WorldWideScience

Sample records for matter supportive methodology

  1. Establishment of Requirements and Methodology for the Development and Implementation of GreyMatters, a Memory Clinic Information System.

    Science.gov (United States)

    Tapuria, Archana; Evans, Matt; Curcin, Vasa; Austin, Tony; Lea, Nathan; Kalra, Dipak

    2017-01-01

    The aim of the paper is to establish the requirements and methodology for the development process of GreyMatters, a memory clinic system, outlining the conceptual, practical, technical and ethical challenges, and the experiences of capturing clinical and research oriented data along with the implementation of the system. The methodology for development of the information system involved phases of requirements gathering, modeling and prototype creation, and 'bench testing' the prototype with experts. The standard Institute of Electrical and Electronics Engineers (IEEE) recommended approach for the specifications of software requirements was adopted. An electronic health record (EHR) standard, EN13606 was used, and clinical modelling was done through archetypes and the project complied with data protection and privacy legislation. The requirements for GreyMatters were established. Though the initial development was complex, the requirements, methodology and standards adopted made the construction, deployment, adoption and population of a memory clinic and research database feasible. The electronic patient data including the assessment scales provides a rich source of objective data for audits and research and to establish study feasibility and identify potential participants for the clinical trials. The establishment of requirements and methodology, addressing issues of data security and confidentiality, future data compatibility and interoperability and medico-legal aspects such as access controls and audit trails, led to a robust and useful system. The evaluation supports that the system is an acceptable tool for clinical, administrative, and research use and forms a useful part of the wider information architecture.

  2. Fire Support Requirements Methodology Study, Phase 2 Proceedings of the Fire Support Methodology Workshop

    Science.gov (United States)

    1975-12-18

    It was not immediatei- clear that the -approach- would- succeed in overcoming the deficiencies of present fire support methodologies which demand- an...support require analysis up to Level 6. They also felt that deficiencies in f technique were most serious at Levels 3, 4 and 5. It was accepted that...defined as: Tk2 = _Tkl ilk2 kl (2) Tkt = Tk,t-l - ’lMktMk,t-l + 𔃼kt ,t-2 I t > (3. where Mt refers to the-number of type k targets killed in time

  3. The Evaluation Methodology of Information Support

    Directory of Open Access Journals (Sweden)

    Lubos Necesal

    2016-01-01

    Full Text Available Knowledge, information and people are the motive force in today's organizations. Successful organizations need to find the right employees and provide them with the right and highquality information. This is a complex problem. In the world where information plays more and more important role, employees have to be skilled at information activities (searching, processing, saving, etc. of information and information system/-s (IS they work with. Organizations have to cover both these areas. Therefore, we need an effective instrument, which could be used to evaluate new employees within admission or as regular evaluating of current employees, to evaluate information system, whether it is an appropriate tool for fulfilling the employee’s tasks within the organization, and to evaluate how the organization covers the foregoing areas. Such instrument is the “Evaluation methodology of information support in organization”. This paper defines the term “information support“ and its role in organization. The body of the paper proposes the “Evaluation methodology of information support in organization”. The conclusion discusses contributions of information support evaluation

  4. Quark Matter 2017: Young Scientist Support

    Energy Technology Data Exchange (ETDEWEB)

    Evdokimov, Olga [University of Illinois at Chicago

    2017-07-31

    Quark Matter conference series are amongst the major scientific events for the Relativistic Heavy Ion community. With over 30 year long history, the meetings are held about every 1½ years to showcase the progress made in theoretical and experimental studies of nuclear matter under extreme conditions. The 26th International Conference on Ultra-relativistic Nucleus-Nucleus Collisions (Quark Matter 2017) was held at the Hyatt Regency Hotel in downtown Chicago from Sunday, February 5th through Saturday, February 11th, 2017. The conference featured about 180 plenary and parallel presentations of the most significant recent results in the field, a poster session for additional presentations, and an evening public lecture. Following the tradition of previous Quark Matter meetings, the first day of the conference was dedicated entirely to a special program for young scientists (graduate students and postdoctoral researchers). This grant will provided financial support for 235 young physicists facilitating their attendance of the conference.

  5. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  6. WORK ALLOCATION IN COMPLEX PRODUCTION PROCESSES: A METHODOLOGY FOR DECISION SUPPORT

    OpenAIRE

    de Mello, Adriana Marotti; School of Economics, Business and Accounting at the University of São Paulo; Marx, Roberto; Polytechnic School, University of São Paulo; Zilbovicius, Mauro; Polytechnic School – University of São Paulo

    2013-01-01

    This article presents the development of a Methodology of Decision Support for Work Allocation in complex production processes. It is known that this decision is frequently taken empirically and that the methodologies available to support it are few and restricted in terms of its conceptual basis. The study of Times and Motion is one of these methodologies, but its applicability is restricted in cases of more complex production processes. The method presented here was developed as a result of...

  7. TEACHING AND LEARNING METHODOLOGIES SUPPORTED BY ICT APPLIED IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Jose CAPACHO

    2016-04-01

    Full Text Available The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory. Genetic-Cognitive Psychology Theory and Dialectics Psychology. Based on the theoretical framework the following methodologies were developed: Game Theory, Constructivist Approach, Personalized Teaching, Problem Solving, Cooperative Collaborative learning, Learning projects using ICT. These methodologies were applied to the teaching learning process during the Algorithms and Complexity – A&C course, which belongs to the area of ​​Computer Science. The course develops the concepts of Computers, Complexity and Intractability, Recurrence Equations, Divide and Conquer, Greedy Algorithms, Dynamic Programming, Shortest Path Problem and Graph Theory. The main value of the research is the theoretical support of the methodologies and their application supported by ICT using learning objects. The course aforementioned was built on the Blackboard platform evaluating the operation of methodologies. The results of the evaluation are presented for each of them, showing the learning outcomes achieved by students, which verifies that methodologies are functional.

  8. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1992-01-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described

  9. Sampling and analytical methodologies for instrumental neutron activation analysis of airborne particulate matter

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-12-01

    The IAEA supports a number of projects having to do with the analysis of airborne particulate matter by nuclear techniques. Most of this work involves the use of activation analysis in its various forms, particularly instrumental neutron activation analysis (INAA). This technique has been widely used in many different countries for the analysis of airborne particulate matter, and there are already many publications in scientific journals, books and reports describing such work. The present document represents an attempt to summarize the most important features of INAA as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of INAA to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability, although they are presented here in a way that takes account of the particular requirements arising from the use of INAA as the analytical technique. The analytical part of the document, however, is presented in a form that is applicable only to INAA. (Subsequent publications in this series are expected to deal specifically with other nuclear related techniques such as energy dispersive X ray fluorescence (ED-XRF) and particle induced X ray emission (PIXE) analysis). Although the methods and procedures described here have been found through experience to yield acceptable results, they should not be considered mandatory. Any other procedure used should, however, be chosen to be capable of yielding results at least of equal quality to those described.

  10. Methodology to estimate particulate matter emissions from certified commercial aircraft engines.

    Science.gov (United States)

    Wayson, Roger L; Fleming, Gregg G; Lovinelli, Ralph

    2009-01-01

    Today, about one-fourth of U.S. commercial service airports, including 41 of the busiest 50, are either in nonattainment or maintenance areas per the National Ambient Air Quality Standards. U.S. aviation activity is forecasted to triple by 2025, while at the same time, the U.S. Environmental Protection Agency (EPA) is evaluating stricter particulate matter (PM) standards on the basis of documented human health and welfare impacts. Stricter federal standards are expected to impede capacity and limit aviation growth if regulatory mandated emission reductions occur as for other non-aviation sources (i.e., automobiles, power plants, etc.). In addition, strong interest exists as to the role aviation emissions play in air quality and climate change issues. These reasons underpin the need to quantify and understand PM emissions from certified commercial aircraft engines, which has led to the need for a methodology to predict these emissions. Standardized sampling techniques to measure volatile and nonvolatile PM emissions from aircraft engines do not exist. As such, a first-order approximation (FOA) was derived to fill this need based on available information. FOA1.0 only allowed prediction of nonvolatile PM. FOA2.0 was a change to include volatile PM emissions on the basis of the ratio of nonvolatile to volatile emissions. Recent collaborative efforts by industry (manufacturers and airlines), research establishments, and regulators have begun to provide further insight into the estimation of the PM emissions. The resultant PM measurement datasets are being analyzed to refine sampling techniques and progress towards standardized PM measurements. These preliminary measurement datasets also support the continued refinement of the FOA methodology. FOA3.0 disaggregated the prediction techniques to allow for independent prediction of nonvolatile and volatile emissions on a more theoretical basis. The Committee for Aviation Environmental Protection of the International Civil

  11. Particulate Matter Filtration Design Considerations for Crewed Spacecraft Life Support Systems

    Science.gov (United States)

    Agui, Juan H.; Vijayakumar, R.; Perry, Jay L.

    2016-01-01

    Particulate matter filtration is a key component of crewed spacecraft cabin ventilation and life support system (LSS) architectures. The basic particulate matter filtration functional requirements as they relate to an exploration vehicle LSS architecture are presented. Particulate matter filtration concepts are reviewed and design considerations are discussed. A concept for a particulate matter filtration architecture suitable for exploration missions is presented. The conceptual architecture considers the results from developmental work and incorporates best practice design considerations.

  12. EVALUATION OF TRAINING AND‐METHODOLOGICAL SUPPORT OF UNIVERSITY COURSES (in Russian

    Directory of Open Access Journals (Sweden)

    Natalia BELKINA

    2012-04-01

    Full Text Available Quality of teaching at a Higher Education Institution certainly depends on the integrity and quality of its training and methodological support. However, in order to improve this quality it is necessary to have a sound methodology for evaluation of such support. This article contains a list of recommended university teaching course materials, criteria of their separate components evaluation and an approach to calculating the quality levels of separate components and teaching course materials as a whole.

  13. Methodology and Supporting Toolset Advancing Embedded Systems Quality

    DEFF Research Database (Denmark)

    Berger, Michael Stübert; Soler, José; Brewka, Lukasz Jerzy

    2013-01-01

    Software quality is of primary importance in the development of embedded systems that are often used in safety-critical applications. Moreover, as the life cycle of embedded products becomes increasingly tighter, productivity and quality are simultaneously required and closely interrelated towards...... delivering competitive products. In this context, the MODUS (Methodology and supporting toolset advancing embedded systems quality) project aims to provide a pragmatic and viable solution that will allow SMEs to substantially improve their positioning in the embedded-systems development market. This paper...... will describe the MODUS project with focus on the technical methodologies that will be developed advancing embedded system quality....

  14. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  15. Post-Sale Customer Support Methodology in the TQM System

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Elizabeta Mitreva

    2014-06-01

    Full Text Available In this paper a survey of the activities in the post-sale period of the product is made and based on the analysis of the results, a methodology that managers could use to design and implement the system of total quality management has been created. The implementation of this methodology is carried out in a simplified way and in less time, without having to study and deepen new knowledge for internal standardization, statistical process control, cost analysis and optimization of business processes The purpose of this paper is to lay a good foundation for Macedonian companies in their post-sale period activities of the product, to understand the philosophy of TQM (Total Quality Management and benefits will be achieved by implementing the system and setting strategic directions for success. These activities begin by identifying the wishes and needs of customers/users, reengineering business processes for sales support, satisfaction of employees and all stakeholders. As a result of the implementation of this methodology in practice, improved competitiveness, increased efficiency, reduction of quality costs and increased productivity are noted. The methodology proposed in this paper brings together all the activities in the spiral of quality in a company that deals with post-sales support. Due to the necessity of flow of information about quality in the entire enterprise, an information system is designed accordingly to the QC-CEPyramid model in several steps.

  16. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    Science.gov (United States)

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  17. Methodology development for estimating support behavior of spacer grid spring in core

    International Nuclear Information System (INIS)

    Yoon, Kyung Ho; Kang, Heung Seok; Kim, Hyung Kyu; Song, Kee Nam

    1998-04-01

    The fuel rod (FR) support behavior is changed during operation resulting from effects such as clad creep-down, spring force relaxation due to irradiation, and irradiation growth of spacer straps in accordance with time or increase of burnup. The FR support behavior is closely associated with time or increase of burnup. The FR support behavior is closely associated with FR damage due to fretting, therefore the analysis on the FR support behavior is normally required to minimize the damage. The characteristics of the parameters, which affect the FR support behavior, and the methodology developed for estimating the FR support behavior in the reactor core are described in this work. The FR support condition for the KOFA (KOrean Fuel Assembly) fuel has been analyzed by this method, and the results of the analysis show that the fuel failure due to the fuel rod fretting wear is closely related to the support behavior of FR in the core. Therefore, the present methodology for estimating the FR support condition seems to be useful for estimating the actual FR support condition. In addition, the optimization seems to be a reliable tool for establishing the optimal support condition on the basis of these results. (author). 15 refs., 3 tabs., 26 figs

  18. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    Science.gov (United States)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  19. Development of methodologies for coupled water-hammer analysis of piping systems and supports

    International Nuclear Information System (INIS)

    Kamil, H.; Gantayat, A.; Attia, A.; Goulding, H.

    1983-01-01

    The paper presents the results of an investigation on the development of methodologies for coupled water-hammer analyses. The study was conducted because the present analytical methods for calculation of loads on piping systems and supports resulting from water-hammer phenomena are overly conservative. This is mainly because the methods do not usually include interaction between the fluid and the piping and thus predict high loads on piping systems and supports. The objective of the investigation presented in this paper was to develop methodologies for coupled water-hammer analyses, including fluid-structure interaction effects, to be able to obtain realistic loads on piping systems and supports, resulting in production of more economical designs. (orig./RW)

  20. СONTENTS OF THE METHODOLOGICAL AND TECHNOLOGICAL SUPPORT OF THE EDUCATION QUALITY MANAGEMENT INFORMATION SYSTEM FOR FUTURE ECONOMISTS

    Directory of Open Access Journals (Sweden)

    Kostiantyn S. Khoruzhyi

    2014-12-01

    Full Text Available In the article, the content and nature of organizational activities in scope of methodological and technological support of the education quality management information system (EQMIS for future economists are described. The content of the organizational activities for the implementation of methodological and technological support of EQMIS for future economists includes four stages (preparatory, instructional/adaptational, methodological/basic, as well as experimental/evaluational and contains a set of methodological and technological measures for each of the stages of the EQMIS implementation. A study of the pedagogical impact of the proposed methodology of using EQMIS in the formation of professional competence of economics students was also conducted. The main stages, methods and sequence of implementation arrangements for the methodological and technological support of EQMIS are defined.

  1. A Performance-Based Technology Assessment Methodology to Support DoD Acquisition

    National Research Council Canada - National Science Library

    Mahafza, Sherry; Componation, Paul; Tippett, Donald

    2005-01-01

    .... This methodology is referred to as Technology Performance Risk Index (TPRI). The TPRI can track technology readiness through a life cycle, or it can be used at a specific time to support a particular system milestone decision...

  2. Sampling and analytical methodologies for energy dispersive X-ray fluorescence analysis of airborne particulate matter

    International Nuclear Information System (INIS)

    1993-01-01

    The present document represents an attempt to summarize the most important features of the different forms of ED-XFR as applied to the analysis of airborne particulate matter. It is intended to serve as a set of guidelines for use by participants in the IAEA's own programmes, and other scientists, who are not yet fully experienced in the application of ED-XRF to airborne particulate samples, and who wish either to make a start on using this technique or to improve their existing procedures. The methodologies for sampling described in this document are of rather general applicability. Emphasis is also placed on the sources of errors affecting the sampling of airborne particulate matter. The analytical part of the document describes the different forms of ED-XRF and their potential applications. Spectrum evaluation, a key step in X-ray spectrometry, is covered in depth, including discussion on several calibration and peak fitting techniques and computer programs especially designed for this purpose. 148 refs, 25 figs, 13 tabs

  3. Methodological Challenges in Studies Comparing Prehospital Advanced Life Support with Basic Life Support.

    Science.gov (United States)

    Li, Timmy; Jones, Courtney M C; Shah, Manish N; Cushman, Jeremy T; Jusko, Todd A

    2017-08-01

    Determining the most appropriate level of care for patients in the prehospital setting during medical emergencies is essential. A large body of literature suggests that, compared with Basic Life Support (BLS) care, Advanced Life Support (ALS) care is not associated with increased patient survival or decreased mortality. The purpose of this special report is to synthesize the literature to identify common study design and analytic challenges in research studies that examine the effect of ALS, compared to BLS, on patient outcomes. The challenges discussed in this report include: (1) choice of outcome measure; (2) logistic regression modeling of common outcomes; (3) baseline differences between study groups (confounding); (4) inappropriate statistical adjustment; and (5) inclusion of patients who are no longer at risk for the outcome. These challenges may affect the results of studies, and thus, conclusions of studies regarding the effect of level of prehospital care on patient outcomes should require cautious interpretation. Specific alternatives for avoiding these challenges are presented. Li T , Jones CMC , Shah MN , Cushman JT , Jusko TA . Methodological challenges in studies comparing prehospital Advanced Life Support with Basic Life Support. Prehosp Disaster Med. 2017;32(4):444-450.

  4. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Science.gov (United States)

    Phillips-Smith, Catherine; Jeong, Cheol-Heon; Healy, Robert M.; Dabek-Zlotorzynska, Ewa; Celo, Valbona; Brook, Jeffrey R.; Evans, Greg

    2017-08-01

    The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter) were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010-November 2012) at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013), hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow, water, and biota samples collected

  5. Sources of particulate matter components in the Athabasca oil sands region: investigation through a comparison of trace element measurement methodologies

    Directory of Open Access Journals (Sweden)

    C. Phillips-Smith

    2017-08-01

    Full Text Available The province of Alberta, Canada, is home to three oil sands regions which, combined, contain the third largest deposit of oil in the world. Of these, the Athabasca oil sands region is the largest. As part of Environment and Climate Change Canada's program in support of the Joint Canada-Alberta Implementation Plan for Oil Sands Monitoring program, concentrations of trace elements in PM2. 5 (particulate matter smaller than 2.5 µm in diameter were measured through two campaigns that involved different methodologies: a long-term filter campaign and a short-term intensive campaign. In the long-term campaign, 24 h filter samples were collected once every 6 days over a 2-year period (December 2010–November 2012 at three air monitoring stations in the regional municipality of Wood Buffalo. For the intensive campaign (August 2013, hourly measurements were made with an online instrument at one air monitoring station; daily filter samples were also collected. The hourly and 24 h filter data were analyzed individually using positive matrix factorization. Seven emission sources of PM2. 5 trace elements were thereby identified: two types of upgrader emissions, soil, haul road dust, biomass burning, and two sources of mixed origin. The upgrader emissions, soil, and haul road dust sources were identified through both the methodologies and both methodologies identified a mixed source, but these exhibited more differences than similarities. The second upgrader emissions and biomass burning sources were only resolved by the hourly and filter methodologies, respectively. The similarity of the receptor modeling results from the two methodologies provided reassurance as to the identity of the sources. Overall, much of the PM2. 5-related trace elements were found to be anthropogenic, or at least to be aerosolized through anthropogenic activities. These emissions may in part explain the previously reported higher levels of trace elements in snow

  6. Investigation of optimal seismic design methodology for piping systems supported by elasto-plastic dampers. Part 1. Evaluation functions

    International Nuclear Information System (INIS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    2009-01-01

    In this study, the optimal seismic design methodology that can consider the structural integrity of not only the piping systems but also elasto-plastic supporting devices is developed. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location, capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Four types of evaluation functions are considered. It is found that the proposed optimal seismic design methodology is very effective and can be applied to the actual seismic design for piping systems supported by elasto-plastic dampers. The effectiveness of the evaluation functions is also clarified. (author)

  7. Mindset Matters: Supporting Student Persistence Through The Developmental Mathematics Pipeline

    OpenAIRE

    Kiser, Tracey Nicole

    2016-01-01

    Abstract of the DissertationMindset Matters: Supporting Student Persistence Through The Developmental Mathematics PipelinebyTracey Nicole KiserDoctor of Education in Teaching and LearningUniversity of California, San Diego, 2016Christopher P. Halter, ChairDevelopmental mathematics is one of the most challenging leaks in the mathematics K-20 pipeline. Few students enter two-year colleges prepared to successfully engage in college-level mathematics classes. Many of students who place into devel...

  8. Improving life cycle assessment methodology for the application of decision support

    DEFF Research Database (Denmark)

    Herrmann, Ivan Tengbjerg

    for the application of decision support and evaluation of uncertainty in LCA. From a decision maker’s (DM’s) point of view there are at least three main “illness” factors influencing the quality of the information that the DM uses for making decisions. The factors are not independent of each other, but it seems......) refrain from making a decision based on an LCA and thus support a decision on other parameters than the LCA environmental parameters. Conversely, it may in some decision support contexts be acceptable to base a decision on highly uncertain information. This all depends on the specific decision support...... the different steps. A deterioration of the quality in each step is likely to accumulate through the statistical value chain in terms of increased uncertainty and bias. Ultimately this can make final decision support problematic. The "Law of large numbers" (LLN) is the methodological tool/probability theory...

  9. Design Methodology of a Sensor Network Architecture Supporting Urgent Information and Its Evaluation

    Science.gov (United States)

    Kawai, Tetsuya; Wakamiya, Naoki; Murata, Masayuki

    Wireless sensor networks are expected to become an important social infrastructure which helps our life to be safe, secure, and comfortable. In this paper, we propose design methodology of an architecture for fast and reliable transmission of urgent information in wireless sensor networks. In this methodology, instead of establishing single complicated monolithic mechanism, several simple and fully-distributed control mechanisms which function in different spatial and temporal levels are incorporated on each node. These mechanisms work autonomously and independently responding to the surrounding situation. We also show an example of a network architecture designed following the methodology. We evaluated the performance of the architecture by extensive simulation and practical experiments and our claim was supported by the results of these experiments.

  10. Catalytic combustion of particulate matter Catalysts of alkaline nitrates supported on hydrous zirconium

    International Nuclear Information System (INIS)

    Galdeano, N.F.; Carrascull, A.L.; Ponzi, M.I.; Lick, I.D.; Ponzi, E.N.

    2004-01-01

    In order to explore a method to remove particulate matter, catalysts of different alkaline nitrates (Li, K and Cs) supported on hydrous zirconium were prepared by the method of incipient humidity and tested as catalysts for particulate matter combustion. The catalytic activity was determined by using the temperature programmed oxidation technique (TPO), utilizing two equipments, a thermogravimetric reactor and other of fixed bed. In the first case the particulate matter/catalyst mixture was milled carefully in a mortar (tight contact) while in the second case more realistic operative conditions were used, particulate matter/catalyst mixture was made with a spatula (loose contact). All prepared catalysts showed good activity for the particulate matter combustion. The cesium catalyst was the one that presented higher activity, decreasing the combustion temperature between 200 and 250 deg. C with respect to the combustion without catalyst. The catalyst with lithium nitrate became active at higher temperature than its melting point and the same occurred with the potassium catalyst. This did not occur for the catalyst containing cesium nitrate that melts at 407 deg. C and became active from 350 deg. C

  11. Learning to Support Learning Together: An Experience with the Soft Systems Methodology

    Science.gov (United States)

    Sanchez, Adolfo; Mejia, Andres

    2008-01-01

    An action research approach called soft systems methodology (SSM) was used to foster organisational learning in a school regarding the role of the learning support department within the school and its relation with the normal teaching-learning activities. From an initial situation of lack of coordination as well as mutual misunderstanding and…

  12. Evaluating electronic performance support systems: A methodology focused on future use-in-practice

    NARCIS (Netherlands)

    Collis, Betty; Verwijs, C.A.

    1995-01-01

    Electronic performance support systems, as an emerging type of software environment, present many new challenges in relation to effective evaluation. In this paper, a global approach to a 'usage-orientated' evaluation methodology for software product is presented, followed by a specific example of

  13. An ultrasonic methodology for in-service inspection of shell weld of core support structure in a sodium cooled fast reactor

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Anish, E-mail: anish@igcar.gov.in; Rajkumar, K.V.; Sharma, Govind K.; Dhayalan, R.; Jayakumar, T.

    2015-02-15

    Highlights: • We demonstrate a novel ultrasonic methodology for in-service inspection of shell weld of core support structure in a sodium cooled fast breeder reactor. • The methodology comprises of the inspection of shell weld immersed in sodium from the outside surface of the main vessel using ultrasonic guided wave. • The formation and propagation of guided wave modes are validated by finite element simulation of the inspection methodology. • A defect down to 20% of 30 mm thick wall (∼6 mm) in the shell weld can be detected reliably using the developed methodology. - Abstract: The paper presents a novel ultrasonic methodology developed for in-service inspection (ISI) of shell weld of core support structure of main vessel of 500 MWe prototype fast breeder reactor (PFBR). The methodology comprises of the inspection of shell weld immersed in sodium from the outsider surface of the main vessel using a normal beam longitudinal wave ultrasonic transducer. Because of the presence of curvature in the knuckle region of the main vessel, the normal beam longitudinal wave enters the support shell plate at an angle and forms the guided waves by mode conversion and multiple reflections from the boundaries of the shell plate. Hence, this methodology can be used to detect defects in the shell weld of the core support structure. The successful demonstration of the methodology on a mock-up sector made of stainless steel indicated that an artificial defect down to 20% of 30 mm thick wall (∼6 mm) in the shell weld can be detected reliably.

  14. Impact of renewables on electricity markets – Do support schemes matter?

    International Nuclear Information System (INIS)

    Winkler, Jenny; Gaio, Alberto; Pfluger, Benjamin; Ragwitz, Mario

    2016-01-01

    Rising renewable shares influence electricity markets in several ways: among others, average market prices are reduced and price volatility increases. Therefore, the “missing money problem” in energy-only electricity markets is more likely to occur in systems with high renewable shares. Nevertheless, renewables are supported in many countries due to their expected benefits. The kind of support instrument can however influence the degree to which renewables influence the market. While fixed feed-in tariffs lead to higher market impacts, more market-oriented support schemes such as market premiums, quota systems and capacity-based payments decrease the extent to which markets are affected. This paper analyzes the market impacts of different support schemes. For this purpose, a new module is added to an existing bottom-up simulation model of the electricity market. In addition, different degrees of flexibility in the electricity system are considered. A case study for Germany is used to derive policy recommendations regarding the choice of support scheme. - Highlights: •Renewable support schemes matter regarding the impact on electricity markets. •Market-oriented support schemes reduce the impact on electricity markets. •More flexible electricity systems reduce the need for market participation. •Sliding premiums combine market integration with a productive risk allocation.

  15. Methodological Aspects of In Vitro Assessment of Bio-accessible Risk Element Pool in Urban Particulate Matter

    Czech Academy of Sciences Publication Activity Database

    Sysalová, J.; Száková, J.; Tremlová, J.; Kašparovská, Kateřina; Kotlík, B.; Tlustoš, P.; Svoboda, Petr

    2014-01-01

    Roč. 161, č. 2 (2014), s. 216-222 ISSN 0163-4984 Grant - others:GA ČR(CZ) GA521/09/1150; GA ČR(CZ) GAP503/12/0682 Program:GA; GA Institutional support: RVO:67985823 Keywords : risk elements * urban particulate matter * in vitro tests * bio-accessibility Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 1.748, year: 2014

  16. Methodology supporting production control in a foundry applying modern DISAMATIC molding line

    Directory of Open Access Journals (Sweden)

    Sika Robert

    2017-01-01

    Full Text Available The paper presents methodology of production control using statistical methods in foundry conditions, using the automatic DISAMATIC molding line. The authors were inspired by many years of experience in implementing IT tools for foundries. The authors noticed that there is a lack of basic IT tools dedicated to specific casting processes, that would greatly facilitate their oversight and thus improve the quality of manufactured products. More and more systems are installed in the ERP or CAx area, but they integrate processes only partially, mainly in the area of technology design and business management from finance and control. Monitoring of foundry processes can generate a large amount of process-related data. This is particularly noticeable in automated processes. An example is the modern DISAMATIC molding line, which integrates several casting processes, such as mold preparation, assembly, pouring or shake out. The authors proposed a methodology that supports the control of the above-mentioned foundry processes using statistical methods. Such an approach can be successfully used, for example, during periodic external audits. The mentioned methodology in the innovative DISAM-ProdC computer tool was implemented.

  17. Supporting the Future Total Force: A Methodology for Evaluating Potential Air National Guard Mission Assignments

    National Research Council Canada - National Science Library

    Lynch, Kristin F; Drew, John G; Sleeper, Sally; Williams, William A; Masters, James M; Luangkesorn, Louis; Tripp, Robert S; Lichter, Dahlia S; Roll, Charles R

    2007-01-01

    ... trained, highly experienced personnel with no aircraft to operate and support. The authors develop a methodology to evaluate missions that could be transferred from the active component to the ANG without significant cost to the total force...

  18. An Improved Cambridge Filter Pad Extraction Methodology to Obtain More Accurate Water and “Tar” Values: In Situ Cambridge Filter Pad Extraction Methodology

    Directory of Open Access Journals (Sweden)

    Ghosh David

    2014-07-01

    Full Text Available Previous investigations by others and internal investigations at Philip Morris International (PMI have shown that the standard trapping and extraction procedure used for conventional cigarettes, defined in the International Standard ISO 4387 (Cigarettes -- Determination of total and nicotine-free dry particulate matter using a routine analytical smoking machine, is not suitable for high-water content aerosols. Errors occur because of water losses during the opening of the Cambridge filter pad holder to remove the filter pad as well as during the manual handling of the filter pad, and because the commercially available filter pad holder, which is constructed out of plastic, may adsorb water. This results in inaccurate values for the water content, and erroneous and overestimated values for Nicotine Free Dry Particulate Matter (NFDPM. A modified 44 mm Cambridge filter pad holder and extraction equipment which supports in situ extraction methodology has been developed and tested. The principle of the in situ extraction methodology is to avoid any of the above mentioned water losses by extracting the loaded filter pad while kept in the Cambridge filter pad holder which is hermetically sealed by two caps. This is achieved by flushing the extraction solvent numerous times through the hermetically sealed Cambridge filter pad holder by means of an in situ extractor. The in situ methodology showed a significantly more complete water recovery, resulting in more accurate NFDPM values for high-water content aerosols compared to the standard ISO methodology. The work presented in this publication demonstrates that the in situ extraction methodology applies to a wider range of smoking products and smoking regimens, whereas the standard ISO methodology only applies to a limited range of smoking products and smoking regimens, e.g., conventional cigarettes smoked under ISO smoking regimen. In cases where a comparison of yields between the PMI HTP and

  19. Solid Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Supported by a generous quantity of full-color illustrations and interesting sidebars, Solid Matter introduces the basic characteristics and properties of solid matter. It briefly describes the cosmic connection of the elements, leading readers through several key events in human pre-history that resulted in more advanced uses of matter in the solid state. Chapters include:. -Solid Matter: An Initial Perspective. -Physical Behavior of Matter. -The Gravity of Matter. -Fundamentals of Materials Science. -Rocks and Minerals. -Metals. -Building Materials. -Carbon Earth's Most Versatile Element. -S

  20. 3 + 1-dimensional thin shell wormhole with deformed throat can be supported by normal matter

    Energy Technology Data Exchange (ETDEWEB)

    Mazharimousavi, S.H.; Halilsoy, M. [Eastern Mediterranean University, Department of Physics, Gazimagusa (Turkey)

    2015-06-15

    From the physics standpoint the exotic matter problem is a major difficulty in thin shell wormholes (TSWs) with spherical/cylindrical throat topologies.We aim to circumvent this handicap by considering angle dependent throats in 3 + 1 dimensions. By considering the throat of the TSW to be deformed spherical, i.e., a function of θ and φ, we present general conditions which are to be satisfied by the shape of the throat in order to have the wormhole supported by matter with positive density in the static reference frame. We provide particular solutions/examples to the constraint conditions. (orig.)

  1. Final report of the accident phenomenology and consequence (APAC) methodology evaluation. Spills Working Group

    Energy Technology Data Exchange (ETDEWEB)

    Brereton, S.; Shinn, J. [Lawrence Livermore National Lab., CA (United States); Hesse, D [Battelle Columbus Labs., OH (United States); Kaninich, D. [Westinghouse Savannah River Co., Aiken, SC (United States); Lazaro, M. [Argonne National Lab., IL (United States); Mubayi, V. [Brookhaven National Lab., Upton, NY (United States)

    1997-08-01

    The Spills Working Group was one of six working groups established under the Accident Phenomenology and Consequence (APAC) methodology evaluation program. The objectives of APAC were to assess methodologies available in the accident phenomenology and consequence analysis area and to evaluate their adequacy for use in preparing DOE facility safety basis documentation, such as Basis for Interim Operation (BIO), Justification for Continued Operation (JCO), Hazard Analysis Documents, and Safety Analysis Reports (SARs). Additional objectives of APAC were to identify development needs and to define standard practices to be followed in the analyses supporting facility safety basis documentation. The Spills Working Group focused on methodologies for estimating four types of spill source terms: liquid chemical spills and evaporation, pressurized liquid/gas releases, solid spills and resuspension/sublimation, and resuspension of particulate matter from liquid spills.

  2. Status of Activities to Implement a Sustainable System of MC&A Equipment and Methodological Support at Rosatom Facilities

    Energy Technology Data Exchange (ETDEWEB)

    J.D. Sanders

    2010-07-01

    Under the U.S.-Russian Material Protection, Control and Accounting (MPC&A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC&A measurement system. These efforts have resulted in the development of a MC&A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC&A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP, as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.

  3. UPC Scaling-up methodology for Deterministic Safety Assessment and Support to Plant Operation

    Energy Technology Data Exchange (ETDEWEB)

    Martínez-Quiroga, V.; Reventós, F.; Batet, Il.

    2015-07-01

    Best Estimate codes along with necessary nodalizations are widely used tools in nuclear engineering for both Deterministic Safety Assessment (DSA) and Support to Plant Operation and Control. In this framework, the application of quality assurance procedures in both codes and nodalizations becomes an essential step prior any significant study. Along these lines the present paper introduces the UPC SCUP, a systematic methodology based on the extrapolation of the Integral Test Facilities (ITF) post-test simulations by means of scaling analyses. In that sense, SCUP fulfills a gap in current nodalization qualification procedures, the related with the validation of NPP nodalizations for Design Basis Accidents conditions. Three are the pillars that support SCUP: judicial selection of the experimental transients, full confidence in the quality of the ITF simulations, and simplicity in justifying discrepancies that appear between ITF and NPP counterpart transients. The techniques that are presented include the socalled Kv scaled calculations as well as the use of two new approaches, ”Hybrid nodalizations” and ”Scaled-up nodalizations”. These last two methods have revealed themselves to be very helpful in producing the required qualification and in promoting further improvements in nodalization. The study of both LSTF and PKL counterpart tests have allowed to qualify the methodology by the comparison with experimental data. Post-test simulations at different sizes allowed to define which phenomena could be well reproduced by system codes and which not, in this way also establishing the basis for the extrapolation to an NPP scaled calculation. Furthermore, the application of the UPC SCUP methodology demonstrated that selected phenomena can be scaled-up and explained between counterpart simulations by carefully considering the differences in scale and design. (Author)

  4. Supplement to a Methodology for Succession Planning for Technical Experts

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Cain, Ronald A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Agreda, Carla L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-01

    This report complements A Methodology for Succession Planning for Technical Experts (Ron Cain, Shaheen Dewji, Carla Agreda, Bernadette Kirk, July 2017), which describes a draft methodology for identifying and evaluating the loss of key technical skills at nuclear operations facilities. This report targets the methodology for identifying critical skills, and the methodology is tested through interviews with selected subject matter experts.

  5. Counting stem cells : methodological constraints

    NARCIS (Netherlands)

    Bystrykh, Leonid V.; Verovskaya, Evgenia; Zwart, Erik; Broekhuis, Mathilde; de Haan, Gerald

    The number of stem cells contributing to hematopoiesis has been a matter of debate. Many studies use retroviral tagging of stem cells to measure clonal contribution. Here we argue that methodological factors can impact such clonal analyses. Whereas early studies had low resolution, leading to

  6. Identifying Opportunities for Decision Support Systems in Support of Regional Resource Use Planning: An Approach Through Soft Systems Methodology.

    Science.gov (United States)

    Zhu; Dale

    2000-10-01

    / Regional resource use planning relies on key regional stakeholder groups using and having equitable access to appropriate social, economic, and environmental information and assessment tools. Decision support systems (DSS) can improve stakeholder access to such information and analysis tools. Regional resource use planning, however, is a complex process involving multiple issues, multiple assessment criteria, multiple stakeholders, and multiple values. There is a need for an approach to DSS development that can assist in understanding and modeling complex problem situations in regional resource use so that areas where DSSs could provide effective support can be identified, and the user requirements can be well established. This paper presents an approach based on the soft systems methodology for identifying DSS opportunities for regional resource use planning, taking the Central Highlands Region of Queensland, Australia, as a case study.

  7. Status of Activities to Implement a Sustainable System of MC and A Equipment and Methodological Support at Rosatom Facilities

    International Nuclear Information System (INIS)

    Sanders, J.D.

    2010-01-01

    Under the U.S.-Russian Material Protection, Control and Accounting (MPC and A) Program, the Material Control and Accounting Measurements (MCAM) Project has supported a joint U.S.-Russian effort to coordinate improvements of the Russian MC and A measurement system. These efforts have resulted in the development of a MC and A Equipment and Methodological Support (MEMS) Strategic Plan (SP), developed by the Russian MEM Working Group. The MEMS SP covers implementation of MC and A measurement equipment, as well as the development, attestation and implementation of measurement methodologies and reference materials at the facility and industry levels. This paper provides an overview of the activities conducted under the MEMS SP, as well as a status on current efforts to develop reference materials, implement destructive and nondestructive assay measurement methodologies, and implement sample exchange, scrap and holdup measurement programs across Russian nuclear facilities.

  8. Directed Graph Methodology for Acquisition Path Analysis: a possible tool to support the state-level approach

    International Nuclear Information System (INIS)

    Vincze, Arpad; Nemeth, Andras

    2013-01-01

    According to a recent statement, the IAEA seeks to develop a more effective safeguards system to achieve greater deterrence, because deterrence of proliferation is much more effective than detection. To achieve this goal, a less predictive safeguards system is being developed based on the advanced state-level approach that is driven by all available safeguards-relevant information. The 'directed graph analysis' is recommended as a possible methodology to implement acquisition path analysis by the IAEA to support the State evaluation process. The basic methodology is simple, well established, powerful, and its adaptation to the modelling of the nuclear profile of a State requires minimum software development. Based on this methodology the material flow network model has been developed under the Hungarian Support Programme to the IAEA, which is described in detail. In the proposed model, materials in different chemical and physical form can flow through pipes representing declared processes, material transports, diversions or undeclared processes. The nodes of the network are the material types, while the edges of the network are the pipes. A state parameter (p) is assigned to each node and edge representing the probability of their existence in the State. The possible application of this model in the State-level analytical approach will be discussed and outlook for further work will be given. The paper is followed by the slides of the presentation

  9. Size Matters: The Link between Staff Size and Perceived Organizational Support in Early Childhood Education

    Science.gov (United States)

    Ho, Dora; Lee, Moosung; Teng, Yue

    2016-01-01

    Purpose: The purpose of this paper is to examine the relationship between staff size and perceived organizational support (POS) in early childhood education (ECE) organizations. Design/methodology/approach: A territory-wide questionnaire survey was designed to investigate the perceptions of preschool teachers in Hong Kong on four dimensions of…

  10. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  11. A Multi-Criteria Methodology to Support Public Administration Decision Making Concerning Sustainable Energy Action Plans

    Directory of Open Access Journals (Sweden)

    Chiara Novello

    2013-08-01

    Full Text Available For municipalities that have joined the Covenant of Mayors promoted by the European Commission, the Sustainable Energy Action Plan (SEAP represents a strategic tool for achieving the greenhouse gas reductions required by 2020. So far as the energy retrofit actions in their residential building stock are concerned, which in the small-to-medium municipalities are responsible for more than 60% of CO2 emissions, the scenarios for intervening are normally decided on the basis of an economic (cost/performance analysis. This type of analysis, however, does not take into account important aspects for small and medium-sized communities such as social aspects, environmental impacts, local economic development and employment. A more comprehensive and effective tool to support the choices of public administrators is the multi-criteria analysis. This study proposes a methodology that integrates multi-criteria analysis in order to support Public Administration/Local Authorities in programming Sustainable Energy Action Plans with a more targeted approach to sustainability. The methodology, based on the ELECTRE III method, was applied to a medium-size municipality in the Lombardy region of Italy. The results obtained with this approach are discussed in this paper.

  12. Architecture-Level Exploration of Alternative Interconnection Schemes Targeting 3D FPGAs: A Software-Supported Methodology

    Directory of Open Access Journals (Sweden)

    Kostas Siozios

    2008-01-01

    Full Text Available In current reconfigurable architectures, the interconnection structures increasingly contribute more to the delay and power consumption. The demand for increased clock frequencies and logic density (smaller area footprint makes the problem even more important. Three-dimensional (3D architectures are able to alleviate this problem by accommodating a number of functional layers, each of which might be fabricated in different technology. However, the benefits of such integration technology have not been sufficiently explored yet. In this paper, we propose a software-supported methodology for exploring and evaluating alternative interconnection schemes for 3D FPGAs. In order to support the proposed methodology, three new CAD tools were developed (part of the 3D MEANDER Design Framework. During our exploration, we study the impact of vertical interconnection between functional layers in a number of design parameters. More specifically, the average gains in operation frequency, power consumption, and wirelength are 35%, 32%, and 13%, respectively, compared to existing 2D FPGAs with identical logic resources. Also, we achieve higher utilization ratio for the vertical interconnections compared to existing approaches by 8% for designing 3D FPGAs, leading to cheaper and more reliable devices.

  13. Investigation of optimal seismic design methodology for piping systems supported by elasto-plastic dampers. Part. 2. Applicability for seismic waves with various frequency characteristics

    International Nuclear Information System (INIS)

    Ito, Tomohiro; Michiue, Masashi; Fujita, Katsuhisa

    2010-01-01

    In this study, the applicability of a previously developed optimal seismic design methodology, which can consider the structural integrity of not only piping systems but also elasto-plastic supporting devices, is studied for seismic waves with various frequency characteristics. This methodology employs a genetic algorithm and can search the optimal conditions such as the supporting location and the capacity and stiffness of the supporting devices. Here, a lead extrusion damper is treated as a typical elasto-plastic damper. Numerical simulations are performed using a simple piping system model. As a result, it is shown that the proposed optimal seismic design methodology is applicable to the seismic design of piping systems subjected to seismic waves with various frequency characteristics. The mechanism of optimization is also clarified. (author)

  14. Topical issues of psychological research materials on matters related to extremism

    Directory of Open Access Journals (Sweden)

    Sekerazh T.N.

    2014-12-01

    Full Text Available The article deals with methodological support psychological and linguistic research "extremist" materials. Presents a comprehensive psycho-linguistic approach to the examination of information materials on matters related to combating extremism and terrorism, and certain provisions of the methodology developed by the Russian federal center of judicial examination of the Ministry of Justice of the Russian Federation. Based on the analysis of the "verbal" crimes related to criminal legal interpretation of extremism and terrorism, highlighted the types of prohibited public expression of communicative action, corresponding to the seven types of "extremist" values. The article outlines the key features of psychological analysis "extremist" materials research stages. It is shown that the complex (psycho-linguistic approach to the study of materials of extremist orientation, is scientifically sound, methodically proven, appropriate to the needs of law enforcement, judicial and investigative practice.

  15. Optimization Of Methodological Support Of Application Tax Benefits In Regions: Practice Of Perm Region

    Directory of Open Access Journals (Sweden)

    Alexandr Ivanovich Tatarkin

    2015-03-01

    Full Text Available In the article, the problem of the methodological process support of regional tax benefits is reviewed. The method of tax benefits assessment, accepted in Perm Region, was chosen as an analysis object because the relatively long period of application of benefits has allowed to build enough statistics base. In the article, the reliability of budget, economic, investment, and social effectiveness assessments of application benefits, based on the Method, is investigated. The suggestions of its perfection are formulated

  16. Baryonic Dark Matter

    OpenAIRE

    De Paolis, F.; Jetzer, Ph.; Ingrosso, G.; Roncadelli, M.

    1997-01-01

    Reasons supporting the idea that most of the dark matter in galaxies and clusters of galaxies is baryonic are discussed. Moreover, it is argued that most of the dark matter in galactic halos should be in the form of MACHOs and cold molecular clouds.

  17. [Exploration of a quantitative methodology to characterize the retention of PM2.5 and other atmospheric particulate matter by plant leaves: taking Populus tomentosa as an example].

    Science.gov (United States)

    Zhang, Zhi-Dan; Xi, Ben-Ye; Cao, Zhi-Guo; Jia, Li-Ming

    2014-08-01

    Taking Populus tomentosa as an example, a methodology called elution-weighing-particle size-analysis (EWPA) was proposed to evaluate quantitatively the ability of retaining fine particulate matter (PM2.5, diameter d ≤ 2.5 μm) and atmospheric particulate matter by plant leaves using laser particle size analyzer and balance. This method achieved a direct, accurate measurement with superior operability about the quality and particle size distribution of atmospheric particulate matter retained by plant leaves. First, a pre-experiment was taken to test the stability of the method. After cleaning, centrifugation and drying, the particulate matter was collected and weighed, and then its particle size distribution was analyzed by laser particle size analyzer. Finally, the mass of particulate matter retained by unit area of leaf and stand was translated from the leaf area and leaf area index. This method was applied to a P. tomentosa stand which had not experienced rain for 27 days in Beijing Olympic Forest Park. The results showed that the average particle size of the atmospheric particulate matter retained by P. tomentosa was 17.8 μm, and the volume percentages of the retained PM2.5, inhalable particulate matter (PM10, d ≤ 10 μm) and total suspended particle (TSP, d ≤ 100 μm) were 13.7%, 47.2%, and 99.9%, respectively. The masses of PM2.5, PM10, TSP and total particulate matter were 8.88 x 10(-6), 30.6 x 10(-6), 64.7 x 10(-6) and 64.8 x 10(-6) g x cm(-2) respectively. The retention quantities of PM2.5, PM10, TSP and total particulate matter by the P. tomentosa stand were 0.963, 3.32, 7.01 and 7.02 kg x hm(-2), respectively.

  18. Assessment of methodologies for radioactive waste management

    International Nuclear Information System (INIS)

    Hoos, I.R.

    1978-01-01

    No quantitative methodology is adequate to encompass and assess all the risks, no risk/benefit calculation is fine-tuned enough to supply decision-makers with the full range and all of the dimensions. Quality assurance cannot be conceived in terms of systems design alone, but must be maintained vigilantly and with integrity throughout the process. The responsibility of the NRC is fairly well established with respect to overall reactor safety. With respect to the management of radioactive wastes, its mission is not yet so clearly delineated. Herein lies a challenge and an opportunity. Where the known quantitative methodologies are restrictive and likely to have negative feedback effect on authority and public support, the broader lens and the bolder thrust are called for. The cozy cocoon of figures ultimately protects no one. The Commission, having acknowledged that the management of radioactive wastes is not merely a technological matter can now take the socially responsible position of exploring as fully and confronting as candidly as possible the total range of dimensions involved. Paradoxically, it is Charles J. Hitch, intellectual progenitor of the methodology, who observes that we may be missing the meaning of his message by relying too heavily on quantitative analysis and thus defining our task too narrowly. We live in a closed system, in which science and technology, politics and economics, and, above all, social and human elements interact, sometimes to create the problems, sometimes to articulate the questions, and sometimes to find viable solutions

  19. Extração de matéria orgânica aquática por abaixamento de temperatura: uma metodologia alternativa para manter a identidade da amostra Extraction of aquatic organic matter by temperature decreasing: an alternative methodology to keep the original sample characteristics

    Directory of Open Access Journals (Sweden)

    Rosana N. H. Martins de Almeida

    2003-03-01

    Full Text Available In this work was developed an alternative methodology to separation of aquatic organic matter (AOM present in natural river waters. The process is based in temperature decreasing of the aqueous sample under controlled conditions that provoke the freezing of the sample and separation of the dark extract, not frozen and rich in organic matter. The results showed that speed of temperature decreasing exerts strongly influence in relative recovery of organic carbon, enrichment and time separation of the organic matter present in water samples. Elemental composition, infrared spectra and thermal analysis results showed that the alternative methodology is less aggressive possible in the attempt of maintaining the integrity of the sample.

  20. Defluoridation of water using activated alumina in presence of natural organic matter via response surface methodology.

    Science.gov (United States)

    Samarghandi, Mohammad Reza; Khiadani, Mehdi; Foroughi, Maryam; Zolghadr Nasab, Hasan

    2016-01-01

    Adsorption by activated alumina is considered to be one of the most practiced methods for defluoridation of freshwater. This study was conducted, therefore, to investigate the effect of natural organic matters (NOMs) on the removal of fluoride by activated alumina using response surface methodology. To the authors' knowledge, this has not been previously investigated. Physico-chemical characterization of the alumina was determined by scanning electron microscope (SEM), Brunauer-Emmett-Teller (BET), Fourier transform infrared spectroscopy (FTIR), X-ray fluorescence (XRF), and X-ray diffractometer (XRD). Response surface methodology (RSM) was applied to evaluate the effect of single and combined parameters on the independent variables such as the initial concentration of fluoride, NOMs, and pH on the process. The results revealed that while presence of NOM and increase of pH enhance fluoride adsorption on the activated alumina, initial concentration of fluoride has an adverse effect on the efficiency. The experimental data were analyzed and found to be accurately and reliably fitted to a second-order polynomial model. Under optimum removal condition (fluoride concentration 20 mg/L, NOM concentration 20 mg/L, and pH 7) with a desirability value of 0.93 and fluoride removal efficiency of 80.6%, no significant difference was noticed with the previously reported sequence of the co-exiting ion affinity to activated alumina for fluoride removal. Moreover, aluminum residual was found to be below the recommended value by the guideline for drinking water. Also, the increase of fluoride adsorption on the activated alumina, as NOM concentrations increase, could be due to the complexation between fluoride and adsorbed NOM. Graphical abstract ᅟ.

  1. EVALUATION OF THE GRAI INTEGRATED METHODOLOGY AND THE IMAGIM SUPPORTWARE

    Directory of Open Access Journals (Sweden)

    J.M.C. Reid

    2012-01-01

    Full Text Available This paper describes the GRAI Integrated Methodology and identifies the need for computer tools to support enterprise modelling,design and integration. The IMAGIM tool is then evaluated in terms of its ability to support the GRAI Integrated Methodology. The GRAI Integrated Methodology is an Enterprise Integration methodology developed to support the design of CIM systems . The GRAI Integrated Methodology consists of the GRAI model and a structured approach. The latest addition to the methodology is the IMAGIM software tool developed by the GRAI research group for the specific purpose of supporting the methodology.

  2. A Performance Tuning Methodology with Compiler Support

    Directory of Open Access Journals (Sweden)

    Oscar Hernandez

    2008-01-01

    Full Text Available We have developed an environment, based upon robust, existing, open source software, for tuning applications written using MPI, OpenMP or both. The goal of this effort, which integrates the OpenUH compiler and several popular performance tools, is to increase user productivity by providing an automated, scalable performance measurement and optimization system. In this paper we describe our environment, show how these complementary tools can work together, and illustrate the synergies possible by exploiting their individual strengths and combined interactions. We also present a methodology for performance tuning that is enabled by this environment. One of the benefits of using compiler technology in this context is that it can direct the performance measurements to capture events at different levels of granularity and help assess their importance, which we have shown to significantly reduce the measurement overheads. The compiler can also help when attempting to understand the performance results: it can supply information on how a code was translated and whether optimizations were applied. Our methodology combines two performance views of the application to find bottlenecks. The first is a high level view that focuses on OpenMP/MPI performance problems such as synchronization cost and load imbalances; the second is a low level view that focuses on hardware counter analysis with derived metrics that assess the efficiency of the code. Our experiments have shown that our approach can significantly reduce overheads for both profiling and tracing to acceptable levels and limit the number of times the application needs to be run with selected hardware counters. In this paper, we demonstrate the workings of this methodology by illustrating its use with selected NAS Parallel Benchmarks and a cloud resolving code.

  3. Kaupapa Maori Methodology: Trusting the Methodology through Thick and Thin

    Science.gov (United States)

    Hiha, Anne Aroha

    2016-01-01

    Kaupapa Maori is thoroughly theorised in academia in Aotearoa and those wishing to use it as their research methodology can find support through the writing of a number of Maori academics. What is not so well articulated, is the experiential voice of those who have used Kaupapa Maori as research methodology. My identity as a Maori woman…

  4. Methodological Reflections on the Contribution of Qualitative Research to the Evaluation of Clinical Ethics Support Services.

    Science.gov (United States)

    Wäscher, Sebastian; Salloch, Sabine; Ritter, Peter; Vollmann, Jochen; Schildmann, Jan

    2017-05-01

    This article describes a process of developing, implementing and evaluating a clinical ethics support service intervention with the goal of building up a context-sensitive structure of minimal clinical-ethics in an oncology department without prior clinical ethics structure. Scholars from different disciplines have called for an improvement in the evaluation of clinical ethics support services (CESS) for different reasons over several decades. However, while a lot has been said about the concepts and methodological challenges of evaluating CESS up to the present time, relatively few empirical studies have been carried out. The aim of this article is twofold. On the one hand, it describes a process of development, modifying and evaluating a CESS intervention as part of the ETHICO research project, using the approach of qualitative-formative evaluation. On the other hand, it provides a methodological analysis which specifies the contribution of qualitative empirical methods to the (formative) evaluation of CESS. We conclude with a consideration of the strengths and limitations of qualitative evaluation research with regards to the evaluation and development of context sensitive CESS. We further discuss our own approach in contrast to rather traditional consult or committee models. © 2017 John Wiley & Sons Ltd.

  5. Detecting dark matter

    International Nuclear Information System (INIS)

    Dixon, Roger L.

    2000-01-01

    Dark matter is one of the most pressing problems in modern cosmology and particle physic research. This talk will motivate the existence of dark matter by reviewing the main experimental evidence for its existence, the rotation curves of galaxies and the motions of galaxies about one another. It will then go on to review the corroborating theoretical motivations before combining all the supporting evidence to explore some of the possibilities for dark matter along with its expected properties. This will lay the ground work for dark matter detection. A number of differing techniques are being developed and used to detect dark matter. These will be briefly discussed before the focus turns to cryogenic detection techniques. Finally, some preliminary results and expectations will be given for the Cryogenic Dark Matter Search (CDMS) experiment

  6. Tools and methodologies to support more sustainable biofuel feedstock production.

    Science.gov (United States)

    Dragisic, Christine; Ashkenazi, Erica; Bede, Lucio; Honzák, Miroslav; Killeen, Tim; Paglia, Adriano; Semroc, Bambi; Savy, Conrad

    2011-02-01

    Increasingly, government regulations, voluntary standards, and company guidelines require that biofuel production complies with sustainability criteria. For some stakeholders, however, compliance with these criteria may seem complex, costly, or unfeasible. What existing tools, then, might facilitate compliance with a variety of biofuel-related sustainability criteria? This paper presents four existing tools and methodologies that can help stakeholders assess (and mitigate) potential risks associated with feedstock production, and can thus facilitate compliance with requirements under different requirement systems. These include the Integrated Biodiversity Assessment Tool (IBAT), the ARtificial Intelligence for Ecosystem Services (ARIES) tool, the Responsible Cultivation Areas (RCA) methodology, and the related Biofuels + Forest Carbon (Biofuel + FC) methodology.

  7. Assessment of bioenergy potential in Sicily: A GIS-based support methodology

    International Nuclear Information System (INIS)

    Beccali, Marco; D'Alberti, Vincenzo; Franzitta, Vincenzo; Columba, Pietro

    2009-01-01

    A Geographical Information System (GIS) supported methodology has been developed in order to assess the technical and economic potential of biomass exploitation for energy production in Sicily. The methodology was based on the use of agricultural, economic, climatic, and infrastructural data in a GIS. Data about land use, transportation facilities, urban cartography, regional territorial planning, terrain digital model, lithology, climatic types, and civil and industrial users have been stored in the GIS to define potential areas for gathering the residues coming from the pruning of olive groves, vineyards, and other agricultural crops, and to assess biomass available for energy cultivation. Further, it was possible to assess the potential of biodiesel production, supposing the cultivation of rapeseed in arable crop areas. For the biomass used for direct combustion purposes, the economic availability has been assessed assuming a price of the biomass and comparing it with other fuels. This assessment has shown the strong competitiveness of firewood in comparison with traditional fossil fuels when the collection system is implemented in an efficient way. Moreover, the economic potential of biodiesel was assessed considering the on-going financial regime for fuel. At the same time, the study has shown a significant competitiveness of the finished biomass (pellets), and good potential for a long-term development of this market. An important result was the determination of biofuel production potential in Sicily. An outcome of the study was to show the opportunities stemming from the harmonisation of Energy Policy with the Waste Management System and Rural Development Plan. (author)

  8. K-Means Subject Matter Expert Refined Topic Model Methodology

    Science.gov (United States)

    2017-01-01

    computing environment the Visual Basic for Applications ( VBA ) programming language presents the option as our programming language of choice. We propose...background, or access to other computational programming environments, to build topic models from free text datasets using a familiar Excel based...environment the restricts access to other software based text analytic tools. Opportunities to deploy developmental versions of the methodology and

  9. Methodology of Young Minds Matter: The second Australian Child and Adolescent Survey of Mental Health and Wellbeing.

    Science.gov (United States)

    Hafekost, Jennifer; Lawrence, David; Boterhoven de Haan, Katrina; Johnson, Sarah E; Saw, Suzy; Buckingham, William J; Sawyer, Michael G; Ainley, John; Zubrick, Stephen R

    2016-09-01

    To describe the study design of Young Minds Matter: The second Australian Child and Adolescent Survey of Mental Health and Wellbeing. The aims of the study, sample design, development of survey content, field procedures and final questionnaires are detailed. During 2013-2014, a national household survey of the mental health and wellbeing of young people was conducted involving a sample of 6310 families selected at random from across Australia. The survey included a face-to-face diagnostic interview with parents/carers of 4- to 17-year-olds and a self-report questionnaire completed by young people aged 11-17 years. The overall response rate to the survey was 55% with 6310 parents/carers of eligible households participating in the survey. In addition, 2967 or 89% of young people aged 11-17 years in these participating households completed a questionnaire. The survey sample was found to be broadly representative of the Australian population on major demographic characteristics when compared with data from the Census of Population and Housing. However, adjustments were made for an over-representation of younger children aged 4 to 7 years and also families with more than one eligible child in the household. Young Minds Matter provides updated national prevalence estimates of common child and adolescent mental disorders, describes patterns of service use and will help to guide future decisions in the development of policy and provision of mental health services for children and adolescents. Advancements in interviewing methodology, addition of a data linkage component and informed content development contributed to improved breadth and quality of the data collected. © The Royal Australian and New Zealand College of Psychiatrists 2015.

  10. Watermark: An Application and Methodology and Application for Interactive and intelligent Decision Support for Groundwater Systems

    Science.gov (United States)

    Pierce, S. A.; Wagner, K.; Schwartz, S.; Gentle, J. N., Jr.

    2016-12-01

    Critical water resources face the effects of historic drought, increased demand, and potential contamination, the need has never been greater to develop resources to effectively communicate conservation and protection across a broad audience and geographical area. The Watermark application and macro-analysis methodology merges topical analysis of context rich corpus from policy texts with multi-attributed solution sets from integrated models of water resource and other subsystems, such as mineral, food, energy, or environmental systems to construct a scalable, robust, and reproducible approach for identifying links between policy and science knowledge bases. The Watermark application is an open-source, interactive workspace to support science-based visualization and decision making. Designed with generalization in mind, Watermark is a flexible platform that allows for data analysis and inclusion of large datasets with an interactive front-end capable of connecting with other applications as well as advanced computing resources. In addition, the Watermark analysis methodology offers functionality that streamlines communication with non-technical users for policy, education, or engagement with groups around scientific topics of societal relevance. The technology stack for Watermark was selected with the goal of creating a robust and dynamic modular codebase that can be adjusted to fit many use cases and scale to support usage loads that range between simple data display to complex scientific simulation-based modelling and analytics. The methodology uses to topical analysis and simulation-optimization to systematically analyze the policy and management realities of resource systems and explicitly connect the social and problem contexts with science-based and engineering knowledge from models. A case example demonstrates use in a complex groundwater resources management study highlighting multi-criteria spatial decision making and uncertainty comparisons.

  11. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  12. Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Water Security

    Science.gov (United States)

    Hakimdavar, Raha; Wood, Danielle; Eylander, John; Peters-Lidard, Christa; Smith, Jane; Doorn, Brad; Green, David; Hummel, Corey; Moore, Thomas C.

    2018-01-01

    River basins for which transboundary coordination and governance is a factor are of concern to US national security, yet there is often a lack of sufficient data-driven information available at the needed time horizons to inform transboundary water decision-making for the intelligence, defense, and foreign policy communities. To address this need, a two-day workshop entitled Transboundary Water: Improving Methodologies and Developing Integrated Tools to Support Global Water Security was held in August 2017 in Maryland. The committee that organized and convened the workshop (the Organizing Committee) included representatives from the National Aeronautics and Space Administration (NASA), the US Army Corps of Engineers Engineer Research and Development Center (ERDC), and the US Air Force. The primary goal of the workshop was to advance knowledge on the current US Government and partners' technical information needs and gaps to support national security interests in relation to transboundary water. The workshop also aimed to identify avenues for greater communication and collaboration among the scientific, intelligence, defense, and foreign policy communities. The discussion around transboundary water was considered in the context of the greater global water challenges facing US national security.

  13. A methodology and decision support tool for informing state-level bioenergy policymaking: New Jersey biofuels as a case study

    Science.gov (United States)

    Brennan-Tonetta, Margaret

    This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or

  14. CALS and the Product State Model - Methodology and Supporting Schools and Paradigms

    DEFF Research Database (Denmark)

    Larsen, Michael Holm

    1998-01-01

    incorporates relevant information about each stage of the production process.The paper will describe the research object, the model object and discuss a part of the methodology in developing a Product State Model. The project is primarily technological, however, organisational and human aspects......This paper address the preliminary considerations in a research project, initiated February 1997, regarding Continuous Acquisition and Life-cycle Support (CALS) which is a part of the activities in CALS Center Denmark. The CALS concept is presented focusing on the Product State Model (PSM). The PSM...... will be developed upon, will be discussed. Also, the parameters for evaluating the PSM will be considered. In establishing the theoretical body of knowledge with respect to CALS, an identification of schools and paradigms within the research area of applying information technology in a manufacturing environment...

  15. Review of indirect detection of dark matter with neutrinos

    Science.gov (United States)

    Danninger, Matthias

    2017-09-01

    Dark Matter could be detected indirectly through the observation of neutrinos produced in dark matter self-annihilations or decays. Searches for such neutrino signals have resulted in stringent constraints on the dark matter self-annihilation cross section and the scattering cross section with matter. In recent years these searches have made significant progress in sensitivity through new search methodologies, new detection channels, and through the availability of rich datasets from neutrino telescopes and detectors, like IceCube, ANTARES, Super-Kamiokande, etc. We review recent experimental results and put them in context with respect to other direct and indirect dark matter searches. We also discuss prospects for discoveries at current and next generation neutrino detectors.

  16. Application of response surface methodology for optimization of natural organic matter degradation by UV/H2O2 advanced oxidation process.

    Science.gov (United States)

    Rezaee, Reza; Maleki, Afshin; Jafari, Ali; Mazloomi, Sajad; Zandsalimi, Yahya; Mahvi, Amir H

    2014-01-01

    In this research, the removal of natural organic matter from aqueous solutions using advanced oxidation processes (UV/H2O2) was evaluated. Therefore, the response surface methodology and Box-Behnken design matrix were employed to design the experiments and to determine the optimal conditions. The effects of various parameters such as initial concentration of H2O2 (100-180 mg/L), pH (3-11), time (10-30 min) and initial total organic carbon (TOC) concentration (4-10 mg/L) were studied. Analysis of variance (ANOVA), revealed a good agreement between experimental data and proposed quadratic polynomial model (R(2) = 0.98). Experimental results showed that with increasing H2O2 concentration, time and decreasing in initial TOC concentration, TOC removal efficiency was increased. Neutral and nearly acidic pH values also improved the TOC removal. Accordingly, the TOC removal efficiency of 78.02% in terms of the independent variables including H2O2 concentration (100 mg/L), pH (6.12), time (22.42 min) and initial TOC concentration (4 mg/L) were optimized. Further confirmation tests under optimal conditions showed a 76.50% of TOC removal and confirmed that the model is accordance with the experiments. In addition TOC removal for natural water based on response surface methodology optimum condition was 62.15%. This study showed that response surface methodology based on Box-Behnken method is a useful tool for optimizing the operating parameters for TOC removal using UV/H2O2 process.

  17. Comparison of the sanitary effects of energy chains. Methodological aspects

    International Nuclear Information System (INIS)

    Fagnani, F.

    1979-01-01

    Beyond technical and economical matters, the development of an industrial technology involves more or less numerous indirect consequences. From this viewpoint, the author analysis the methodological problems raised in evaluating the sanitary and ecological problems of the different energy-producing lines and considers successively the matter of technical interdependences, protection and safety regulations and selection of sites, classification of risks and measuring problems in relation to sanitary effects [fr

  18. On indexes and subject matter of “global competitiveness”

    Directory of Open Access Journals (Sweden)

    A. V. Korotkov

    2017-01-01

    Full Text Available The aim of the research is to analyze the subject matter of a country’s competitiveness and to characterize statistical indexes of competitiveness known in the international practice from the perspective of a more elaborated theory of market competition. This aim follows from the identified problems. First, there are no generally accepted interpretation and joint understanding of competition and competitiveness at country level. Even the international organizations giving estimations of global competitiveness disagree on definitions of competitiveness. Secondly, there is no relation to the theory of market competition in the available source materials on competitiveness of the country without original methodology. Thirdly, well-known statistical indexes of global competitiveness do not have enough theoretical justification and differ in sets of factors. All this highlights the incompleteness of the methodology and methodological support of studying competitiveness at country level.Materials and methods. The research is based on the methodology of statistics, economic theory and marketing. The authors followed the basic principle of statistical methodology – requirement of continuous combination of qualitative and quantitative analysis, when the research begins and ends with qualitative analysis. A most important section of statistical methodology is widely used – construction of statistical indexes. In the course of the analysis, a method of statistical classifications is applied. A significant role in the present research is given to the method of generalizing and analogue method, realizing that related terms should mean similar and almost similar contents. Modeling of competition and competitiveness is widely used in the present research, which made it possible to develop a logical model of competition following from the competition theory.Results. Based on the definitions’ survey the analysis of the subject matter of global

  19. Methodology. Volume 3

    Science.gov (United States)

    1997-02-01

    and supporting technical architectures » Evolution with changing needs and technology » Maturation to achieve continuous improvement • Instituitional ...implementation issues » Institutional arrangements » Financial aspects » Scheduling matters – Such factors must be addressed to complete the

  20. STUDYING FOREST ROOT SYSTEMS - AN OVERVIEW OF METHODOLOGICAL PROBLEMS

    Science.gov (United States)

    The study of tree root systems is central to understanding forest ecosystem carbon and nutrient cycles, nutrient and water uptake, C allocation patterns by trees, soil microbial populations, adaptation of trees to stress, soil organic matter production, etc. Methodological probl...

  1. Methodology for the economic optimisation of energy storage systems for frequency support in wind power plants

    International Nuclear Information System (INIS)

    Johnston, Lewis; Díaz-González, Francisco; Gomis-Bellmunt, Oriol; Corchero-García, Cristina; Cruz-Zambrano, Miguel

    2015-01-01

    Highlights: • Optimisation of energy storage system with wind power plant for frequency response. • Energy storage option considered could be economically viable. • For a 50 MW wind farm, an energy storage system of 5.3 MW and 3 MW h was found. - Abstract: This paper proposes a methodology for the economic optimisation of the sizing of Energy Storage Systems (ESSs) whilst enhancing the participation of Wind Power Plants (WPP) in network primary frequency control support. The methodology was designed flexibly, so it can be applied to different energy markets and to include different ESS technologies. The methodology includes the formulation and solving of a Linear Programming (LP) problem. The methodology was applied to the particular case of a 50 MW WPP, equipped with a Vanadium Redox Flow battery (VRB) in the UK energy market. Analysis is performed considering real data on the UK regular energy market and the UK frequency response market. Data for wind power generation and energy storage costs are estimated from literature. Results suggest that, under certain assumptions, ESSs can be profitable for the operator of a WPP that is providing frequency response. The ESS provides power reserves such that the WPP can generate close to the maximum energy available. The solution of the optimisation problem establishes that an ESS with a power rating of 5.3 MW and energy capacity of about 3 MW h would be enough to provide such service whilst maximising the incomes for the WPP operator considering the regular and frequency regulation UK markets

  2. A new methodology to derive settleable particulate matter guidelines to assist policy-makers on reducing public nuisance

    Science.gov (United States)

    Machado, Milena; Santos, Jane Meri; Reisen, Valdério Anselmo; Reis, Neyval Costa; Mavroidis, Ilias; Lima, Ana T.

    2018-06-01

    Air quality standards for settleable particulate matter (SPM) are found in many countries around the world. As well known, annoyance caused by SPM can be considered a community problem even if only a small proportion of the population is bothered at rather infrequent occasions. Many authors have shown that SPM cause soiling in residential and urban environments and degradation of materials (eg, objects and surface painting) that can impair the use and enjoyment of property and alter the normal activities of society. In this context, this paper has as main contribution to propose a guidance to establish air quality standards for annoyance caused by SPM in metropolitan industrial areas. To attain this objective, a new methodology is proposed which is based on the nonlinear correlation between the perceived annoyance (qualitative variable) and particles deposition rate (quantitative variable). Since the response variable is binary (annoyed and not annoyed), the logistic regression model is used to estimate the probability of people being annoyed at different levels of particles deposition rate and to compute the odds ratio function which gives, under a specific level of particles deposition rate, the estimated expected value of the population perceived annoyance. The proposed methodology is verified in a data set measured in the metropolitan area of Great Vitória, Espirito Santo, Brazil. As a general conclusion, the estimated probability function of perceived annoyance as a function of SPM has shown that 17% of inhabitants report annoyance to very low particles deposition levels of 5 g/(m2•30 days). In addition, for an increasing of 1 g/(m2•30 days) of SPM, the smallest estimated odds ratio of perceived annoyance by a factor of 1.5, implying that the probability of occurrence is almost 2 times as large as the probability of no occurrence of annoyance.

  3. A Methodology to Support Decision Making in Flood Plan Mitigation

    Science.gov (United States)

    Biscarini, C.; di Francesco, S.; Manciola, P.

    2009-04-01

    . In the present paper we propose a novel methodology for supporting the priority setting in the assessment of such issues, beyond the typical "expected value" approach. Scientific contribution and management aspects are merged to create a simplified method for plan basin implementation, based on risk and economic analyses. However, the economic evaluation is not the sole criterion for flood-damage reduction plan selection. Among the different criteria that are relevant to the decision process, safety and quality of human life, economic damage, expenses related with the chosen measures and environmental issues should play a fundamental role on the decisions made by the authorities. Some numerical indices, taking in account administrative, technical, economical and risk aspects, are defined and are combined together in a mathematical formula that defines a Priority Index (PI). In particular, the priority index defines a ranking of priority interventions, thus allowing the formulation of the investment plan. The research is mainly focused on the technical factors of risk assessment, providing quantitative and qualitative estimates of possible alternatives, containing measures of the risk associated with those alternatives. Moreover, the issues of risk management are analyzed, in particular with respect to the role of decision making in the presence of risk information. However, a great effort is devoted to make this index easy to be formulated and effective to allow a clear and transparent comparison between the alternatives. Summarizing this document describes a major- steps for incorporation of risk analysis into the decision making process: framing of the problem in terms of risk analysis, application of appropriate tools and techniques to obtain quantified results, use of the quantified results in the choice of structural and non-structural measures. In order to prove the reliability of the proposed methodology and to show how risk-based information can be

  4. Macro Dark Matter

    CERN Document Server

    Jacobs, David M; Lynn, Bryan W.

    2015-01-01

    Dark matter is a vital component of the current best model of our universe, $\\Lambda$CDM. There are leading candidates for what the dark matter could be (e.g. weakly-interacting massive particles, or axions), but no compelling observational or experimental evidence exists to support these particular candidates, nor any beyond-the-Standard-Model physics that might produce such candidates. This suggests that other dark matter candidates, including ones that might arise in the Standard Model, should receive increased attention. Here we consider a general class of dark matter candidates with characteristic masses and interaction cross-sections characterized in units of grams and cm$^2$, respectively -- we therefore dub these macroscopic objects as Macros. Such dark matter candidates could potentially be assembled out of Standard Model particles (quarks and leptons) in the early universe. A combination of earth-based, astrophysical, and cosmological observations constrain a portion of the Macro parameter space; ho...

  5. Adaptation of the Methodological Support to the Specifics of Management of “Smart” Environment

    Directory of Open Access Journals (Sweden)

    Lazebnyk Iuliia O.

    2017-12-01

    Full Text Available The aim of the article is to justify the analytic base of the methodological support for adopting new technology solutions necessary to improve management of the “smart” environment system. The article identifies the range of major problems facing modern large cities in the context of the growing urbanization and substantiates the need to introduce the concept of “smart environment for solving these problems. The main approaches to the definition of the concept of “smart” environment are considered. The main components of “smart” environment are identified and analyzed. The best world practices of leading cities, such as Dubai and Hong Kong, regarding the introduction of “smart” technologies are considered.

  6. Methodological spot of establishing silt deposit concentration in Serbian rivers

    Directory of Open Access Journals (Sweden)

    Dragićević Slavoljub

    2007-01-01

    Full Text Available Recent methodology of sampling and establishing silt deposit concentration in Serbian rivers is associated to numerous deficiencies. Daily concentrations of this type of river deposit on the most of the hydrological gauges were obtained on the base of only one measurement, which takes into consideration the matter of representative ness of those samples. Taking the samples of deposit in one point on the profile is little bit problematic because of dispersion of the obtained results. Very important matter is the question of choice of the sampling location. This analyses of data may lead to serious spots in calculating total carried deposit. From the above mentioned reasons, we decided to take precise measurements of silt deposit concentration as well as to establish methodological spots of measurements. The results of these measurements are analyzed and presented in this paper.

  7. Reliability Centered Maintenance - Methodologies

    Science.gov (United States)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  8. Alternative Perspectives on Sustainability: Indigenous Knowledge and Methodologies

    Directory of Open Access Journals (Sweden)

    Meg Parsons

    2017-02-01

    Full Text Available Indigenous knowledge (IK is now recognized as being critical to the development of effective, equitable and meaningful strategies to address socio-ecological crises. However efforts to integrate IK and Western science frequently encounter difficulties due to different systems of knowledge production and underlying worldviews. New approaches are needed so that sustainability can progress on the terms that matter the most for the people involved. In this paper we discuss a case study from Aotearoa New Zealand where an indigenous community is in the process of renegotiating and enacting new indigenous-led approaches to address coupled socio-ecological crises. We reflect on novel methodological approaches that highlight the ways in which projects/knowledge are co-produced by a multiplicity of human and non-human actors. To this end we draw on conceptualizations of environmental ethics offered by indigenous scholars and propose alternative bodies of thought, methods, and practices that can support the wider sustainability agenda.

  9. Effective management of changes: methodological and instrumental support

    Directory of Open Access Journals (Sweden)

    G. S. Merzlikina

    2017-01-01

    Full Text Available Small and medium-sized enterprises are characterized by maneuverability, readiness for change, and focus on innovation. But the growing instability of the external and internal environment requires the company to develop increasingly complex control systems. There were several models of management: by objectives, by processes and by changes. Achieving the goals involves the development and implementation of strategy and tactics. In business, strategy comes from the goal set by the owner before the organization. Process management describes and defines the main elements and categories of the process, observing the balance of responsibility and authority by creating a team to improve each business process. Management of changes is a special mechanism for the adoption and implementation of adequate management decisions. The article compares these management models, examines the criteria, indicators and factors for assessing the effectiveness of management of the organization. Comparative analysis showed that management of changes is more preferable for small and medium-sized businesses. Management of changes involves obtaining a certain idea of future trends in the development of the organization and the active use of entrepreneurial structure of modern management methods. This will ensure the economic stability and stability of the organization. Evaluation of the effectiveness of the enterprise can be carried out in accordance with performance indicators. The article suggests a matrix of selection of such indicators taking into account the sphere of influence. Recommendations are given on the choice of indicators of the effectiveness of achieving the goals. Also, the values under which the enterprise acquires stability of such key factors of management effectiveness as efficiency, capacity and sustainability of the organization are indicated. The theoretical and practical significance of this research is the development of methodological and

  10. A methodology to support the development of 4-year pavement management plan.

    Science.gov (United States)

    2014-07-01

    A methodology for forming and prioritizing pavement maintenance and rehabilitation (M&R) projects was developed. : The Texas Department of Transportation (TxDOT) can use this methodology to generate defensible and cost-effective : 4-year pavement man...

  11. The NLC Software Requirements Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shoaee, Hamid

    2002-08-20

    We describe the software requirements and development methodology developed for the NLC control system. Given the longevity of that project, and the likely geographical distribution of the collaborating engineers, the planned requirements management process is somewhat more formal than the norm in high energy physics projects. The short term goals of the requirements process are to accurately estimate costs, to decompose the problem, and to determine likely technologies. The long term goal is to enable a smooth transition from high level functional requirements to specific subsystem and component requirements for individual programmers, and to support distributed development. The methodology covers both ends of that life cycle. It covers both the analytical and documentary tools for software engineering, and project management support. This paper introduces the methodology, which is fully described in [1].

  12. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  13. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support.

    Science.gov (United States)

    Proctor, Enola; Luke, Douglas; Calhoun, Annaliese; McMillen, Curtis; Brownson, Ross; McCrary, Stacey; Padek, Margaret

    2015-06-11

    Little is known about how well or under what conditions health innovations are sustained and their gains maintained once they are put into practice. Implementation science typically focuses on uptake by early adopters of one healthcare innovation at a time. The later-stage challenges of scaling up and sustaining evidence-supported interventions receive too little attention. This project identifies the challenges associated with sustainability research and generates recommendations for accelerating and strengthening this work. A multi-method, multi-stage approach, was used: (1) identifying and recruiting experts in sustainability as participants, (2) conducting research on sustainability using concept mapping, (3) action planning during an intensive working conference of sustainability experts to expand the concept mapping quantitative results, and (4) consolidating results into a set of recommendations for research, methodological advances, and infrastructure building to advance understanding of sustainability. Participants comprised researchers, funders, and leaders in health, mental health, and public health with shared interest in the sustainability of evidence-based health care. Prompted to identify important issues for sustainability research, participants generated 91 distinct statements, for which a concept mapping process produced 11 conceptually distinct clusters. During the conference, participants built upon the concept mapping clusters to generate recommendations for sustainability research. The recommendations fell into three domains: (1) pursue high priority research questions as a unified agenda on sustainability; (2) advance methods for sustainability research; (3) advance infrastructure to support sustainability research. Implementation science needs to pursue later-stage translation research questions required for population impact. Priorities include conceptual consistency and operational clarity for measuring sustainability, developing evidence

  14. Methodological exemplar of integrating quantitative and qualitative evidence - supportive care for men with prostate cancer:what are the most important components?

    OpenAIRE

    Huntley, Alyson; King, Anna J L; Moore, Theresa H M; Paterson, Charlotte; Persad, Raj; Sharp, Debbie J; Evans, Maggie A

    2017-01-01

    AIMS: To present a methodological exemplar of integrating findings from a quantitative and qualitative review on the same topic to provide insight into components of care that contribute to supportive care that is acceptable to men with prostate cancer.BACKGROUND: Men with prostate cancer are likely to live a long time with the disease, experience side effects from treatment and therefore have ongoing supportive care needs. Quantitative and qualitative reviews have been published but the find...

  15. Danish emission inventory for particular matter (PM)

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, M; Winther, M; Illerup, J B; Hjort Mikkelsen, M

    2003-11-01

    The first Danish emission inventory that was reported in 2002 was a provisional-estimate based on data presently available. This report documents methodology, emission factors and references used for an improved Danish emission inventory for particulate matter. Further results of the improved emission inventory for the year 2000 are shown. The particulate matter emission inventory includes TSP, PM,, and PM, The report covers emission inventories for transport and stationary combustion. An appendix covering emissions from agriculture is also included. For the transport sector, both exhaust and non-exhaust emission such as tyre and break wear and road abrasion are included. (au)

  16. A Methodology for Safety Culture Impact Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kiyoon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of)

    2014-05-15

    The purpose of this study is to develop methodology for assessing safety culture impact on nuclear power plants. A new methodology for assessing safety culture impact index has been developed and applied for the reference nuclear power plants. The developed SCII model might contribute to comparing the level of safety culture among nuclear power plants as well as to improving the safety of nuclear power plants. Safety culture is defined to be fundamental attitudes and behaviors of the plant staff which demonstrate that nuclear safety is the most important consideration in all activities conducted in nuclear power operation. Through several accidents of nuclear power plant including the Fukusima Daiichi in 2011 and Chernovyl accidents in 1986, the safety of nuclear power plant is emerging into a matter of interest. From the accident review report, it can be easily found out that safety culture is important and one of dominant contributors to accidents. However, the impact methodology for assessing safety culture has not been established analytically yet. It is difficult to develop the methodology for assessing safety culture impact quantitatively.

  17. A Methodology for Safety Culture Impact Assessment

    International Nuclear Information System (INIS)

    Han, Kiyoon; Jae, Moosung

    2014-01-01

    The purpose of this study is to develop methodology for assessing safety culture impact on nuclear power plants. A new methodology for assessing safety culture impact index has been developed and applied for the reference nuclear power plants. The developed SCII model might contribute to comparing the level of safety culture among nuclear power plants as well as to improving the safety of nuclear power plants. Safety culture is defined to be fundamental attitudes and behaviors of the plant staff which demonstrate that nuclear safety is the most important consideration in all activities conducted in nuclear power operation. Through several accidents of nuclear power plant including the Fukusima Daiichi in 2011 and Chernovyl accidents in 1986, the safety of nuclear power plant is emerging into a matter of interest. From the accident review report, it can be easily found out that safety culture is important and one of dominant contributors to accidents. However, the impact methodology for assessing safety culture has not been established analytically yet. It is difficult to develop the methodology for assessing safety culture impact quantitatively

  18. Principles of Thermodynamics, one of the supports of the green economy and the role of the school in their awareness

    Directory of Open Access Journals (Sweden)

    Juan R. Cardentey Lorente

    2008-09-01

    Full Text Available A methodological request has been strengthened in the last decades, which asks for a critical thinking about the economic reality, taking into account certain principles and fundamental processes, generalized to higher states of matter development. In this way, the principles of thermodynamics or the biological evolution appear as epistemological perspectives of knowledge. The matter is how to design a new “economy of sustainability” which do es not destroy the natural resources a nd the ecologic systems that support it. So the Ecoeconomy emerges aiming at the reconstruction of the biophysical bases of the economic process which claims for the economical efficiency, social justice and sustainability.

  19. Detection of white matter lesion regions in MRI using SLIC0 and convolutional neural network.

    Science.gov (United States)

    Diniz, Pedro Henrique Bandeira; Valente, Thales Levi Azevedo; Diniz, João Otávio Bandeira; Silva, Aristófanes Corrêa; Gattass, Marcelo; Ventura, Nina; Muniz, Bernardo Carvalho; Gasparetto, Emerson Leandro

    2018-04-19

    White matter lesions are non-static brain lesions that have a prevalence rate up to 98% in the elderly population. Because they may be associated with several brain diseases, it is important that they are detected as soon as possible. Magnetic Resonance Imaging (MRI) provides three-dimensional data with the possibility to detect and emphasize contrast differences in soft tissues, providing rich information about the human soft tissue anatomy. However, the amount of data provided for these images is far too much for manual analysis/interpretation, representing a difficult and time-consuming task for specialists. This work presents a computational methodology capable of detecting regions of white matter lesions of the brain in MRI of FLAIR modality. The techniques highlighted in this methodology are SLIC0 clustering for candidate segmentation and convolutional neural networks for candidate classification. The methodology proposed here consists of four steps: (1) images acquisition, (2) images preprocessing, (3) candidates segmentation and (4) candidates classification. The methodology was applied on 91 magnetic resonance images provided by DASA, and achieved an accuracy of 98.73%, specificity of 98.77% and sensitivity of 78.79% with 0.005 of false positives, without any false positives reduction technique, in detection of white matter lesion regions. It is demonstrated the feasibility of the analysis of brain MRI using SLIC0 and convolutional neural network techniques to achieve success in detection of white matter lesions regions. Copyright © 2018. Published by Elsevier B.V.

  20. Implications of the DAMA and CRESST experiments for mirror matter-type dark matter

    International Nuclear Information System (INIS)

    Foot, R.

    2004-01-01

    Mirror atoms are expected to be a significant component of the galactic dark matter halo if mirror matter is identified with the nonbaryonic dark matter in the Universe. Mirror matter can interact with ordinary matter via gravity and via the photon-mirror photon kinetic mixing interaction--causing mirror charged particles to couple to ordinary photons with an effective electric charge εe. This means that the nuclei of mirror atoms can elastically scatter off the nuclei of ordinary atoms, leading to nuclear recoils, which can be detected in existing dark matter experiments. We show that the dark matter experiments most sensitive to this type of dark matter candidate (via the nuclear recoil signature) are the DAMA/NaI and CRESST/Sapphire experiments. Furthermore, we show that the impressive annual modulation signal obtained by the DAMA/NaI experiment can be explained by mirror matter-type dark matter for vertical bar ε vertical bar ∼5x10 -9 and is supported by DAMA's absolute rate measurement as well as the CRESST/Sapphire data. This value of vertical bar ε vertical bar is consistent with the value obtained from various solar system anomalies including the Pioneer spacecraft anomaly, anomalous meteorite events and lack of small craters on the asteroid Eros. It is also consistent with standard big bang nucleosynthesis

  1. Hazard classification methodology

    International Nuclear Information System (INIS)

    Brereton, S.J.

    1996-01-01

    This document outlines the hazard classification methodology used to determine the hazard classification of the NIF LTAB, OAB, and the support facilities on the basis of radionuclides and chemicals. The hazard classification determines the safety analysis requirements for a facility

  2. NASA’s Universe of Learning: Engaging Subject Matter Experts to Support Museum Alliance Science Briefings

    Science.gov (United States)

    Marcucci, Emma; Slivinski, Carolyn; Lawton, Brandon L.; Smith, Denise A.; Squires, Gordon K.; Biferno, Anya A.; Lestition, Kathleen; Cominsky, Lynn R.; Lee, Janice C.; Rivera, Thalia; Walker, Allyson; Spisak, Marilyn

    2018-06-01

    NASA's Universe of Learning creates and delivers science-driven, audience-driven resources and experiences designed to engage and immerse learners of all ages and backgrounds in exploring the universe for themselves. The project is a unique partnership between the Space Telescope Science Institute, Caltech/IPAC, Jet Propulsion Laboratory, Smithsonian Astrophysical Observatory, and Sonoma State University and is part of the NASA SMD Science Activation Collective. The NASA’s Universe of Learning projects pull on the expertise of subject matter experts (scientist and engineers) from across the broad range of NASA Astrophysics themes and missions. One such project, which draws strongly on the expertise of the community, is the NASA’s Universe of Learning Science Briefings, which is done in collaboration with the NASA Museum Alliance. This collaboration presents a monthly hour-long discussion on relevant NASA astrophysics topics or events to an audience composed largely of informal educators from informal learning environments. These professional learning opportunities use experts and resources within the astronomical community to support increased interest and engagement of the informal learning community in NASA Astrophysics-related concepts and events. Briefings are designed to create a foundation for this audience using (1) broad science themes, (2) special events, or (3) breaking science news. The NASA’s Universe of Learning team engages subject matter experts to be speakers and present their science at these briefings to provide a direct connection to NASA Astrophysics science and provide the audience an opportunity to interact directly with scientists and engineers involved in NASA missions. To maximize the usefulness of the Museum Alliance Science Briefings, each briefing highlights resources related to the science theme to support informal educators in incorporating science content into their venues and/or interactions with the public. During this

  3. Differences in regional grey matter volumes in currently ill patients with anorexia nervosa.

    Science.gov (United States)

    Phillipou, Andrea; Rossell, Susan Lee; Gurvich, Caroline; Castle, David Jonathan; Abel, Larry Allen; Nibbs, Richard Grant; Hughes, Matthew Edward

    2018-01-01

    Neurobiological findings in anorexia nervosa (AN) are inconsistent, including differences in regional grey matter volumes. Methodological limitations often contribute to the inconsistencies reported. The aim of this study was to improve on these methodologies by utilising voxel-based morphometry (VBM) analysis with the use of diffeomorphic anatomic registration through an exponentiated lie algebra algorithm (DARTEL), in a relatively large group of individuals with AN. Twenty-six individuals with AN and 27 healthy controls underwent a T1-weighted magnetic resonance imaging (MRI) scan. AN participants were found to have reduced grey matter volumes in a number of areas including regions of the basal ganglia (including the ventral striatum), and parietal and temporal cortices. Body mass index (BMI) and global scores on the Eating Disorder Examination Questionnaire (EDE-Q) were also found to correlate with grey matter volumes in a region of the brainstem (including the substantia nigra and ventral tegmental area) in AN, and predicted 56% of the variance in grey matter volumes in this area. The brain regions associated with grey matter reductions in AN are consistent with regions responsible for cognitive deficits associated with the illness including anhedonia, deficits in affect perception and saccadic eye movement abnormalities. Overall, the findings suggest reduced grey matter volumes in AN that are associated with eating disorder symptomatology. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  4. State and perspectives of methodological support of materials research of products from Zirconium alloys for fuel rods and fuel assemblies of VVER

    International Nuclear Information System (INIS)

    Gusev, A.; Markelov, V.; Novikov, V.; Zheltkovskaya, T.; Malgin, A.; Shevyakov, A.; Bekrenev, S.

    2015-01-01

    The basic methodological framework for the study of the characteristics of zirconium products was created in JSC «VNIINM». The reliability of experiments confirmed the results of metrological certification procedures. Further development of methodological support of «VNIINM» for Zr products research is the development and validation of methods to determine: mechanical characteristics under internal pressure; Determination of Contractile Strain Ratio (CSR); Expansion Due to Compression (EDC); Plane Strain Tensile (PST); characteristics of resistance multi-cycle and low-cyclic fatigue; texture parameters using the orientation distribution function; the electrical characteristics of the oxide film by impedance

  5. A methodology for system-of-systems design in support of the engineering team

    Science.gov (United States)

    Ridolfi, G.; Mooij, E.; Cardile, D.; Corpino, S.; Ferrari, G.

    2012-04-01

    Space missions have experienced a trend of increasing complexity in the last decades, resulting in the design of very complex systems formed by many elements and sub-elements working together to meet the requirements. In a classical approach, especially in a company environment, the two steps of design-space exploration and optimization are usually performed by experts inferring on major phenomena, making assumptions and doing some trial-and-error runs on the available mathematical models. This is done especially in the very early design phases where most of the costs are locked-in. With the objective of supporting the engineering team and the decision-makers during the design of complex systems, the authors developed a modelling framework for a particular category of complex, coupled space systems called System-of-Systems. Once modelled, the System-of-Systems is solved using a computationally cheap parametric methodology, named the mixed-hypercube approach, based on the utilization of a particular type of fractional factorial design-of-experiments, and analysis of the results via global sensitivity analysis and response surfaces. As an applicative example, a system-of-systems of a hypothetical human space exploration scenario for the support of a manned lunar base is presented. The results demonstrate that using the mixed-hypercube to sample the design space, an optimal solution is reached with a limited computational effort, providing support to the engineering team and decision makers thanks to sensitivity and robustness information. The analysis of the system-of-systems model that was implemented shows that the logistic support of a human outpost on the Moon for 15 years is still feasible with currently available launcher classes. The results presented in this paper have been obtained in cooperation with Thales Alenia Space—Italy, in the framework of a regional programme called STEPS. STEPS—Sistemi e Tecnologie per l'EsPlorazione Spaziale is a research

  6. Laboratory Calibration Studies in Support of ORGANICS on the International Space Station: Evolution of Organic Matter in Space

    Science.gov (United States)

    Ruiterkamp, R.; Ehrenfreund, P.; Halasinski, T.; Salama, F.; Foing, B.; Schmidt, W.

    2002-01-01

    This paper describes the scientific overview and current status of ORGANICS an exposure experiment performed on the International Space Station (ISS) to study the evolution of organic matter in space (PI: P. Ehrenfreund), with supporting laboratory experiments performed at NASA Ames. ORGANICS investigates the chemical evolution of samples submitted to long-duration exposure to space environment in near-Earth orbit. This experiment will provide information on the nature, evolution, and survival of carbon species in the interstellar medium (ISM) and in solar system targets.

  7. Governance matters II - updated indicators for 2000-01

    OpenAIRE

    Kaufmann, Daniel; Kraay, Aart; Zoido-Lobaton, Pablo

    2002-01-01

    The authors construct aggregate governance indicators for six dimensions of governance, covering 175 countries in 2000-01. They apply the methodology developed in Kaufmann, Kraay, and Zoido-Lobaton ("Aggregating Governance Indicators", Policy Research Working Paper 2195, and "Governance Matters", Policy Research Working Paper 2196, October 1999) to newly available data at governance indica...

  8. Autonomy support physical education: history, design, methodology and analysis regarding motivation in teenage students

    Directory of Open Access Journals (Sweden)

    Marina Martínez-Molina

    2013-07-01

    Full Text Available In any area of education it is recognized how important is that students are motivated. But this requires teachers who motivate and actions that cause this state on students. The autonomy support may be the key to improve the motivation of learners, as well as an indicator to search for other improvements in the teaching-learning process. The aim of this study was to analyze the potential importance of supporting autonomy in students (both in learning and in the acquisition of habits and exemplify the design, methodology and analysis to make possible to get the objectives. This will draw a sample of 758 high school students (347 men, 45.8%; 411 women, 54.2% of the Region of Murcia, aged between 12 and 18 (M = 15.22, SD = 1.27. The instrument to be used is a questionnaire consisting of scales: Learning Climate Quetionarire (LCQ, Sport Motivation Scale (SMS, Intention to partake in leisure-time physical activity (Intention-PFTL, Sport Satisfaction Instrument to Physical Education (SSI-EF and the scale of Importance and usefulness of Physical Education (IEF. Possible results may improve and discuss many of the existing work and provide further guidance to be used for teachers to improve their teaching performance.

  9. Investigating 3S Synergies to Support Infrastructure Development and Risk-Informed Methodologies for 3S by Design

    International Nuclear Information System (INIS)

    Suzuki, M.; Izumi, Y.; Kimoto, T.; Naoi, Y.; Inoue, T.; Hoffheins, B.

    2010-01-01

    In 2008, Japan and other G8 countries pledged to support the Safeguards, Safety, and Security (3S) Initiative to raise awareness of 3S worldwide and to assist countries in setting up nuclear energy infrastructures that are essential cornerstones of a successful nuclear energy program. The goals of the 3S initiative are to ensure that countries already using nuclear energy or those planning to use nuclear energy are supported by strong national programs in safety, security, and safeguards not only for reliability and viability of the programs, but also to prove to the international audience that the programs are purely peaceful and that nuclear material is properly handled, accounted for, and protected. In support of this initiative, Japan Atomic Energy Agency (JAEA) has been conducting detailed analyses of the R and D programs and cultures of each of the 'S' areas to identify overlaps where synergism and efficiencies might be realized, to determine where there are gaps in the development of a mature 3S culture, and to coordinate efforts with other Japanese and international organizations. As an initial outcome of this study, incoming JAEA employees are being introduced to 3S as part of their induction training and the idea of a President's Award program is being evaluated. Furthermore, some overlaps in 3S missions might be exploited to share facility instrumentation as with Joint-Use-Equipment (JUE), in which cameras and radiation detectors, are shared by the State and IAEA. Lessons learned in these activities can be applied to developing more efficient and effective 3S infrastructures for incorporating into Safeguards by Design methodologies. They will also be useful in supporting human resources and technology development projects associated with Japan's planned nuclear security center for Asia, which was announced during the 2010 Nuclear Security Summit. In this presentation, a risk-informed approach regarding integration of 3S will be introduced. An initial

  10. Integrated cost estimation methodology to support high-performance building design

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, Prasad; Greden, Lara; Eijadi, David; McDougall, Tom [The Weidt Group, Minnetonka (United States); Cole, Ray [Axiom Engineers, Monterey (United States)

    2007-07-01

    Design teams evaluating the performance of energy conservation measures (ECMs) calculate energy savings rigorously with established modelling protocols, accounting for the interaction between various measures. However, incremental cost calculations do not have a similar rigor. Often there is no recognition of cost reductions with integrated design, nor is there assessment of cost interactions amongst measures. This lack of rigor feeds the notion that high-performance buildings cost more, creating a barrier for design teams pursuing aggressive high-performance outcomes. This study proposes an alternative integrated methodology to arrive at a lower perceived incremental cost for improved energy performance. The methodology is based on the use of energy simulations as means towards integrated design and cost estimation. Various points along the spectrum of integration are identified and characterized by the amount of design effort invested, the scheduling of effort, and relative energy performance of the resultant design. It includes a study of the interactions between building system parameters as they relate to capital costs. Several cost interactions amongst energy measures are found to be significant.The value of this approach is demonstrated with alternatives in a case study that shows the differences between perceived costs for energy measures along various points on the integration spectrum. These alternatives show design tradeoffs and identify how decisions would have been different with a standard costing approach. Areas of further research to make the methodology more robust are identified. Policy measures to encourage the integrated approach and reduce the barriers towards improved energy performance are discussed.

  11. FlooDSuM - a decision support methodology for assisting local authorities in flood situations

    Science.gov (United States)

    Schwanbeck, Jan; Weingartner, Rolf

    2014-05-01

    Decision making in flood situations is a difficult task, especially in small to medium-sized mountain catchments (30 - 500 km2) which are usually characterized by complex topography, high drainage density and quick runoff response to rainfall events. Operating hydrological models driven by numerical weather prediction systems, which have a lead-time of several hours up to few even days, would be beneficial in this case as time for prevention could be gained. However, the spatial and quantitative accuracy of such meteorological forecasts usually decrease with increasing lead-time. In addition, the sensitivity of rainfall-runoff models to inaccuracies in estimations of areal rainfall increases with decreasing catchment size. Accordingly, decisions on flood alerts should ideally be based on areal rainfall from high resolution and short-term numerical weather prediction, nowcasts or even real-time measurements, which is transformed into runoff by a hydrological model. In order to benefit from the best possible rainfall data while retaining enough time for alerting and for prevention, the hydrological model should be fast and easily applicable by decision makers within local authorities themselves. The proposed decision support methodology FlooDSuM (Flood Decision Support Methodology) aims to meet those requirements. Applying FlooDSuM, a few successive binary decisions of increasing complexity have to be processed following a flow-chart-like structure. Prepared data and straightforwardly applicable tools are provided for each of these decisions. Maps showing the current flood disposition are used for the first step. While danger of flooding cannot be excluded more and more complex and time consuming methods will be applied. For the final decision, a set of scatter-plots relating areal precipitation to peak flow is provided. These plots take also further decisive parameters into account such as storm duration, distribution of rainfall intensity in time as well as the

  12. Using a life cycle assessment methodology for the analysis of two treatment systems of food-processing industry wastewaters

    DEFF Research Database (Denmark)

    Maya Altamira, Larisa; Schmidt, Jens Ejbye; Baun, Anders

    2007-01-01

    criteria involve sludge disposal strategies and electrical energy consumption. However, there is a need to develop a systematic methodology to quantify relevant environmental indicators; comprising information of the wastewater treatment system in a life cycle perspective. Also, to identify which...... are the parameters that have the greatest influence on the potential environmental impacts of the systems analyzed. In this study, we present a systematic methodology for the analysis of the operation of two modern wastewater treatment technologies: Biological removal of nitrogen and organic matter by activated...... sludge (Scenario 1), and anaerobic removal of organic matter by a continuous stirred tank reactor (Scenario 2). Both technologies were applied to wastewater coming from a fish meals industry and a pet food industry discharging about 250 to 260 thousand cubic meters of wastewater per year. The methodology...

  13. Thin-shell wormholes supported by total normal matter

    Energy Technology Data Exchange (ETDEWEB)

    Mazharimousavi, S.H.; Halilsoy, M. [Eastern Mediterranean University, Department of Physics, Gazimagusa (Turkey)

    2014-09-15

    The Zipoy-Voorhees-Weyl (ZVW) spacetime characterized by mass (M) and oblateness (δ) is proposed in the construction of viable thin-shell wormholes (TSWs). A departure from spherical/cylindrical symmetry yields a positive total energy in spite of the fact that the local energy density may take negative values. We show that oblateness of the bumpy sources/black holes can be incorporated as a new degree of freedom that may play a role in the resolution of the exotic matter problem in TSWs. A small velocity perturbation reveals, however, that the resulting TSW is unstable. (orig.)

  14. Dark matter in spiral galaxies

    International Nuclear Information System (INIS)

    Albada, T.S. van; Sancisi, R.

    1986-01-01

    Mass models of spiral galaxies based on the observed light distribution, assuming constant M/L for bulge and disc, are able to reproduce the observed rotation curves in the inner regions, but fail to do so increasingly towards and beyond the edge of the visible material. The discrepancy in the outer region can be accounted for by invoking dark matter; some galaxies require at least four times as much dark matter as luminous matter. There is no evidence for a dependence on galaxy luminosity or morphological type. Various arguments support the idea that a distribution of visible matter with constant M/L is responsible for the circular velocity in the inner region, i.e. inside approximately 2.5 disc scalelengths. Luminous matter and dark matter seem to 'conspire' to produce the flat observed rotation curves in the outer region. It seems unlikely that this coupling between disc and halo results from the large-scale gravitational interaction between the two components. Attempts to determine the shape of dark halos have not yet produced convincing results. (author)

  15. Pedagogical support of competence formation: methodological bases and experimental context

    OpenAIRE

    NABIEV VALERY SHARIFYANOVICH

    2016-01-01

    The article considers the problem of competence approach methodological basis. It discusses the topical issues of organizing a holistic educational process. The article presents the original solutions created by the author and the results of experimental verification of the specified conditions of pedagogical maintenance of educational and training activities.

  16. Inclusion of products of physicochemical oxidation of organic wastes in matter recycling of biological-technical life support systems.

    Science.gov (United States)

    Tikhomirov, Alexander A.; Kudenko, Yurii; Trifonov, Sergei; Ushakova, Sofya

    Inclusion of products of human and plant wastes' `wet' incineration in 22 medium using alter-nating current into matter recycling of biological-technical life support system (BTLSS) has been considered. Fluid and gaseous components have been shown to be the products of such processing. In particular, the final product contained all necessary for plant cultivation nitrogen forms: NO2, NO3, NH4+. As the base solution included urine than NH4+ form dominated. At human solid wastes' mineralization NO2 NH4+ were registered in approximately equal amount. Comparative analysis of mineral composition of oxidized human wastes' and standard Knop solutions has been carried out. On the grounds of that analysis the dilution methods of solutions prepared with addition of oxidized human wastes for their further use for plant irrigation have been suggested. Reasonable levels of wheat productivity cultivated at use of given solutions have been obtained. CO2, N2 and O2 have been determined to be the main gas components of the gas admixture emitted within the given process. These gases easily integrate in matter recycling process of closed ecosystem. The data of plants' cultivation feasibility in the atmosphere obtained after closing of gas loop including physicochemical facility and vegetation chamber with plants-representatives of LSS phototrophic unit has been received. Conclusion of advance research on creation of matter recycling process in the integrated physical-chemical-biological model system has been drawn.

  17. Integrating cost information with health management support system: an enhanced methodology to assess health care quality drivers.

    Science.gov (United States)

    Kohli, R; Tan, J K; Piontek, F A; Ziege, D E; Groot, H

    1999-08-01

    Changes in health care delivery, reimbursement schemes, and organizational structure have required health organizations to manage the costs of providing patient care while maintaining high levels of clinical and patient satisfaction outcomes. Today, cost information, clinical outcomes, and patient satisfaction results must become more fully integrated if strategic competitiveness and benefits are to be realized in health management decision making, especially in multi-entity organizational settings. Unfortunately, traditional administrative and financial systems are not well equipped to cater to such information needs. This article presents a framework for the acquisition, generation, analysis, and reporting of cost information with clinical outcomes and patient satisfaction in the context of evolving health management and decision-support system technology. More specifically, the article focuses on an enhanced costing methodology for determining and producing improved, integrated cost-outcomes information. Implementation issues and areas for future research in cost-information management and decision-support domains are also discussed.

  18. Workshops as a Research Methodology

    DEFF Research Database (Denmark)

    Ørngreen, Rikke; Levinsen, Karin Tweddell

    2017-01-01

    , and workshops as a research methodology. Focusing primarily on the latter, this paper presents five studies on upper secondary and higher education teachers’ professional development and on teaching and learning through video conferencing. Through analysis and discussion of these studies’ findings, we argue......This paper contributes to knowledge on workshops as a research methodology, and specifically on how such workshops pertain to e-learning. A literature review illustrated that workshops are discussed according to three different perspectives: workshops as a means, workshops as practice...... that workshops provide a platform that can aid researchers in identifying and exploring relevant factors in a given domain by providing means for understanding complex work and knowledge processes that are supported by technology (for example, e-learning). The approach supports identifying factors...

  19. Developing knowledge management systems with an active expert methodology

    International Nuclear Information System (INIS)

    Sandahl, K.

    1992-01-01

    Knowledge management, understood as the ability to store, distribute and utilize human knowledge in an organization, is the subject of this dissertation. In particular we have studied the design of methods and supporting software for this process. Detailed and systematic description of the design and development processes of three case-study implementations of knowledge management software are provided. The outcome of the projects is explained in terms of an active expert development methodology, which is centered around support for a domain expert to take substantial responsibility for the design and maintenance of a knowledge management system in a given area of application. Based on the experiences from the case studies and the resulting methodology, an environment for automatically supporting knowledge management was designed in the KNOWLEDGE-LINKER research project. The vital part of this architecture is a knowledge acquisition tool, used directly by the experts in creating and maintaining a knowledge base. An elaborated version of the active expert development methodology was then formulated as the result of applying the KNOWLEDGE-LINKER approach in a fourth case study. This version of the methodology is also accounted for and evaluated together within the supporting KNOWLEDGE-LINKER architecture. (au)

  20. Proposed Objective Odor Control Test Methodology for Waste Containment

    Science.gov (United States)

    Vos, Gordon

    2010-01-01

    The Orion Cockpit Working Group has requested that an odor control testing methodology be proposed to evaluate the odor containment effectiveness of waste disposal bags to be flown on the Orion Crew Exploration Vehicle. As a standardized "odor containment" test does not appear to be a matter of record for the project, a new test method is being proposed. This method is based on existing test methods used in industrial hygiene for the evaluation of respirator fit in occupational settings, and takes into consideration peer reviewed documentation of human odor thresholds for standardized contaminates, industry stardnard atmostpheric testing methodologies, and established criteria for laboratory analysis. The proposed methodology is quantitative, though it can readily be complimented with a qualitative subjective assessment. Isoamyl acetate (IAA - also known at isopentyl acetate) is commonly used in respirator fit testing, and there are documented methodologies for both measuring its quantitative airborne concentrations. IAA is a clear, colorless liquid with a banana-like odor, documented detectable smell threshold for humans of 0.025 PPM, and a 15 PPB level of quantation limit.

  1. Nucleation in Polymers and Soft Matter

    Science.gov (United States)

    Xu, Xiaofei; Ting, Christina L.; Kusaka, Isamu; Wang, Zhen-Gang

    2014-04-01

    Nucleation is a ubiquitous phenomenon in many physical, chemical, and biological processes. In this review, we describe recent progress on the theoretical study of nucleation in polymeric fluids and soft matter, including binary mixtures (polymer blends, polymers in poor solvents, compressible polymer-small molecule mixtures), block copolymer melts, and lipid membranes. We discuss the methodological development for studying nucleation as well as novel insights and new physics obtained in the study of the nucleation behavior in these systems.

  2. UNSAPONIFIABLE MATTER FROM SUGAR CANE FILTER CAKE USING ETHANOL AS SOLVENT

    Directory of Open Access Journals (Sweden)

    Inés María San Anastacio Rebollar

    2016-07-01

    Full Text Available In this paper, we propose a methodology for the obtaining of unsaponificable matter starting from the sugar cane filter cake, in the one that only ethanol 96 °GL is used as solvent. The wax is extracted of the mud with ethanol (with a purity of 96 ºGL by means of a leaching out process using a mud/ethanol ratio of 0.05 kg/L to 70 ºC, atmospheric pressure, agitation speed of 700 rpm and extraction time of 2,5 hours. Under these conditions 86.21 % of extraction is obtained. Then, the obtained extract reacts with alcoholic NaOH to 70 ºC during 75 minutes to atmospheric pressure and shaking to 200 rpm. The employment of the proposed methodology allows to obtain 1.942 g of impure unsaponifiable matter starting from 50 g of mud and 1.05 L of ethanol 96 °GL.

  3. Methodological guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-04-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs.

  4. Methodological guidelines

    International Nuclear Information System (INIS)

    Halsnaes, K.; Callaway, J.M.; Meyer, H.J.

    1999-01-01

    The guideline document establishes a general overview of the main components of climate change mitigation assessment. This includes an outline of key economic concepts, scenario structure, common assumptions, modelling tools and country study assumptions. The guidelines are supported by Handbook Reports that contain more detailed specifications of calculation standards, input assumptions and available tools. The major objectives of the project have been provided a methodology, an implementing framework and a reporting system which countries can follow in meeting their future reporting obligations under the FCCC and for GEF enabling activities. The project builds upon the methodology development and application in the UNEP National Abatement Coasting Studies (UNEP, 1994a). The various elements provide countries with a road map for conducting climate change mitigation studies and submitting national reports as required by the FCCC. (au) 121 refs

  5. Recycling Piaget: Posthumanism and Making Children's Knowledge Matter

    Science.gov (United States)

    Aslanian, Teresa K.

    2018-01-01

    A growing body of research incorporates children's perspectives into the research process. If we are to take children's perspectives seriously in education research, research methodologies must be capable of addressing issues that matter to children. This article engages in a theoretical discussion that considers how a posthuman research…

  6. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  7. Creativity in phenomenological methodology

    DEFF Research Database (Denmark)

    Dreyer, Pia; Martinsen, Bente; Norlyk, Annelise

    2014-01-01

    on the methodologies of van Manen, Dahlberg, Lindseth & Norberg, the aim of this paper is to argue that the increased focus on creativity and arts in research methodology is valuable to gain a deeper insight into lived experiences. We illustrate this point through examples from empirical nursing studies, and discuss......Nursing research is often concerned with lived experiences in human life using phenomenological and hermeneutic approaches. These empirical studies may use different creative expressions and art-forms to describe and enhance an embodied and personalised understanding of lived experiences. Drawing...... may support a respectful renewal of phenomenological research traditions in nursing research....

  8. Hot Strange Hadronic Matter in an Effective Model

    Science.gov (United States)

    Qian, Wei-Liang; Su, Ru-Keng; Song, Hong-Qiu

    2003-10-01

    An effective model used to describe the strange hadronic matter with nucleons, Λ-hyperons, and Ξ-hyperons is extended to finite temperature. The extended model is used to study the density, temperature, and strangeness fraction dependence of the effective masses of baryons in the matter. The thermodynamical quantities, such as free energy and pressure, as well as the equation of state of the matter, are given. The project supported in part by National Natural Science Foundation of China under Grant Nos. 10075071, 10047005, 19947001, 19975010, and 10235030, and the CAS Knowledge Innovation Project No. KJCX2-N11. Also supported by the State Key Basic Research Development Program under Grant No. G200077400 and the Exploration Project of Knowledge Innovation Program of the Chinese Academy of Sciences

  9. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  10. Definition of supportive care: does the semantic matter?

    Science.gov (United States)

    Hui, David

    2014-07-01

    'Supportive care' is a commonly used term in oncology; however, no consensus definition exists. This represents a barrier to communication in both the clinical and research settings. In this review, we propose a unifying conceptual framework for supportive care and discuss the proper use of this term in the clinical and research settings. A recent systematic review revealed several themes for supportive care: a focus on symptom management and improvement of quality of life, and care for patients on treatments and those with advanced stage disease. These findings are consistent with a broad definition for supportive care: 'the provision of the necessary services for those living with or affected by cancer to meet their informational, emotional, spiritual, social, or physical needs during their diagnostic, treatment, or follow-up phases encompassing issues of health promotion and prevention, survivorship, palliation, and bereavement.' Supportive care can be classified as primary, secondary, and tertiary based on the level of specialization. For example, palliative care teams provide secondary supportive care for patients with advanced cancer. Until a consensus definition is available for supportive care, this term should be clearly defined or cited whenever it is used.

  11. Unlocking color and flavor in superconducting strange quark matter

    International Nuclear Information System (INIS)

    Alford, Mark; Berges, Juergen; Rajagopal, Krishna

    1999-01-01

    We explore the phase diagram of strongly interacting matter with massless u and d quarks as a function of the strange quark mass m s and the chemical potential μ for baryon number. Neglecting electromagnetism, we describe the different baryonic and quark matter phases at zero temperature. For quark matter, we support our model-independent arguments with a quantitative analysis of a model which uses a four-fermion interaction abstracted from single-gluon exchange. For any finite m s , at sufficiently large μ we find quark matter in a color-flavor-locked state which leaves a global vector-like SU(2) color+L+R symmetry unbroken. As a consequence, chiral symmetry is always broken in sufficiently dense quark matter. As the density is reduced, for sufficiently large m s we observe a first-order transition from the color-flavor-locked phase to color superconducting phase analogous to that in two-flavor QCD. At this unlocking transition chiral symmetry is restored. For realistic values of m s our analysis indicates that chiral symmetry breaking may be present for all densities down to those characteristic of baryonic matter. This supports the idea that quark matter and baryonic matter may be continuously connected in nature. We map the gaps at the quark Fermi surfaces in the high density color-flavor-locked phase onto gaps at the baryon Fermi surfaces at low densities

  12. Science That Matters: Exploring Science Learning and Teaching in Primary Schools

    Science.gov (United States)

    Fitzgerald, Angela; Smith, Kathy

    2016-01-01

    To help support primary school students to better understand why science matters, teachers must first be supported to teach science in ways that matter. In moving to this point, this paper identifies the dilemmas and tensions primary school teachers face in the teaching of science. The balance is then readdressed through a research-based…

  13. Behavioural and Autonomic Regulation of Response to Sensory Stimuli among Children: A Systematic Review of Relationship and Methodology.

    Science.gov (United States)

    Gomez, Ivan Neil; Lai, Cynthia Y Y; Morato-Espino, Paulin Grace; Chan, Chetwyn C H; Tsang, Hector W H

    2017-01-01

    Previous studies have explored the correlates of behavioural and autonomic regulation of response to sensory stimuli in children; however, a comprehensive review of such relationship is lacking. This systematic review was performed to critically appraise the current evidence on such relationship and describe the methods used in these studies. Online databases were systematically searched for peer-reviewed, full-text articles in the English language between 1999 and 2016, initially screened by title and abstract, and appraised and synthesized by two independent review authors. Fourteen Level III-3 cross-sectional studies were included for systematic review, among which six studies explored the relationship between behaviour and physiological regulation of responses to sensory stimuli. Three studies reported significant positive weak correlations among ASD children; however, no correlations were found in typically developing children. Methodological differences related to individual differences among participants, measures used, and varied laboratory experimental setting were noted. This review suggests inconclusive evidence supporting the relationship between behavioural and physiological regulation of responses to sensory stimuli among children. Methodological differences may likely have confounded the results of the current evidence. We present methodological recommendations to address this matter for future researches. This trial is registered with PROSPERO registration number CRD42016043887.

  14. Behavioural and Autonomic Regulation of Response to Sensory Stimuli among Children: A Systematic Review of Relationship and Methodology

    Directory of Open Access Journals (Sweden)

    Ivan Neil Gomez

    2017-01-01

    Full Text Available Background. Previous studies have explored the correlates of behavioural and autonomic regulation of response to sensory stimuli in children; however, a comprehensive review of such relationship is lacking. This systematic review was performed to critically appraise the current evidence on such relationship and describe the methods used in these studies. Methods. Online databases were systematically searched for peer-reviewed, full-text articles in the English language between 1999 and 2016, initially screened by title and abstract, and appraised and synthesized by two independent review authors. Results. Fourteen Level III-3 cross-sectional studies were included for systematic review, among which six studies explored the relationship between behaviour and physiological regulation of responses to sensory stimuli. Three studies reported significant positive weak correlations among ASD children; however, no correlations were found in typically developing children. Methodological differences related to individual differences among participants, measures used, and varied laboratory experimental setting were noted. Conclusion. This review suggests inconclusive evidence supporting the relationship between behavioural and physiological regulation of responses to sensory stimuli among children. Methodological differences may likely have confounded the results of the current evidence. We present methodological recommendations to address this matter for future researches. This trial is registered with PROSPERO registration number CRD42016043887.

  15. A conceptual methodology to design a decision support system to leak detection programs in water networks

    International Nuclear Information System (INIS)

    Di Federico, V.; Bottarelli, M.; Di Federico, I.

    2005-01-01

    The paper outlines a conceptual methodology to develop a decision support system to assist technicians managing water networks in selecting the appropriate leak detection method(s). First, the necessary knowledge about the network is recapitulated: location and characteristics of its physical components, but also water demand, breaks in pipes, and water quality data. Second, the water balance in a typical Italian Agency is discussed, suggesting method and procedures to evacuate and/or estimate each term in the mass balance equation. Then the available methods for leak detection are described in detail, from those useful in the pre-localization phase to those commonly adopted to pinpoint pipe failures and allow a rapid repair. Criteria to estimate costs associated with each of these methods are provided. Finally, the proposed structure of the DSS is described [it

  16. Nucleation of strange matter in dense stellar cores

    International Nuclear Information System (INIS)

    Horvath, J.E.; Benvenuto, O.G.; Vucetich, H.

    1992-01-01

    We investigate the nucleation of strange quark matter inside hot, dense nuclear matter. Applying Zel'dovich's kinetic theory of nucleation we find a lower limit of the temperature T for strange-matter bubbles to appear, which happens to be satisfied inside the Kelvin-Helmholtz cooling era of a compact star life but not much after it. Our bounds thus suggest that a prompt conversion could be achieved, giving support to earlier expectations for nonstandard type-II supernova scenarios

  17. Dark Matter Benchmark Models for Early LHC Run-2 Searches: Report of the ATLAS/CMS Dark Matter Forum

    CERN Document Server

    Abercrombie, Daniel; Akilli, Ece; Alcaraz Maestre, Juan; Allen, Brandon; Alvarez Gonzalez, Barbara; Andrea, Jeremy; Arbey, Alexandre; Azuelos, Georges; Azzi, Patrizia; Backovic, Mihailo; Bai, Yang; Banerjee, Swagato; Beacham, James; Belyaev, Alexander; Boveia, Antonio; Brennan, Amelia Jean; Buchmueller, Oliver; Buckley, Matthew R.; Busoni, Giorgio; Buttignol, Michael; Cacciapaglia, Giacomo; Caputo, Regina; Carpenter, Linda; Filipe Castro, Nuno; Gomez Ceballos, Guillelmo; Cheng, Yangyang; Chou, John Paul; Cortes Gonzalez, Arely; Cowden, Chris; D'Eramo, Francesco; De Cosa, Annapaola; De Gruttola, Michele; De Roeck, Albert; De Simone, Andrea; Deandrea, Aldo; Demiragli, Zeynep; DiFranzo, Anthony; Doglioni, Caterina; du Pree, Tristan; Erbacher, Robin; Erdmann, Johannes; Fischer, Cora; Flaecher, Henning; Fox, Patrick J.; Fuks, Benjamin; Genest, Marie-Helene; Gomber, Bhawna; Goudelis, Andreas; Gramling, Johanna; Gunion, John; Hahn, Kristian; Haisch, Ulrich; Harnik, Roni; Harris, Philip C.; Hoepfner, Kerstin; Hoh, Siew Yan; Hsu, Dylan George; Hsu, Shih-Chieh; Iiyama, Yutaro; Ippolito, Valerio; Jacques, Thomas; Ju, Xiangyang; Kahlhoefer, Felix; Kalogeropoulos, Alexis; Kaplan, Laser Seymour; Kashif, Lashkar; Khoze, Valentin V.; Khurana, Raman; Kotov, Khristian; Kovalskyi, Dmytro; Kulkarni, Suchita; Kunori, Shuichi; Kutzner, Viktor; Lee, Hyun Min; Lee, Sung-Won; Liew, Seng Pei; Lin, Tongyan; Lowette, Steven; Madar, Romain; Malik, Sarah; Maltoni, Fabio; Martinez Perez, Mario; Mattelaer, Olivier; Mawatari, Kentarou; McCabe, Christopher; Megy, Theo; Morgante, Enrico; Mrenna, Stephen; Narayanan, Siddharth M.; Nelson, Andy; Novaes, Sergio F.; Padeken, Klaas Ole; Pani, Priscilla; Papucci, Michele; Paulini, Manfred; Paus, Christoph; Pazzini, Jacopo; Penning, Bjorn; Peskin, Michael E.; Pinna, Deborah; Procura, Massimiliano; Qazi, Shamona F.; Racco, Davide; Re, Emanuele; Riotto, Antonio; Rizzo, Thomas G.; Roehrig, Rainer; Salek, David; Sanchez Pineda, Arturo; Sarkar, Subir; Schmidt, Alexander; Schramm, Steven Randolph; Shepherd, William; Singh, Gurpreet; Soffi, Livia; Srimanobhas, Norraphat; Sung, Kevin; Tait, Tim M.P.; Theveneaux-Pelzer, Timothee; Thomas, Marc; Tosi, Mia; Trocino, Daniele; Undleeb, Sonaina; Vichi, Alessandro; Wang, Fuquan; Wang, Lian-Tao; Wang, Ren-Jie; Whallon, Nikola; Worm, Steven; Wu, Mengqing; Wu, Sau Lan; Yang, Hongtao; Yang, Yong; Yu, Shin-Shan; Zaldivar, Bryan; Zanetti, Marco; Zhang, Zhiqing; Zucchetta, Alberto

    2015-01-01

    This document is the final report of the ATLAS-CMS Dark Matter Forum, a forum organized by the ATLAS and CMS collaborations with the participation of experts on theories of Dark Matter, to select a minimal basis set of dark matter simplified models that should support the design of the early LHC Run-2 searches. A prioritized, compact set of benchmark models is proposed, accompanied by studies of the parameter space of these models and a repository of generator implementations. This report also addresses how to apply the Effective Field Theory formalism for collider searches and present the results of such interpretations.

  18. Estimation of the laser cutting operating cost by support vector regression methodology

    Science.gov (United States)

    Jović, Srđan; Radović, Aleksandar; Šarkoćević, Živče; Petković, Dalibor; Alizamir, Meysam

    2016-09-01

    Laser cutting is a popular manufacturing process utilized to cut various types of materials economically. The operating cost is affected by laser power, cutting speed, assist gas pressure, nozzle diameter and focus point position as well as the workpiece material. In this article, the process factors investigated were: laser power, cutting speed, air pressure and focal point position. The aim of this work is to relate the operating cost to the process parameters mentioned above. CO2 laser cutting of stainless steel of medical grade AISI316L has been investigated. The main goal was to analyze the operating cost through the laser power, cutting speed, air pressure, focal point position and material thickness. Since the laser operating cost is a complex, non-linear task, soft computing optimization algorithms can be used. Intelligent soft computing scheme support vector regression (SVR) was implemented. The performance of the proposed estimator was confirmed with the simulation results. The SVR results are then compared with artificial neural network and genetic programing. According to the results, a greater improvement in estimation accuracy can be achieved through the SVR compared to other soft computing methodologies. The new optimization methods benefit from the soft computing capabilities of global optimization and multiobjective optimization rather than choosing a starting point by trial and error and combining multiple criteria into a single criterion.

  19. An integrated probabilistic risk analysis decision support methodology for systems with multiple state variables

    International Nuclear Information System (INIS)

    Sen, P.; Tan, John K.G.; Spencer, David

    1999-01-01

    Probabilistic risk analysis (PRA) methods have been proven to be valuable in risk and reliability analysis. However, a weak link seems to exist between methods for analysing risks and those for making rational decisions. The integrated decision support system (IDSS) methodology presented in this paper attempts to address this issue in a practical manner. In consists of three phases: a PRA phase, a risk sensitivity analysis (SA) phase and an optimisation phase, which are implemented through an integrated computer software system. In the risk analysis phase the problem is analysed by the Boolean representation method (BRM), a PRA method that can deal with systems with multiple state variables and feedback loops. In the second phase the results obtained from the BRM are utilised directly to perform importance and risk SA. In the third phase, the problem is formulated as a multiple objective decision making problem in the form of multiple objective reliability optimisation. An industrial example is included. The resultant solutions of a five objective reliability optimisation are presented, on the basis of which rational decision making can be explored

  20. Microbial chlorination of organic matter in forest soil: investigation using 36Cl-chloride and its methodology.

    Science.gov (United States)

    Rohlenová, J; Gryndler, M; Forczek, S T; Fuksová, K; Handova, V; Matucha, M

    2009-05-15

    Chloride, which comes into the forest ecosystem largely from the sea as aerosol (and has been in the past assumed to be inert), causes chlorination of soil organic matter. Studies of the chlorination showed that the content of organically bound chlorine in temperate forest soils is higher than that of chloride, and various chlorinated compounds are produced. Our study of chlorination of organic matter in the fermentation horizon of forest soil using radioisotope 36Cl and tracer techniques shows that microbial chlorination clearly prevails over abiotic, chlorination of soil organic matter being enzymatically mediated and proportional to chloride content and time. Long-term (>100 days) chlorination leads to more stable chlorinated substances contained in the organic layer of forest soil (overtime; chlorine is bound progressively more firmly in humic acids) and volatile organochlorines are formed. Penetration of chloride into microorganisms can be documented by the freezing/thawing technique. Chloride absorption in microorganisms in soil and in litter residues in the fermentation horizon complicates the analysis of 36Cl-chlorinated soil. The results show that the analytical procedure used should be tested for every soil type under study.

  1. Search for supporting methodologies - Or how to support SEI for 35 years

    Science.gov (United States)

    Handley, Thomas H., Jr.; Masline, Richard C.

    1991-01-01

    Concepts relevant to the development of an evolvable information management system are examined in terms of support for the Space Exploration Initiative. The issues of interoperability within NASA and industry initiatives are studied including the Open Systems Interconnection standard and the operating system of the Open Software Foundation. The requirements of partitioning functionality into separate areas are determined with attention given to the infrastructure required to ensure system-wide compliance. The need for a decision-making context is a key to the distributed implementation of the program, and this environment is concluded to be next step in developing an evolvable, interoperable, and securable support network.

  2. [Extraction Optimization of Rhizome of Curcuma longa by Response Surface Methodology and Support Vector Regression].

    Science.gov (United States)

    Zhou, Pei-pei; Shan, Jin-feng; Jiang, Jian-lan

    2015-12-01

    To optimize the optimal microwave-assisted extraction method of curcuminoids from Curcuma longa. On the base of single factor experiment, the ethanol concentration, the ratio of liquid to solid and the microwave time were selected for further optimization. Support Vector Regression (SVR) and Central Composite Design-Response Surface Methodology (CCD) algorithm were utilized to design and establish models respectively, while Particle Swarm Optimization (PSO) was introduced to optimize the parameters of SVR models and to search optimal points of models. The evaluation indicator, the sum of curcumin, demethoxycurcumin and bisdemethoxycurcumin by HPLC, were used. The optimal parameters of microwave-assisted extraction were as follows: ethanol concentration of 69%, ratio of liquid to solid of 21 : 1, microwave time of 55 s. On those conditions, the sum of three curcuminoids was 28.97 mg/g (per gram of rhizomes powder). Both the CCD model and the SVR model were credible, for they have predicted the similar process condition and the deviation of yield were less than 1.2%.

  3. Hanford Site baseline risk assessment methodology

    International Nuclear Information System (INIS)

    1993-03-01

    This methodology has been developed to prepare human health and environmental evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act remedial investigations (RIs) and the Resource Conservation and Recovery Act facility investigations (FIs) performed at the Hanford Site pursuant to the Hanford Federal Facility Agreement and Consent Order referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies Site-specific risk assessment considerations and integrates them with approaches for evaluating human and environmental risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site

  4. Methodology for performing surveys for fixed contamination

    International Nuclear Information System (INIS)

    Durham, J.S.; Gardner, D.L.

    1994-10-01

    This report describes a methodology for performing instrument surveys for fixed contamination that can be used to support the release of material from radiological areas, including release to controlled areas and release from radiological control. The methodology, which is based on a fast scan survey and a series of statistical, fixed measurements, meets the requirements of the U.S. Department of Energy Radiological Control Manual (RadCon Manual) (DOE 1994) and DOE Order 5400.5 (DOE 1990) for surveys for fixed contamination and requires less time than a conventional scan survey. The confidence interval associated with the new methodology conforms to the draft national standard for surveys. The methodology that is presented applies only to surveys for fixed contamination. Surveys for removable contamination are not discussed, and the new methodology does not affect surveys for removable contamination

  5. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  6. New methodology for fast prediction of wheel wear evolution

    Science.gov (United States)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  7. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  8. Necessity of Dividing General Methodology of Psychology into a Number of Particular Trends of Meta-Scientific Research

    OpenAIRE

    D A Yatsenko

    2013-01-01

    The article states the illegitimacy of including the problems and matters not directly connected with the methods of scientific studying the psychological phenomena in the works on the methodology of psychology. It also argues about the necessity of separating such problems from the methodology of psycho­logy and transferring them into different meta-psychological sciences: philosophy of psychology, logics of psychology, fundamental problem psychology, meta-theoretical psychology. In conclusi...

  9. Matter-antimatter and matter-matter interactions at intermediate energies

    International Nuclear Information System (INIS)

    Santos, Antonio Carlos Fontes dos

    2002-01-01

    This article presents some of the recent experimental advances on the study on antimatter-matter and matter-matter interactions, and some of the subtle differences stimulated a great theoretical efforts for explanation of the results experimentally observed

  10. Socioeconomic status, white matter, and executive function in children.

    Science.gov (United States)

    Ursache, Alexandra; Noble, Kimberly G

    2016-10-01

    A growing body of evidence links socioeconomic status (SES) to children's brain structure. Few studies, however, have specifically investigated relations of SES to white matter structure. Further, although several studies have demonstrated that family SES is related to development of brain areas that support executive functions (EF), less is known about the role that white matter structure plays in the relation of SES to EF. One possibility is that white matter differences may partially explain SES disparities in EF (i.e., a mediating relationship). Alternatively, SES may differentially shape brain-behavior relations such that the relation of white matter structure to EF may differ as a function of SES (i.e., a moderating relationship). In a diverse sample of 1082 children and adolescents aged 3-21 years, we examined socioeconomic disparities in white matter macrostructure and microstructure. We further investigated relations between family SES, children's white matter volume and integrity in tracts supporting EF, and performance on EF tasks. Socioeconomic status was associated with fractional anisotropy (FA) and volume in multiple white matter tracts. Additionally, family income moderated the relation between white matter structure and cognitive flexibility. Specifically, across multiple tracts of interest, lower FA or lower volume was associated with reduced cognitive flexibility among children from lower income families. In contrast, children from higher income families showed preserved cognitive flexibility in the face of low white matter FA or volume. SES factors did not mediate or moderate links between white matter and either working memory or inhibitory control. This work adds to a growing body of literature suggesting that the socioeconomic contexts in which children develop not only shape cognitive functioning and its underlying neurobiology, but may also shape the relations between brain and behavior.

  11. Methodology for Analyzing and Developing Information Management Infrastructure to Support Telerehabilitation

    Directory of Open Access Journals (Sweden)

    Andi Saptono

    2009-09-01

    Full Text Available The proliferation of advanced technologies led researchers within the Rehabilitation Engineering Research Center on Telerehabilitation (RERC-TR to devise an integrated infrastructure for clinical services using the University of Pittsburgh (PITT model. This model describes five required characteristics for a telerehabilitation (TR infrastructure: openness, extensibility, scalability, cost-effectiveness, and security. The infrastructure is to deliver clinical services over distance to improve access to health services for people living in underserved or remote areas. The methodological approach to design, develop, and employ this infrastructure is explained and detailed for the remote wheelchair prescription project, a research task within the RERC-TR. The availability of this specific clinical service and personnel outside of metropolitan areas is limited due to the lack of specialty expertise and access to resources. The infrastructure is used to deliver expertise in wheeled mobility and seating through teleconsultation to remote clinics, and has been successfully deployed to five rural clinics in Western Pennsylvania. Keywords: Telerehabilitation, Information Management, Infrastructure Development Methodology, Videoconferencing, Online Portal, Database

  12. Is It Necessary to Articulate a Research Methodology When Reporting on Theoretical Research?

    Directory of Open Access Journals (Sweden)

    Juliana Smith

    2017-05-01

    Full Text Available In this paper the authors share their insights on whether it is necessary to articulate a research methodology when reporting on theoretical research. Initially the authors, one being a supervisor and the other, a PhD student and a colleague, were confronted with the question during supervision and writing of a thesis on theoretical research. Reflection on the external examiners’ reports about whether a research methodology for theoretical research is necessary prompted the writing of this paper. In order to answer the question, the characteristics of theoretical research are clarified and contrasting views regarding the necessity or not of including a research methodology section in such a thesis, are examined. The paper also highlights the justification for including a research methodology in a thesis that reports on theoretical research, investigates the soundness of such justification and finally draws conclusions on the matter.

  13. What Matters Most: Using High-Traction Instructional Strategies to Increase Student Success

    Science.gov (United States)

    Turner, Curtis

    2016-01-01

    What matters most when it comes to increasing achievement and student success in the developmental classroom? Recent reform efforts in developmental education have brought sweeping changes in some states. New curricular pathways, redesigned courses, and a handful of new instructional delivery methodologies have been the result. Although these are…

  14. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  15. Status of the Gen-IV Proliferation Resistance and Physical Protection (PRPP) Evaluation Methodology

    International Nuclear Information System (INIS)

    Whitlock, J.; Bari, R.; Peterson, P.; Padoani, F.; Cojazzi, G.G.M.; Renda, G.; ); Cazalet, J.; Haas, E.; Hori, K.; Kawakubo, Y.; Chang, S.; Kim, H.; Kwon, E.-H.; Yoo, H.; Chebeskov, A.; Pshakin, G.; Pilat, J.F.; Therios, I.; Bertel, E.

    2015-01-01

    Methodologies have been developed within the Generation IV International Forum (GIF) to support the assessment and improvement of system performance in the areas safeguards, security, economics and safety. Of these four areas, safeguards and security are the subjects of the GIF working group on Proliferation Resistance and Physical Protection (PRPP). Since the PRPP methodology (now at Revision 6) represents a mature, generic, and comprehensive evaluation approach, and is freely available on the GIF public website, several non-GIF technical groups have chosen to utilize the PRPP methodology for their own goals. Indeed, the results of the evaluations performed with the methodology are intended for three types of generic users: system designers, programme policy makers, and external stakeholders. The PRPP Working Group developed the methodology through a series of demonstration and case studies. In addition, over the past few years various national and international groups have applied the methodology to inform nuclear energy system designs, as well as to support the development of approaches to advanced safeguards. A number of international workshops have also been held which have introduced the methodology to design groups and other stakeholders. In this paper we summarize the technical progress and accomplishments of the PRPP evaluation methodology, including applications outside GIF, and we outline the PRPP methodology's relationship with the IAEA's INPRO methodology. Current challenges with the efficient implementation of the methodology are outlined, along with our path forward for increasing its accessibility to a broader stakeholder audience - including supporting the next generation of skilled professionals in the nuclear non-proliferation field. (author)

  16. CosmoQuest: Supporting Subject Matter Experts in Broadening the Impacts of their Work beyond their Institutional Walls.

    Science.gov (United States)

    Noel-Storr, J.; Buxner, S.; Grier, J.; Gay, P.

    2016-12-01

    CosmoQuest is a virtual research facility, which, like its physical counterparts, provides tools for scientists to acquire reduced data products (thanks to our cadre of citizen scientists working to analyze images and produce results online), and also to participate in education and outreach activities either directly through CosmoQuest activities (such as CosmoAcademy and the Educators' Zone) or with the support of CosmoQuest. Here, we present our strategies to inspire, engage and support Subject Matter Experts (SMEs - Scientists, Engineers, Technologists and Mathematicians) in activities outside of their institutions, and beyond college classroom teaching. We provide support for SMEs who are interested in increasing the impacts of their science knowledge and expertise by interacting with people online, or in other venues outside of their normal work environment. This includes a broad spectrum of opportunities for those interested in hosting webinars; running short courses for the public; using Facebook, Twitter or other social media to communicate science; or other diverse activities such as supporting an open house, science fair, or star party. As noted by Katheryn Woods-Townsend and colleagues, "...face-to-face interactions with scientists allowed students to view scientists as approachable and normal people, and to begin to understand the range of scientific areas and careers that exist. Scientists viewed the scientist-student interactions as a vehicle for science communication" (2015). As CosmoQuest fosters these relationships, it We present a framework for SMEs which combine opportunities for continuing professional development (virtually and in person at conferences) with ongoing online support, creating a dynamic professional learning network. The goal of this is to deepen SME capacity-knowledge, attitudes and behaviors-both encouraging and empowering them to connect to broader audiences in new ways.

  17. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  18. Theory, Software and Testing Examples for Decision Support Systems

    OpenAIRE

    Lewandowski, A.; Wierzbicki, A.P.

    1987-01-01

    Research in methodology of Decision Support Systems is one of the activities within the System and Decision Sciences Program which was initiated seven years ago and is still in the center of interests of SDS. During these years several methodological approaches and software tools have been developed; among others the DIDAS (Dynamic Interactive Decision Analysis and Support) and SCDAS (Selection Committed Decision Analysis and Support). Both methodologies gained a certain level of popularity a...

  19. Eighteenth Workshop on Recent Developments in Computer Simulation Studies in Condensed Matter Physics

    CERN Document Server

    Landau, David P; Schüttler, Heinz-Bernd; Computer Simulation Studies in Condensed-Matter Physics XVIII

    2006-01-01

    This volume represents a "status report" emanating from presentations made during the 18th Annual Workshop on Computer Simulations Studies in Condensed Matter Physics at the Center for Simulational Physics at the University of Georgia in March 2005. It provides a broad overview of the most recent advances in the field, spanning the range from statistical physics to soft condensed matter and biological systems. Results on nanostructures and materials are included as are several descriptions of advances in quantum simulations and quantum computing as well as.methodological advances.

  20. The methodological defense of realism scrutinized.

    Science.gov (United States)

    Wray, K Brad

    2015-12-01

    I revisit an older defense of scientific realism, the methodological defense, a defense developed by both Popper and Feyerabend. The methodological defense of realism concerns the attitude of scientists, not philosophers of science. The methodological defense is as follows: a commitment to realism leads scientists to pursue the truth, which in turn is apt to put them in a better position to get at the truth. In contrast, anti-realists lack the tenacity required to develop a theory to its fullest. As a consequence, they are less likely to get at the truth. My aim is to show that the methodological defense is flawed. I argue that a commitment to realism does not always benefit science, and that there is reason to believe that a research community with both realists and anti-realists in it may be better suited to advancing science. A case study of the Copernican Revolution in astronomy supports this claim. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Methodology for evaluation of diagnostic performance

    International Nuclear Information System (INIS)

    Metz, C.E.

    1992-01-01

    Effort in this project during the past year has focused on the development, refinement, and distribution of computer software that will allow current Receiver Operating Characteristic (ROC) methodology to be used conveniently and reliably by investigators in a variety of evaluation tasks in diagnostic medicine; and on the development of new ROC methodology that will broaden the spectrum of evaluation tasks and/or experimental settings to which the fundamental approach can be applied. Progress has been limited by the amount of financial support made available to the project

  2. A Methodological Approach to Support Collaborative Media Creation in an E-Learning Higher Education Context

    Science.gov (United States)

    Ornellas, Adriana; Muñoz Carril, Pablo César

    2014-01-01

    This article outlines a methodological approach to the creation, production and dissemination of online collaborative audio-visual projects, using new social learning technologies and open-source video tools, which can be applied to any e-learning environment in higher education. The methodology was developed and used to design a course in the…

  3. An ultrasonic methodology for muscle cross section measurement of support space flight

    Science.gov (United States)

    Hatfield, Thomas R.; Klaus, David M.; Simske, Steven J.

    2004-09-01

    The number one priority for any manned space mission is the health and safety of its crew. The study of the short and long term physiological effects on humans is paramount to ensuring crew health and mission success. One of the challenges associated in studying the physiological effects of space flight on humans, such as loss of bone and muscle mass, has been that of readily attaining the data needed to characterize the changes. The small sampling size of astronauts, together with the fact that most physiological data collection tends to be rather tedious, continues to hinder elucidation of the underlying mechanisms responsible for the observed changes that occur in space. Better characterization of the muscle loss experienced by astronauts requires that new technologies be implemented. To this end, we have begun to validate a 360° ultrasonic scanning methodology for muscle measurements and have performed empirical sampling of a limb surrogate for comparison. Ultrasonic wave propagation was simulated using 144 stations of rotated arm and calf MRI images. These simulations were intended to provide a preliminary check of the scanning methodology and data analysis before its implementation with hardware. Pulse-echo waveforms were processed for each rotation station to characterize fat, muscle, bone, and limb boundary interfaces. The percentage error between MRI reference values and calculated muscle areas, as determined from reflection points for calf and arm cross sections, was -2.179% and +2.129%, respectively. These successful simulations suggest that ultrasound pulse scanning can be used to effectively determine limb cross-sectional areas. Cross-sectional images of a limb surrogate were then used to simulate signal measurements at several rotation angles, with ultrasonic pulse-echo sampling performed experimentally at the same stations on the actual limb surrogate to corroborate the results. The objective of the surrogate sampling was to compare the signal

  4. Silicon quantum dots: surface matters

    Czech Academy of Sciences Publication Activity Database

    Dohnalová, K.; Gregorkiewicz, T.; Kůsová, Kateřina

    2014-01-01

    Roč. 26, č. 17 (2014), 1-28 ISSN 0953-8984 R&D Projects: GA ČR GPP204/12/P235 Institutional support: RVO:68378271 Keywords : silicon quantum dots * quantum dot * surface chemistry * quantum confinement Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 2.346, year: 2014

  5. ANALYSIS OF PARTICULATE ORGANIC MATTER IN HOLOCENE SEDIMENTS OF COASTAL PLAIN FROM PERO BEACH, CABO FRIO, RIO DE JANEIRO, BRAZIL

    Directory of Open Access Journals (Sweden)

    Taísa Camila Silveira de Souza

    2016-06-01

    Full Text Available The study of palynofacies along a core drilled on the coastal plain of Cabo Frio, State of Rio de Janeiro, was carried out in order to contribute to the knowledge of the paleoenvironmental evolution of the Pero Beach region. The ages obtained from 14C datings allowed to verify that the studied core records the past 6761 ± 130 yrs cal BP. Thirty samples were prepared by standard methodology for palynofacies. About three hundred particles of the particulate organic material was classified and recorded for each sample. Statistical methods were employed for the associations of particulate organic matter (R-mode cluster analysis and levels (samples; Q-mode cluster analysis analyzed along the core. Furthermore, the ratio Phytoclast - Total Organic Carbon (Phy-TOC was used to verify the proximity of the source area. The three major groups of particulate organic matter found along the studied core are Phytoclasts, Amorphous Organic Matter (AOM and Palynomorphs. The samples showed in general, a predominance of phytoclasts (73.2%, followed by AOM (18.6% and Palynomorphs (8.2%. Supported by statistical analysis, it was possible to deduce that the study area evolved since the middle Holocene from a marine environment to a paleolagoon.

  6. Challenges and Possibilities of Implementation of the Inverted Classroom Methodology: Case Study in a Private Higher Education Institution

    Directory of Open Access Journals (Sweden)

    Paulo Rodrigues Milhorato

    2016-12-01

    Full Text Available Change is a word which characterizes the transition of the industrial society to the knowledge one. In this sense it is necessary to reevaluate the educational processes as well as the needing of nowadays society. In such scenery, it appears as a possibility the model proposed by Eric Mazur, during the 90ties, called inverted classroom. This model uses technology as a support to teaching. This article aims to describe the effects of the inverted classroom methodology in a Private IES. The methodology was based on a qualitative and quantitative research and used questionnaires available in google docs and/or printed matter, applied to the students of the Faculdade Pitágoras located in the Metropolitan area of Belo Horizonte, and also in interviews with teachers and a participating observation. Results show that the students’ profile is favorable to the use of inverted classroom model: they are young and in permanent contact with technology. Nevertheless, their daily routine, a deficient formation basis and the necessity to work in order to bear the cost of a private IES make quite complex the application of this model.

  7. Methodological issues involved in conducting qualitative research on support for nurses directly involved with women who chose to terminate their pregnancy

    Directory of Open Access Journals (Sweden)

    Antoinette Gmeiner

    2001-11-01

    Full Text Available The purpose of this article is to describe the methodological issues involved in conducting qualitative research to explore and describe nurses’ experience of being directly involved with termination of pregnancies and developing guidelines for support for these nurses. Opsomming Die doel van hierdie artikel is om die metodologiese vraagstukke te beskryf rondom die uitvoer van kwalitatiewe navorsing waar verpleegkundiges se ervaring van hul direkte betrokkenheid by terminasie van swangerskap verken en beskryf is. *Please note: This is a reduced version of the abstract. Please refer to PDF for full text.

  8. Exploring the multiple-hit hypothesis of preterm white matter damage using diffusion MRI

    Directory of Open Access Journals (Sweden)

    Madeleine L. Barnett

    2018-01-01

    Conclusion: This study suggests multiple perinatal risk factors have an independent association with diffuse white matter injury at term equivalent age and exposure to multiple perinatal risk factors exacerbates dMRI defined, clinically significant white matter injury. Our findings support the multiple hit hypothesis for preterm white matter injury.

  9. Modified dark matter: Relating dark energy, dark matter and baryonic matter

    Science.gov (United States)

    Edmonds, Douglas; Farrah, Duncan; Minic, Djordje; Ng, Y. Jack; Takeuchi, Tatsu

    Modified dark matter (MDM) is a phenomenological model of dark matter, inspired by gravitational thermodynamics. For an accelerating universe with positive cosmological constant (Λ), such phenomenological considerations lead to the emergence of a critical acceleration parameter related to Λ. Such a critical acceleration is an effective phenomenological manifestation of MDM, and it is found in correlations between dark matter and baryonic matter in galaxy rotation curves. The resulting MDM mass profiles, which are sensitive to Λ, are consistent with observational data at both the galactic and cluster scales. In particular, the same critical acceleration appears both in the galactic and cluster data fits based on MDM. Furthermore, using some robust qualitative arguments, MDM appears to work well on cosmological scales, even though quantitative studies are still lacking. Finally, we comment on certain nonlocal aspects of the quanta of modified dark matter, which may lead to novel nonparticle phenomenology and which may explain why, so far, dark matter detection experiments have failed to detect dark matter particles.

  10. Introducing an ILS methodology into research reactors

    International Nuclear Information System (INIS)

    Lorenzo, N. de; Borsani, R.C.

    2003-01-01

    Integrated Logistics Support (ILS) is the managerial organisation that co-ordinates the activities of many disciplines to develop the supporting resources (training, staffing, designing aids, equipment removal routes, etc) required by technologically complex systems. The application of an ILS methodology in defence projects is described in several places, but it is infrequently illustrated for other areas; therefore the present paper deals with applying this approach to research reactors under design or already in operation. Although better results are obtained when applied since the very beginning of a project, it can be applied successfully in facilities already in operation to improve their capability in a cost-effective way. In applying this methodology, the key objectives shall be previously identified in order to tailor the whole approach. Generally in high power multipurpose reactors, obtaining maximum profit at the lowest possible cost without reducing the safety levels are key issues, while in others the goal is to minimise drawbacks like spurious shutdowns, low quality experimental results or even to reduce staff dose to ALARA values. These items need to be quantified for establishing a system status base line in order to trace the process evolution. Thereafter, specific logistics analyses should be performed in the different areas composing the system. RAMS (Reliability, Availability, Maintainability and Supportability), Manning, Training Needs, Supplying Needs are some examples of these special logistic assessments. The following paragraphs summarise the different areas, encompassed by this ILS methodology. Plant design is influenced focussing the designers? attention on the objectives already identified. Careful design reviews are performed only in an early design stage, being useless a later application. In this paper is presented a methodology including appropriate tools for ensuring the designers abide to ILS issues and key objectives through the

  11. Wigner's infinite spin representations and inert matter

    Energy Technology Data Exchange (ETDEWEB)

    Schroer, Bert [CBPF, Rio de Janeiro (Brazil); Institut fuer Theoretische Physik FU-Berlin, Berlin (Germany)

    2017-06-15

    Positive energy ray representations of the Poincare group are naturally subdivided into three classes according to their mass and spin content: m > 0, m = 0 finite helicity and m = 0 infinite spin. For a long time the localization properties of the massless infinite spin class remained unknown, until it became clear that such matter does not permit compact spacetime localization and its generating covariant fields are localized on semi-infinite space-like strings. Using a new perturbation theory for higher spin fields we present arguments which support the idea that infinite spin matter cannot interact with normal matter and we formulate conditions under which this also could happen for finite spin s > 1 fields. This raises the question of a possible connection between inert matter and dark matter. (orig.)

  12. International Conference on Polarised Neutrons for Condensed Matter Investigations (PNCMI 2016)

    International Nuclear Information System (INIS)

    2017-01-01

    The present volume of the Journal of Physics: Conference Series represents Proceedings of the 11th International Conference on Polarised Neutrons for Condensed Matter Investigation (PNCMI) held in Freising, Germany from July 4–7, 2016. The conference attended by more than 120 scientists from various academic, government, and industrial institutions in Europe, Asia and the Americas was organized by the Jülich Centre for Neutron Science of the Forschungszentrum Jülich. The PNCMI-2016 continuoued the successful previous conferences in this series covering the latest condensed matter investigations using polarised neutrons and state-of-the-art methodologies, from effective polarization of neutron beams to wide-angle polarization analysis, as well as applications for novel instrumentation and experiments, with emphasis on prospects for new science and new instrument concepts. The conference program included invited and contributed oral presentations and posters which demonstrated the activities using polarized neutrons all over the world and showed the deep interest in developing the topic. The presentations tackled all area of science including multiferroic and chirality, strongly correlated electron systems, superconductors, frustrated and disordered systems, magnetic nanomaterials, thin films and multilayers, soft matter and biology, imaging, as well as further developments in polarized neutron techniques and methods, including nuclear polarisation, Larmor techniques and depolarisation methods.. We would like to thank all speakers for their presentations and all attendees for their participation. We would also like to gratefully acknowledge the financial support by J-PARC and AIRBUS DS as Premium Sponsors and Swiss Neutronics, ISIS, LLB, PSI and Mirrotron as Standard Sponsors of this conference. The next PNCMI will take place in Great Britain in 2018 and will be organized by ISIS. Alexander Ioffe (Conference Chair) Thomas Gutberlet (Conference Secretary) (paper)

  13. DTI and VBM reveal white matter changes without associated gray matter changes in patients with idiopathic restless legs syndrome

    Science.gov (United States)

    Belke, Marcus; Heverhagen, Johannes T; Keil, Boris; Rosenow, Felix; Oertel, Wolfgang H; Stiasny-Kolster, Karin; Knake, Susanne; Menzler, Katja

    2015-01-01

    Background and Purpose We evaluated cerebral white and gray matter changes in patients with iRLS in order to shed light on the pathophysiology of this disease. Methods Twelve patients with iRLS were compared to 12 age- and sex-matched controls using whole-head diffusion tensor imaging (DTI) and voxel-based morphometry (VBM) techniques. Evaluation of the DTI scans included the voxelwise analysis of the fractional anisotropy (FA), radial diffusivity (RD), and axial diffusivity (AD). Results Diffusion tensor imaging revealed areas of altered FA in subcortical white matter bilaterally, mainly in temporal regions as well as in the right internal capsule, the pons, and the right cerebellum. These changes overlapped with changes in RD. Voxel-based morphometry did not reveal any gray matter alterations. Conclusions We showed altered diffusion properties in several white matter regions in patients with iRLS. White matter changes could mainly be attributed to changes in RD, a parameter thought to reflect altered myelination. Areas with altered white matter microstructure included areas in the internal capsule which include the corticospinal tract to the lower limbs, thereby supporting studies that suggest changes in sensorimotor pathways associated with RLS. PMID:26442748

  14. Methodology for assessing the concentrations of the primary marine aerosol in coastal areas

    International Nuclear Information System (INIS)

    Barsanti, P.; Briganti, G.; Cappelletti, A.; Marri, P.

    2009-01-01

    European and Italian regulations (DM 60/2002) fix for atmospheric particulate matter PM10 the threshold of 50 μg/m 3 as limit not to be exceeded more than 35 times per year (24 hour mean); unfortunately, such prescriptions do not distinguish anthropic contributions from natural ones (sea salt, Saharan sand, coastal erosion, volcanic ashes, etc.). The aim of this study is to set up a methodology in order to estimate sea salt emissions, both from open sea and surf zone, and to model atmospheric dispersion of marine aerosols. The proposed methodology, applied to the coastal zone between Massa Carrara and Viareggio (Tuscany, Italy), shows specific open sea emissions are generally very low in comparison with the surf zone ones: they are not negligible only with strong winds, but such meteorological conditions are neither persistent nor very frequent in the selected area. On the contrary, sea surf contributions are much more strong (at least 1 order of magnitude), peak-shaped and persistent then the first ones, and can lead to high PM10 concentration fields up to few kilometres inland. The comparison between model outputs and observations, in two points placed at 2000 and 4000 m from the shoreline, has shown an amount of sea salt in total PM10 even greater then 70% in mass. The existence of a surf zone, which can persist many hours or days even after a storm, can produce both elevated PM concentrations and gradients, mainly for light winds perpendicular to the shoreline. This work, supported by MINNI Project (www.minni.org), is suitable for other coastal areas as well and it is aimed to furnish an overview of marine particulate production and atmospheric dispersion processes; it is the starting point of an experimental investigation program, supported by institutional air quality authorities [it

  15. Administration, Administration of the knowledge and methodological work in the Complex Educational Disciplines

    Directory of Open Access Journals (Sweden)

    Tania Alina Mena-Silva

    2014-06-01

    Full Text Available Today the educational disciplines have evolved to a superior and more complex stadium that has been materialized in that converge in a discipline and relate diverse study modalities, professors' type and teaching matters, that makes more complex the work of the professor when having to prepare metodológicamente the subject in more than an occasion, it is materialized this way the Complex Educational Discipline. The administration and the administration of the knowledge are constituted in essential elements for the realization of the methodological work with the community of professors, starting from a process of methodological preparation centered in the professor.

  16. Identification and Characterization of Particulate Matter Concentrations at Construction Jobsites

    Directory of Open Access Journals (Sweden)

    Ingrid P. S. Araújo

    2014-11-01

    Full Text Available The identification and characterization of particulate matter (PM concentrations from construction site activities pose major challenges due to the diverse characteristics related to different aspects, such as concentration, particle size and particle composition. Moreover, the characterization of particulate matter is influenced by meteorological conditions, including temperature, humidity, rainfall and wind speed. This paper is part of a broader investigation that aims to develop a methodology for assessing the environmental impacts caused by the PM emissions that arise from construction activities. The objective of this paper is to identify and characterize the PM emissions on a construction site with different aerodynamic diameters (PM2.5, PM10, total suspended particulates (TSP, based on an exploratory study. Initially, a protocol was developed to standardize the construction site selection criteria, laboratory procedures, field sample collection and laboratory analysis. This protocol was applied on a multifamily residential building construction site during three different construction phases (earthworks, superstructure and finishings aimed at measuring and monitoring PM concentrations arising from construction activities. The particulate matter was characterized in different particle sizes. Results showed that the higher TSP emissions arising from construction activities provoked environmental impacts. Some limitations to the results were identified, especially with regards the need for a detailed investigation about the influence of different construction phases on PM emissions. The findings provided significant knowledge about various situations, serving as a basis for improving the existing methodology for particulate material collection on construction sites and the development of future studies on the specific construction site phases.

  17. Application and licensing requirements of the Framatome ANP RLBLOCA methodology

    International Nuclear Information System (INIS)

    Martin, R.P.; Dunn, B.M.

    2004-01-01

    The Framatome ANP Realistic Large-Break LOCA methodology (FANP RLBLOCA) is an analysis approach approved by the US NRC for supporting the licensing basis of 3- and 4-loop Westinghouse PWRs and CE 2x4 PWRs. It was developed consistent with the NRC's Code Scaling, Applicability, and Uncertainty (CSAU) methodology for performing best-estimate large-break LOCAs. The CSAU methodology consists of three key elements with the second and third element addressing uncertainty identification and application. Unique to the CSAU methodology is the use of engineering judgment and the Process Identification and Ranking Table (PIRT) defined in the first element to lay the groundwork for achieving the ultimate goal of quantifying the total uncertainty in predicted measures of interest associated with the large-break LOCA. It is the PIRT that not only directs the methodology development, but also directs the methodology review. While the FANP RLBLOCA methodology was generically approved, a plant-specific application is customized in two ways addressing how the unique plant characterization 1) is translated to code input and 2) relates to the unique methodology licensing requirements. Related to the former, plants are required by 10 CFR 50.36 to define a technical specification limiting condition for operation based on the following criteria: 1. Installed instrumentation that is used in the control room to detect, and indicate, a significant abnormal degradation of the reactor coolant pressure boundary. 2. A process variable, design feature, or operating restriction that is an initial condition of a design basis accident or transient analysis that either assumes the failure of or presents a challenge to the integrity of a fission product barrier. 3. A structure, system, or component that is part of the primary success path and which functions or actuates to mitigate a design basis accident or transient that either assumes the failure of or presents a challenge to the integrity of a

  18. CATHARE code development and assessment methodologies

    International Nuclear Information System (INIS)

    Micaelli, J.C.; Barre, F.; Bestion, D.

    1995-01-01

    The CATHARE thermal-hydraulic code has been developed jointly by Commissariat a l'Energie Atomique (CEA), Electricite de France (EdF), and Framatorne for safety analysis. Since the beginning of the project (September 1979), development and assessment activities have followed a methodology supported by two series of experimental tests: separate effects tests and integral effects tests. The purpose of this paper is to describe this methodology, the code assessment status, and the evolution to take into account two new components of this program: the modeling of three-dimensional phenomena and the requirements of code uncertainty evaluation

  19. Enacting a Place-Responsive Research Methodology: Walking Interviews with Educators

    Science.gov (United States)

    Lynch, Jonathan; Mannion, Greg

    2016-01-01

    Place-based and place-responsive approaches to outdoor learning and education are developing in many countries but there is dearth of theoretically-supported methodologies to take a more explicit account of place in research in these areas. In response, this article outlines one theoretical framing for place-responsive methodologies for…

  20. Software engineering methodologies and tools

    Science.gov (United States)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  1. Physics through the 1990s: condensed-matter physics

    International Nuclear Information System (INIS)

    1986-01-01

    The volume presents the current status of condensed-matter physics from developments since the 1970s to opportunities in the 1990s. Topics include electronic structure, vibrational properties, critical phenomena and phase transitions, magnetism, semiconductors, defects and diffusion, surfaces and interfaces, low-temperature physics, liquid-state physics, polymers, nonlinear dynamics, instabilities, and chaos. Appendices cover the connections between condensed-matter physics and applications of national interest, new experimental techniques and materials, laser spectroscopy, and national facilities for condensed-matter physics research. The needs of the research community regarding support for individual researchers and for national facilities are presented, as are recommendations for improved government-academic-industrial relations

  2. Interacting dark matter disguised as warm dark matter

    International Nuclear Information System (INIS)

    Boehm, Celine; Riazuelo, Alain; Hansen, Steen H.; Schaeffer, Richard

    2002-01-01

    We explore some of the consequences of dark-matter-photon interactions on structure formation, focusing on the evolution of cosmological perturbations and performing both an analytical and a numerical study. We compute the cosmic microwave background anisotropies and matter power spectrum in this class of models. We find, as the main result, that when dark matter and photons are coupled, dark matter perturbations can experience a new damping regime in addition to the usual collisional Silk damping effect. Such dark matter particles (having quite large photon interactions) behave like cold dark matter or warm dark matter as far as the cosmic microwave background anisotropies or matter power spectrum are concerned, respectively. These dark-matter-photon interactions leave specific imprints at sufficiently small scales on both of these two spectra, which may allow us to put new constraints on the acceptable photon-dark-matter interactions. Under the conservative assumption that the abundance of 10 12 M · galaxies is correctly given by the cold dark matter, and without any knowledge of the abundance of smaller objects, we obtain the limit on the ratio of the dark-matter-photon cross section to the dark matter mass σ γ-DM /m DM -6 σ Th /(100 GeV)≅6x10 -33 cm 2 GeV -1

  3. Particle Dark Matter and DAMA/LIBRA

    International Nuclear Information System (INIS)

    Bernabei, R.; Nozzoli, F.; Belli, P.; Cappella, F.; D'Angelo, A.; Prosperi, D.; Cerulli, R.; Dai, C. J.; He, H. L.; Ma, X. H.; Sheng, X. D.; Wang, R. G.; Incicchitti, A.; Montecchia, F.; Ye, Z. P.

    2010-01-01

    The DAMA/LIBRA set-up (about 250 kg highly radiopure NaI(Tl) sensitive mass) is running at the Gran Sasso National Laboratory of the I.N.F.N.. The first DAMA/LIBRA results confirm the evidence for the presence of a Dark Matter particle component in the galactic halo, as pointed out by the former DAMA/NaI set-up; cumulatively the data support such evidence at 8.2 σ C.L. and satisfy all the many peculiarities of the Dark Matter annual modulation signature. The main aspects and prospects of this model independent experimental approach will be outlined.

  4. Investigating Safety, Safeguards and Security (3S) Synergies to Support Infrastructure Development and Risk-Informed Methodologies for 3S by Design

    International Nuclear Information System (INIS)

    Suzuki, M.; Izumi, Y.; Kimoto, T.; Naoi, Y.; Inoue, T.; Hoffheins, B.

    2010-01-01

    In 2008, Japan and other G8 countries pledged to support the Safeguards, Safety, and Security (3S) Initiative to raise awareness of 3S worldwide and to assist countries in setting up nuclear energy infrastructures that are essential cornerstones of a successful nuclear energy program. The goals of the 3S initiative are to ensure that countries already using nuclear energy or those planning to use nuclear energy are supported by strong national programs in safety, security, and safeguards not only for reliability and viability of the programs, but also to prove to the international audience that the programs are purely peaceful and that nuclear material is properly handled, accounted for, and protected. In support of this initiative, Japan Atomic Energy Agency (JAEA) has been conducting detailed analyses of the R and D programs and cultures of each of the 'S' areas to identify overlaps where synergism and efficiencies might be realized, to determine where there are gaps in the development of a mature 3S culture, and to coordinate efforts with other Japanese and international organizations. As an initial outcome of this study, incoming JAEA employees are being introduced to 3S as part of their induction training and the idea of a President's Award program is being evaluated. Furthermore, some overlaps in 3S missions might be exploited to share facility instrumentation as with Joint-Use-Equipment (JUE), in which cameras and radiation detectors, are shared by the State and IAEA. Lessons learned in these activities can be applied to developing more efficient and effective 3S infrastructures for incorporating into Safeguards by Design methodologies. They will also be useful in supporting human resources and technology development projects associated with Japan's planned nuclear security center for Asia, which was announced during the 2010 Nuclear Security Summit. In this presentation, a risk-informed approach regarding integration of 3S will be introduced. An initial

  5. Methodological Interactionism : Theory and Application to the Firm and to the Building of Trust

    NARCIS (Netherlands)

    Nooteboom, B.

    2007-01-01

    Recent insights from the ‘embodied cognition’ perspective in cognitive science, supported by neural research, provide a basis for a ‘methodological interactionism’ that transcends both the methodological individualism of economics and the methodological collectivism of (some) sociology, and is

  6. Is DevOps another Project Management Methodology?

    Directory of Open Access Journals (Sweden)

    Logica BANICA

    2017-01-01

    Full Text Available In this paper, the authors aim to present the concept of DevOps (Development & Operations, considering its degree of novelty in the area of project management. Firstly, the authors will bring theoretical arguments to support the idea that DevOps is an early-stage methodology, built on the Agile principles, but coming with its own contributions in project management for software development and implementation. Therefore, we believe that after a short time, DevOps will replace its predecessors. Secondly, we experienced this methodology by developing a small project in academic environment by three teams of master students, using VersionOne software. The Conclusions will emphasize the relevance and the future expected effects of DevOps methodology in the project management domain.

  7. Territory development as economic and geographical activity (theory, methodology, practice

    Directory of Open Access Journals (Sweden)

    Vitaliy Nikolaevich Lazhentsev

    2013-03-01

    Full Text Available Accents in a description of theory and methodology of territory development are displaced from distribution of the national benefits on formation of territorial natural and economic systems and organization of economical and geographical activity. The author reveals theconcept of «territory development» and reviews its placein thetheory and methodology of human geography and regionaleconomy. In the articletheindividual directions ofeconomic activity areconsidered. The author has made an attempt to definethesubject matter of five levels of «ideal» territorial and economic systems as a part of objects of the nature, societies, population settlement, production, infrastructure and management. The author’s position of interpretation of sequences of mechanisms of territory development working according to a Nested Doll principle (mechanism of economy, economic management mechanism, controlling mechanism of economy is presented. The author shows the indicators, which authentically define territory development

  8. Unstable gravitino dark matter and neutrino flux

    International Nuclear Information System (INIS)

    Covi, L.; Grefe, M.; Ibarra, A.; Tran, D.

    2008-09-01

    The gravitino is a promising supersymmetric dark matter candidate which does not require exact R-parity conservation. In fact, even with some small R-parity breaking, gravitinos are sufficiently long-lived to constitute the dark matter of the Universe, while yielding a cosmological scenario consistent with primordial nucleosynthesis and the high reheating temperature required for thermal leptogenesis. In this paper, we compute the neutrino flux from direct gravitino decay and gauge boson fragmentation in a simple scenario with bilinear R-parity breaking. Our choice of parameters is motivated by a proposed interpretation of anomalies in the extragalactic gamma-ray spectrum and the positron fraction in terms of gravitino dark matter decay. We find that the generated neutrino flux is compatible with present measurements. We also discuss the possibility of detecting these neutrinos in present and future experiments and conclude that it is a challenging task. However, if detected, this distinctive signal might bring significant support to the scenario of gravitinos as decaying dark matter. (orig.)

  9. Influence of natural organic matter (NOM) coatings on nanoparticle adsorption onto supported lipid bilayers.

    Science.gov (United States)

    Bo, Zhang; Avsar, Saziye Yorulmaz; Corliss, Michael K; Chung, Minsub; Cho, Nam-Joon

    2017-10-05

    As the worldwide usage of nanoparticles in commercial products continues to increase, there is growing concern about the environmental risks that nanoparticles pose to biological systems, including potential damage to cellular membranes. A detailed understanding of how different types of nanoparticles behave in environmentally relevant conditions is imperative for predicting and mitigating potential membrane-associated toxicities. Herein, we investigated the adsorption of two popular nanoparticles (silver and buckminsterfullerene) onto biomimetic supported lipid bilayers of varying membrane charge (positive and negative). The quartz crystal microbalance-dissipation (QCM-D) measurement technique was employed to track the adsorption kinetics. Particular attention was focused on understanding how natural organic matter (NOM) coatings affect nanoparticle-bilayer interactions. Both types of nanoparticles preferentially adsorbed onto the positively charged bilayers, although NOM coatings on the nanoparticle and lipid bilayer surfaces could either inhibit or promote adsorption in certain electrolyte conditions. While past findings showed that NOM coatings inhibit membrane adhesion, our findings demonstrate that the effects of NOM coatings are more nuanced depending on the type of nanoparticle and electrolyte condition. Taken together, the results demonstrate that NOM coatings can modulate the lipid membrane interactions of various nanoparticles, suggesting a possible way to improve the environmental safety of nanoparticles. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. A Cybernetic Design Methodology for 'Intelligent' Online Learning Support

    Science.gov (United States)

    Quinton, Stephen R.

    The World Wide Web (WWW) provides learners and knowledge workers convenient access to vast stores of information, so much that present methods for refinement of a query or search result are inadequate - there is far too much potentially useful material. The problem often encountered is that users usually do not recognise what may be useful until they have progressed some way through the discovery, learning, and knowledge acquisition process. Additional support is needed to structure and identify potentially relevant information, and to provide constructive feedback. In short, support for learning is needed. The learning envisioned here is not simply the capacity to recall facts or to recognise objects. The focus is on learning that results in the construction of knowledge. Although most online learning platforms are efficient at delivering information, most do not provide tools that support learning as envisaged in this chapter. It is conceivable that Web-based learning environments can incorporate software systems that assist learners to form new associations between concepts and synthesise information to create new knowledge. This chapter details the rationale and theory behind a research study that aims to evolve Web-based learning environments into 'intelligent thinking' systems that respond to natural language human input. Rather than functioning simply as a means of delivering information, it is argued that online learning solutions will 1 day interact directly with students to support their conceptual thinking and cognitive development.

  11. 42 CFR 493.649 - Methodology for determining fee amount.

    Science.gov (United States)

    2010-10-01

    ... fringe benefit costs to support the required number of State inspectors, management and direct support... full time equivalent employee. Included in this cost are salary and fringe benefit costs, necessary... 42 Public Health 5 2010-10-01 2010-10-01 false Methodology for determining fee amount. 493.649...

  12. Strength of Temporal White Matter Pathways Predicts Semantic Learning.

    Science.gov (United States)

    Ripollés, Pablo; Biel, Davina; Peñaloza, Claudia; Kaufmann, Jörn; Marco-Pallarés, Josep; Noesselt, Toemme; Rodríguez-Fornells, Antoni

    2017-11-15

    Learning the associations between words and meanings is a fundamental human ability. Although the language network is cortically well defined, the role of the white matter pathways supporting novel word-to-meaning mappings remains unclear. Here, by using contextual and cross-situational word learning, we tested whether learning the meaning of a new word is related to the integrity of the language-related white matter pathways in 40 adults (18 women). The arcuate, uncinate, inferior-fronto-occipital and inferior-longitudinal fasciculi were virtually dissected using manual and automatic deterministic fiber tracking. Critically, the automatic method allowed assessing the white matter microstructure along the tract. Results demonstrate that the microstructural properties of the left inferior-longitudinal fasciculus predict contextual learning, whereas the left uncinate was associated with cross-situational learning. In addition, we identified regions of special importance within these pathways: the posterior middle temporal gyrus, thought to serve as a lexical interface and specifically related to contextual learning; the anterior temporal lobe, known to be an amodal hub for semantic processing and related to cross-situational learning; and the white matter near the hippocampus, a structure fundamental for the initial stages of new-word learning and, remarkably, related to both types of word learning. No significant associations were found for the inferior-fronto-occipital fasciculus or the arcuate. While previous results suggest that learning new phonological word forms is mediated by the arcuate fasciculus, these findings show that the temporal pathways are the crucial neural substrate supporting one of the most striking human abilities: our capacity to identify correct associations between words and meanings under referential indeterminacy. SIGNIFICANCE STATEMENT The language-processing network is cortically (i.e., gray matter) well defined. However, the role of the

  13. Hanford Site Risk Assessment Methodology. Revision 3

    International Nuclear Information System (INIS)

    1995-05-01

    This methodology has been developed to prepare human health and ecological evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act of 1980 (CERCLA) remedial investigations (RI) and the Resource conservation and Recovery Act of 1976 (RCRA) facility investigations (FI) performed at the Hanford Site pursuant to the hanford Federal Facility Agreement and Consent Order (Ecology et al. 1994), referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies site-specific risk assessment considerations and integrates them with approaches for evaluating human and ecological risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site

  14. Some methodological issues in neuroradiological research in psychiatry

    International Nuclear Information System (INIS)

    Becker, T.; Retz, W.; Hofmann, E.; Becker, G.; Teichmann, E.; Gsell, W.

    1995-01-01

    An outline is given of some of the methodological issues discussed in neuroradiological research on psychiatric illnesses. The strengths and shortcomings of magnetic resonance imaging (MRI) in depicting and quantifying brain structures are described. Temporal lobe anatomy and pathology are easily accessible to MRI, whereas limits on anatomical delineation hamper approaches to frontal lobe study. White matter hyperintense lesions are sensitively depicted by MRI, but specificity is limited. The distinction of vascular and primary degenerative dementia is considerably improved by CT and MRI analysis. Computed tomography (CT) and MRI have enhanced the understanding of treatable organic psychiatric disorders, e.g., normal pressure hydrocephalus. Subcortical and white matter pathology has been replicated in CT and MRI studies of late onset psychiatric disorders, clinical overlap with cerebrovascular disease or neuro degeneration may be of import. Trans cranial sonography findings of brainstem structural change specific to unipolar depression may contribute to the understanding of affective psychoses. Magnetic resonance spectroscopy and functional MRI are likely to stimulate psychiatric research in the future. (author)

  15. Matter, dark matter, and anti-matter in search of the hidden universe

    CERN Document Server

    Mazure, Alain

    2012-01-01

    For over ten years, the dark side of the universe has been headline news. Detailed studies of the rotation of spiral galaxies, and 'mirages' created by clusters of galaxies bending the light from very remote objects, have convinced astronomers of the presence of large quantities of dark (unseen) matter in the cosmos. Moreover, in the 1990s, it was discovered that some four to five billion years ago the expansion of the universe entered a phase of acceleration. This implies the existence of dark energy. The nature of these 'dark; ingredients remains a mystery, but they seem to comprise about 95 percent of the matter/energy content of the universe. As for ordinary matter, although we are immersed in a sea of dark particles, including primordial neutrinos and photons from 'fossil' cosmological radiation, both we and our environment are made of ordinary, baryonic matter. Strangely, even if 15-20 percent of matter is baryonic matter, this represents only 4-5 percent of the total matter/energy content of the cosmos...

  16. Relationship between grey matter integrity and executive abilities in aging.

    Science.gov (United States)

    Manard, Marine; Bahri, Mohamed Ali; Salmon, Eric; Collette, Fabienne

    2016-07-01

    This cross-sectional study was designed to investigate grey matter changes that occur in healthy aging and the relationship between grey matter characteristics and executive functioning. Thirty-six young adults (18-30 years old) and 43 seniors (60-75 years old) were included. A general executive score was derived from a large battery of neuropsychological tests assessing three major aspects of executive functioning (inhibition, updating and shifting). Age-related grey matter changes were investigated by comparing young and older adults using voxel-based morphometry and voxel-based cortical thickness methods. A widespread difference in grey matter volume was found across many brain regions, whereas cortical thinning was mainly restricted to central areas. Multivariate analyses showed age-related changes in relatively similar brain regions to the respective univariate analyses but appeared more limited. Finally, in the older adult sample, a significant relationship between global executive performance and decreased grey matter volume in anterior (i.e. frontal, insular and cingulate cortex) but also some posterior brain areas (i.e. temporal and parietal cortices) as well as subcortical structures was observed. Results of this study highlight the distribution of age-related effects on grey matter volume and show that cortical atrophy does not appear primarily in "frontal" brain regions. From a cognitive viewpoint, age-related executive functioning seems to be related to grey matter volume but not to cortical thickness. Therefore, our results also highlight the influence of methodological aspects (from preprocessing to statistical analysis) on the pattern of results, which could explain the lack of consensus in literature. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. White matter cysts in patients with tuberous sclerosis

    International Nuclear Information System (INIS)

    Marti-Bonmati, L.; Dosda, R.; Menor, F.; Arana, E.; Poyatos, C.

    1999-01-01

    The presence of cysts in the white matter of the central nervous system of patients with tuberous sclerosis (TS) is an uncommon finding that has been reported only recently in neuroimaging studies. This article assesses the prevalence of these lesions in a large series of patients studied by magnetic resonance imaging (MRI) and their relationship to other epidemiological and imaging findings. MRI studies were performed in 46 patients (23 males and 23 females) with a mean age of 12.7 years, and the results were examined retrospectively in the search for cortical tubers, subependymal nodules and white matter nodules, lines and cysts. Nine patients (19.6%) presented cysts in white matter. Seven had only one cyst and the remaining two patients each had two. Multiple regression analysis relating the presence of the cysts with other neuroimaging findings in these patients revealed a statistically significant relationship only with white matter nodules (odds ratio: 7.5; p=0.006). White matter cysts are small, supratentorial lesions of deep location. There is a statistically relationship between the presence of these cysts and that of nodular lesions in the white matter. This finding supports the theory that the cyst originate from white matter nodules. (Author) 17 refs

  18. Dissipative dark matter halos: The steady state solution

    Science.gov (United States)

    Foot, R.

    2018-02-01

    Dissipative dark matter, where dark matter particle properties closely resemble familiar baryonic matter, is considered. Mirror dark matter, which arises from an isomorphic hidden sector, is a specific and theoretically constrained scenario. Other possibilities include models with more generic hidden sectors that contain massless dark photons [unbroken U (1 ) gauge interactions]. Such dark matter not only features dissipative cooling processes but also is assumed to have nontrivial heating sourced by ordinary supernovae (facilitated by the kinetic mixing interaction). The dynamics of dissipative dark matter halos around rotationally supported galaxies, influenced by heating as well as cooling processes, can be modeled by fluid equations. For a sufficiently isolated galaxy with a stable star formation rate, the dissipative dark matter halos are expected to evolve to a steady state configuration which is in hydrostatic equilibrium and where heating and cooling rates locally balance. Here, we take into account the major cooling and heating processes, and numerically solve for the steady state solution under the assumptions of spherical symmetry, negligible dark magnetic fields, and that supernova sourced energy is transported to the halo via dark radiation. For the parameters considered, and assumptions made, we were unable to find a physically realistic solution for the constrained case of mirror dark matter halos. Halo cooling generally exceeds heating at realistic halo mass densities. This problem can be rectified in more generic dissipative dark matter models, and we discuss a specific example in some detail.

  19. Quantifying Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    Quantifying Matter explains how scientists learned to measure matter and quantify some of its most fascinating and useful properties. It presents many of the most important intellectual achievements and technical developments that led to the scientific interpretation of substance. Complete with full-color photographs, this exciting new volume describes the basic characteristics and properties of matter. Chapters include:. -Exploring the Nature of Matter. -The Origin of Matter. -The Search for Substance. -Quantifying Matter During the Scientific Revolution. -Understanding Matter's Electromagnet

  20. International Expert Review of Sr-Can: Safety Assessment Methodology - External review contribution in support of SSI's and SKI's review of SR-Can

    International Nuclear Information System (INIS)

    Sagar, Budhi; Egan, Michael; Roehlig, Klaus-Juergen; Chapman, Neil; Wilmot, Roger

    2008-03-01

    In 2006, SKB published a safety assessment (SR-Can) as part of its work to support a licence application for the construction of a final repository for spent nuclear fuel. The purposes of the SR-Can project were stated in the main project report to be: 1. To make a first assessment of the safety of potential KBS-3 repositories at Forsmark and Laxemar to dispose of canisters as specified in the application for the encapsulation plant. 2. To provide feedback to design development, to SKB's research and development (R and D) programme, to further site investigations and to future safety assessments. 3. To foster a dialogue with the authorities that oversee SKB's activities, i.e. the Swedish Nuclear Power Inspectorate, SKI, and the Swedish Radiation Protection Authority, SSI, regarding interpretation of applicable regulations, as a preparation for the SR-Site project. To help inform their review of SKB's proposed approach to development of the longterm safety case, the authorities appointed three international expert review teams to carry out a review of SKB's SR-Can safety assessment report. Comments from one of these teams - the Safety Assessment Methodology (SAM) review team - are presented in this document. The SAM review team's scope of work included an examination of SKB's documentation of the assessment ('Long-term safety for KBS-3 Repositories at Forsmark and Laxemar - a first evaluation' and several supporting reports) and hearings with SKB staff and contractors, held in March 2007. As directed by SKI and SSI, the SAM review team focused on methodological aspects and sought to determine whether SKB's proposed safety assessment methodology is likely to be suitable for use in the future SR-Site and to assess its consistency with the Swedish regulatory framework. No specific evaluation of long-term safety or site acceptability was undertaken by any of the review teams. SKI and SSI's Terms of Reference for the SAM review team requested that consideration be given

  1. Spaceflight Effect on White Matter Structural Integrity

    Science.gov (United States)

    Lee, Jessica K.; Kopplemans, Vincent; Paternack, Ofer; Bloomberg, Jacob J.; Mulavara, Ajitkumar P.; Seidler, Rachael D.

    2017-01-01

    Recent reports of elevated brain white matter hyperintensity (WMH) counts and volume in postflight astronaut MRIs suggest that further examination of spaceflight's impact on the microstructure of brain white matter is warranted. To this end, retrospective longitudinal diffusion-weighted MRI scans obtained from 15 astronauts were evaluated. In light of the recent reports of microgravity-induced cephalad fluid shift and gray matter atrophy seen in astronauts, we applied a technique to estimate diffusion tensor imaging (DTI) metrics corrected for free water contamination. This approach enabled the analysis of white matter tissue-specific alterations that are unrelated to fluid shifts, occurring from before spaceflight to after landing. After spaceflight, decreased fractional anisotropy (FA) values were detected in an area encompassing the superior and inferior longitudinal fasciculi and the inferior fronto-occipital fasciculus. Increased radial diffusivity (RD) and decreased axial diffusivity (AD) were also detected within overlapping regions. In addition, FA values in the corticospinal tract decreased and RD measures in the precentral gyrus white matter increased from before to after flight. The results show disrupted structural connectivity of white matter in tracts involved in visuospatial processing, vestibular function, and movement control as a result of spaceflight. The findings may help us understand the structural underpinnings of the extensive spaceflight-induced sensorimotor remodeling. Prospective longitudinal assessment of the white matter integrity in astronauts is needed to characterize the evolution of white matter microstructural changes associated with spaceflight, their behavioral consequences, and the time course of recovery. Supported by a grant from the National Space Biomedical Research Institute, NASA NCC 9-58.

  2. Hanford Site baseline risk assessment methodology. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    1993-03-01

    This methodology has been developed to prepare human health and environmental evaluations of risk as part of the Comprehensive Environmental Response, Compensation, and Liability Act remedial investigations (RIs) and the Resource Conservation and Recovery Act facility investigations (FIs) performed at the Hanford Site pursuant to the Hanford Federal Facility Agreement and Consent Order referred to as the Tri-Party Agreement. Development of the methodology has been undertaken so that Hanford Site risk assessments are consistent with current regulations and guidance, while providing direction on flexible, ambiguous, or undefined aspects of the guidance. The methodology identifies Site-specific risk assessment considerations and integrates them with approaches for evaluating human and environmental risk that can be factored into the risk assessment program supporting the Hanford Site cleanup mission. Consequently, the methodology will enhance the preparation and review of individual risk assessments at the Hanford Site.

  3. Improvement in Product Development: Use of back-end data to support upstream efforts of Robust Design Methodology

    Directory of Open Access Journals (Sweden)

    Vanajah Siva

    2012-12-01

    Full Text Available In the area of Robust Design Methodology (RDM less is done on how to use and work with data from the back-end of the product development process to support upstream improvement. The purpose of this paper is to suggest RDM practices for the use of customer claims data in early design phases as a basis for improvements. The back-end data, when systematically analyzed and fed back into the product development process, aids in closing the product development loop from claims to improvement in the design phase. This is proposed through a flow of claims data analysis tied to an existing tool, namely Failure Mode and Effects Analysis (FMEA. The systematic and integrated analysis of back-end data is suggested as an upstream effort of RDM to increase understanding of noise factors during product usage based on the feedback of claims data to FMEA and to address continuous improvement in product development.

  4. A White Paper on keV sterile neutrino Dark Matter

    Czech Academy of Sciences Publication Activity Database

    Adhikari, R.; Agostini, M.; Ky, N. A.; Araki, T.; Archidiacono, M.; Bahr, M.; Baur, J.; Dragoun, Otokar; Vénos, Drahoslav; Zuber, K.

    2017-01-01

    Roč. 2017, č. 1 (2017), č. článku 025. ISSN 1475-7516 R&D Projects: GA ČR(CZ) GAP203/12/1896 Institutional support: RVO:61389005 Keywords : cosmological neutrinos * dark matter experiments * dark matter theory * particle physics - cosmology connection Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics OBOR OECD: Astronomy (including astrophysics,space science) Impact factor: 4.734, year: 2016

  5. Dark Matter

    Directory of Open Access Journals (Sweden)

    Einasto J.

    2011-06-01

    Full Text Available I give a review of the development of the concept of dark matter. The dark matter story passed through several stages from a minor observational puzzle to a major challenge for theory of elementary particles. Modern data suggest that dark matter is the dominant matter component in the Universe, and that it consists of some unknown non-baryonic particles. Dark matter is the dominant matter component in the Universe, thus properties of dark matter particles determine the structure of the cosmic web.

  6. Synthesis of methodology development and case studies

    OpenAIRE

    Roetter, R.P.; Keulen, van, H.; Laar, van, H.H.

    2000-01-01

    The .Systems Research Network for Ecoregional Land Use Planning in Support of Natural Resource Management in Tropical Asia (SysNet). was financed under the Ecoregional Fund, administered by the International Service for National Agricultural Research (ISNAR). The objective of the project was to develop and evaluate methodologies and tools for land use analysis, and apply them at the subnational scale to support agricultural and environmental policy formulation. In the framework of this projec...

  7. White matter abnormalities in tuberous sclerosis complex

    Energy Technology Data Exchange (ETDEWEB)

    Griffiths, P.D. [Sheffield Univ. (United Kingdom). Academic Dept. of Radiology; Bolton, P. [Cambridge Univ. (United Kingdom). Section of Developmental Psychiatry; Verity, C. [Addenbrooke`s NHS Trust, Cambridge (United Kingdom). Dept. of Paediatric Radiology

    1998-09-01

    The aim of this study was to investigate and describe the range of white matter abnormalities in children with tuberous sclerosis complex by means of MR imaging. Material and Methods: A retrospective cross-sectional study was performed on the basis of MR imaging findings in 20 cases of tuberous sclerosis complex in children aged 17 years or younger. Results: White matter abnormalities were present in 19/20 (95%) cases of tuberous sclerosis complex. These were most frequently (19/20 cases) found in relation to cortical tubers in the supratentorial compartment. White matter abnormalities related to tubers were found in the cerebellum in 3/20 (15%) cases. White matter abnormalities described as radial migration lines were found in relation to 5 tubers in 3 (15%) children. In 4/20 (20%) cases, white matter abnormalities were found that were not related to cortical tubers. These areas had the appearance of white matter cysts in 3 cases and infarction in the fourth. In the latter case there was a definable event in the clinical history, supporting the diagnosis of stroke. Conclusion: A range of white matter abnormalities were found by MR imaging in tuberous sclerosis complex, the commonest being gliosis and hypomyelination related to cortical tubers. Radial migration lines were seen infrequently in relation to cortical tubers and these are thought to represent heterotopic glia and neurons along the expected path of cortical migration. (orig.)

  8. Neutrino-Pair Exchange Long-Range Force Between Aggregate Matter

    OpenAIRE

    Segarra, A.

    2016-01-01

    We study the long-range force arising between two neutral---of electric charge---aggregates of matter due to a neutrino-pair exchange, in the limit of zero neutrino mass. The conceptual basis for the construction of the effective potential comes from the coherent scattering amplitude at low values of t. This amplitude is obtained using the methodology of an unsubtracted dispersion relation in t at threshold for s, where (s, t) are the Lorentz invariant scattering variables. The ultraviolet be...

  9. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents. Final report

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    Full text: The objective of this report is to demonstrate the use of a methodology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all non-dominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer package has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination and the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is the final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN. (author)

  10. Mirror matter as self-interacting dark matter

    International Nuclear Information System (INIS)

    Mohapatra, R.N.; Nussinov, S.; Teplitz, V.L.

    2002-01-01

    It has been argued that the observed core density profile of galaxies is inconsistent with having a dark matter particle that is collisionless and that alternative dark matter candidates which are self-interacting may explain observations better. One new class of self-interacting dark matter that has been proposed in the context of mirror universe models of particle physics is the mirror hydrogen atom, whose stability is guaranteed by the conservation of mirror baryon number. We show that the effective transport cross section for mirror hydrogen atoms has the right order of magnitude for solving the 'cuspy' halo problem. Furthermore, the suppression of dissipation effects for mirror atoms due to a higher mirror mass scale prevents the mirror halo matter from collapsing into a disk, strengthening the argument for mirror matter as galactic dark matter

  11. Physics through the 1990s: Condensed-matter physics

    International Nuclear Information System (INIS)

    1986-01-01

    In this survey of condensed-matter physics we describe the current status of the field, present some of the significant discoveries and developments in it since the early 1970s, and indicate some areas in which we expect that important discoveries will be made in the next decade. We also describe the resources that will be required to produce these discoveries. This volume is organized as follows. The first part is devoted to a discussion of the importance of condensed-matter physics; to brief descriptions of several of the most significant discoveries and advances in condensed-matter physics made in the 1970s and early 1980s, and of areas that appear to provide particularly exciting research opportunities in the next decade; and to a presentation of the support needs of condensed-matter physicists in the next decade and of recommendations aimed at their provision. Next, the subfields of condensed-matter physics are reviewed in detail. The volume concludes with several appendixes in which new materials, new experimental techniques, and the National Facilities are reviewed

  12. A game-based decision support methodology for competitive systems design

    Science.gov (United States)

    Briceno, Simon Ignacio

    This dissertation describes the development of a game-based methodology that facilitates the exploration and selection of research and development (R&D) projects under uncertain competitive scenarios. The proposed method provides an approach that analyzes competitor positioning and formulates response strategies to forecast the impact of technical design choices on a project's market performance. A critical decision in the conceptual design phase of propulsion systems is the selection of the best architecture, centerline, core size, and technology portfolio. This selection can be challenging when considering evolving requirements from both the airframe manufacturing company and the airlines in the market. Furthermore, the exceedingly high cost of core architecture development and its associated risk makes this strategic architecture decision the most important one for an engine company. Traditional conceptual design processes emphasize performance and affordability as their main objectives. These areas alone however, do not provide decision-makers with enough information as to how successful their engine will be in a competitive market. A key objective of this research is to examine how firm characteristics such as their relative differences in completing R&D projects, differences in the degree of substitutability between different project types, and first/second-mover advantages affect their product development strategies. Several quantitative methods are investigated that analyze business and engineering strategies concurrently. In particular, formulations based on the well-established mathematical field of game theory are introduced to obtain insights into the project selection problem. The use of game theory is explored in this research as a method to assist the selection process of R&D projects in the presence of imperfect market information. The proposed methodology focuses on two influential factors: the schedule uncertainty of project completion times and

  13. Methodology for Environmental Impact Assessment; Metodik foer miljoekonsekvensbedoemning

    Energy Technology Data Exchange (ETDEWEB)

    Malmlund, Anna (Structor Miljoebyraan Stockholm AB (Sweden))

    2010-12-15

    This report is an appendix to 'Environmental Impact Assessment Interim storage, encapsulation and disposal of spent nuclear fuel'. The appendix presents the methodology and criteria used in support investigations to conduct impact assessments.

  14. A state-impact-state methodology for assessing environmental impact in land use planning

    International Nuclear Information System (INIS)

    Chen, Longgao; Yang, Xiaoyan; Chen, Longqian; Potter, Rebecca; Li, Yingkui

    2014-01-01

    The implementation of land use planning (LUP) has a large impact on environmental quality. There lacks a widely accepted and consolidated approach to assess the LUP environmental impact using Strategic Environmental Assessment (SEA). In this paper, we developed a state-impact-state (SIS) model employed in the LUP environmental impact assessment (LUPEA). With the usage of Matter-element (ME) and Extenics method, the methodology based on the SIS model was established and applied in the LUPEA of Zoucheng County, China. The results show that: (1) this methodology provides an intuitive and easy understanding logical model for both the theoretical analysis and application of LUPEA; (2) the spatial multi-temporal assessment from base year, near-future year to planning target year suggests the positive impact on the environmental quality in the whole County despite certain environmental degradation in some towns; (3) besides the spatial assessment, other achievements including the environmental elements influenced by land use and their weights, the identification of key indicators in LUPEA, and the appropriate environmental mitigation measures were obtained; and (4) this methodology can be used to achieve multi-temporal assessment of LUP environmental impact of County or Town level in other areas. - Highlights: • A State-Impact-State model for Land Use Planning Environmental Assessment (LUPEA). • Matter-element (ME) and Extenics methods were embedded in the LUPEA. • The model was applied to the LUPEA of Zoucheng County. • The assessment shows improving environment quality since 2000 in Zoucheng County. • The method provides a useful tool for the LUPEA in the county level

  15. A state-impact-state methodology for assessing environmental impact in land use planning

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Longgao [Institute of land resources, Jiangsu Normal University, Xuzhou 221116 (China); Yang, Xiaoyan [Institute of land resources, Jiangsu Normal University, Xuzhou 221116 (China); School of Environmental Science and Spatial Informatics, China University of Mining and Technology, Xuzhou 221116 (China); Chen, Longqian, E-mail: cumt_chenlongqian@163.com [School of Environmental Science and Spatial Informatics, China University of Mining and Technology, Xuzhou 221116 (China); Potter, Rebecca; Li, Yingkui [Department of Geography, University of Tennessee, Knoxville, TN 37996 (United States)

    2014-04-01

    The implementation of land use planning (LUP) has a large impact on environmental quality. There lacks a widely accepted and consolidated approach to assess the LUP environmental impact using Strategic Environmental Assessment (SEA). In this paper, we developed a state-impact-state (SIS) model employed in the LUP environmental impact assessment (LUPEA). With the usage of Matter-element (ME) and Extenics method, the methodology based on the SIS model was established and applied in the LUPEA of Zoucheng County, China. The results show that: (1) this methodology provides an intuitive and easy understanding logical model for both the theoretical analysis and application of LUPEA; (2) the spatial multi-temporal assessment from base year, near-future year to planning target year suggests the positive impact on the environmental quality in the whole County despite certain environmental degradation in some towns; (3) besides the spatial assessment, other achievements including the environmental elements influenced by land use and their weights, the identification of key indicators in LUPEA, and the appropriate environmental mitigation measures were obtained; and (4) this methodology can be used to achieve multi-temporal assessment of LUP environmental impact of County or Town level in other areas. - Highlights: • A State-Impact-State model for Land Use Planning Environmental Assessment (LUPEA). • Matter-element (ME) and Extenics methods were embedded in the LUPEA. • The model was applied to the LUPEA of Zoucheng County. • The assessment shows improving environment quality since 2000 in Zoucheng County. • The method provides a useful tool for the LUPEA in the county level.

  16. Effects of support masses on seismic response of piping and supports

    International Nuclear Information System (INIS)

    Iotti, R.C.; Dinkevich, S.

    1985-01-01

    A special methodology is presented for quantitatively predicting when the effect of piping restraint masses is significant and should be explicitly considered in piping seismic analyses which use the response spectrum method. It is concluded that the effect of support mass in the unrestrained direction is to increase piping and support responses by a percentage no larger than twice the ratio of the support to the pipe-supported span mass. In the restrained direction the mass of the support significantly reduces its dynamic stiffness so that for low support stiffnesses and relatively large mass the support can act as an amplifier of vibration. The dynamic effect, however, is negligible for very stiff supports. (orig.)

  17. D Integrated Methodologies for the Documentation and the Virtual Reconstruction of AN Archaeological Site

    Science.gov (United States)

    Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.

    2015-02-01

    Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric

  18. Wormhole supported by dark energy admitting conformal motion

    Energy Technology Data Exchange (ETDEWEB)

    Bhar, Piyali [Government General Degree College, Singur, Department of Mathematics, Hooghly, West Bengal (India); Rahaman, Farook; Banerjee, Ayan [Jadavpur University, Department of Mathematics, Kolkata, West Bengal (India); Manna, Tuhina [St. Xavier' s College, Department of Mathematics and Statistics (Commerce Evening), Kolkata, West Bengal (India)

    2016-12-15

    In this article, we study the possibility of sustaining static and spherically symmetric traversable wormhole geometries admitting conformal motion in Einstein gravity, which presents a more systematic approach to search a relation between matter and geometry. In wormhole physics, the presence of exotic matter is a fundamental ingredient and we show that this exotic source can be dark energy type which support the existence of wormhole spacetimes. In this work we model a wormhole supported by dark energy which admits conformal motion. We also discuss the possibility of the detection of wormholes in the outer regions of galactic halos by means of gravitational lensing. Studies of the total gravitational energy for the exotic matter inside a static wormhole configuration are also performed. (orig.)

  19. Ratcheting Up The Search for Dark Matter

    Energy Technology Data Exchange (ETDEWEB)

    McDermott, Samuel Dylan [Univ. of Michigan, Ann Arbor, MI (United States)

    2014-01-01

    The last several years have included remarkable advances in two of the primary areas of fundamental particle physics: the search for dark matter and the discovery of the Higgs boson. This dissertation will highlight some contributions made on the forefront of these exciting fields. Although the circumstantial evidence supporting the dark matter hypothesis is now almost undeniably significant, indisputable direct proof is still lacking. As the direct searches for dark matter continue, we can maximize our prospects of discovery by using theoretical techniques complementary to the observational searches to rule out additional, otherwise accessible parameter space. In this dissertation, I report bounds on a wide range of dark matter theories. The models considered here cover the spectrum from the canonical case of self-conjugate dark matter with weak-scale interactions, to electrically charged dark matter, to non-annihilating, non-fermionic dark matter. These bounds are obtained from considerations of astrophysical and cosmological data, including, respectively: diffuse gamma ray photon observations; structure formation considerations, along with an explication of the novel local dark matter structure due to galactic astrophysics; and the existence of old pulsars in dark-matter-rich environments. I also consider the prospects for a model of neutrino dark matter which has been motivated by a wide set of seemingly contradictory experimental results. In addition, I include a study that provides the tools to begin solving the speculative ``inverse'' problem of extracting dark matter properties solely from hypothetical nuclear energy spectra, which we may face if dark matter is discovered with multiple direct detection experiments. In contrast to the null searches for dark matter, we have the example of the recent discovery of the Higgs boson. The Higgs boson is the first fundamental scalar particle ever observed, and precision measurements of the production and

  20. Exchange of Information in Tax Matters

    Directory of Open Access Journals (Sweden)

    Paweł Szwajdler

    2017-01-01

    Full Text Available The main aim of this paper is to present issues related to exchange of tax information. The author focuses on models of exchange of information and boundaries of obligations in reference to above-mentioned problems. Automatic exchange of information, spontaneous exchange of information and exchange of information on request are analysed in this work on the base of OECD Convention on Mutual Administrative Assistance in Tax Matters, Council Directive 2011/16 and OECD Model Agreement on Exchange of Information in Tax Matters. In the summary, it was showed that the most efficient method of exchange of tax information is automatic exchange of information. Furthermore, it was stated that exchange on request can be related to negative phenomenon such as fishing expedition. Spontaneous exchange of information is thought to play only supportive role.  The author also considers that boundaries of exchange of information in tax matters were regulated so as to protect jeopardised raison d’ État.

  1. Dark matter in the universe

    International Nuclear Information System (INIS)

    Turner, M.S.; Chicago Univ., IL

    1990-11-01

    What is the quantity and composition of material in the Universe? This is one of the most fundamental questions we can ask about the Universe, and its answer bears on a number of important issues including the formation of structure in the Universe, and the ultimate fate and the earliest history of the Universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand: Most of the material in the Universe does not give off detectable radiation, i.e., is ''dark;'' the dark matter associated with bright galaxies contributes somewhere between 10% and 30% of the critical density (by comparison luminous matter contributes less than 1%); baryonic matter contributes between 1.1% and 12% of critical. The case for the spatially-flat, Einstein-de Sitter model is supported by three compelling theoretical arguments--structure formation, the temporal Copernican principle, and inflation--and by some observational data. If Ω is indeed unity--or even just significantly greater than 0.1--then there is a strong case for a Universe comprised of nonbaryonic matter. There are three well motivated particle dark-matter candidates: an axion of mass 10 -6 eV to 10 -4 eV; a neutralino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either being planned or are underway. 63 refs

  2. Methodology for the interactive graphic simulator construction

    International Nuclear Information System (INIS)

    Milian S, Idalmis; Rodriguez M, Lazaro; Lopez V, Miguel A.

    1997-01-01

    The PC-supported Interactive Graphic Simulators (IGS) have successfully been used for industrial training programs in many countries. This paper is intended to illustrate the general methodology applied by our research team for the construction of this kind of conceptual or small scale simulators. The information and tools available to achieve this goal are also described. The applicability of the present methodology was confirmed with the construction of a set of IGS for nuclear power plants operators training programs in Cuba. One of them, relating reactor kinetics, is shown and briefly described in this paper. (author). 11 refs., 3 figs

  3. Investigation of the organic matter in inactive nuclear tank liquids

    International Nuclear Information System (INIS)

    Schenley, R.L.; Griest, W.H.

    1990-08-01

    Environmental Protection Agency (EPA) methodology for regulatory organics fails to account for the organic matter that is suggested by total organic carbon (TOC) analysis in the Oak Ridge National Laboratory (ORNL) inactive nuclear waste-tank liquids and sludges. Identification and measurement of the total organics are needed to select appropriate waste treatment technologies. An initial investigation was made of the nature of the organics in several waste-tank liquids. This report details the analysis of ORNL wastes

  4. Unattended Monitoring System Design Methodology

    International Nuclear Information System (INIS)

    Drayer, D.D.; DeLand, S.M.; Harmon, C.D.; Matter, J.C.; Martinez, R.L.; Smith, J.D.

    1999-01-01

    A methodology for designing Unattended Monitoring Systems starting at a systems level has been developed at Sandia National Laboratories. This proven methodology provides a template that describes the process for selecting and applying appropriate technologies to meet unattended system requirements, as well as providing a framework for development of both training courses and workshops associated with unattended monitoring. The design and implementation of unattended monitoring systems is generally intended to respond to some form of policy based requirements resulting from international agreements or domestic regulations. Once the monitoring requirements are established, a review of the associated process and its related facilities enables identification of strategic monitoring locations and development of a conceptual system design. The detailed design effort results in the definition of detection components as well as the supporting communications network and data management scheme. The data analyses then enables a coherent display of the knowledge generated during the monitoring effort. The resultant knowledge is then compared to the original system objectives to ensure that the design adequately addresses the fundamental principles stated in the policy agreements. Implementation of this design methodology will ensure that comprehensive unattended monitoring system designs provide appropriate answers to those critical questions imposed by specific agreements or regulations. This paper describes the main features of the methodology and discusses how it can be applied in real world situations

  5. Proposing C4ISR Architecture Methodology for Homeland Security

    National Research Council Canada - National Science Library

    Farah-Stapleton, Monica F; Dimarogonas, James; Eaton, Rodney; Deason, Paul J

    2004-01-01

    This presentation presents how a network architecture methodology developed for the Army's Future Force could be applied to the requirements of Civil Support, Homeland Security/Homeland Defense (CS HLS/HLD...

  6. Methodological issues involved in conducting qualitative research ...

    African Journals Online (AJOL)

    The purpose of this article is to describe the methodological issues involved in conducting qualitative research to explore and describe nurses' experience of being directly involved with termination of pregnancies and developing guidelines for support for these nurses. The article points out the sensitivity and responsibility ...

  7. Polycyclic aromatic hydrocarbons and organic matter associated to particulate matter emitted from atmospheric fluidized bed coal combustion

    International Nuclear Information System (INIS)

    Mastral, A.M.; Callen, M.S.; Garcia, T.

    1999-01-01

    The polycyclic aromatic hydrocarbons (PAH) and the organic matter (OM) content associated with particulate matter (PM) emissions from atmospheric fluidized bed coal combustion have been studied. The two main aims of the work have been (a) to study OM and PAH emissions as a function of the coal fluidized bed combustion (FBC) variables in solid phase and (b) to check if there is any correlation between OM and PAH contained in the PM. The combustion was carried out in a laboratory scale plant at different combustion conditions: temperature, percentage of oxygen excess, and total air flow. PAH associated on the particulate matter have been analyzed by fluorescence spectroscopy in the synchronous mode (FS) after PM extraction by sonication with dimethylformamide (DMF). It can be concluded that there is not a direct relationship between the OM content and the PAH supported in the PM emitted. In addition, neither PM or OM show dependence between themselves

  8. Is there a dichotomy in the Dark Matter as well as in the Baryonic Matter properties of ellipticals?

    NARCIS (Netherlands)

    Napolitano, NR; Capaccioli, M; Arnaboldi, M; Merrifield, MR; Douglas, NG; Kuijken, K; Romanowsky, AJ; Freeman, KC; Ryder, SD; Pisano, DJ; Walker, MA; Freeman, KC

    2004-01-01

    We have found a correlation between the M/L global gradients and the structural parameters of the luminous components of a sample of 19 early-type galaxies. Such a correlation supports the hypothesis that there is a connection between the dark matter content and the evolution of the baryonic

  9. Present stage evaluation of Furnas calculus methodology qualification

    International Nuclear Information System (INIS)

    1987-07-01

    This technical note is about the present stage evaluation of FURNAS Calculus Methodology Qualification related to reload licensing process and licensing support of operation questions of Angra 1 NPP concerning transient and Core ThermalHydraulic areas. (Author) [pt

  10. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  11. Intelligent systems/software engineering methodology - A process to manage cost and risk

    Science.gov (United States)

    Friedlander, Carl; Lehrer, Nancy

    1991-01-01

    A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.

  12. Learning from doing: the case for combining normalisation process theory and participatory learning and action research methodology for primary healthcare implementation research.

    Science.gov (United States)

    de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne

    2016-08-03

    The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of

  13. PWR control system design using advanced linear and non-linear methodologies

    International Nuclear Information System (INIS)

    Rabindran, N.; Whitmarsh-Everiss, M.J.

    2004-01-01

    Consideration is here given to the methodology deployed for non-linear heuristic analysis in the time domain supported by multi-variable linear control system design methods for the purposes of operational dynamics and control system analysis. This methodology is illustrated by the application of structural singular value μ analysis to Pressurised Water Reactor control system design. (author)

  14. Regional gray matter growth, sexual dimorphism, and cerebral asymmetry in the neonatal brain.

    Science.gov (United States)

    Gilmore, John H; Lin, Weili; Prastawa, Marcel W; Looney, Christopher B; Vetsa, Y Sampath K; Knickmeyer, Rebecca C; Evans, Dianne D; Smith, J Keith; Hamer, Robert M; Lieberman, Jeffrey A; Gerig, Guido

    2007-02-07

    Although there has been recent interest in the study of childhood and adolescent brain development, very little is known about normal brain development in the first few months of life. In older children, there are regional differences in cortical gray matter development, whereas cortical gray and white matter growth after birth has not been studied to a great extent. The adult human brain is also characterized by cerebral asymmetries and sexual dimorphisms, although very little is known about how these asymmetries and dimorphisms develop. We used magnetic resonance imaging and an automatic segmentation methodology to study brain structure in 74 neonates in the first few weeks after birth. We found robust cortical gray matter growth compared with white matter growth, with occipital regions growing much faster than prefrontal regions. Sexual dimorphism is present at birth, with males having larger total brain cortical gray and white matter volumes than females. In contrast to adults and older children, the left hemisphere is larger than the right hemisphere, and the normal pattern of fronto-occipital asymmetry described in older children and adults is not present. Regional differences in cortical gray matter growth are likely related to differential maturation of sensory and motor systems compared with prefrontal executive function after birth. These findings also indicate that whereas some adult patterns of sexual dimorphism and cerebral asymmetries are present at birth, others develop after birth.

  15. Requirements model generation to support requirements elicitation: The Secure Tropos experience

    NARCIS (Netherlands)

    Kiyavitskaya, N.; Zannone, N.

    2008-01-01

    In recent years several efforts have been devoted by researchers in the Requirements Engineering community to the development of methodologies for supporting designers during requirements elicitation, modeling, and analysis. However, these methodologies often lack tool support to facilitate their

  16. Using Facebook to Support Novice Teachers

    Science.gov (United States)

    Staudt, Denise; St. Clair, Norman; Martinez, Elda E.

    2013-01-01

    Providing quality support for novice teachers as they enter the profession has been an ongoing concern of educator preparation programs. This article describes the efforts of one teacher preparation program in addressing this matter by utilizing Facebook[R] to provide sustained support and professional development for its beginning teachers. We…

  17. Scoping paper on new CDM baseline methodology for cross-border power trade

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    Poeyry has been sub-contracted by Carbon Limits, under the African Development Bank CDM Support Programme, to prepare a new CDM baseline methodology for cross border trade, based on a transmission line from Ethiopia to Kenya. The first step in that process is to review the response of the UNFCCC, particularly the Methodologies Panel ('Meth Panel') of the CDM Executive Board, to the various proposals on cross-border trade and interconnection of grids. This report reviews the Methodology Panel and Executive Board decisions on 4 requests for revisions of ACM2 'Consolidated baseline methodology for grid-connected electricity generation from renewable sources', and 5 proposed new baseline methodologies (NM255, NM269, NM272, NM318, NM342), all of which were rejected. We analyse the reasons the methodologies were rejected, and whether the proposed draft Approved Methodology (AM) that the Methodology Panel created in response to NM269 and NM272 is a suitable basis for a new methodology proposal.(auth)

  18. Response spectrum analysis for multi-supported subsystems

    International Nuclear Information System (INIS)

    Reed, J.W.

    1983-01-01

    A methodology was developed to analyze multi-supported subsystems (e.g., piping systems) for seismic or other dynamic forces using response spectrum input. Currently, subsystems which are supported at more than one location in a nuclear power plant building are analyzed either by the time-history method or by response spectrum procedures, where spectra which envelop all support locations are used. The former procedure is exceedingly expensive, while the latter procedure is inexpensive but very conservative. Improved analysis procedures are currently being developed which are either coupled- or uncoupled-system approaches. For the coupled-system approach, response feedback between the subsystem and building system is included. For the uncoupled-system approach, feedback is neglected; however, either time history or response spectrum methods can be used. The methodology developed for analyzing multi-supported subsystems is based on the assumption that the building response and the subsystem response are uncoupled. This is the same assumption implicitly made by analysts who design singly-supported subsystems using floor response spectrum input. This approach implies that there is no response feedback between the primary building system and the subsystem, which is generally found to be conservative. The methodology developed for multi-supported subsystems makes this same assumption and thus should produce results with the same ease and degree of accuracy as results obtained for singly-supported subsystems. (orig./HP)

  19. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  20. Support system for the professional integration of people with disability into the labour market.

    Science.gov (United States)

    Filgueiras, Ernesto; Vilar, Elisângela; Rebelo, Francisco

    2015-01-01

    Successful cases of professional reintegration were achieved when adequate conditions were created for the adaptation of the worker with disability to the working environment and to the professional activity, allowing them to carry out all their functions without any restriction. In this sense, this paper presents a methodology for professional integration of people with disability in service companies and industry. It has as results a matrix of analysis of a set of observables for the reintegration of people with disability into the labour market, as well as an auxiliary tool for those who work in recruitment of personnel. The main objective was to develop a tool (i.e., a software) based on the crossing of data obtained from the analysis of the individual capacities and the requirements of the job to optimise the relationship between worker and the workplace. There was also considered a series of strategies which can be adopted by the individuals and the possible adaptations in the workplace, as a way to reduce the handicap in the accomplishment of different activities. The methodology for the development of this study is divided in two phases: Phase I, destined to the assessment criteria and classification of the indispensable functional characteristics of the individuals; Phase II, related to the assessment criteria of the jobs and the functions that have to be performed. As a result it was developed an evaluation tool to match the individuals' capabilities and the job requirements. A software was created to support the evaluation and to help professionals during the assessment. This methodology together with the support tool demonstrated to be a quite inclusive tool, as it considers, as a matter of priority, the capacities of the individuals and the real necessities of the workplaces.

  1. Evaluation of modulation transfer function of optical lens system by support vector regression methodologies - A comparative study

    Science.gov (United States)

    Petković, Dalibor; Shamshirband, Shahaboddin; Saboohi, Hadi; Ang, Tan Fong; Anuar, Nor Badrul; Rahman, Zulkanain Abdul; Pavlović, Nenad T.

    2014-07-01

    The quantitative assessment of image quality is an important consideration in any type of imaging system. The modulation transfer function (MTF) is a graphical description of the sharpness and contrast of an imaging system or of its individual components. The MTF is also known and spatial frequency response. The MTF curve has different meanings according to the corresponding frequency. The MTF of an optical system specifies the contrast transmitted by the system as a function of image size, and is determined by the inherent optical properties of the system. In this study, the polynomial and radial basis function (RBF) are applied as the kernel function of Support Vector Regression (SVR) to estimate and predict estimate MTF value of the actual optical system according to experimental tests. Instead of minimizing the observed training error, SVR_poly and SVR_rbf attempt to minimize the generalization error bound so as to achieve generalized performance. The experimental results show that an improvement in predictive accuracy and capability of generalization can be achieved by the SVR_rbf approach in compare to SVR_poly soft computing methodology.

  2. Dark matter in the Universe

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. (Fermi National Accelerator Lab., Batavia, IL (USA) Chicago Univ., IL (USA). Enrico Fermi Inst.)

    1991-03-01

    What is the quantity and composition of material in the universe This is one of the most fundamental questions we can ask about the universe, and its answer bears on a number of important issues including the formation of structure in the universe, and the ultimate fate and the earliest history of the universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand: most of the material in the universe does not give off detectable radiation, i.e., is dark;'' the dark matter associated with bright galaxies contributes somewhere between 10% and 30% of the critical density (by comparison luminous matter contributes less than 1%); baryonic matter contributes between 1.1% and 12% of critical. The case for the spatially-flat, Einstein-de Sitter model is supported by three compelling theoretical arguments -- structure formation, the temporal Copernican principle, and inflation -- and by some observational data. If {Omega} is indeed unity--or even just significantly greater than 0.1--then there is a strong case for a universe comprised of nonbaryonic matter. There are three well motivated particle dark-matter candidates: an axion of mass 10{sup {minus}6} eV to 10{sup {minus}4} eV; a neutralino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either being planned or are underway. 71 refs., 6 figs.

  3. Dark matter in the universe

    Energy Technology Data Exchange (ETDEWEB)

    Turner, M.S. (Fermi National Accelerator Lab., Batavia, IL (USA) Chicago Univ., IL (USA). Enrico Fermi Inst.)

    1990-11-01

    What is the quantity and composition of material in the Universe This is one of the most fundamental questions we can ask about the Universe, and its answer bears on a number of important issues including the formation of structure in the Universe, and the ultimate fate and the earliest history of the Universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand: Most of the material in the Universe does not give off detectable radiation, i.e., is dark;'' the dark matter associated with bright galaxies contributes somewhere between 10% and 30% of the critical density (by comparison luminous matter contributes less than 1%); baryonic matter contributes between 1.1% and 12% of critical. The case for the spatially-flat, Einstein-de Sitter model is supported by three compelling theoretical arguments--structure formation, the temporal Copernican principle, and inflation--and by some observational data. If {Omega} is indeed unity--or even just significantly greater than 0.1--then there is a strong case for a Universe comprised of nonbaryonic matter. There are three well motivated particle dark-matter candidates: an axion of mass 10{sup {minus}6} eV to 10{sup {minus}4} eV; a neutralino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either being planned or are underway. 63 refs.

  4. Gray Matter Is Targeted in First-Attack Multiple Sclerosis

    Energy Technology Data Exchange (ETDEWEB)

    Schutzer, Steven E.; Angel, Thomas E.; Liu, Tao; Schepmoes, Athena A.; Xie, Fang; Bergquist, Jonas P.; Vecsei, Lazlo' ; Zadori, Denes; Camp, David G.; Holland, Bart K.; Smith, Richard D.; Coyle, Patricia K.

    2013-09-10

    The cause of multiple sclerosis (MS), its driving pathogenesis at the earliest stages, and what factors allow the first clinical attack to manifest remain unknown. Some imaging studies suggest gray rather than white matter may be involved early, and some postulate this may be predictive of developing MS. Other imaging studies are in conflict. To determine if there was objective molecular evidence of gray matter involvement in early MS we used high-resolution mass spectrometry to identify proteins in the cerebrospinal fluid (CSF) of first-attack MS patients (two independent groups) compared to established relapsing remitting (RR) MS and controls. We found that the CSF proteins in first-attack patients were differentially enriched for gray matter components (axon, neuron, synapse). Myelin components did not distinguish these groups. The results support that gray matter dysfunction is involved early in MS, and also may be integral for the initial clinical presentation.

  5. Interactions between dark energy and dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Baldi, Marco

    2009-03-20

    We have investigated interacting dark energy cosmologies both concerning their impact on the background evolution of the Universe and their effects on cosmological structure growth. For the former aspect, we have developed a cosmological model featuring a matter species consisting of particles with a mass that increases with time. In such model the appearance of a Growing Matter component, which is negligible in early cosmology, dramatically slows down the evolution of the dark energy scalar field at a redshift around six, and triggers the onset of the accelerated expansion of the Universe, therefore addressing the Coincidence Problem. We propose to identify this Growing Matter component with cosmic neutrinos, in which case the present dark energy density can be related to the measured average mass of neutrinos. For the latter aspect, we have implemented the new physical features of interacting dark energy models into the cosmological N-body code GADGET-2, and we present the results of a series of high-resolution simulations for a simple realization of dark energy interaction. As a consequence of the new physics, cold dark matter and baryon distributions evolve differently both in the linear and in the non-linear regime of structure formation. Already on large scales, a linear bias develops between these two components, which is further enhanced by the non-linear evolution. We also find, in contrast with previous work, that the density profiles of cold dark matter halos are less concentrated in coupled dark energy cosmologies compared with {lambda}{sub CDM}. Also, the baryon fraction in halos in the coupled models is significantly reduced below the universal baryon fraction. These features alleviate tensions between observations and the {lambda}{sub CDM} model on small scales. Our methodology is ideally suited to explore the predictions of coupled dark energy models in the fully non-linear regime, which can provide powerful constraints for the viable parameter

  6. Interactions between dark energy and dark matter

    International Nuclear Information System (INIS)

    Baldi, Marco

    2009-01-01

    We have investigated interacting dark energy cosmologies both concerning their impact on the background evolution of the Universe and their effects on cosmological structure growth. For the former aspect, we have developed a cosmological model featuring a matter species consisting of particles with a mass that increases with time. In such model the appearance of a Growing Matter component, which is negligible in early cosmology, dramatically slows down the evolution of the dark energy scalar field at a redshift around six, and triggers the onset of the accelerated expansion of the Universe, therefore addressing the Coincidence Problem. We propose to identify this Growing Matter component with cosmic neutrinos, in which case the present dark energy density can be related to the measured average mass of neutrinos. For the latter aspect, we have implemented the new physical features of interacting dark energy models into the cosmological N-body code GADGET-2, and we present the results of a series of high-resolution simulations for a simple realization of dark energy interaction. As a consequence of the new physics, cold dark matter and baryon distributions evolve differently both in the linear and in the non-linear regime of structure formation. Already on large scales, a linear bias develops between these two components, which is further enhanced by the non-linear evolution. We also find, in contrast with previous work, that the density profiles of cold dark matter halos are less concentrated in coupled dark energy cosmologies compared with Λ CDM . Also, the baryon fraction in halos in the coupled models is significantly reduced below the universal baryon fraction. These features alleviate tensions between observations and the Λ CDM model on small scales. Our methodology is ideally suited to explore the predictions of coupled dark energy models in the fully non-linear regime, which can provide powerful constraints for the viable parameter space of such scenarios

  7. When Family-Supportive Supervision Matters: Relations between Multiple Sources of Support and Work-Family Balance

    Science.gov (United States)

    Greenhaus, Jeffrey H.; Ziegert, Jonathan C.; Allen, Tammy D.

    2012-01-01

    This study examines the mechanisms by which family-supportive supervision is related to employee work-family balance. Based on a sample of 170 business professionals, we found that the positive relation between family-supportive supervision and balance was fully mediated by work interference with family (WIF) and partially mediated by family…

  8. AN INTEGRATED METHODOLOGY FOR CUSTOMER RELATIONSHIP MANAGEMENT CUSTOMIZATION

    Directory of Open Access Journals (Sweden)

    Ricardo Colomo Palacios

    2008-02-01

    Full Text Available The importance and presence of technological solutions in organizations supporting CRM are a vital business fact from the late nineties. Presently, the manufacturers figure in the market has dramatically decreased because of continuous takeovers and merges, but it has on the other hand gained momentum because of the sudden open-source and on-demand solutions appearance. In this scope, a unified methodology centered on CRM solutions is of paramount importance since it has traditionally been linked to either system integration or overall solution design. Based on the two de-facto complementary standards for the implementation and development of Information Systems, namely the ESA and Dyché CRM systems implementation methodology, in this paper, we provide a CRM business solutions customization methodology which pertains independently to the integration and tool maker perspective.

  9. Integrated approach methodology: A handbook for power plant assessment

    International Nuclear Information System (INIS)

    Roush, M.L.; Modarres, M.; Hunt, R.N.M.; Kreps, D.; Pearce, R.

    1987-10-01

    This handbook is a practical document that provides the principles and steps of a method to help a utility's decision-making process on matters concerning plant safety and economy. It provides a framework for analyzing the manner in which plant equipment and personnel work together to achieve successful operation; also making possible the quantitative evaluation of individual contributors to success in overall plant operation. The methodology does not purport to instruct utilities on the proper way to run a power plant. Rather, it is an analytical tool to aid a utility in using plant data and other hands-on knowledge of its own personnel to solve practical problems

  10. Ancillary reactive power service allocation cost in deregulated markets: a methodology

    International Nuclear Information System (INIS)

    Hernandez, J. Horacio Tovar; Jimenez-Guzman, Miguel; Gutierrez-Alcaraz, Guillermo

    2005-01-01

    This paper presents a methodology to allocate reactive power costs in deregulated markets. Reactive power supply service is decomposed into voltage regulation and reactive power spinning reserve. The proposed methodology is based on sensitivities and the postage-stamp method in order to allocate the total costs service among all participants. With the purpose of achieving this goal, the system operator identifies voltage support and/or reactive power requirements, and looks out for suitable providers. One case study is presented here to illustrate the methodology over a simplified southeastern Mexican grid. (Author)

  11. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More Than Qualitative Methods.

    Science.gov (United States)

    Bowleg, Lisa

    2017-10-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with this premise, I address four themes in this commentary. First, I criticize the ubiquitous and uncritical use of the term health disparities in U.S. public health. Next, I advocate for the increased use of qualitative methodologies-namely, photovoice and critical ethnography-that, pursuant to critical approaches, prioritize dismantling social-structural inequities as a prerequisite to health equity. Thereafter, I discuss epistemological stance and its influence on all aspects of the research process. Finally, I highlight my critical discourse analysis HIV prevention research based on individual interviews and focus groups with Black men, as an example of a critical health equity research approach.

  12. Hermeneutics as a Methodological Resource for Understanding Empathy in On-Line Learning Environments

    Science.gov (United States)

    Walshaw, Margaret; Duncan, Wayne

    2015-01-01

    Hermeneutics is both a philosophical tradition and a methodological resource. In this qualitative study, hermeneutics provided, simultaneously, a framework and a methodology for understanding empathy in synchronous multimedia conferencing. As a framework for the design of the study, hermeneutics supported the overriding objective to understand the…

  13. Monitoring and diagnosis for sensor fault detection using GMDH methodology

    International Nuclear Information System (INIS)

    Goncalves, Iraci Martinez Pereira

    2006-01-01

    The fault detection and diagnosis system is an Operator Support System dedicated to specific functions that alerts operators to sensors and actuators fault problems, and guide them in the diagnosis before the normal alarm limits are reached. Operator Support Systems appears to reduce panels complexity caused by the increase of the available information in nuclear power plants control room. In this work a Monitoring and Diagnosis System was developed based on the GMDH (Group Method of Data Handling) methodology. The methodology was applied to the IPEN research reactor IEA-R1. The system performs the monitoring, comparing GMDH model calculated values with measured values. The methodology developed was firstly applied in theoretical models: a heat exchanger model and an IPEN reactor theoretical model. The results obtained with theoretical models gave a base to methodology application to the actual reactor operation data. Three GMDH models were developed for actual operation data monitoring: the first one using just the thermal process variables, the second one was developed considering also some nuclear variables, and the third GMDH model considered all the reactor variables. The three models presented excellent results, showing the methodology utilization viability in monitoring the operation data. The comparison between the three developed models results also shows the methodology capacity to choose by itself the best set of input variables for the model optimization. For the system diagnosis implementation, faults were simulated in the actual temperature variable values by adding a step change. The fault values correspond to a typical temperature descalibration and the result of monitoring faulty data was then used to build a simple diagnosis system based on fuzzy logic. (author)

  14. Applying Costs, Risks and Values Evaluation (CRAVE) methodology to Engineering Support Request (ESR) prioritization

    Science.gov (United States)

    Joglekar, Prafulla N.

    1994-01-01

    Given limited budget, the problem of prioritization among Engineering Support Requests (ESR's) with varied sizes, shapes, and colors is a difficult one. At the Kennedy Space Center (KSC), the recently developed 4-Matrix (4-M) method represents a step in the right direction as it attempts to combine the traditional criteria of technical merits only with the new concern for cost-effectiveness. However, the 4-M method was not adequately successful in the actual prioritization of ESRs for the fiscal year 1995 (FY95). This research identifies a number of design issues that should help us to develop better methods. It emphasizes that given the variety and diversity of ESR's one should not expect that a single method could help in the assessment of all ESR's. One conclusion is that a methodology such as Costs, Risks, and Values Evaluation (CRAVE) should be adopted. It also is clear that the development of methods such as 4-M requires input not only from engineers with technical expertise in ESR's but also from personnel with adequate background in the theory and practice of cost-effectiveness analysis. At KSC, ESR prioritization is one part of the Ground Support Working Teams (GSWT) Integration Process. It was discovered that the more important barriers to the incorporation of cost-effectiveness considerations in ESR prioritization lie in this process. The culture of integration, and the corresponding structure of review by a committee of peers, is not conducive to the analysis and confrontation necessary in the assessment and prioritization of ESR's. Without assistance from appropriately trained analysts charged with the responsibility to analyze and be confrontational about each ESR, the GSWT steering committee will continue to make its decisions based on incomplete understanding, inconsistent numbers, and at times, colored facts. The current organizational separation of the prioritization and the funding processes is also identified as an important barrier to the

  15. Essential methodological considerations when using grounded theory.

    Science.gov (United States)

    Achora, Susan; Matua, Gerald Amandu

    2016-07-01

    To suggest important methodological considerations when using grounded theory. A research method widely used in nursing research is grounded theory, at the centre of which is theory construction. However, researchers still struggle with some of its methodological issues. Although grounded theory is widely used to study and explain issues in nursing practice, many researchers are still failing to adhere to its rigorous standards. Researchers should articulate the focus of their investigations - the substantive area of interest as well as the focal population. This should be followed by a succinct explanation of the strategies used to collect and analyse data, supported by clear coding processes. Finally, the resolution of the core issues, including the core category and related categories, should be explained to advance readers' understanding. Researchers should endeavour to understand the tenets of grounded theory. This enables 'neophytes' in particular to make methodological decisions that will improve their studies' rigour and fit with grounded theory. This paper complements the current dialogue on improving the understanding of grounded theory methodology in nursing research. The paper also suggests important procedural decisions researchers need to make to preserve their studies' scientific merit and fit with grounded theory.

  16. Overview of a performance assessment methodology for low-level radioactive waste disposal facilities

    International Nuclear Information System (INIS)

    Kozak, M.W.; Chu, M.S.Y.

    1991-01-01

    A performance assessment methodology has been developed for use by the US Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. This paper provides a summary and an overview of the modeling approaches selected for the methodology. The overview includes discussions of the philosophy and structure of the methodology. This performance assessment methodology is designed to provide the NRC with a tool for performing confirmatory analyses in support of license reviews related to postclosure performance. The methodology allows analyses of dose to individuals from off-site releases under normal conditions as well as on-site doses to inadvertent intruders. 24 refs., 1 tab

  17. The specific aspects for the ASSET methodology implementation in Romania

    Energy Technology Data Exchange (ETDEWEB)

    Serbanescu, D [National Commission for Nuclear Activities Control of Romania (Romania)

    1997-10-01

    The main aspects of the implementation of a root cause analysis methodology are as follows: The Test Operating Licence requires that a systematical root cause analysis method for the event analysis to clarify the three questions from the ASSET methodology has to be implemented; A Training seminar on the ASSET methodology for the plant staff was performed at Cernavoda 1 NPP in April 1997, with the IAEA support; The self assessment process for the events which occurred during commissioning phases has to be performed by the plant up to the end of this year; An ASSET Peer Review of the Plant Self Assessment is planned in 1998; The Regulatory Authority has the task to evaluated independently the plant conclusions on various events. The tool used by CNCAN is the ASSET methodology.

  18. The specific aspects for the ASSET methodology implementation in Romania

    International Nuclear Information System (INIS)

    Serbanescu, D.

    1997-01-01

    The main aspects of the implementation of a root cause analysis methodology are as follows: The Test Operating Licence requires that a systematical root cause analysis method for the event analysis to clarify the three questions from the ASSET methodology has to be implemented; A Training seminar on the ASSET methodology for the plant staff was performed at Cernavoda 1 NPP in April 1997, with the IAEA support; The self assessment process for the events which occurred during commissioning phases has to be performed by the plant up to the end of this year; An ASSET Peer Review of the Plant Self Assessment is planned in 1998; The Regulatory Authority has the task to evaluated independently the plant conclusions on various events. The tool used by CNCAN is the ASSET methodology

  19. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More than Qualitative Methods

    Science.gov (United States)

    Bowleg, Lisa

    2017-01-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with…

  20. Indirect research of dark matter toward dwarf galaxies with the ANTARES neutrino telescope

    International Nuclear Information System (INIS)

    Dumas, Alexis

    2014-01-01

    The first part of this document summarizes the astrophysical arguments to suppose the existence of dark matter. The cosmological model γCDM is presented as well as the concept of cross section of dark matter self-annihilation. Dwarf galaxies satellites of the Milky Way, the sources of our study are introduced into a second chapter. After recalling the large structures that make up the universe, the issues related to dwarf galaxies are addressed: missing satellites problem, distribution of dark matter density within them and tidal forces due to the Milky Way. The second part discusses the modeling of the dark matter density in dwarf galaxies. The methodology, using the Jeans equation and dispersion of projected stars velocities, is presented. Three dark matter profiles are retained: NFW, Burkert and Einasto and fifteen dwarf galaxies. Neutrino production during the self-annihilation of dark matter is then addressed. The energy spectra of neutrinos are generated with PYTHIA software and compared with other results for the galactic center. Twenty-three assumptions of mass dark matter candidates are chosen, ranging from 25 GeV/c 2 100 TeV/c 2 . Five self-annihilation channels are selected for analysis: b - b, W + W - T + T - μ + μ - νμ νμ. The third part includes a presentation of the detector used for the study, the ANTARES neutrino telescope. Three reconstruction algorithms developed and used in collaboration are also detailed: AAFIT, BBFit and GridFit. The analysis of data ANTARES aimed to find a neutrinos excess characteristic of dark matter self-annihilation is summarized in the sixth and final chapter. No excess was observed, a limit on the cross section of dark matter self-annihilation was determined. (author)

  1. Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies

    International Nuclear Information System (INIS)

    White, M.K.; Buelt, J.L.; Stottlemyre, J.A.

    1991-02-01

    Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets of compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab

  2. Assessing the Atmospheric Pollution of Energy Facilities for Supporting Energy Policy Decisions

    International Nuclear Information System (INIS)

    Meneses Ruiz, E.; Alonso García, D.; Pérez Zayas, G.; Piñera Hernández, I.; Martinez Varona, M.; Molina Esquivel, E.

    2015-01-01

    The impacts of different energy facilities on the environment and human health are a matter of interest and concern throughout the world. For example, fossil fuels are one of the energy sources of more undesirable effects on the environment, but this energy is still one of the most competitive at the market, especially for the developing countries. However, it is necessary to find out a balance between the costs of achieving a lower level of environmental and health injury and the benefits of providing electricity at a reasonable cost. With a view to solving the current deficit in energy production (mainly in electricity generation) in the light of major transformations in the energy sector, the Cuban Government is evaluating ways of incorporating new sources and technologies and the expansion of existing capabilities. In this context non-fossil energy sources will play an increasingly important role. The present work shows the results obtained in the frame of the IAEA Technical Cooperation Project CUB7007. The project integrated several tools and methodologies in the field of air quality modelling and its assessment, emissions measurement and nuclear techniques. The main objective was to assess atmospheric pollution from various energy facilities for supporting energy policy decisions by incorporating nuclear techniques (proton-induced X–ray emission, neutron activation and X–ray fluorescence) for estimating the elementary composition of particulate matter. As results were consolidated national laboratories in the application of nuclear and nonnuclear techniques to support environmental studies, especially for the analysis of emissions in chimneys and ambient air sampling. Moreover, all energy technologies considered in the national strategy of development were assessed. (author)

  3. A methodology for comprehensive strategic planning and program prioritization

    Science.gov (United States)

    Raczynski, Christopher Michael

    2008-10-01

    This process developed in this work, Strategy Optimization for the Allocation of Resources (SOAR), is a strategic planning methodology based off Integrated Product and Process Development and systems engineering techniques. Utilizing a top down approach, the process starts with the creation of the organization vision and its measures of effectiveness. These measures are prioritized based on their application to external world scenarios which will frame the future. The programs which will be used to accomplish this vision are identified by decomposing the problem. Information is gathered on the programs as to the application, cost, schedule, risk, and other pertinent information. The relationships between the levels of the hierarchy are mapped utilizing subject matter experts. These connections are then utilized to determine the overall benefit of the programs to the vision of the organization. Through a Multi-Objective Genetic Algorithm a tradespace of potential program portfolios can be created amongst which the decision maker can allocate resources. The information and portfolios are presented to the decision maker through the use of a Decision Support System which collects and visualizes all the data in a single location. This methodology was tested utilizing a science and technology planning exercise conducted by the United States Navy. A thorough decomposition was defined and technology programs identified which had the potential to provide benefit to the vision. The prioritization of the top level capabilities was performed through the use of a rank ordering scheme and a previous naval application was used to demonstrate a cumulative voting scheme. Voting was performed utilizing the Nominal Group Technique to capture the relationships between the levels of the hierarchy. Interrelationships between the technologies were identified and a MOGA was utilized to optimize portfolios with respect to these constraints and information was placed in a DSS. This

  4. Phenomena based Methodology for Process Synthesis incorporating Process Intensification

    DEFF Research Database (Denmark)

    Lutze, Philip; Babi, Deenesh Kavi; Woodley, John

    2013-01-01

    at processes at the lowest level of aggregation which is the phenomena level. In this paper, a phenomena based synthesis/design methodology incorporating process intensification is presented. Using this methodology, a systematic identification of necessary and desirable (integrated) phenomena as well......Process intensification (PI) has the potential to improve existing as well as conceptual processes, in order to achieve a more sustainable production. PI can be achieved at different levels. That is, the unit operations, functional and/or phenomena level. The highest impact is expected by looking...... as generation and screening of phenomena based flowsheet options are presented using a decomposition based solution approach. The developed methodology as well as necessary tools and supporting methods are highlighted through a case study involving the production of isopropyl-acetate....

  5. D matter

    International Nuclear Information System (INIS)

    Shiu, Gary; Wang Liantao

    2004-01-01

    We study the properties and phenomenology of particlelike states originating from D branes whose spatial dimensions are all compactified. They are nonperturbative states in string theory and we refer to them as D matter. In contrast to other nonperturbative objects such as 't Hooft-Polyakov monopoles, D-matter states could have perturbative couplings among themselves and with ordinary matter. The lightest D particle (LDP) could be stable because it is the lightest state carrying certain (integer or discrete) quantum numbers. Depending on the string scale, they could be cold dark matter candidates with properties similar to that of WIMPs or wimpzillas. The spectrum of excited states of D matter exhibits an interesting pattern which could be distinguished from that of Kaluza-Klein modes, winding states, and string resonances. We speculate about possible signatures of D matter from ultrahigh energy cosmic rays and colliders

  6. Does Market Remoteness Matter?

    OpenAIRE

    Moctar, Ndiaye; Elodie, Maitre d’Hôtel; Tristan, Le Cotty

    2015-01-01

    This paper addresses the role of market remoteness in explaining maize price volatility in Burkina Faso. A model of price formation is introduced to demonstrate formally that transport costs between urban and rural markets exacerbate maize price volatility. Empirical support is provided to the proposition by exploring an unusually rich data set of monthly maize price series across 28 markets over 2004-13. The methodology relies on an autoregressive conditional heteroskedasticity model to inve...

  7. FOREWORD: Computational methodologies for designing materials Computational methodologies for designing materials

    Science.gov (United States)

    Rahman, Talat S.

    2009-02-01

    It would be fair to say that in the past few decades, theory and computer modeling have played a major role in elucidating the microscopic factors that dictate the properties of functional novel materials. Together with advances in experimental techniques, theoretical methods are becoming increasingly capable of predicting properties of materials at different length scales, thereby bringing in sight the long-sought goal of designing material properties according to need. Advances in computer technology and their availability at a reasonable cost around the world have made tit all the more urgent to disseminate what is now known about these modern computational techniques. In this special issue on computational methodologies for materials by design we have tried to solicit articles from authors whose works collectively represent the microcosm of developments in the area. This turned out to be a difficult task for a variety of reasons, not the least of which is space limitation in this special issue. Nevertheless, we gathered twenty articles that represent some of the important directions in which theory and modeling are proceeding in the general effort to capture the ability to produce materials by design. The majority of papers presented here focus on technique developments that are expected to uncover further the fundamental processes responsible for material properties, and for their growth modes and morphological evolutions. As for material properties, some of the articles here address the challenges that continue to emerge from attempts at accurate descriptions of magnetic properties, of electronically excited states, and of sparse matter, all of which demand new looks at density functional theory (DFT). I should hasten to add that much of the success in accurate computational modeling of materials emanates from the remarkable predictive power of DFT, without which we would not be able to place the subject on firm theoretical grounds. As we know and will also

  8. 29th Workshop on Recent Developments in Computer Simulation Studies in Condensed Matter Physics

    International Nuclear Information System (INIS)

    2016-01-01

    support of this year's workshop. These Proceedings contain both invited papers and contributed presentations on problems in both classical and quantum condensed matter physics. As usual, topics ranged from hard and soft condensed matter to biologically inspired problems and purely methodological advances. While familiar topics like phase transitions were still on display, the trends in biophysics, dynamical behavior and complex systems demonstrated the continuing progression in the focus of computational condensed matter physics. We hope that readers will benefit from specialized results as well as profit from exposure to new algorithms, methods of analysis, and conceptual developments. Athens, GA, U.S.A. April, 2016 D. P. Landau M. Bachmann S. P. Lewis (paper)

  9. Progressing recovery-oriented care in psychiatric inpatient units: Occupational therapy’s role in supporting a stronger peer workforce

    Directory of Open Access Journals (Sweden)

    Chris Lloyd

    2017-10-01

    Full Text Available Purpose - Initiated by the service user movement, recovery-oriented practices are one of the keystones of modern mental health care. Over the past two decades, substantial gains have been made with introducing recovery-oriented practice in many areas of mental health practice, but there remain areas where progress is delayed, notably, the psychiatric inpatient environment. The peer support workforce can play a pivotal role in progressing recovery-oriented practices. The purpose of this paper is to provide a pragmatic consideration of how occupational therapists can influence mental health systems to work proactively with a peer workforce. Design/methodology/approach - The authors reviewed current literature and considered practical approaches to building a peer workforce in collaboration with occupational therapists. Findings - It is suggested that the peer support workforce should be consciously enhanced in the inpatient setting to support culture change as a matter of priority. Occupational therapists working on inpatient units should play a key role in promoting and supporting the growth in the peer support workforce. Doing so will enrich the Occupational Therapy profession as well as improving service user outcomes. Originality/value - This paper seeks to provide a pragmatic consideration of how occupational therapists can influence mental health systems to work proactively with a peer workforce.

  10. Case Study Methodology: Flexibility, Rigour, and Ethical Considerations for the Scholarship of Teaching and Learning

    Directory of Open Access Journals (Sweden)

    Marion L. Pearson

    2015-12-01

    Full Text Available Individuals and teams engaging in the scholarship of teaching and learning (SoTL in multidisciplinary higher education settings must make decisions regarding choice of research methodology and methods. These decisions are guided by the research context and the goals of the inquiry. With reference to our own recent experiences investigating pedagogical and curricular practices in a pharmacy program, we outline case study methodology as one of the many options available for SoTL inquiry. Case study methodology has the benefits of flexibility in terms of the types of research questions that can be addressed and the data collection methods that can be employed. Conducted with proper attention to the context of the case(s selected, ethical treatment of participants, and data management, case studies also have the necessary rigour to be credible and generalizable. In the matter of generalization, however, we recommend that the readers of a case study draw their own conclusions about the applicability of the findings to other settings.

  11. White matter pathways in persistent developmental stuttering: Lessons from tractography.

    Science.gov (United States)

    Kronfeld-Duenias, Vered; Civier, Oren; Amir, Ofer; Ezrati-Vinacour, Ruth; Ben-Shachar, Michal

    2018-03-01

    Fluent speech production relies on the coordinated processing of multiple brain regions. This highlights the role of neural pathways that connect distinct brain regions in producing fluent speech. Here, we aim to investigate the role of the white matter pathways in persistent developmental stuttering (PDS), where speech fluency is disrupted. We use diffusion weighted imaging and tractography to compare the white matter properties between adults who do and do not stutter. We compare the diffusion properties along 18 major cerebral white matter pathways. We complement the analysis with an overview of the methodology and a roadmap of the pathways implicated in PDS according to the existing literature. We report differences in the microstructural properties of the anterior callosum, the right inferior longitudinal fasciculus and the right cingulum in people who stutter compared with fluent controls. Persistent developmental stuttering is consistently associated with differences in bilateral distributed networks. We review evidence showing that PDS involves differences in bilateral dorsal fronto-temporal and fronto-parietal pathways, in callosal pathways, in several motor pathways and in basal ganglia connections. This entails an important role for long range white matter pathways in this disorder. Using a wide-lens analysis, we demonstrate differences in additional, right hemispheric pathways, which go beyond the replicable findings in the literature. This suggests that the affected circuits may extend beyond the known language and motor pathways. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Cosmology and Dark Matter

    CERN Document Server

    Tkachev, Igor

    2017-01-01

    This lecture course covers cosmology from the particle physicist perspective. Therefore, the emphasis will be on the evidence for the new physics in cosmological and astrophysical data together with minimal theoretical frameworks needed to understand and appreciate the evidence. I review the case for non-baryonic dark matter and describe popular models which incorporate it. In parallel, the story of dark energy will be developed, which includes accelerated expansion of the Universe today, the Universe origin in the Big Bang, and support for the Inflationary theory in CMBR data.

  13. Methodological issues in studies of air pollution and reproductive health.

    Science.gov (United States)

    Woodruff, Tracey J; Parker, Jennifer D; Darrow, Lyndsey A; Slama, Rémy; Bell, Michelle L; Choi, Hyunok; Glinianaia, Svetlana; Hoggatt, Katherine J; Karr, Catherine J; Lobdell, Danelle T; Wilhelm, Michelle

    2009-04-01

    In the past decade there have been an increasing number of scientific studies describing possible effects of air pollution on perinatal health. These papers have mostly focused on commonly monitored air pollutants, primarily ozone (O(3)), particulate matter (PM), sulfur dioxide (SO(2)), carbon monoxide (CO), and nitrogen dioxide (NO(2)), and various indices of perinatal health, including fetal growth, pregnancy duration, and infant mortality. While most published studies have found some marker of air pollution related to some types of perinatal outcomes, variability exists in the nature of the pollutants and outcomes associated. Synthesis of the findings has been difficult for various reasons, including differences in study design and analysis. A workshop was held in September 2007 to discuss methodological differences in the published studies as a basis for understanding differences in study findings and to identify priorities for future research, including novel approaches for existing data. Four broad topic areas were considered: confounding and effect modification, spatial and temporal exposure variations, vulnerable windows of exposure, and multiple pollutants. Here we present a synopsis of the methodological issues and challenges in each area and make recommendations for future study. Two key recommendations include: (1) parallel analyses of existing data sets using a standardized methodological approach to disentangle true differences in associations from methodological differences among studies; and (2) identification of animal studies to inform important mechanistic research gaps. This work is of critical public health importance because of widespread exposure and because perinatal outcomes are important markers of future child and adult health.

  14. Dark matter detectors

    International Nuclear Information System (INIS)

    Forster, G.

    1995-01-01

    A fundamental question of astrophysics and cosmology is the nature of dark matter. Astrophysical observations show clearly the existence of some kind of dark matter, though they cannot yet reveal its nature. Dark matter can consist of baryonic particles, or of other (known or unknown) elementary particles. Baryonic dark matter probably exists in the form of dust, gas, or small stars. Other elementary particles constituting the dark matter can possibly be measured in terrestrial experiments. Possibilities for dark matter particles are neutrinos, axions and weakly interacting massive particles (WIMPs). While a direct detection of relic neutrinos seems at the moment impossible, there are experiments looking for baryonic dark matter in the form of Massive Compact Halo Objects, and for particle dark matter in the form of axions and WIMPS. (orig.)

  15. Light-matter interaction physics and engineering at the nanoscale

    CERN Document Server

    Weiner, John

    2013-01-01

    This book draws together the essential elements of classical electrodynamics, surface wave physics, plasmonic materials, and circuit theory of electrical engineering to provide insight into the essential physics of nanoscale light-matter interaction and to provide design methodology for practical nanoscale plasmonic devices. A chapter on classical and quantal radiation also highlights the similarities (and differences) between the classical fields of Maxwell's equations and the wave functions of Schrodinger's equation. The aim of this chapter is to provide a semiclassical picture of atomic absorption and emission of radiation, lending credence and physical plausibility to the "rules" of standard wave-mechanical calculations.

  16. Disposal Of Waste Matter

    International Nuclear Information System (INIS)

    Kim, Jeong Hyeon; Lee, Seung Mu

    1989-02-01

    This book deals with disposal of waste matter management of soiled waste matter in city with introduction, definition of waste matter, meaning of management of waste matter, management system of waste matter, current condition in the country, collect and transportation of waste matter disposal liquid waste matter, industrial waste matter like plastic, waste gas sludge, pulp and sulfuric acid, recycling technology of waste matter such as recycling system of Black clawson, Monroe and Rome.

  17. Impeded Dark Matter

    Energy Technology Data Exchange (ETDEWEB)

    Kopp, Joachim; Liu, Jia [PRISMA Cluster of Excellence & Mainz Institute for Theoretical Physics,Johannes Gutenberg University,Staudingerweg 7, 55099 Mainz (Germany); Slatyer, Tracy R. [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States); Wang, Xiao-Ping [PRISMA Cluster of Excellence & Mainz Institute for Theoretical Physics,Johannes Gutenberg University,Staudingerweg 7, 55099 Mainz (Germany); Xue, Wei [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States)

    2016-12-12

    We consider dark matter models in which the mass splitting between the dark matter particles and their annihilation products is tiny. Compared to the previously proposed Forbidden Dark Matter scenario, the mass splittings we consider are much smaller, and are allowed to be either positive or negative. To emphasize this modification, we dub our scenario “Impeded Dark Matter”. We demonstrate that Impeded Dark Matter can be easily realized without requiring tuning of model parameters. For negative mass splitting, we demonstrate that the annihilation cross-section for Impeded Dark Matter depends linearly on the dark matter velocity or may even be kinematically forbidden, making this scenario almost insensitive to constraints from the cosmic microwave background and from observations of dwarf galaxies. Accordingly, it may be possible for Impeded Dark Matter to yield observable signals in clusters or the Galactic center, with no corresponding signal in dwarfs. For positive mass splitting, we show that the annihilation cross-section is suppressed by the small mass splitting, which helps light dark matter to survive increasingly stringent constraints from indirect searches. As specific realizations for Impeded Dark Matter, we introduce a model of vector dark matter from a hidden SU(2) sector, and a composite dark matter scenario based on a QCD-like dark sector.

  18. Impeded Dark Matter

    International Nuclear Information System (INIS)

    Kopp, Joachim; Liu, Jia; Slatyer, Tracy R.; Wang, Xiao-Ping; Xue, Wei

    2016-01-01

    We consider dark matter models in which the mass splitting between the dark matter particles and their annihilation products is tiny. Compared to the previously proposed Forbidden Dark Matter scenario, the mass splittings we consider are much smaller, and are allowed to be either positive or negative. To emphasize this modification, we dub our scenario “Impeded Dark Matter”. We demonstrate that Impeded Dark Matter can be easily realized without requiring tuning of model parameters. For negative mass splitting, we demonstrate that the annihilation cross-section for Impeded Dark Matter depends linearly on the dark matter velocity or may even be kinematically forbidden, making this scenario almost insensitive to constraints from the cosmic microwave background and from observations of dwarf galaxies. Accordingly, it may be possible for Impeded Dark Matter to yield observable signals in clusters or the Galactic center, with no corresponding signal in dwarfs. For positive mass splitting, we show that the annihilation cross-section is suppressed by the small mass splitting, which helps light dark matter to survive increasingly stringent constraints from indirect searches. As specific realizations for Impeded Dark Matter, we introduce a model of vector dark matter from a hidden SU(2) sector, and a composite dark matter scenario based on a QCD-like dark sector.

  19. Dark matter in the universe

    Science.gov (United States)

    Turner, Michael S.

    1991-01-01

    What is the quantity and composition of material in the Universe? This is one of the most fundamental questions we can ask about the Universe, and its answer bears on a number of important issues including the formation of structure in the Universe, and the ultimate fate and the earliest history of the Universe. Moreover, answering this question could lead to the discovery of new particles, as well as shedding light on the nature of the fundamental interactions. At present, only a partial answer is at hand. Most of the radiation in the Universe does not give off detectable radiation; it is dark. The dark matter associated with bright galaxies contributes somewhere between 10 and 30 percent of the critical density; baryonic matter contributes between 1.1 and 12 percent of the critical. The case for the spatially flat, Einstein-de Sitter model is supported by three compelling theoretical arguments - structure formation, the temporal Copernican principle, and inflation - and by some observational data. If Omega is indeed unity, or even just significantly greater than 0.1, then there is a strong case for a Universe comprised of nonbaryonic matter. There are three well motivated particle dark matter candidates: an axion of mass 10 (exp -6) eV to 10 (exp -4) eV; a neutrino of mass 10 GeV to about 3 TeV; or a neutrino of mass 20 eV to 90 eV. All three possibilities can be tested by experiments that are either planned or are underway.

  20. Dark matter and rotation curves of spiral galaxies

    Czech Academy of Sciences Publication Activity Database

    Křížek, Michal; Křížek, Filip; Somer, L.

    2016-01-01

    Roč. 25, April (2016), s. 64-77 ISSN 1313-2709 R&D Projects: GA MŠk(CZ) LG15052 Institutional support: RVO:67985840 ; RVO:61389005 Keywords : red dwarf * dark matter * spiral galaxy Subject RIV: BA - General Mathematics http://www.astro.bas.bg/AIJ/issues/n25/MKrizek.pdf

  1. Neutrino signals from gravitino dark matter with broken R-parity

    Energy Technology Data Exchange (ETDEWEB)

    Grefe, M.

    2008-12-15

    The gravitino is a promising supersymmetric dark matter candidate, even without strict R-parity conservation. In fact, with some small R-parity violation, gravitinos are sufficiently long-lived to constitute the dark matter of the universe, while the resulting cosmological scenario is consistent with primordial nucleosynthesis and the high reheating temperature needed for thermal leptogenesis. Furthermore, in this scenario the gravitino is unstable and might thus be accessible by indirect detection via its decay products. We compute in this thesis the partial decay widths for the gravitino in models with bilinear R-parity breaking. In addition, we determine the neutrino signal from astrophysical gravitino dark matter decays. Finally, we discuss the feasibility of detecting these neutrino signals in present and future neutrino experiments, and conclude that it will be a challenging task. Albeit, if detected, this distinctive signal might bring considerable support to the scenario of decaying gravitino dark matter. (orig.)

  2. Origin of heat-induced structural changes in dissolved organic matter

    Czech Academy of Sciences Publication Activity Database

    Drastík, M.; Novák, František; Kučerík, J.

    2013-01-01

    Roč. 90, č. 2 (2013), s. 789-795 ISSN 0045-6535 Institutional support: RVO:60077344 Keywords : dissolved organic matter * humic substances * hydration * hysteresis Subject RIV: DF - Soil Science Impact factor: 3.499, year: 2013

  3. Evidence of dark matter from biological observations

    International Nuclear Information System (INIS)

    Zioutas, K.

    1990-01-01

    In accordance with the generally accepted properties of dark matter (DM) candidates, the probability of their interaction with living matter must be equal to that for inorganic matter, and the expected effects might be unique and provide the etiology related to the appearance of several biological phenomena having sometimes fatal late effects. Although collisions with DM are rare, the charged secondaries (recoiling atoms) are expected to be high linear energy transfer particles favouring the highest relative biological effectiveness values for this, as yet invisible, part of the natural background radiation. A few cases are given, where a correlation between DM interaction and phenomena in living matter might already exist, or can show up in existing data: biorhythms with periodicities identical to known cosmic frequencies are explainable with gravitationally clustered DM around the sun, the moon, the earth, etc. The observed arrhythmia, when biological probes are moved (in airplanes, satellites, etc.) support this idea strongly. It is also proposed to implement some of the biological properties and processes (such as element composition and chemical reactions) in future DM detectors in order to improve their sensitivity. The interdisciplinary feedback is bidirectional: huge DM detectors could be used in attempt to understand enigmatic biological behaviour. (orig.)

  4. Quality research in healthcare: are researchers getting enough statistical support?

    Directory of Open Access Journals (Sweden)

    Ambler Gareth

    2006-01-01

    Full Text Available Abstract Background Reviews of peer-reviewed health studies have highlighted problems with their methodological quality. As published health studies form the basis of many clinical decisions including evaluation and provisions of health services, this has scientific and ethical implications. The lack of involvement of methodologists (defined as statisticians or quantitative epidemiologists has been suggested as one key reason for this problem and this has been linked to the lack of access to methodologists. This issue was highlighted several years ago and it was suggested that more investments were needed from health care organisations and Universities to alleviate this problem. Methods To assess the current level of methodological support available for health researchers in England, we surveyed the 25 National Health Services Trusts in England, that are the major recipients of the Department of Health's research and development (R&D support funding. Results and discussion The survey shows that the earmarking of resources to provide appropriate methodological support to health researchers in these organisations is not widespread. Neither the level of R&D support funding received nor the volume of research undertaken by these organisations showed any association with the amount they spent in providing a central resource for methodological support for their researchers. Conclusion The promotion and delivery of high quality health research requires that organisations hosting health research and their academic partners put in place funding and systems to provide appropriate methodological support to ensure valid research findings. If resources are limited, health researchers may have to rely on short courses and/or a limited number of advisory sessions which may not always produce satisfactory results.

  5. International Expert Review of Sr-Can: Safety Assessment Methodology - External review contribution in support of SSI's and SKI's review of SR-Can

    Energy Technology Data Exchange (ETDEWEB)

    Sagar, Budhi (Center for Nuclear Waste Regulatory Analyses, Southwest Research Inst., San Antonio, TX (US)); Egan, Michael (Quintessa Limited, Henley-on-Thames (GB)); Roehlig, Klaus-Juergen (Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH (DE)); Chapman, Neil (Independent Consultant (XX)); Wilmot, Roger (Galson Sciences Limited, Oakham (GB))

    2008-03-15

    In 2006, SKB published a safety assessment (SR-Can) as part of its work to support a licence application for the construction of a final repository for spent nuclear fuel. The purposes of the SR-Can project were stated in the main project report to be: 1. To make a first assessment of the safety of potential KBS-3 repositories at Forsmark and Laxemar to dispose of canisters as specified in the application for the encapsulation plant. 2. To provide feedback to design development, to SKB's research and development (R and D) programme, to further site investigations and to future safety assessments. 3. To foster a dialogue with the authorities that oversee SKB's activities, i.e. the Swedish Nuclear Power Inspectorate, SKI, and the Swedish Radiation Protection Authority, SSI, regarding interpretation of applicable regulations, as a preparation for the SR-Site project. To help inform their review of SKB's proposed approach to development of the longterm safety case, the authorities appointed three international expert review teams to carry out a review of SKB's SR-Can safety assessment report. Comments from one of these teams - the Safety Assessment Methodology (SAM) review team - are presented in this document. The SAM review team's scope of work included an examination of SKB's documentation of the assessment ('Long-term safety for KBS-3 Repositories at Forsmark and Laxemar - a first evaluation' and several supporting reports) and hearings with SKB staff and contractors, held in March 2007. As directed by SKI and SSI, the SAM review team focused on methodological aspects and sought to determine whether SKB's proposed safety assessment methodology is likely to be suitable for use in the future SR-Site and to assess its consistency with the Swedish regulatory framework. No specific evaluation of long-term safety or site acceptability was undertaken by any of the review teams. SKI and SSI's Terms of Reference for the SAM

  6. What do QCD sum rules tell us about dense matter?

    International Nuclear Information System (INIS)

    Cohen, T.D.; Washington Univ., Seattle, WA

    1995-01-01

    The QCD sum rule approach to the properties of hadrons in both the vacuum and in nuclear matter is discussed. The primary limitation for the nuclear matter case is the absence of reliable phenomenological information about the form of the spectral function and about the value of certain four quark condensates. The approach gives moderate evidence in support of the Dirac phenomenology picture of strong attractive Lorentz scalar and repulsive Lorentz vector optical potentials. The approach gives weak evidence for decreasing vector meson masses in medium. (orig.)

  7. Separating the effects of organic matter-mineral interactions and organic matter chemistry on the sorption of diuron and phenanthrene.

    Science.gov (United States)

    Ahangar, Ahmad Gholamalizadeh; Smernik, Ronald J; Kookana, Rai S; Chittleborough, David J

    2008-06-01

    Even though it is well established that soil C content is the primary determinant of the sorption affinity of soils for non-ionic compounds, it is also clear that organic carbon-normalized sorption coefficients (K(OC)) vary considerably between soils. Two factors that may contribute to K(OC) variability are variations in organic matter chemistry between soils and interactions between organic matter and soil minerals. Here, we quantify these effects for two non-ionic sorbates-diuron and phenanthrene. The effect of organic matter-mineral interactions were evaluated by comparing K(OC) for demineralized (HF-treated) soils, with K(OC) for the corresponding whole soils. For diuron and phenanthrene, average ratios of K(OC) of the HF-treated soils to K(OC) of the whole soils were 2.5 and 2.3, respectively, indicating a substantial depression of K(OC) due to the presence of minerals in the whole soils. The effect of organic matter chemistry was determined by correlating K(OC) against distributions of C types determined using solid-state (13)C NMR spectroscopy. For diuron, K(OC) was positively correlated with aryl C and negatively correlated with O-alkyl C, for both whole and HF-treated soils, whereas for phenanthrene, these correlations were only present for the HF-treated soils. We suggest that the lack of a clear effect of organic matter chemistry on whole soil K(OC) for phenanthrene is due to an over-riding influence of organic matter-mineral interactions in this case. This hypothesis is supported by a correlation between the increase in K(OC) on HF-treatment and the soil clay content for phenanthrene, but not for diuron.

  8. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  9. NeOn Methodology for Building Ontology Networks: Specification, Scheduling and Reuse

    OpenAIRE

    Suárez-Figueroa, Mari Carmen

    2010-01-01

    A new ontology development paradigm has started; its emphasis lies on the reuse and possible subsequent reengineering of knowledge resources, on the collaborative and argumentative ontology development, and on the building of ontology networks; this new trend is the opposite of building new ontologies from scratch. To help ontology developers in this new paradigm, it is important to provide strong methodological support. This thesis presents some contributions to the methodological area of...

  10. Understanding palliative care on the heart failure care team: an innovative research methodology.

    Science.gov (United States)

    Lingard, Lorelei A; McDougall, Allan; Schulz, Valerie; Shadd, Joshua; Marshall, Denise; Strachan, Patricia H; Tait, Glendon R; Arnold, J Malcolm; Kimel, Gil

    2013-05-01

    There is a growing call to integrate palliative care for patients with advanced heart failure (HF). However, the knowledge to inform integration efforts comes largely from interview and survey research with individual patients and providers. This work has been critically important in raising awareness of the need for integration, but it is insufficient to inform solutions that must be enacted not by isolated individuals but by complex care teams. Research methods are urgently required to support systematic exploration of the experiences of patients with HF, family caregivers, and health care providers as they interact as a care team. To design a research methodology that can support systematic exploration of the experiences of patients with HF, caregivers, and health care providers as they interact as a care team. This article describes in detail a methodology that we have piloted and are currently using in a multisite study of HF care teams. We describe three aspects of the methodology: the theoretical framework, an innovative sampling strategy, and an iterative system of data collection and analysis that incorporates four data sources and four analytical steps. We anticipate that this innovative methodology will support groundbreaking research in both HF care and other team settings in which palliative integration efforts are emerging for patients with advanced nonmalignant disease. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  11. A methodology and supply chain management inspired reference ontology for modeling healthcare teams.

    Science.gov (United States)

    Kuziemsky, Craig E; Yazdi, Sara

    2011-01-01

    Numerous studies and strategic plans are advocating more team based healthcare delivery that is facilitated by information and communication technologies (ICTs). However before we can design ICTs to support teams we need a solid conceptual model of team processes and a methodology for using such a model in healthcare settings. This paper draws upon success in the supply chain management domain to develop a reference ontology of healthcare teams and a methodology for modeling teams to instantiate the ontology in specific settings. This research can help us understand how teams function and how we can design ICTs to support teams.

  12. Dark Matter

    International Nuclear Information System (INIS)

    Bashir, A.; Cotti, U.; De Leon, C. L.; Raya, A; Villasenor, L.

    2008-01-01

    One of the biggest scientific mysteries of our time resides in the identification of the particles that constitute a large fraction of the mass of our Universe, generically known as dark matter. We review the observations and the experimental data that imply the existence of dark matter. We briefly discuss the properties of the two best dark-matter candidate particles and the experimental techniques presently used to try to discover them. Finally, we mention a proposed project that has recently emerged within the Mexican community to look for dark matter

  13. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  14. Characterising the grey matter correlates of leukoaraiosis in cerebral small vessel disease

    Directory of Open Access Journals (Sweden)

    Christian Lambert

    2015-01-01

    We demonstrate that SVD severity is associated with regional cortical thinning. Furthermore a quantitative measure of SVD severity (WMH volume can be predicted from grey matter measures, supporting an association between white and grey matter damage. The pattern of cortical thinning and volumetric decline is distinctive for SVD severity compared to ageing. These results, taken together, suggest that there is a phenotypic pattern of atrophy associated with SVD severity.

  15. Teaching Theory of Science and Research Methodology to Nursing Students: A Practice-Developing Approach

    DEFF Research Database (Denmark)

    Sievert, Anne; Chaiklin, Seth

    , in a principled way, to select subject-matter content for a course for nursing students on theory of science and research methodology. At the same time, the practical organisation of the project was motivated by a practice-developing research perspective. The purpose of the presentation is to illustrate how...... the idea of practice-developing research was realised in this concrete project. A short introduction is first given to explain the practical situation that motivated the need and interest to select subject matter for teaching. Then, the main part of the presentation explains the considerations involved...... developed. On the basis of this presentation, it should be possible to get a concrete image of one form for practice-developing research. The presentation concludes with a discussion that problematises the sense in which general knowledge about development of nursing school teaching practice has been...

  16. Study of methodology diversification in diagnostics

    International Nuclear Information System (INIS)

    Suda, Kazunori; Yonekawa, Tsuyoshi; Yoshikawa, Shinji; Hasegawa, Makoto

    1999-03-01

    There are several research activities to enhance safety and reliability of nuclear power plant operation and maintenance. We are developing a concept of an autonomous operation system where the role of operators is replaced with artificial intelligence. The purpose of the study described in this report is to develop a operator support system in abnormal plant situations. Conventionally, diagnostic modules based on individual methodology such as expert system have been developed and verified. In this report, methodology diversification is considered to integrate diagnostic modules which performance are confirmed using information processing technique. Technical issues to be considered in diagnostic methodology diversification are; 1)reliability of input data, 2)diversification of knowledge models, algorithms and reasoning schemes, 3)mutual complement and robustness. The diagnostic module utilizing the different approaches defined along with strategy of diversification was evaluated using fast breeder plant simulator. As a result, we confirmed that any singular diagnostic module can not meet accuracy criteria for the entire set of anomaly events. In contrast with this, we confirmed that every abnormality could be precisely diagnosed by a mutual combination. In other words, legitimacy of approach selected by strategy of diversification was shown, and methodology diversification attained clear efficiency for abnormal diagnosis. It has been also confirmed that the diversified diagnostic system implemented in this study is able to maintain its accuracy even in case that encountered scale of abnormality is different from reference cases embedded in the knowledge base. (author)

  17. Methodological and epistemological problems in the sociological study of foreign immigration

    Directory of Open Access Journals (Sweden)

    Ramón Llopis Goig

    2014-11-01

    Full Text Available The aim of this paper is to identify and to analyze the main technical and methodological difficulties faced by the sociological research on immigration carried out in Spain during the last years. More concretely, what is sought is to figure out the epistemological assumptions which support the social research practices that generate the mentioned techno-methodological problems. For that, firstly, we study some problems generated in the sociological study of immigration as a consequence of the prevalence, in the field of the practices of social research, of a neopositivist methodology, what we have called «public opinion syndrome». Secondly, we examine the difficulties derived from the necessity of considering «empirical reflexivity» methodologically. And thirdly, we present the derived difficulties of the implicit assumption of a «methodological nationalism». These analyses are supplemented with several reflections leading to the establishment of guidelines for the design and the implementation of future research

  18. Clustering Categories in Support Vector Machines

    DEFF Research Database (Denmark)

    Carrizosa, Emilio; Nogales-Gómez, Amaya; Morales, Dolores Romero

    2017-01-01

    The support vector machine (SVM) is a state-of-the-art method in supervised classification. In this paper the Cluster Support Vector Machine (CLSVM) methodology is proposed with the aim to increase the sparsity of the SVM classifier in the presence of categorical features, leading to a gain in in...

  19. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  20. The Evolution of Galaxies by the Incompatibility between Dark Matter and Baryonic Matter

    OpenAIRE

    Chung, Ding-Yu

    2001-01-01

    In this paper, the evolution of galaxies is by the incompatibility between dark matter and baryonic matter. Due to the structural difference, baryonic matter and dark matter are incompatible to each other as oil droplet and water in emulsion. In the interfacial zone between dark matter and baryonic matter, this incompatibility generates the modification of Newtonian dynamics to keep dark matter and baryonic matter apart. The five periods of baryonic structure development in the order of incre...

  1. Dark Matter

    Indian Academy of Sciences (India)

    What You See Ain't What. You Got, Resonance, Vol.4,. No.9,1999. Dark Matter. 2. Dark Matter in the Universe. Bikram Phookun and Biman Nath. In Part 11 of this article we learnt that there are compelling evidences from dynamics of spiral galaxies, like our own, that there must be non-luminous matter in them. In this.

  2. Resource management for sustainable development : The application of a methodology to support resource management for the adequate application of Construction Systems to enhance sustainability in the lower income dwelling construction industry in Costa Rica

    NARCIS (Netherlands)

    Egmond - de Wilde De Ligny, van E.L.C.; Erkelens, P.A.; Jonge, de S.; Vliet, van A.A.M.

    2000-01-01

    This paper describes the results of the application of a methodology to support resource management for the enhancement of sustainability in the construction industry. Particular emphasis is given to the sustainability of manufacturing and application of construction systems for low income housing

  3. D1.5 WEKIT Framework and Training Methodology

    NARCIS (Netherlands)

    Limbu, Bibeg

    2017-01-01

    The document reports on the status of the WEKIT framework. Building up on the methodologies described in D1.3, it outlines the work done and progress made so far in the Task 1.3. The WEKIT framework was drafted to guide and support the development and implementation of the project. It aims to

  4. Soil Radiological Characterisation Methodology

    International Nuclear Information System (INIS)

    Attiogbe, Julien; Aubonnet, Emilie; De Maquille, Laurence; De Moura, Patrick; Desnoyers, Yvon; Dubot, Didier; Feret, Bruno; Fichet, Pascal; Granier, Guy; Iooss, Bertrand; Nokhamzon, Jean-Guy; Ollivier Dehaye, Catherine; Pillette-Cousin, Lucien; Savary, Alain

    2014-12-01

    This report presents the general methodology and best practice approaches which combine proven existing techniques for sampling and characterisation to assess the contamination of soils prior to remediation. It is based on feedback of projects conducted by main French nuclear stakeholders involved in the field of remediation and dismantling (EDF, CEA, AREVA and IRSN). The application of this methodology will enable the project managers to obtain the elements necessary for the drawing up of files associated with remediation operations, as required by the regulatory authorities. It is applicable to each of the steps necessary for the piloting of remediation work-sites, depending on the objectives targeted (release into the public domain, re-use, etc.). The main part describes the applied statistical methodology with the exploratory analysis and variogram data, identification of singular points and their location. The results obtained permit assessment of a mapping to identify the contaminated surface and subsurface areas. It stakes the way for radiological site characterisation since the initial investigations from historical and functional analysis to check that the remediation objectives have been met. It follows an example application from the feedback of the remediation of a contaminated site on the Fontenay aux Roses facility. It is supplemented by a glossary of main terms used in the field from different publications or international standards. This technical report is a support of the ISO Standard ISO ISO/TC 85/SC 5 N 18557 'Sampling and characterisation principles for soils, buildings and infrastructures contaminated by radionuclides for remediation purposes'. (authors) [fr

  5. Summary of the IAEA's BIOMASS reference biosphere methodology for Environment Agency staff

    International Nuclear Information System (INIS)

    Coughtrey, P.

    2001-01-01

    The International Atomic Energy Agency (IAEA) programme on BIOsphere Modelling and ASSessment (BIOMASS) was launched in October 1996, and will complete during 2001. The BIOMASS programme aimed to develop and apply a methodology for defining biospheres for practical radiological assessment of releases from radioactive waste disposal. This report provides a summary description of the BIOMASS methodology. The BIOMASS methodology has been developed through international collaboration and represents a major milestone in biosphere modelling. It provides an approach supported by a wide range of developers, regulators, biosphere experts and safety assessment specialists. The Environment Agency participated actively in the BIOMASS programme

  6. Examining the effect of psychopathic traits on gray matter volume in a community substance abuse sample.

    Science.gov (United States)

    Cope, Lora M; Shane, Matthew S; Segall, Judith M; Nyalakanti, Prashanth K; Stevens, Michael C; Pearlson, Godfrey D; Calhoun, Vince D; Kiehl, Kent A

    2012-11-30

    Psychopathy is believed to be associated with brain abnormalities in both paralimbic (i.e., orbitofrontal cortex, insula, temporal pole, parahippocampal gyrus, posterior cingulate) and limbic (i.e., amygdala, hippocampus, anterior cingulate) regions. Recent structural imaging studies in both community and prison samples are beginning to support this view. Sixty-six participants, recruited from community corrections centers, were administered the Hare psychopathy checklist-revised (PCL-R), and underwent magnetic resonance imaging (MRI). Voxel-based morphometry was used to test the hypothesis that psychopathic traits would be associated with gray matter reductions in limbic and paralimbic regions. Effects of lifetime drug and alcohol use on gray matter volume were covaried. Psychopathic traits were negatively associated with gray matter volumes in right insula and right hippocampus. Additionally, psychopathic traits were positively associated with gray matter volumes in bilateral orbital frontal cortex and right anterior cingulate. Exploratory regression analyses indicated that gray matter volumes within right hippocampus and left orbital frontal cortex combined to explain 21.8% of the variance in psychopathy scores. These results support the notion that psychopathic traits are associated with abnormal limbic and paralimbic gray matter volume. Furthermore, gray matter increases in areas shown to be functionally impaired suggest that the structure-function relationship may be more nuanced than previously thought. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  7. External Costs and Benefits of Energy. Methodologies, Results and Effects on Renewable Energies Competitivity

    International Nuclear Information System (INIS)

    Saez, R.; Cabal, H.; Varela, M.

    1999-01-01

    This study attempts to give a summarised vision of the concept of eternality in energy production, the social and economic usefulness of its evaluation and consideration as support to the political decision-marking in environmental regulation matters, technologies selection of new plants, priorities establishment on energy plans, etc. More relevant environmental externalisation are described, as are the effects on the health, ecosystems, materials and climate, as well as some of the socioeconomic externalisation such as the employment, increase of the GDP and the reduction and depletion of energy resources. Different methodologies used during the last years have been reviewed as well as the principals resulted obtained in the most relevant studies accomplished internationally on this topic. Special mention has deserved the European study National Implementation of the Extern E Methodology in the EU . Results obtained are represented in Table 2 of this study. Also they are exposed, in a summarised way, the results obtained in the evaluation of environmental externalisation of the Spanish electrical system in function of the fuel cycle. In this last case the obtained results are more approximated since have been obtained by extrapolation from the obtained for ten representative plants geographically distributed trough the Peninsula. Finally it has been analysed the influence that the internalization of the external costs of conventional energies can have in the competitiveness and in te market of renewable energy, those which originate less environmental effects and therefore produce much smaller external costs. The mechanisms of internalization and the consideration on the convenience or not of their incorporation in the price of energy have been also discussed. (Author) 30 refs

  8. Becoming ecological citizens: connecting people through performance art, food matter and practices

    Science.gov (United States)

    Roe, Emma; Buser, Michael

    2016-01-01

    Engaging the interest of Western citizens in the complex food connections that shape theirs’ and others’ personal wellbeing around issues such as food security and access is challenging. This article is critical of the food marketplace as the site for informing consumer behaviour and argues instead for arts-based participatory activities to support the performance of ecological citizens in non-commercial spaces. Following the ongoing methodological and conceptual fascination with performance, matter and practice in cultural food studies, we outline what the ecological citizen, formed through food’s agentive potential, does and could do. This is an ecological citizen, defined not in its traditional relation to the state but rather to the world of humans and non-humans whose lives are materially interconnected through nourishment. The article draws on the theories of Berlant, Latour, Bennett and Massumi. Our methodology is a collaborative arts-led research project that explored and juxtaposed diverse food practices with artist Paul Hurley, researchers, community partners, volunteers and participants in Bristol, UK. It centred on a 10-day exhibition where visitors were exposed to a series of interactive explorations with and about food. Our experience leads us to outline two steps for enacting ecological citizenship. The first step is to facilitate sensory experiences that enable the agential qualities of foodstuffs to shape knowledge making. The second is to create a space where people can perform, or relate differently, in unusual manners to food. Through participating in the project and visiting the exhibition, people were invited to respond not only as ‘ethical consumers’ but also as ‘ecological citizens’. This participatory approach to research can contribute to understandings of human-world entanglements. PMID:29708123

  9. Codecaying Dark Matter.

    Science.gov (United States)

    Dror, Jeff Asaf; Kuflik, Eric; Ng, Wee Hao

    2016-11-18

    We propose a new mechanism for thermal dark matter freeze-out, called codecaying dark matter. Multicomponent dark sectors with degenerate particles and out-of-equilibrium decays can codecay to obtain the observed relic density. The dark matter density is exponentially depleted through the decay of nearly degenerate particles rather than from Boltzmann suppression. The relic abundance is set by the dark matter annihilation cross section, which is predicted to be boosted, and the decay rate of the dark sector particles. The mechanism is viable in a broad range of dark matter parameter space, with a robust prediction of an enhanced indirect detection signal. Finally, we present a simple model that realizes codecaying dark matter.

  10. Topological hierarchy matters — topological matters with superlattices of defects

    International Nuclear Information System (INIS)

    He Jing; Kou Su-Peng

    2016-01-01

    Topological insulators/superconductors are new states of quantum matter with metallic edge/surface states. In this paper, we review the defects effect in these topological states and study new types of topological matters — topological hierarchy matters. We find that both topological defects (quantized vortices) and non topological defects (vacancies) can induce topological mid-gap states in the topological hierarchy matters after considering the superlattice of defects. These topological mid-gap states have nontrivial topological properties, including the nonzero Chern number and the gapless edge states. Effective tight-binding models are obtained to describe the topological mid-gap states in the topological hierarchy matters. (topical review)

  11. Methodology to extract of humic substances of lombricompost and evaluation of their performance

    International Nuclear Information System (INIS)

    Torrente Trujillo, Armando; Gomez Zambrano, Jairo

    1995-01-01

    The present works was developed at the facultad de ciencias agropecuarias of the Universidad Nacional de Colombia, located in Palmira City Valle del Cauca. The research consisted in the development of the appropriate methodology to extract humic substances contained in lombricompost and on the other hand to evaluate the performance in organic carbon of the fulvic and humic acids. The lombricompost source consisted in organic matter such as: dug cow, filter press cake, coffee pulp and Paspalum notatum with and without application of lime. The results showed sixteen steps, which are completely described in the work, obtain the proposal methodology. By the other hand this method showed that humic acids in the lombricompost are richer than fulvic ones; besides among the four sources used in the experiment the filter press cake was different and higher in carbon yield than coffee pulp and Paspalum notatum

  12. Two Stage Fuzzy Methodology to Evaluate the Credit Risks of Investment Projects

    OpenAIRE

    O. Badagadze; G. Sirbiladze; I. Khutsishvili

    2014-01-01

    The work proposes a decision support methodology for the credit risk minimization in selection of investment projects. The methodology provides two stages of projects’ evaluation. Preliminary selection of projects with minor credit risks is made using the Expertons Method. The second stage makes ranking of chosen projects using the Possibilistic Discrimination Analysis Method. The latter is a new modification of a well-known Method of Fuzzy Discrimination Analysis.

  13. Update in the methodology of the chronic stress paradigm: internal control matters

    Directory of Open Access Journals (Sweden)

    Boyks Marco

    2011-04-01

    Full Text Available Abstract To date, the reliability of induction of a depressive-like state using chronic stress models is confronted by many methodological limitations. We believe that the modifications to the stress paradigm in mice proposed herein allow some of these limitations to be overcome. Here, we discuss a variant of the standard stress paradigm, which results in anhedonia. This anhedonic state was defined by a decrease in sucrose preference that was not exhibited by all animals. As such, we propose the use of non-anhedonic, stressed mice as an internal control in experimental mouse models of depression. The application of an internal control for the effects of stress, along with optimized behavioural testing, can enable the analysis of biological correlates of stress-induced anhedonia versus the consequences of stress alone in a chronic-stress depression model. This is illustrated, for instance, by distinct physiological and molecular profiles in anhedonic and non-anhedonic groups subjected to stress. These results argue for the use of a subgroup of individuals who are negative for the induction of a depressive phenotype during experimental paradigms of depression as an internal control, for more refined modeling of this disorder in animals.

  14. Update in the methodology of the chronic stress paradigm: internal control matters.

    Science.gov (United States)

    Strekalova, Tatyana; Couch, Yvonne; Kholod, Natalia; Boyks, Marco; Malin, Dmitry; Leprince, Pierre; Steinbusch, Harry Mw

    2011-04-27

    To date, the reliability of induction of a depressive-like state using chronic stress models is confronted by many methodological limitations. We believe that the modifications to the stress paradigm in mice proposed herein allow some of these limitations to be overcome. Here, we discuss a variant of the standard stress paradigm, which results in anhedonia. This anhedonic state was defined by a decrease in sucrose preference that was not exhibited by all animals. As such, we propose the use of non-anhedonic, stressed mice as an internal control in experimental mouse models of depression. The application of an internal control for the effects of stress, along with optimized behavioural testing, can enable the analysis of biological correlates of stress-induced anhedonia versus the consequences of stress alone in a chronic-stress depression model. This is illustrated, for instance, by distinct physiological and molecular profiles in anhedonic and non-anhedonic groups subjected to stress. These results argue for the use of a subgroup of individuals who are negative for the induction of a depressive phenotype during experimental paradigms of depression as an internal control, for more refined modeling of this disorder in animals.

  15. Methodology applied to develop the DHIE: applied methodology

    CSIR Research Space (South Africa)

    Herselman, Marlien

    2016-12-01

    Full Text Available This section will address the methodology that was applied to develop the South African Digital Health Innovation Ecosystem (DHIE). Each chapter under Section B represents a specific phase in the methodology....

  16. Distinct optical chemistry of dissolved organic matter in urban pond ecosystems

    Czech Academy of Sciences Publication Activity Database

    McEnroe, N. A.; Williams, C. J.; Xenopoulos, M. A.; Porcal, Petr; Frost, P. C.

    2013-01-01

    Roč. 8, č. 11 (2013), e80334 E-ISSN 1932-6203 Institutional support: RVO:60077344 Keywords : dissolved organic matter * photodegradation * fluorescence * PARAFAC Subject RIV: DA - Hydrology ; Limnology Impact factor: 3.534, year: 2013

  17. Contextual assessment of organisational culture - methodological development in two case studies

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.

    2002-01-01

    Despite the acknowledged significance of organisational culture in the nuclear field, previous cultural studies have concentrated on purely safety related matters, or been only descriptive in nature. New kinds of methods, taking into account the overall objectives of the organisation, were needed to assess culture and develop its working practices appropriately. VTT developed the Contextual Assessment of Organisational Culture (CAOC) methodology during the FINNUS programme. The methodology utilises two concepts, organisational culture and core task. The core task can be defined as the core demands and content of work that the organisation has to accomplish in order to be effective. The core task concept is used in assessing the central dimensions of the organisation's culture. Organisational culture is defined as a solution the company has generated in order to fulfil the perceived demands of its core task. The CAOC-methodology was applied in two case studies, in the Radiation and Nuclear Safety Authority of Finland and in the maintenance unit of Loviisa NPP. The aim of the studies was not only to assess the given culture, but also to give the personnel new concepts and new tools for reflecting on their organisation, their jobs and on appropriate working practices. The CAOC-methodology contributes to the design and redesign of work in complex sociotechnical systems. It strives to enhance organisations' capability to assess their current working practices and the meanings attached to them and compare these to the actual demands of their basic mission and so change unadaptive practices. (orig.)

  18. Dark Matter Caustics

    International Nuclear Information System (INIS)

    Natarajan, Aravind

    2010-01-01

    The continuous infall of dark matter with low velocity dispersion in galactic halos leads to the formation of high density structures called caustics. Dark matter caustics are of two kinds : outer and inner. Outer caustics are thin spherical shells surrounding galaxies while inner caustics have a more complicated structure that depends on the dark matter angular momentum distribution. The presence of a dark matter caustic in the plane of the galaxy modifies the gas density in its neighborhood which may lead to observable effects. Caustics are also relevant to direct and indirect dark matter searches.

  19. Application of validity theory and methodology to patient-reported outcome measures (PROMs): building an argument for validity.

    Science.gov (United States)

    Hawkins, Melanie; Elsworth, Gerald R; Osborne, Richard H

    2018-07-01

    Data from subjective patient-reported outcome measures (PROMs) are now being used in the health sector to make or support decisions about individuals, groups and populations. Contemporary validity theorists define validity not as a statistical property of the test but as the extent to which empirical evidence supports the interpretation of test scores for an intended use. However, validity testing theory and methodology are rarely evident in the PROM validation literature. Application of this theory and methodology would provide structure for comprehensive validation planning to support improved PROM development and sound arguments for the validity of PROM score interpretation and use in each new context. This paper proposes the application of contemporary validity theory and methodology to PROM validity testing. The validity testing principles will be applied to a hypothetical case study with a focus on the interpretation and use of scores from a translated PROM that measures health literacy (the Health Literacy Questionnaire or HLQ). Although robust psychometric properties of a PROM are a pre-condition to its use, a PROM's validity lies in the sound argument that a network of empirical evidence supports the intended interpretation and use of PROM scores for decision making in a particular context. The health sector is yet to apply contemporary theory and methodology to PROM development and validation. The theoretical and methodological processes in this paper are offered as an advancement of the theory and practice of PROM validity testing in the health sector.

  20. Mastering the wrinkling of self-supported graphene

    Czech Academy of Sciences Publication Activity Database

    Pacáková, Barbara; Verhagen, Timotheus; Bouša, Milan; Huebner, U.; Vejpravová, Jana; Kalbáč, Martin; Frank, Otakar

    2017-01-01

    Roč. 7, Aug (2017), s. 1-10, č. článku 10003. ISSN 2045-2322 R&D Projects: GA ČR GA14-15357S; GA MŠk LL1301; GA MŠk(CZ) LM2015073 Institutional support: RVO:68378271 ; RVO:61388955 Keywords : wrinkling * graphene Subject RIV: BM - Solid Matter Physics ; Magnetism; CF - Physical ; Theoretical Chemistry (UFCH-W) OBOR OECD: Condensed matter physics (including formerly solid state physics, supercond.); Physical chemistry (UFCH-W) Impact factor: 4.259, year: 2016

  1. A multicriteria decision support methodology for evaluating airport expansion plans

    NARCIS (Netherlands)

    Vreeker, R.; Nijkamp, P.; ter Welle, C.

    2001-01-01

    Rational decision-making requires an assessment of advantages and disadvantages of choice possibilities, including non-market effects (such as externalities). This also applies to strategic decision-making in the transport sector (including aviation). In the past decades various decision support and

  2. Genetic covariance functioners for live weight, condition score, and dry-matter intake measured at different lactations stages of Holstein-Friesian heifers

    NARCIS (Netherlands)

    Koenen, E.P.C.; Veerkamp, R.F.

    1998-01-01

    Genetic parameters for live weight, body condition score and dry-matter intake of dairy heifers were estimated using covariance function methodology. Data were from 469 heifers of the Langhill Dairy Cattle Research Centre and included observations during the first 25 weeks in lactation. Genetic

  3. Hypothetical Dark Matter/Axion rockets: What can be said about Dark Matter in terms of space physics propulsion

    International Nuclear Information System (INIS)

    Beckwith, Andrew

    2009-01-01

    This paper discusses dark matter (DM) particle candidates from non-supersymmetry (SUSY) processes and explores how a DM candidate particle in the 100-400 GeV range could be created. Thrust from DM particles is also proposed for Photon rocket and Axion rockets. It would use a magnetic field to convert DM particles to near photonlike particles in a chamber to create thrust from the discharge of the near-photon-like particles. The presence of DM particles would suggest that thrust from the emerging near-photon-like particle would be greater than with conventional photon rockets. This amplifies and improves on an 'axion rocket ramjet' for interstellar travel. It is assumed that the same methodology used in an axion ramjet could be used with DM, with perhaps greater thrust/power conversion efficiencies.

  4. How can computers support, enrich, and transform collaborative creativity

    DEFF Research Database (Denmark)

    Dalsgaard, Peter; Inie, Nanna; Hansen, Nicolai Brodersen

    2017-01-01

    The aim of the workshop is to examine and discuss how computers can support, enrich, and transform collaborative creative processes. By exploring and combining methodological, theoretical, and design- oriented perspectives, we wish to examine the implications, potentials, and limitations of diffe......The aim of the workshop is to examine and discuss how computers can support, enrich, and transform collaborative creative processes. By exploring and combining methodological, theoretical, and design- oriented perspectives, we wish to examine the implications, potentials, and limitations...... of different approaches to providing digital support for collaborative creativity. Participation in the workshop requires participants to actively document and identify salient themes in one or more examples of computer- supported collaborative creativity, and the resulting material will serve as the empirical...

  5. Methodology development to support NPR strategic planning. Final report

    International Nuclear Information System (INIS)

    1996-01-01

    This report covers the work performed in support of the Office of New Production Reactors during the 9 month period from January through September 1990. Because of the rapid pace of program activities during this time period, the emphasis on work performed shifted from the strategic planning emphasis toward supporting initiatives requiring a more immediate consideration and response. Consequently, the work performed has concentrated on researching and helping identify and resolve those issues considered to be of most immediate concern. Even though they are strongly interrelated, they can be separated into two broad categories as follows: The first category encompasses program internal concerns. Included are issues associated with the current demand for accelerating staff growth, satisfying the immediate need for appropriate skill and experience levels, team building efforts necessary to assure the development of an effective operating organization, ability of people and organizations to satisfactorily understand and execute their assigned roles and responsibilities, and the general facilitation of inter/intra organization communications and working relationships. The second category encompasses program execution concerns. These include those efforts required in development of realistic execution plans and implementation of appropriate control mechanisms which provide for effective forecasting, planning, managing, and controlling of on-going (or soon to be) program substantive activities according to the master integrated schedule and budget

  6. Matter-antimatter and matter-matter interactions at intermediate energies; Interacao materia-antimateria e materia-materia a energias intermediarias

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Antonio Carlos Fontes dos [Missouri Univ., Rolla, MO (United States). Dept. of Physics]. E-mail: antoniocfs@hotmail.com

    2002-07-01

    This article presents some of the recent experimental advances on the study on antimatter-matter and matter-matter interactions, and some of the subtle differences stimulated a great theoretical efforts for explanation of the results experimentally observed.

  7. Asymmetric dark matter

    International Nuclear Information System (INIS)

    Kaplan, David E.; Luty, Markus A.; Zurek, Kathryn M.

    2009-01-01

    We consider a simple class of models in which the relic density of dark matter is determined by the baryon asymmetry of the Universe. In these models a B-L asymmetry generated at high temperatures is transferred to the dark matter, which is charged under B-L. The interactions that transfer the asymmetry decouple at temperatures above the dark matter mass, freezing in a dark matter asymmetry of order the baryon asymmetry. This explains the observed relation between the baryon and dark matter densities for the dark matter mass in the range 5-15 GeV. The symmetric component of the dark matter can annihilate efficiently to light pseudoscalar Higgs particles a or via t-channel exchange of new scalar doublets. The first possibility allows for h 0 →aa decays, while the second predicts a light charged Higgs-like scalar decaying to τν. Direct detection can arise from Higgs exchange in the first model or a nonzero magnetic moment in the second. In supersymmetric models, the would-be lightest supersymmetric partner can decay into pairs of dark matter particles plus standard model particles, possibly with displaced vertices.

  8. Pure Gravities via Color-Kinematics Duality for Fundamental Matter

    CERN Document Server

    Johansson, Henrik

    2015-01-01

    We give a prescription for the computation of loop-level scattering amplitudes in pure Einstein gravity, and four-dimensional pure supergravities, using the color-kinematics duality. Amplitudes are constructed using double copies of pure (super-)Yang-Mills parts and additional contributions from double copies of fundamental matter, which are treated as ghosts. The opposite-statistics states cancel the unwanted dilaton and axion in the bosonic theory, as well as the extra matter supermultiplets in supergravities. As a spinoff, we obtain a prescription for obtaining amplitudes in supergravities with arbitrary non-self-interacting matter. As a prerequisite, we extend the color-kinematics duality from the adjoint to the fundamental representation of the gauge group. We explain the numerator relations that the fundamental kinematic Lie algebra should satisfy. We give nontrivial evidence supporting our construction using explicit tree and loop amplitudes, as well as more general arguments.

  9. Trace elements in suspended particulate matter and liquid fraction of the Arno River waters

    International Nuclear Information System (INIS)

    Capannesi, G.; Cecchi, A.; Mando, P.A.

    1984-01-01

    The concentrations of 46 elements along the course of the Arno River (Tuscany, Italy) have been determined by means of Instrumental Neutron Activation Analysis. Both suspended particulate matter and liquid fraction have been investigated. No chemical treatment has been performed on the samples, either before or after irradiation. Anticoincidence techniques have been employed in the γ spectroscopy. Results are briefly discussed also from a methodological point of view. 4 references, 16 figures, 2 tables

  10. The quark matter

    International Nuclear Information System (INIS)

    Rho, Mannque.

    1980-04-01

    The present status of our understanding of the physics of hadronic (nuclear or neutron) matter under extreme conditions, in particular at high densities is discussed. This is a problem which challenges three disciplines of physics: nuclear physics, astrophysics and particle physics. It is generally believed that we now have a correct and perhaps ultimate theory of the strong interactions, namely quantum chromodynamics (QCD). The constituents of this theory are quarks and gluons, so highly dense matters should be describable in terms of these constituents alone. This is a question that addresses directly to the phenomenon of quark confinement, one of the least understood aspects in particle physics. For nuclear physics, the possibility of a phase change between nuclear matter and quark matter introduces entirely new degrees of freedom in the description of nuclei and will bring perhaps a deeper understanding of nuclear dynamics. In astrophysics, the properties of neutron stars will be properly understood only when the equation of state of 'neutron' matter at densities exceeding that of nuclear matter can be realiably calculated. Most fascinating is the possibility of quark stars existing in nature, not entirely an absurd idea. Finally the quark matter - nuclear matter phase transition must have occured in the early stage of universe when matter expanded from high temperature and density; this could be an essential ingredient in the big-bang cosmology

  11. Life Support Filtration System Trade Study for Deep Space Missions

    Science.gov (United States)

    Agui, Juan H.; Perry, Jay L.

    2017-01-01

    The National Aeronautics and Space Administrations (NASA) technical developments for highly reliable life support systems aim to maximize the viability of long duration deep space missions. Among the life support system functions, airborne particulate matter filtration is a significant driver of launch mass because of the large geometry required to provide adequate filtration performance and because of the number of replacement filters needed to a sustain a mission. A trade analysis incorporating various launch, operational and maintenance parameters was conducted to investigate the trade-offs between the various particulate matter filtration configurations. In addition to typical launch parameters such as mass, volume and power, the amount of crew time dedicated to system maintenance becomes an increasingly crucial factor for long duration missions. The trade analysis evaluated these parameters for conventional particulate matter filtration technologies and a new multi-stage particulate matter filtration system under development by NASAs Glenn Research Center. The multi-stage filtration system features modular components that allow for physical configuration flexibility. Specifically, the filtration system components can be configured in distributed, centralized, and hybrid physical layouts that can result in considerable mass savings compared to conventional particulate matter filtration technologies. The trade analysis results are presented and implications for future transit and surface missions are discussed.

  12. Effects of soil organic matter on the development of the microbial polycyclic aromatic hydrocarbons (PAHs) degradation potentials

    International Nuclear Information System (INIS)

    Yang, Y.; Zhang, N.; Xue, M.; Lu, S.T.; Tao, S.

    2011-01-01

    The microbial activity in soils was a critical factor governing the degradation of organic micro-pollutants. The present study was conducted to analyze the effects of soil organic matter on the development of degradation potentials for polycyclic aromatic hydrocarbons (PAHs). Most of the degradation kinetics for PAHs by the indigenous microorganisms developed in soils can be fitted with the Logistic growth models. The microbial activities were relatively lower in the soils with the lowest and highest organic matter content, which were likely due to the nutrition limit and PAH sequestration. The microbial activities developed in humic acid (HA) were much higher than those developed in humin, which was demonstrated to be able to sequester organic pollutants stronger. The results suggested that the nutrition support and sequestration were the two major mechanisms, that soil organic matter influenced the development of microbial PAHs degradation potentials. - Research highlights: → PAH degradation kinetics obey Logistic model. → Degradation potentials depend on soil organic carbon content. → Humin inhibits the development of PAH degradation activity. → Nutrition support and sequestration regulate microbial degradation capacity. - Soil organic matter regulated PAH degradation potentials through nutrition support and sequestration.

  13. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: 'Kursk' submarine study

    Directory of Open Access Journals (Sweden)

    A. Baklanov

    2003-01-01

    Full Text Available There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.. Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1 probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2 forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2 over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  14. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk"? submarine study

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-03-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  15. Methodology for prediction and estimation of consequences of possible atmospheric releases of hazardous matter: "Kursk" submarine study

    Science.gov (United States)

    Baklanov, A.; Mahura, A.; Sørensen, J. H.

    2003-06-01

    There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.

  16. A Q-Methodological Study of the Kubler-Ross Stage Theory.

    Science.gov (United States)

    Metzger, Anne M.

    1979-01-01

    Investigated the correspondence between stage changes hypothesized by the Kubler-Ross theory and the perception of the course of illness by seriously ill patients and their spouses. Supported the use of Q-methodology as a research procedure for investigations of terminal illness. (Author)

  17. Nonlinear Methodologies for Identifying Seismic Event and Nuclear Explosion Using Random Forest, Support Vector Machine, and Naive Bayes Classification

    Directory of Open Access Journals (Sweden)

    Longjun Dong

    2014-01-01

    Full Text Available The discrimination of seismic event and nuclear explosion is a complex and nonlinear system. The nonlinear methodologies including Random Forests (RF, Support Vector Machines (SVM, and Naïve Bayes Classifier (NBC were applied to discriminant seismic events. Twenty earthquakes and twenty-seven explosions with nine ratios of the energies contained within predetermined “velocity windows” and calculated distance are used in discriminators. Based on the one out cross-validation, ROC curve, calculated accuracy of training and test samples, and discriminating performances of RF, SVM, and NBC were discussed and compared. The result of RF method clearly shows the best predictive power with a maximum area of 0.975 under the ROC among RF, SVM, and NBC. The discriminant accuracies of RF, SVM, and NBC for test samples are 92.86%, 85.71%, and 92.86%, respectively. It has been demonstrated that the presented RF model can not only identify seismic event automatically with high accuracy, but also can sort the discriminant indicators according to calculated values of weights.

  18. Matter and dark matter from false vacuum decay

    Energy Technology Data Exchange (ETDEWEB)

    Buchmueller, W.; Schmitz, K.; Vertongen, G.

    2010-08-15

    We study tachyonic preheating associated with the spontaneous breaking of B-L, the difference of baryon and lepton number. Reheating occurs through the decays of heavy Majorana neutrinos which are produced during preheating and in decays of the Higgs particles of B-L breaking. Baryogenesis is an interplay of nonthermal and thermal leptogenesis, accompanied by thermally produced gravitino dark matter. The proposed mechanism simultaneously explains the generation of matter and dark matter, thereby relating the absolute neutrino mass scale to the gravitino mass. (orig.)

  19. Matter and dark matter from false vacuum decay

    International Nuclear Information System (INIS)

    Buchmueller, W.; Schmitz, K.; Vertongen, G.

    2010-08-01

    We study tachyonic preheating associated with the spontaneous breaking of B-L, the difference of baryon and lepton number. Reheating occurs through the decays of heavy Majorana neutrinos which are produced during preheating and in decays of the Higgs particles of B-L breaking. Baryogenesis is an interplay of nonthermal and thermal leptogenesis, accompanied by thermally produced gravitino dark matter. The proposed mechanism simultaneously explains the generation of matter and dark matter, thereby relating the absolute neutrino mass scale to the gravitino mass. (orig.)

  20. #BlackBabiesMatter: Analyzing Black Religious Media in Conservative and Progressive Evangelical Communities

    Directory of Open Access Journals (Sweden)

    Monique Moultrie

    2017-11-01

    Full Text Available This article explores how conservative and progressive black Protestants interrogate the theological theme of the sacrality of black life through digital media. The innovations of religious media in black evangelical communities remain an understudied phenomenon in African American religion, making this an apt arena for further discovery. This current intervention into the study of African American Religion examines digital activism through examples of religious media produced by blacks for black audiences. This article begins its interrogation of the sacrality of black life by juxtaposing those who contend that Black Babies Matter as pro-birth-oriented, religiously motivated activists with those religious opponents asserting Black Lives Matter who present an intersectional pro-life approach. The comparison of views relies on womanist cultural analysis as its main methodology to analyze and interpret digital media and explore its ramifications for African American Religion.

  1. Cosmological constraints on the gravitational interactions of matter and dark matter

    International Nuclear Information System (INIS)

    Bai, Yang; Salvado, Jordi; Stefanek, Ben A.

    2015-01-01

    Although there is overwhelming evidence of dark matter from its gravitational interaction, we still do not know its precise gravitational interaction strength or whether it obeys the equivalence principle. Using the latest available cosmological data and working within the framework of ΛCDM, we first update the measurement of the multiplicative factor of cosmology-relevant Newton’s constant over the standard laboratory-based value and find that it is consistent with one. In general relativity, dark matter equivalence principle breaking can be mimicked by a long-range dark matter force mediated by an ultra light scalar field. Using the Planck three year data, we find that the dark matter “fifth-force” strength is constrained to be weaker than 10 −4 of the gravitational force. We also introduce a phenomenological, post-Newtonian two-fluid description to explicitly break the equivalence principle by introducing a difference between dark matter inertial and gravitational masses. Depending on the decoupling time of the dark matter and ordinary matter fluids, the ratio of the dark matter gravitational mass to inertial mass is constrained to be unity at the 10 −6 level

  2. An ontological case base engineering methodology for diabetes management.

    Science.gov (United States)

    El-Sappagh, Shaker H; El-Masri, Samir; Elmogy, Mohammed; Riad, A M; Saddik, Basema

    2014-08-01

    Ontology engineering covers issues related to ontology development and use. In Case Based Reasoning (CBR) system, ontology plays two main roles; the first as case base and the second as domain ontology. However, the ontology engineering literature does not provide adequate guidance on how to build, evaluate, and maintain ontologies. This paper proposes an ontology engineering methodology to generate case bases in the medical domain. It mainly focuses on the research of case representation in the form of ontology to support the case semantic retrieval and enhance all knowledge intensive CBR processes. A case study on diabetes diagnosis case base will be provided to evaluate the proposed methodology.

  3. Seven (and a half) reasons to believe in mirror matter: from neutrino puzzles to the inferred dark matter in the universe

    International Nuclear Information System (INIS)

    Foot, R.

    2001-02-01

    Parity and time reversal are obvious and plausible candidates for fundamental symmetries of nature. Hypothesising that these symmetries exist implies the existence of a new form of matter, called mirror matter. The mirror matter theory (or exact parity model) makes four main predictions: 1) Dark matter in the form of mirror matter should exist in the Universe (i.e. mirror galaxies, stars, planets, meteoroids...), 2) Maximal ordinary neutrino - mirror neutrino oscillations if neutrinos have mass, 3) Orthopositronium should have a shorter effective lifetime than predicted by QED (in 'vacuum' experiments) because of the effects of photon-mirror photon mixing and 4) Higgs production and decay rate should be 50% lower than in the standard model due to Higgs mirror - Higgs mixing (assuming that the separation of the Higgs masses is larger than their decay widths). At the present time there is strong experimental/observational evidence supporting the first three of these predictions, while the fourth one is not tested yet because the Higgs boson, predicted in the standard model of particle physics, is yet to be found. This experimental/observational evidence is rich and varied ranging from the atmospheric and solar neutrino deficits, MACHO gravitational microlensing events, strange properties of extra-solar planets, the existence of 'isolated' planets, orthopositronium lifetime anomaly, Tunguska and other strange 'meteor' events including perhaps, the origin of the moon. The purpose of this article is to provide a not too technical review of these ideas along with some new results

  4. An information system supporting design for reliability and maintenance

    International Nuclear Information System (INIS)

    Rit, J.F.; Beraud, M.T.

    1997-01-01

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.)

  5. An information system supporting design for reliability and maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Rit, J.F.; Beraud, M.T

    1997-12-31

    EDF is currently developing a methodology to integrate availability, operating experience and maintenance in the design of power plants. This involves studies that depend closely on the results and assumptions of each other about the reliability and operations of the plant. Therefore a support information system must be carefully designed. Concurrently with development of the methodology, a research oriented information system was designed and built. It is based on the database model of a logistic support repository that we tailored to our needs. (K.A.) 10 refs.

  6. Perancangan Data Warehouse Nilai Mahasiswa Dengan Kimball Nine-Step Methodology

    Directory of Open Access Journals (Sweden)

    Ganda Wijaya

    2017-04-01

    Abstract Student grades has many components that can be analyzed to support decision making. Based on this, the authors conducted a study of student grades. The study was conducted on a database that is in the Bureau of Academic and Student Affairs Administration Bina Sarana Informatika (BAAK BSI. The focus of this research is "How to model a data warehouse that can meet the management needs of the data value of students as supporters of evaluation, planning and decision making?". Data warehouse grades students need to be made in order to obtain the information, reports, and can perform multi-dimensional analysis, which in turn can assist management in making policy. Development of the system is done by using System Development Life Cycle (SDLC with Waterfall approach. While the design of the data warehouse using a nine-step methodology kimball. Results obtained in the form of a star schema and data warehouse value. Data warehouses can provide a summary of information that is fast, accurate and continuous so as to assist management in making policies for the future. In general, the benefits of this research are as additional reference in building a data warehouse using a nine-step methodology kimball.   Keywords: Data Warehouse, Kimball Nine-Step Methodology.

  7. Dark matter universe.

    Science.gov (United States)

    Bahcall, Neta A

    2015-10-06

    Most of the mass in the universe is in the form of dark matter--a new type of nonbaryonic particle not yet detected in the laboratory or in other detection experiments. The evidence for the existence of dark matter through its gravitational impact is clear in astronomical observations--from the early observations of the large motions of galaxies in clusters and the motions of stars and gas in galaxies, to observations of the large-scale structure in the universe, gravitational lensing, and the cosmic microwave background. The extensive data consistently show the dominance of dark matter and quantify its amount and distribution, assuming general relativity is valid. The data inform us that the dark matter is nonbaryonic, is "cold" (i.e., moves nonrelativistically in the early universe), and interacts only weakly with matter other than by gravity. The current Lambda cold dark matter cosmology--a simple (but strange) flat cold dark matter model dominated by a cosmological constant Lambda, with only six basic parameters (including the density of matter and of baryons, the initial mass fluctuations amplitude and its scale dependence, and the age of the universe and of the first stars)--fits remarkably well all the accumulated data. However, what is the dark matter? This is one of the most fundamental open questions in cosmology and particle physics. Its existence requires an extension of our current understanding of particle physics or otherwise point to a modification of gravity on cosmological scales. The exploration and ultimate detection of dark matter are led by experiments for direct and indirect detection of this yet mysterious particle.

  8. Destination brands and website evaluation: a research methodology

    Directory of Open Access Journals (Sweden)

    J Fernández-Cavia

    2013-10-01

    Full Text Available Introduction:The World Wide Web has become the primary instrument used by tourists in order to search for information. As a result, tourism websites pertaining to destinations need to be appealing and must convey their brand image in an appropriate, effective manner. However, there is no methodology in place to assess the quality and communicative effectiveness of destination websites that is scientifically sound and universally accepted. The development of such a methodology is one of the tasks we have proposed within the framework of the research project: “New strategies for advertising and promoting Spanish tourism brands online” (CSO2008-02627, funded by the Spanish Ministry of Science and Innovation. Method: The project team have developed an interdisciplinary, all-embracing analysis template combining certain automated analyses with other qualitative and quantitative ones. The template comprises a total of 12 subject areas and 154 indicators prepared on the basis of contributions from prominent experts in each of the fields of work. This article sets out the analysis methodology drawn up and possible applications are given. Results: The primary aim of the project is to provide an assessment methodology that would make it possible to optimise destination brand websites, thus providing a tool to support the work of public tourism destination managers.

  9. Dark Matter Searches

    International Nuclear Information System (INIS)

    Moriyama, Shigetaka

    2008-01-01

    Recent cosmological as well as historical observations of rotational curves of galaxies strongly suggest the existence of dark matter. It is also widely believed that dark matter consists of unknown elementary particles. However, astrophysical observations based on gravitational effects alone do not provide sufficient information on the properties of dark matter. In this study, the status of dark matter searches is investigated by observing high-energy neutrinos from the sun and the earth and by observing nuclear recoils in laboratory targets. The successful detection of dark matter by these methods facilitates systematic studies of its properties. Finally, the XMASS experiment, which is due to start at the Kamioka Observatory, is introduced

  10. Methodological approach and tools for systems thinking in health systems research: technical assistants' support of health administration reform in the Democratic Republic of Congo as an application.

    Science.gov (United States)

    Ribesse, Nathalie; Bossyns, Paul; Marchal, Bruno; Karemere, Hermes; Burman, Christopher J; Macq, Jean

    2017-03-01

    In the field of development cooperation, interest in systems thinking and complex systems theories as a methodological approach is increasingly recognised. And so it is in health systems research, which informs health development aid interventions. However, practical applications remain scarce to date. The objective of this article is to contribute to the body of knowledge by presenting the tools inspired by systems thinking and complexity theories and methodological lessons learned from their application. These tools were used in a case study. Detailed results of this study are in process for publication in additional articles. Applying a complexity 'lens', the subject of the case study is the role of long-term international technical assistance in supporting health administration reform at the provincial level in the Democratic Republic of Congo. The Methods section presents the guiding principles of systems thinking and complex systems, their relevance and implication for the subject under study, and the existing tools associated with those theories which inspired us in the design of the data collection and analysis process. The tools and their application processes are presented in the results section, and followed in the discussion section by the critical analysis of their innovative potential and emergent challenges. The overall methodology provides a coherent whole, each tool bringing a different and complementary perspective on the system.

  11. Dence Cold Matter

    Directory of Open Access Journals (Sweden)

    Stavinskiy Alexey

    2014-04-01

    Full Text Available Possible way to create dense cold baryonic matter in the laboratory is discussed. The density of this matter is comparable or even larger than the density of neutron star core. The properties of this matter can be controlled by trigger conditions. Experimental program for the study of properties of dense cold matter for light and heavy ion collisions at initial energy range √sNN~2-3GeV is proposed..

  12. Scenario development methodologies

    International Nuclear Information System (INIS)

    Eng, T.; Hudson, J.; Stephansson, O.

    1994-11-01

    In the period 1981-1994, SKB has studied several methodologies to systematize and visualize all the features, events and processes (FEPs) that can influence a repository for radioactive waste in the future. All the work performed is based on the terminology and basic findings in the joint SKI/SKB work on scenario development presented in the SKB Technical Report 89-35. The methodologies studied are a) Event tree analysis, b) Influence diagrams and c) Rock Engineering Systems (RES) matrices. Each one of the methodologies is explained in this report as well as examples of applications. One chapter is devoted to a comparison between the two most promising methodologies, namely: Influence diagrams and the RES methodology. In conclusion a combination of parts of the Influence diagram and the RES methodology is likely to be a promising approach. 26 refs

  13. Expatriate support and success : A systematic review of organization-based sources of social support

    NARCIS (Netherlands)

    van der Laken, P.A.; van Engen, M.L.; van Veldhoven, M.J.P.M.; Paauwe, J.

    2016-01-01

    Purpose The purpose of this paper is to review empirical research on the relationship between organization-based social support and the success of international assignments (IAs). Design/methodology/approach Four search engines were used to obtain empirical studies relating organization-based social

  14. Methodological aspects of EEG and Body dynamics measurements during motion.

    Directory of Open Access Journals (Sweden)

    Pedro eReis

    2014-03-01

    Full Text Available EEG involves recording, analysis, and interpretation of voltages recorded on the human scalp originating from brain grey matter. EEG is one of the favorite methods to study and understand processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements, that are performed in response to the environment. However, there are methodological difficulties when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions of how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determination of real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks.

  15. Clarifying the links between social support and health: culture, stress, and neuroticism matter.

    Science.gov (United States)

    Park, Jiyoung; Kitayama, Shinobu; Karasawa, Mayumi; Curhan, Katherine; Markus, Hazel R; Kawakami, Norito; Miyamoto, Yuri; Love, Gayle D; Coe, Christopher L; Ryff, Carol D

    2013-02-01

    Although it is commonly assumed that social support positively predicts health, the empirical evidence has been inconsistent. We argue that three moderating factors must be considered: (1) support-approving norms (cultural context); (2) support-requiring situations (stressful events); and (3) support-accepting personal style (low neuroticism). Our large-scale cross-cultural survey of Japanese and US adults found significant associations between perceived support and health. The association was more strongly evident among Japanese (from a support-approving cultural context) who reported high life stress (in a support-requiring situation). Moreover, the link between support and health was especially pronounced if these Japanese were low in neuroticism.

  16. Clarifying the links between social support and health: Culture, stress, and neuroticism matter

    Science.gov (United States)

    Park, Jiyoung; Kitayama, Shinobu; Karasawa, Mayumi; Curhan, Katherine; Markus, Hazel R; Kawakami, Norito; Miyamoto, Yuri; Love, Gayle D; Coe, Christopher L; Ryff, Carol D

    2012-01-01

    Although it is commonly assumed that social support positively predicts health, the empirical evidence has been inconsistent. We argue that three moderating factors must be considered: (1) support-approving norms (cultural context); (2) support-requiring situations (stressful events); and (3) support-accepting personal style (low neuroticism). Our large-scale cross-cultural survey of Japanese and US adults found significant associations between perceived support and health. The association was more strongly evident among Japanese (from a support-approving cultural context) who reported high life stress (in a support-requiring situation). Moreover, the link between support and health was especially pronounced if these Japanese were low in neuroticism. PMID:22419414

  17. Supportability Analysis in LCI Environment

    OpenAIRE

    Dragan Vasiljevic; Ana Horvat

    2013-01-01

    Starting from the basic pillars of the supportability analysis this paper queries its characteristics in LCI (Life Cycle Integration) environment. The research methodology contents a review of modern logistics engineering literature with the objective to collect and synthesize the knowledge relating to standards of supportability design in e-logistics environment. The results show that LCI framework has properties which are in fully compatibility with the requirement of s...

  18. TWRS privatization support project waste characterization database development

    International Nuclear Information System (INIS)

    1995-11-01

    Pacific Northwest National Laboratory requested support from ICF Kaiser Hanford Company in assembling radionuclide and chemical analyte sample data and inventory estimates for fourteen Hanford underground storage tanks: 241-AN-102, -104, -105, -106, and -107, 241-AP-102, -104, and -105, 241-AW-101, -103, and -105, 241 AZ-101 and -102; and 241-C-109. Sample data were assembled for sixteen radionuclides and thirty-five chemical analytes. The characterization data were provided to Pacific Northwest National Laboratory in support of the Tank Waste Remediation Services Privatization Support Project. The purpose of this report is to present the results and document the methodology used in preparing the waste characterization information data set to support the Tank Waste Remediation Services Privatization Support Project. This report describes the methodology used in assembling the waste characterization information and how that information was validated by a panel of independent technical reviewers. Also, contained in this report are the various data sets created: the master data set, a subset, and an unreviewed data set. The master data set contains waste composition information for Tanks 241-AN-102 and -107, 241-AP-102 and -105, 241-AW-101; and 241-AZ-101 and -102. The subset contains only the validated analytical sample data from the master data set. The unreviewed data set contains all collected but unreviewed sample data for Tanks 241-AN-104, -105, and -106; 241-AP-104; 241-AW-103 and-105; and 241-C-109. The methodology used to review the waste characterization information was found to be an accurate, useful way to separate the invalid or questionable data from the more reliable data. In the future, this methodology should be considered when validating waste characterization information

  19. The baryonic self similarity of dark matter

    International Nuclear Information System (INIS)

    Alard, C.

    2014-01-01

    The cosmological simulations indicates that dark matter halos have specific self-similar properties. However, the halo similarity is affected by the baryonic feedback. By using momentum-driven winds as a model to represent the baryon feedback, an equilibrium condition is derived which directly implies the emergence of a new type of similarity. The new self-similar solution has constant acceleration at a reference radius for both dark matter and baryons. This model receives strong support from the observations of galaxies. The new self-similar properties imply that the total acceleration at larger distances is scale-free, the transition between the dark matter and baryons dominated regime occurs at a constant acceleration, and the maximum amplitude of the velocity curve at larger distances is proportional to M 1/4 . These results demonstrate that this self-similar model is consistent with the basics of modified Newtonian dynamics (MOND) phenomenology. In agreement with the observations, the coincidence between the self-similar model and MOND breaks at the scale of clusters of galaxies. Some numerical experiments show that the behavior of the density near the origin is closely approximated by a Einasto profile.

  20. Current status and new trends in the methodology of safety assessment for near surface disposal facilities

    International Nuclear Information System (INIS)

    Ilie, Petre; Didita, Liana; Danchiv, Alexandru

    2008-01-01

    The main goal of this paper is to present the status of the safety assessment methodology at the end of IAEA CRP 'Application of Safety Assessment Methodology for Near-Surface Radioactive Waste Disposal Facilities (ASAM)', and the new trends outlined at the launch of the follow-up project 'Practical Implementation of Safety Assessment Methodologies in a Context of Safety Case of Near-Surface Facilities (PRISM)'. Over the duration of the ASAM project, the ISAM methodology was confirmed as providing a good framework for conducting safety assessment calculations. In contrast, ASAM project identified the limitations of the ISAM methodology as currently formulated. The major limitations are situated in the area of the use of safety assessment for informing practical decisions about alternative waste and risk management strategies for real disposal sites. As a result of the limitation of the ISAM methodology, the PRISM project is established as an extension of the ISAM and ASAM projects. Based on the outcomes of the ASAM project, the main objective of the PRISM project are: 1 - to develop an overview of what constitutes an adequate safety case and safety assessment with a view to supporting decision making processes; 2 - to provide practical illustrations of how the safety assessment methodology could be used for addressing some specific issues arising from the ASAM project and national cases; 3 - to support harmonization with the IAEA's international safety standards. (authors)

  1. Dark Matter

    International Nuclear Information System (INIS)

    Holt, S. S.; Bennett, C. L.

    1995-01-01

    These proceedings represent papers presented at the Astrophysics conference in Maryland, organized by NASA Goddard Space Flight Center and the University of Maryland. The topics covered included low mass stars as dark matter, dark matter in galaxies and clusters, cosmic microwave background anisotropy, cold and hot dark matter, and the large scale distribution and motions of galaxies. There were eighty five papers presented. Out of these, 10 have been abstracted for the Energy Science and Technology database

  2. Signals of dark matter in a supersymmetric two dark matter model

    International Nuclear Information System (INIS)

    Fukuoka, Hiroki; Suematsu, Daijiro; Toma, Takashi

    2011-01-01

    Supersymmetric radiative neutrino mass models have often two dark matter candidates. One is the usual lightest neutralino with odd R parity and the other is a new neutral particle whose stability is guaranteed by a discrete symmetry that forbids tree-level neutrino Yukawa couplings. If their relic abundance is comparable, dark matter phenomenology can be largely different from the minimal supersymmetric standard model (MSSM). We study this in a supersymmetric radiative neutrino mass model with the conserved R parity and a Z 2 symmetry weakly broken by the anomaly effect. The second dark matter with odd parity of this new Z 2 is metastable and decays to the neutralino dark matter. Charged particles and photons associated to this decay can cause the deviation from the expected background of the cosmic rays. Direct search of the neutralino dark matter is also expected to show different features from the MSSM since the relic abundance is not composed of the neutralino dark matter only. We discuss the nature of dark matter in this model by analyzing these signals quantitatively

  3. Teachers mathematical communication profile in explaining subject matter

    Science.gov (United States)

    Umami, Rohmatul; Budayasa, I. Ketut; Suwarsono, St.

    2017-12-01

    This study aimed to see a teachers mathematical communication profile in explaining a subject matter. It is a qualitative research. A high-school junior teacher (i.e., a teacher with 1- to 5-year experience) teaching mathematics at X-Social Class was selected as the subject of this study. The data was collected by observing the teachers mathematical communication in explaining a given material (i.e., the rule of sine) in class and an in-depth interview would be organized respectively. The result showed that the junior teacher explained the subject matter in systematic, complete, fluent, and centered manner. In this case, she began with reminding students on the previous material related to the current material to be learned, informing the current learning objectives, and finally delivering the subject matter. To support her explanation, the teacher also provided some related information, led the students attention into the given material by asking them particular related questions, and did not use any confusing terms. However, the study found that some of high-school teachers still used less appropriate language in explaining materials.

  4. Revisiting the dose calculation methodologies in European decision support systems

    DEFF Research Database (Denmark)

    Andersson, Kasper Grann; Roos, Per; Hou, Xiaolin

    2012-01-01

    The paper presents examples of current needs for improvement and extended applicability of the European decision support systems. The systems were originally created for prediction of the radiological consequences of accidents at nuclear installations. They could however also be of great value in...... for, to introduce new knowledge and thereby improve prognoses....

  5. Methodological Problems of Nanotechnoscience

    Science.gov (United States)

    Gorokhov, V. G.

    Recently, we have reported on the definitions of nanotechnology as a new type of NanoTechnoScience and on the nanotheory as a cluster of the different natural and engineering theories. Nanotechnology is not only a new type of scientific-engineering discipline, but it evolves also in a “nonclassical” way. Nanoontology or nano scientific world view has a function of the methodological orientation for the choice the theoretical means and methods toward a solution to the scientific and engineering problems. This allows to change from one explanation and scientific world view to another without any problems. Thus, nanotechnology is both a field of scientific knowledge and a sphere of engineering activity, in other words, NanoTechnoScience is similar to Systems Engineering as the analysis and design of large-scale, complex, man/machine systems but micro- and nanosystems. Nano systems engineering as well as Macro systems engineering includes not only systems design but also complex research. Design orientation has influence on the change of the priorities in the complex research and of the relation to the knowledge, not only to “the knowledge about something”, but also to the knowledge as the means of activity: from the beginning control and restructuring of matter at the nano-scale is a necessary element of nanoscience.

  6. PREFACE: Quark Matter 2008

    Science.gov (United States)

    Jan-e~Alam; Subhasis~Chattopadhyay; Tapan~Nayak

    2008-10-01

    Quark Matter 2008—the 20th International Conference on Ultra-Relativistic Nucleus-Nucleus Collisions was held in Jaipur, the Pink City of India, from 4-10 February, 2008. Organizing Quark Matter 2008 in India itself indicates the international recognition of the Indian contribution to the field of heavy-ion physics, which was initiated and nurtured by Bikash Sinha, Chair of the conference. The conference was inaugurated by the Honourable Chief Minister of Rajasthan, Smt. Vasundhara Raje followed by the key note address by Professor Carlo Rubbia. The scientific programme started with the theoretical overview, `SPS to RHIC and onwards to LHC' by Larry McLerran followed by several theoretical and experimental overview talks on the ongoing experiments at SPS and RHIC. The future experiments at the LHC, FAIR and J-PARC, along with the theoretical predictions, were discussed in great depth. Lattice QCD predictions on the nature of the phase transition and critical point were vigorously debated during several plenary and parallel session presentations. The conference was enriched by the presence of an unprecedented number of participants; about 600 participants representing 31 countries across the globe. This issue contains papers based on plenary talks and oral presentations presented at the conference. Besides invited and contributed talks, there were also a large number of poster presentations. Members of the International Advisory Committee played a pivotal role in the selection of speakers, both for plenary and parallel session talks. The contributions of the Organizing Committee in all aspects, from helping to prepare the academic programme down to arranging local hospitality, were much appreciated. We thank the members of both the committees for making Quark Matter 2008 a very effective and interesting platform for scientific deliberations. Quark Matter 2008 was financially supported by: Air Liquide (New Delhi) Board of Research Nuclear Sciences (Mumbai) Bose

  7. Baryonic dark matter

    International Nuclear Information System (INIS)

    Uson, Juan M.

    2000-01-01

    Many searches for baryonic dark matter have been conducted but, so far, all have been unsuccessful. Indeed, no more than 1% of the dark matter can be in the form of hydrogen burning stars. It has recently been suggested that most of the baryons in the universe are still in the form of ionized gas so that it is possible that there is no baryonic dark matter. Although it is likely that a significant fraction of the dark matter in the Milky Way is in a halo of non-baryonic matter, the data do not exclude the possibility that a considerable amount, perhaps most of it, could be in a tenuous halo of diffuse ionized gas

  8. Inelastic dark matter

    International Nuclear Information System (INIS)

    Smith, David; Weiner, Neal

    2001-01-01

    Many observations suggest that much of the matter of the universe is nonbaryonic. Recently, the DAMA NaI dark matter direct detection experiment reported an annual modulation in their event rate consistent with a WIMP relic. However, the Cryogenic Dark Matter Search (CDMS) Ge experiment excludes most of the region preferred by DAMA. We demonstrate that if the dark matter can only scatter by making a transition to a slightly heavier state (Δm∼100 keV), the experiments are no longer in conflict. Moreover, differences in the energy spectrum of nuclear recoil events could distinguish such a scenario from the standard WIMP scenario. Finally, we discuss the sneutrino as a candidate for inelastic dark matter in supersymmetric theories

  9. EnergiTools. A methodology for performance monitoring and diagnosis

    International Nuclear Information System (INIS)

    Ancion, P.; Bastien, R.; Ringdahl, K.

    2000-01-01

    EnergiTools is a performance monitoring and diagnostic tool that combines the power of on-line process data acquisition with advanced diagnosis methodologies. Analytical models based on thermodynamic principles are combined with neural networks to validate sensor data and to estimate missing or faulty measurements. Advanced diagnostic technologies are then applied to point out potential faults and areas to be investigated further. The diagnosis methodologies are based on Bayesian belief networks. Expert knowledge is captured in the form of the fault-symptom relationships and includes historical information as the likelihood of faults and symptoms. The methodology produces the likelihood of component failure root causes using the expert knowledge base. EnergiTools is used at Ringhals nuclear power plants. It has led to the diagnosis of various performance issues. Three case studies based on this plant data and model are presented and illustrate the diagnosis support methodologies implemented in EnergiTools . In the first case, the analytical data qualification technique points out several faulty measurements. The application of a neural network for the estimation of the nuclear reactor power by interpreting several plant indicators is then illustrated. The use of the Bayesian belief networks is finally described. (author)

  10. White matter microstructure damage in tremor-dominant Parkinson's disease patients

    International Nuclear Information System (INIS)

    Luo, ChunYan; Song, Wei; Chen, Qin; Yang, Jing; Shang, Hui-Fang; Gong, QiYong

    2017-01-01

    Resting tremor is one of the cardinal motor features of Parkinson's disease (PD). Several lines of evidence suggest resting tremor may have different underlying pathophysiological processes from those of bradykinesia and rigidity. The current study aims to identify white matter microstructural abnormalities associated with resting tremor in PD. We recruited 60 patients with PD (30 with tremor-dominant PD and 30 with nontremor-dominant PD) and 26 normal controls. All participants underwent clinical assessment and diffusion tensor MRI. We used tract-based spatial statistics to investigate white matter integrity across the entire white matter tract skeleton. Compared with both healthy controls and the nontremor-dominant PD patients, the tremor-dominant PD patients were characterized by increased mean diffusivity (MD) and axial diffusivity (AD) along multiple white matter tracts, mainly involving the cerebello-thalamo-cortical (CTC) pathway. The mean AD value in clusters with significant difference was correlated with resting tremor score in the tremor-dominant PD patients. There was no significant difference between the nontremor-dominant PD patients and controls. Our results support the notion that resting tremor in PD is a distinct condition in which significant microstructural white matter changes exist and provide evidence for the involvement of the CTC in tremor genesis of PD. (orig.)

  11. Archetype modeling methodology.

    Science.gov (United States)

    Moner, David; Maldonado, José Alberto; Robles, Montserrat

    2018-03-01

    Clinical Information Models (CIMs) expressed as archetypes play an essential role in the design and development of current Electronic Health Record (EHR) information structures. Although there exist many experiences about using archetypes in the literature, a comprehensive and formal methodology for archetype modeling does not exist. Having a modeling methodology is essential to develop quality archetypes, in order to guide the development of EHR systems and to allow the semantic interoperability of health data. In this work, an archetype modeling methodology is proposed. This paper describes its phases, the inputs and outputs of each phase, and the involved participants and tools. It also includes the description of the possible strategies to organize the modeling process. The proposed methodology is inspired by existing best practices of CIMs, software and ontology development. The methodology has been applied and evaluated in regional and national EHR projects. The application of the methodology provided useful feedback and improvements, and confirmed its advantages. The conclusion of this work is that having a formal methodology for archetype development facilitates the definition and adoption of interoperable archetypes, improves their quality, and facilitates their reuse among different information systems and EHR projects. Moreover, the proposed methodology can be also a reference for CIMs development using any other formalism. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. A New Approach for Deep Gray Matter Analysis Using Partial-Volume Estimation.

    Science.gov (United States)

    Bonnier, Guillaume; Kober, Tobias; Schluep, Myriam; Du Pasquier, Renaud; Krueger, Gunnar; Meuli, Reto; Granziera, Cristina; Roche, Alexis

    2016-01-01

    The existence of partial volume effects in brain MR images makes it challenging to understand physio-pathological alterations underlying signal changes due to pathology across groups of healthy subjects and patients. In this study, we implement a new approach to disentangle gray and white matter alterations in the thalamus and the basal ganglia. The proposed method was applied to a cohort of early multiple sclerosis (MS) patients and healthy subjects to evaluate tissue-specific alterations related to diffuse inflammatory or neurodegenerative processes. Forty-three relapsing-remitting MS patients and nineteen healthy controls underwent 3T MRI including: (i) fluid-attenuated inversion recovery, double inversion recovery, magnetization-prepared gradient echo for lesion count, and (ii) T1 relaxometry. We applied a partial volume estimation algorithm to T1 relaxometry maps to gray and white matter local concentrations as well as T1 values characteristic of gray and white matter in the thalamus and the basal ganglia. Statistical tests were performed to compare groups in terms of both global T1 values, tissue characteristic T1 values, and tissue concentrations. Significant increases in global T1 values were observed in the thalamus (p = 0.038) and the putamen (p = 0.026) in RRMS patients compared to HC. In the Thalamus, the T1 increase was associated with a significant increase in gray matter characteristic T1 (p = 0.0016) with no significant effect in white matter. The presented methodology provides additional information to standard MR signal averaging approaches that holds promise to identify the presence and nature of diffuse pathology in neuro-inflammatory and neurodegenerative diseases.

  13. Spanish methodological approach for biosphere assessment of radioactive waste disposal

    International Nuclear Information System (INIS)

    Agueero, A.; Pinedo, P.; Cancio, D.; Simon, I.; Moraleda, M.; Perez-Sanchez, D.; Trueba, C.

    2007-01-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS 'Reference Biospheres Methodology' and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates

  14. Spanish methodological approach for biosphere assessment of radioactive waste disposal.

    Science.gov (United States)

    Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C

    2007-10-01

    The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.

  15. Qualitative case study methodology in nursing research: an integrative review.

    Science.gov (United States)

    Anthony, Susan; Jack, Susan

    2009-06-01

    This paper is a report of an integrative review conducted to critically analyse the contemporary use of qualitative case study methodology in nursing research. Increasing complexity in health care and increasing use of case study in nursing research support the need for current examination of this methodology. In 2007, a search for case study research (published 2005-2007) indexed in the CINAHL, MEDLINE, EMBASE, PsychINFO, Sociological Abstracts and SCOPUS databases was conducted. A sample of 42 case study research papers met the inclusion criteria. Whittemore and Knafl's integrative review method guided the analysis. Confusion exists about the name, nature and use of case study. This methodology, including terminology and concepts, is often invisible in qualitative study titles and abstracts. Case study is an exclusive methodology and an adjunct to exploring particular aspects of phenomena under investigation in larger or mixed-methods studies. A high quality of case study exists in nursing research. Judicious selection and diligent application of literature review methods promote the development of nursing science. Case study is becoming entrenched in the nursing research lexicon as a well-accepted methodology for studying phenomena in health and social care, and its growing use warrants continued appraisal to promote nursing knowledge development. Attention to all case study elements, process and publication is important in promoting authenticity, methodological quality and visibility.

  16. An evaluation of methodology for seismic qualification of equipment, cable trays, and ducts in ALWR plants by use of experience data

    International Nuclear Information System (INIS)

    Bandyopadhyay, K.K.; Kana, D.D.; Kennedy, R.P.; Schiff, A.J.

    1997-07-01

    Advanced Reactor Corporation (ARC) has developed a methodology for seismic qualification of equipment, cable trays and ducts in Advanced Light Water Reactor plants. A Panel (members of which acted as individuals) supported by the Office of Nuclear Regulatory Research of the Nuclear Regulatory Commission has evaluated this methodology. The review approach and observations are included in this report. In general, the Panel supports the ARC methodology with some exceptions and provides recommendations for further improvements. 26 refs., 10 figs., 1 tab

  17. An evaluation of methodology for seismic qualification of equipment, cable trays, and ducts in ALWR plants by use of experience data

    Energy Technology Data Exchange (ETDEWEB)

    Bandyopadhyay, K.K.; Kana, D.D.; Kennedy, R.P.; Schiff, A.J.

    1997-07-01

    Advanced Reactor Corporation (ARC) has developed a methodology for seismic qualification of equipment, cable trays and ducts in Advanced Light Water Reactor plants. A Panel (members of which acted as individuals) supported by the Office of Nuclear Regulatory Research of the Nuclear Regulatory Commission has evaluated this methodology. The review approach and observations are included in this report. In general, the Panel supports the ARC methodology with some exceptions and provides recommendations for further improvements. 26 refs., 10 figs., 1 tab.

  18. Mapping Dark Matter in Simulated Galaxy Clusters

    Science.gov (United States)

    Bowyer, Rachel

    2018-01-01

    Galaxy clusters are the most massive bound objects in the Universe with most of their mass being dark matter. Cosmological simulations of structure formation show that clusters are embedded in a cosmic web of dark matter filaments and large scale structure. It is thought that these filaments are found preferentially close to the long axes of clusters. We extract galaxy clusters from the simulations "cosmo-OWLS" in order to study their properties directly and also to infer their properties from weak gravitational lensing signatures. We investigate various stacking procedures to enhance the signal of the filaments and large scale structure surrounding the clusters to better understand how the filaments of the cosmic web connect with galaxy clusters. This project was supported in part by the NSF REU grant AST-1358980 and by the Nantucket Maria Mitchell Association.

  19. Effectiveness of the Size Matters Handwriting Program.

    Science.gov (United States)

    Pfeiffer, Beth; Rai, Gillian; Murray, Tammy; Brusilovskiy, Eugene

    2015-04-01

    The purpose of the research was to study changes in handwriting legibility among kindergarten, first- and second-grade students in response to the Size Matters curricular-based handwriting program. A two-group pre-post-test design was implemented at two public schools with half of the classrooms assigned to receive the Size Matters program and the other continuing to receive standard instruction. All participants completed two standardized handwriting measures at pre-test and after 40 instructional sessions were completed with the classes receiving the handwriting program. Results identified significant changes in legibility in the handwriting intervention group for all three grades when compared with the standard instruction group. The results of this study support the use of a curricular-embedded handwriting program and provide the foundation for future research examining the impact of handwriting legibility on learning outcomes.

  20. Supersymmetric dark-matter Q-balls and their interactions in matter

    International Nuclear Information System (INIS)

    Kusenko, Alexander; Loveridge, Lee C.; Shaposhnikov, Mikhail

    2005-01-01

    Supersymmetric extensions of the Standard Model contain nontopological solitons, Q-balls, which can be stable and can be a form of cosmological dark matter. Understanding the interaction of SUSY Q-balls with matter fermions is important for both astrophysical limits and laboratory searches for these dark-matter candidates. We show that a baryon scattering off a baryonic SUSY Q-ball can convert into its antiparticle with a high probability, while the baryon number of the Q-ball is increased by two units. For a SUSY Q-ball interacting with matter, this process dominates over those previously discussed in the literature

  1. Level 2 PSA methodology and severe accident management

    International Nuclear Information System (INIS)

    1997-01-01

    The objective of the work was to review current Level 2-PSA (Probabilistic Safety Assessment) methodologies and practices and to investigate how Level 2-PSA can support severe accident management programmes, i.e. the development, implementation, training and optimisation of accident management strategies and measures. For the most part, the presented material reflects the state in 1996. Current Level 2 PSA results and methodologies are reviewed and evaluated with respect to plant type specific and generic insights. Approaches and practices for using PSA results in the regulatory context and for supporting severe accident management programmes by input from level 2 PSAs are examined. The work is based on information contained in: PSA procedure guides, PSA review guides and regulatory guides for the use of PSA results in risk informed decision making; plant specific PSAs and PSA related literature exemplifying specific procedures, methods, analytical models, relevant input data and important results, use of computer codes and results of code calculations. The PSAs are evaluated with respect to results and insights. In the conclusion section, the present state of risk informed decision making, in particular in the level 2 domain, is described and substantiated by relevant examples

  2. Exothermic dark matter

    International Nuclear Information System (INIS)

    Graham, Peter W.; Saraswat, Prashant; Harnik, Roni; Rajendran, Surjeet

    2010-01-01

    We propose a novel mechanism for dark matter to explain the observed annual modulation signal at DAMA/LIBRA which avoids existing constraints from every other dark matter direct detection experiment including CRESST, CDMS, and XENON10. The dark matter consists of at least two light states with mass ∼few GeV and splittings ∼5 keV. It is natural for the heavier states to be cosmologically long-lived and to make up an O(1) fraction of the dark matter. Direct detection rates are dominated by the exothermic reactions in which an excited dark matter state downscatters off of a nucleus, becoming a lower energy state. In contrast to (endothermic) inelastic dark matter, the most sensitive experiments for exothermic dark matter are those with light nuclei and low threshold energies. Interestingly, this model can also naturally account for the observed low-energy events at CoGeNT. The only significant constraint on the model arises from the DAMA/LIBRA unmodulated spectrum but it can be tested in the near future by a low-threshold analysis of CDMS-Si and possibly other experiments including CRESST, COUPP, and XENON100.

  3. Hybrid Dark Matter

    OpenAIRE

    Chao, Wei

    2018-01-01

    Dark matter can be produced in the early universe via the freeze-in or freeze-out mechanisms. Both scenarios were investigated in references, but the production of dark matters via the combination of these two mechanisms are not addressed. In this paper we propose a hybrid dark matter model where dark matters have two components with one component produced thermally and the other one produced non-thermally. We present for the first time the analytical calculation for the relic abundance of th...

  4. Speech Matters

    DEFF Research Database (Denmark)

    Hasse Jørgensen, Stina

    2011-01-01

    About Speech Matters - Katarina Gregos, the Greek curator's exhibition at the Danish Pavillion, the Venice Biannual 2011.......About Speech Matters - Katarina Gregos, the Greek curator's exhibition at the Danish Pavillion, the Venice Biannual 2011....

  5. Organization of the Master Tutor in Higher Education: Methodological Support

    OpenAIRE

    Asya Suchanu

    2013-01-01

    It reveals the uniqueness tutor support preparation of future teachers in humanities within the magistracy, the ways and means of professional development tomorrow's specialists. Substantiates the importance and meaning of revealed teaching tutor help first-year students, which manifests itself in optimizing individual learning trajectories, leading to efficient fulfillment and positive socialization of students.

  6. A comparison of the character of algal extracellular versus cellular organic matter produced by cyanobacterium, diatom and green alga

    Czech Academy of Sciences Publication Activity Database

    Pivokonský, Martin; Šafaříková, Jana; Barešová, Magdalena; Pivokonská, Lenka; Kopecká, Ivana

    2014-01-01

    Roč. 51, March (2014), s. 37-46 ISSN 0043-1354 R&D Projects: GA AV ČR IAA200600902 Institutional support: RVO:67985874 Keywords : Algal organic matter * Extracellular organic matter * Cellular organic matter * Peptide/protein content * Hydrophobicity * Molecular weight fraction ation Subject RIV: BK - Fluid Dynamics Impact factor: 5.528, year: 2014 http://www.sciencedirect.com/science/article/pii/S004313541301021X

  7. Divergent discourse between protests and counter-protests: #BlackLivesMatter and #AllLivesMatter.

    Science.gov (United States)

    Gallagher, Ryan J; Reagan, Andrew J; Danforth, Christopher M; Dodds, Peter Sheridan

    2018-01-01

    Since the shooting of Black teenager Michael Brown by White police officer Darren Wilson in Ferguson, Missouri, the protest hashtag #BlackLivesMatter has amplified critiques of extrajudicial killings of Black Americans. In response to #BlackLivesMatter, other Twitter users have adopted #AllLivesMatter, a counter-protest hashtag whose content argues that equal attention should be given to all lives regardless of race. Through a multi-level analysis of over 860,000 tweets, we study how these protests and counter-protests diverge by quantifying aspects of their discourse. We find that #AllLivesMatter facilitates opposition between #BlackLivesMatter and hashtags such as #PoliceLivesMatter and #BlueLivesMatter in such a way that historically echoes the tension between Black protesters and law enforcement. In addition, we show that a significant portion of #AllLivesMatter use stems from hijacking by #BlackLivesMatter advocates. Beyond simply injecting #AllLivesMatter with #BlackLivesMatter content, these hijackers use the hashtag to directly confront the counter-protest notion of "All lives matter." Our findings suggest that Black Lives Matter movement was able to grow, exhibit diverse conversations, and avoid derailment on social media by making discussion of counter-protest opinions a central topic of #AllLivesMatter, rather than the movement itself.

  8. Building an integrated methodology of learning that can optimally support improvements in healthcare.

    Science.gov (United States)

    Lynn, Joanne

    2011-04-01

    The methods for healthcare reform are strikingly underdeveloped, with much reliance on political power. A methodology that combined methods from sources such as clinical trials, experience-based wisdom, and improvement science could be among the aims of the upcoming work in the USA on comparative effectiveness and on the agenda of the Center for Medicare and Medicaid Innovation in the Centers for Medicare and Medicaid Services. Those working in quality improvement have an unusual opportunity to generate substantial input into these processes through professional organisations such as the Academy for Healthcare Improvement and dominant leadership organisations such as the Institute for Healthcare Improvement.

  9. System Anthropological Psychology: Methodological Foundations

    Directory of Open Access Journals (Sweden)

    Vitaliy Y. Klochko

    2012-01-01

    Full Text Available The article considers methodological foundations of the system anthropologicalpsychology (SAP as a scientific branch developed by a well-represented groupof Siberian scientists. SAP is a theory based on axiomatics of cultural-historicalpsychology of L.S. Vygotsky and transspective analysis as a specially developedmeans to define the tendencies of science developing as a self-organizing system.Transspective analysis has revealed regularities in a constantly growing complexityof professional-psychological thinking along the course of emergence ofscientific cognition. It has proved that the field of modern psychology is shapedby theories constructed with ideation of different grades of complexity. The concept“dynamics of the paradigm of science” is introduced; it allows transitions tobe acknowledged from ordinary-binary logic characteristics of the classical scienceto a binary-ternary logic, adequate to non-classical science and then to aternary-multidimensional logic, which is now at the stage of emergence. The latteris employed in SAP construction. It involves the following basic methodologicalprinciples: the principle of directed (selective interaction and the principle ofgenerative effect of selective interaction. The concept of “complimentary interaction”applied in natural as well as humanitarian sciences is reconsidered in thecontext of psychology. The conclusion is made that the principle of selectivity anddirectedness of interaction is relevant to the whole Universe embracing all kindsof systems including the living ones. Different levels of matter organization representingsemantic structures of various complexity use one and the same principleof meaning making through which the Universe ensures its sustainability asa self-developing phenomenon. This methodology provides an explanation fornature and stages of emergence of multidimensional life space of an individual,which comes as a foundation for generation of such features of

  10. Dark matters

    International Nuclear Information System (INIS)

    Silk, Joseph

    2010-01-01

    One of the greatest mysteries in the cosmos is that it is mostly dark. That is, not only is the night sky dark, but also most of the matter and the energy in the universe is dark. For every atom visible in planets, stars and galaxies today there exists at least five or six times as much 'Dark Matter' in the universe. Astronomers and particle physicists today are seeking to unravel the nature of this mysterious but pervasive dark matter, which has profoundly influenced the formation of structure in the universe. Dark energy remains even more elusive, as we lack candidate fields that emerge from well established physics. I will describe various attempts to measure dark matter by direct and indirect means, and discuss the prospects for progress in unravelling dark energy.

  11. Indoor/outdoor Particulate Matter Number and Mass Concentration in Modern Offices

    Czech Academy of Sciences Publication Activity Database

    Chatoutsidou, S.E.; Ondráček, Jakub; Tesař, Ondřej; Tørseth, K.; Ždímal, Vladimír; Lazaridis, M.

    2015-01-01

    Roč. 92, OCT 2015 (2015), s. 462-474 ISSN 0360-1323 EU Projects: European Commission(XE) 315760 Institutional support: RVO:67985858 Keywords : modern offices * particulate matter * mechanical ventilation Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 3.394, year: 2015

  12. A methodology for supporting decisions on the establishment of protective measures after severe nuclear accidents

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Kollas, J.G.

    1994-06-01

    The objective of this report is to demonstrate the use of a methology supporting decisions on protective measures following severe nuclear accidents. A multicriteria decision analysis approach is recommended where value tradeoffs are postponed until the very last stage of the decision process. Use of efficient frontiers is made to exclude all technically inferior solutions and present the decision maker with all nondominated solutions. A choice among these solutions implies a value trade-off among the multiple criteria. An interactive computer packge has been developed where the decision maker can choose a point on the efficient frontier in the consequence space and immediately see the alternative in the decision space resulting in the chosen consequences. The methodology is demonstrated through an application on the choice among possible protective measures in contaminated areas of the former USSR after the Chernobyl accident. Two distinct cases are considered: First a decision is to be made only on the basis of the level of soil contamination with Cs-137 and the total cost of the chosen protective policy; Next the decision is based on the geographic dimension of the contamination ant the total cost. Three alternative countermeasure actions are considered for population segments living on soil contaminated at a certain level or in a specific geographic region: (a) relocation of the population; (b) improvement of the living conditions; and, (c) no countermeasures at all. This is final deliverable of the CEC-CIS Joint Study Project 2, Task 5: Decision-Aiding-System for Establishing Intervention Levels, performed under Contracts COSU-CT91-0007 and COSU-CT92-0021 with the Commission of European Communities through CEPN

  13. Concentrated dark matter: Enhanced small-scale structure from codecaying dark matter

    OpenAIRE

    Dror, Jeff A.; Kuflik, Eric; Melcher, Brandon; Watson, Scott

    2018-01-01

    We study the cosmological consequences of codecaying dark matter—a recently proposed mechanism for depleting the density of dark matter through the decay of nearly degenerate particles. A generic prediction of this framework is an early dark matter dominated phase in the history of the Universe, that results in the enhanced growth of dark matter perturbations on small scales. We compute the duration of the early matter dominated phase and show that the perturbations are robust against washout...

  14. Hybrid probabilistic and possibilistic safety assessment. Methodology and application

    International Nuclear Information System (INIS)

    Kato, Kazuyuki; Amano, Osamu; Ueda, Hiroyoshi; Ikeda, Takao; Yoshida, Hideji; Takase, Hiroyasu

    2002-01-01

    This paper presents a unified methodology to handle variability and ignorance by using probabilistic and possibilistic techniques respectively. The methodology has been applied to the safety assessment of geological disposal of high-level radioactive waste. Uncertainties associated with scenarios, models and parameters were defined in terms of fuzzy membership functions derived through a series of interviews to the experts, while variability was formulated by means of probability density functions (pdfs) based on available data sets. The exercise demonstrated the applicability of the new methodology and, in particular, its advantage in quantifying uncertainties based on expert opinion and in providing information on the dependence of assessment results on the level of conservatism. In addition, it was shown that sensitivity analysis can identify key parameters contributing to uncertainties associated with results of the overall assessment. The information mentioned above can be utilized to support decision-making and to guide the process of disposal system development and optimization of protection against potential exposure. (author)

  15. Capturing Data Connections within the Climate Data Initiative to Support Resiliency

    Science.gov (United States)

    Ramachandran, R.; Bugbee, K.; Weigel, A. M.; Tilmes, C.

    2015-12-01

    The Climate Data Initiative (CDI) focuses on preparing the United States for the impacts of climate change by leveraging existing federal climate-relevant data to stimulate innovation and private-sector entrepreneurship supporting national climate-change preparedness. To achieve these goals, relevant data was curated around seven thematic areas relevant to climate change resiliency. Data for each theme was selected by subject matter experts from various Federal agencies and collected in Data.gov at http://climate.data.gov. While the curation effort for each theme has been immensely valuable on its own, in the end, the themes essentially become a long directory or a list. Establishing valuable connections between datasets and their intended use is lost. Therefore, the user understands that the datasets in the list have been approved by the CDI subject matter experts but has less certainty when making connections between the various datasets and their possible applications. Additionally, the intended use of the curated list is overwhelming and can be difficult to interpret. In order to better address the needs of the CDI data end users, the CDI team has been developing a new controlled vocabulary that will assist in capturing connections between datasets. This new vocabulary will be implemented in the Global Change Information System (GCIS), which has the capability to link individual items within the system. This presentation will highlight the methodology used to develop the controlled vocabulary that will aid end users in both understanding and locating relevant datasets for their intended use.

  16. Predicting human miRNA target genes using a novel evolutionary methodology

    KAUST Repository

    Aigli, Korfiati; Kleftogiannis, Dimitrios A.; Konstantinos, Theofilatos; Spiros, Likothanassis; Athanasios, Tsakalidis; Seferina, Mavroudi

    2012-01-01

    The discovery of miRNAs had great impacts on traditional biology. Typically, miRNAs have the potential to bind to the 3'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. The experimental identification of their targets has many drawbacks including cost, time and low specificity and these are the reasons why many computational approaches have been developed so far. However, existing computational approaches do not include any advanced feature selection technique and they are facing problems concerning their classification performance and their interpretability. In the present paper, we propose a novel hybrid methodology which combines genetic algorithms and support vector machines in order to locate the optimal feature subset while achieving high classification performance. The proposed methodology was compared with two of the most promising existing methodologies in the problem of predicting human miRNA targets. Our approach outperforms existing methodologies in terms of classification performances while selecting a much smaller feature subset. © 2012 Springer-Verlag.

  17. Predicting human miRNA target genes using a novel evolutionary methodology

    KAUST Repository

    Aigli, Korfiati

    2012-01-01

    The discovery of miRNAs had great impacts on traditional biology. Typically, miRNAs have the potential to bind to the 3\\'untraslated region (UTR) of their mRNA target genes for cleavage or translational repression. The experimental identification of their targets has many drawbacks including cost, time and low specificity and these are the reasons why many computational approaches have been developed so far. However, existing computational approaches do not include any advanced feature selection technique and they are facing problems concerning their classification performance and their interpretability. In the present paper, we propose a novel hybrid methodology which combines genetic algorithms and support vector machines in order to locate the optimal feature subset while achieving high classification performance. The proposed methodology was compared with two of the most promising existing methodologies in the problem of predicting human miRNA targets. Our approach outperforms existing methodologies in terms of classification performances while selecting a much smaller feature subset. © 2012 Springer-Verlag.

  18. ACADEMIC INTEGRITY SUPPORT SYSTEM FOR UKRAINIAN UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    V. G. Sherstjuk

    2017-04-01

    Full Text Available Purpose. Developing the methodology for providing academic integrity in the university. The methodology is based on Web-oriented academic integrity support system, developed by the authors, which enters into the information system of learning process control. Academic integrity support system is aimed at maintaining academic integrity as a basic institutional value, which will help to reduce corruption, plagiarism and other types of academic dishonesty. Methodology. The methodology of problem to solve is based on the development of the information system of education process control with the integral elements of quality control. The information subsystem of academic integrity support is its basic part. Findings. The proposed information system allows us to fulfill the following levels: educational process monitoring; audit of internal processes, which is necessary for developing the effective quality control system; assessment of achievements of educational process participants; formalization of the interaction of educational process participants. The system is aimed at the development of new academic society based on the following principles: open access to the information, at which the access of wide audience to the information provides participation, forming the sense of responsibility and social control; transparency of the information, by which its relevance, quality, reliability are meant; responsibility of all members of educational process; measurability, at which any action in educational process should be measured; detail of describing the actions, results and processes; support, which is meant by automatic tools of the realization of the principles of open access to the information, transparency of the information, responsibility of all participants of educational process, measurability, detail, support. The practical realization of information system is based on the development of a common repository of university information. The

  19. Conducting compositions of matter

    Science.gov (United States)

    Viswanathan, Tito (Inventor)

    2000-01-01

    The invention provides conductive compositions of matter, as well as methods for the preparation of the conductive compositions of matter, solutions comprising the conductive compositions of matter, and methods of preparing fibers or fabrics having improved anti-static properties employing the conductive compositions of matter.

  20. Mixed-Reality Prototypes to Support Early Creative Design

    Science.gov (United States)

    Safin, Stéphane; Delfosse, Vincent; Leclercq, Pierre

    The domain we address is creative design, mainly architecture. Rooted in a multidisciplinary approach as well as a deep understanding of architecture and design, our method aims at proposing adapted mixed-reality solutions to support two crucial activities: sketch-based preliminary design and distant synchronous collaboration in design. This chapter provides a summary of our work on a mixed-reality device, based on a drawing table (the Virtual Desktop), designed specifically to address real-life/business-focused issues. We explain our methodology, describe the two supported activities and the related users’ needs, detail the technological solution we have developed, and present the main results of multiple evaluation sessions. We conclude with a discussion of the usefulness of a profession-centered methodology and the relevance of mixed reality to support creative design activities.

  1. Stars of strange matter

    International Nuclear Information System (INIS)

    Bethe, H.A.; Brown, G.E.; Cooperstein, J.

    1987-01-01

    We investigate suggestions that quark matter with strangeness per baryon of order unity may be stable. We model this matter at nuclear matter densities as a gas of close packed Λ-particles. From the known mass of the Λ-particle we obtain an estimate of the energy and chemical potential of strange matter at nuclear densities. These are sufficiently high to preclude any phase transition from neutron matter to strange matter in the region near nucleon matter density. Including effects from gluon exchange phenomenologically, we investigate higher densities, consistently making approximations which underestimate the density of transition. In this way we find a transition density ρ tr > or approx.7ρ 0 , where ρ 0 is nuclear matter density. This is not far from the maximum density in the center of the most massive neutron stars that can be constructed. Since we have underestimated ρ tr and still find it to be ∝7ρ 0 , we do not believe that the transition from neutron to quark matter is likely in neutron stars. Moreover, measured masses of observed neutron stars are ≅1.4 M sun , where M sun is the solar mass. For such masses, the central (maximum) density is ρ c 0 . Transition to quark matter is certainly excluded for these densities. (orig.)

  2. Dynamical matter-parity breaking and gravitino dark matter

    International Nuclear Information System (INIS)

    Schmidt, Jonas; Weniger, Christoph; Yanagida, Tsutomu T.; Tokyo Univ.

    2010-10-01

    Scenarios where gravitinos with GeV masses make up dark matter are known to be in tension with high reheating temperatures, as required by e.g. thermal leptogenesis. This tension comes from the longevity of the NLSPs, which can destroy the successful predictions of the standard primordial nucleosynthesis. However, a small violation of matter parity can open new decay channels for the NLSP, avoiding the BBN problems, while being compatible with experimental cosmic-ray constraints. In this paper, we propose a model where matter parity, which we assume to be embedded in the U(1) B-L gauge symmetry, is broken dynamically in a hidden sector at low scales. This can naturally explain the smallness of the matter parity breaking in the visible sector. We discuss the dynamics of the corresponding pseudo Nambu-Goldstone modes of B-L breaking in the hidden sector, and we comment on typical cosmic-ray and collider signatures in our model. (orig.)

  3. Dynamical matter-parity breaking and gravitino dark matter

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, Jonas; Weniger, Christoph [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Tokyo Univ. (JP). Inst. for the Physics and Mathematics of the Universe (IPMU); Yanagida, Tsutomu T. [Tokyo Univ. (JP). Inst. for the Physics and Mathematics of the Universe (IPMU); Tokyo Univ. (Japan). Dept. of Physics

    2010-10-15

    Scenarios where gravitinos with GeV masses make up dark matter are known to be in tension with high reheating temperatures, as required by e.g. thermal leptogenesis. This tension comes from the longevity of the NLSPs, which can destroy the successful predictions of the standard primordial nucleosynthesis. However, a small violation of matter parity can open new decay channels for the NLSP, avoiding the BBN problems, while being compatible with experimental cosmic-ray constraints. In this paper, we propose a model where matter parity, which we assume to be embedded in the U(1){sub B-L} gauge symmetry, is broken dynamically in a hidden sector at low scales. This can naturally explain the smallness of the matter parity breaking in the visible sector. We discuss the dynamics of the corresponding pseudo Nambu-Goldstone modes of B-L breaking in the hidden sector, and we comment on typical cosmic-ray and collider signatures in our model. (orig.)

  4. Dark matter universe

    Science.gov (United States)

    Bahcall, Neta A.

    2015-01-01

    Most of the mass in the universe is in the form of dark matter—a new type of nonbaryonic particle not yet detected in the laboratory or in other detection experiments. The evidence for the existence of dark matter through its gravitational impact is clear in astronomical observations—from the early observations of the large motions of galaxies in clusters and the motions of stars and gas in galaxies, to observations of the large-scale structure in the universe, gravitational lensing, and the cosmic microwave background. The extensive data consistently show the dominance of dark matter and quantify its amount and distribution, assuming general relativity is valid. The data inform us that the dark matter is nonbaryonic, is “cold” (i.e., moves nonrelativistically in the early universe), and interacts only weakly with matter other than by gravity. The current Lambda cold dark matter cosmology—a simple (but strange) flat cold dark matter model dominated by a cosmological constant Lambda, with only six basic parameters (including the density of matter and of baryons, the initial mass fluctuations amplitude and its scale dependence, and the age of the universe and of the first stars)—fits remarkably well all the accumulated data. However, what is the dark matter? This is one of the most fundamental open questions in cosmology and particle physics. Its existence requires an extension of our current understanding of particle physics or otherwise point to a modification of gravity on cosmological scales. The exploration and ultimate detection of dark matter are led by experiments for direct and indirect detection of this yet mysterious particle. PMID:26417091

  5. Hidden charged dark matter

    International Nuclear Information System (INIS)

    Feng, Jonathan L.; Kaplinghat, Manoj; Tu, Huitzu; Yu, Hai-Bo

    2009-01-01

    Can dark matter be stabilized by charge conservation, just as the electron is in the standard model? We examine the possibility that dark matter is hidden, that is, neutral under all standard model gauge interactions, but charged under an exact (\\rm U)(1) gauge symmetry of the hidden sector. Such candidates are predicted in WIMPless models, supersymmetric models in which hidden dark matter has the desired thermal relic density for a wide range of masses. Hidden charged dark matter has many novel properties not shared by neutral dark matter: (1) bound state formation and Sommerfeld-enhanced annihilation after chemical freeze out may reduce its relic density, (2) similar effects greatly enhance dark matter annihilation in protohalos at redshifts of z ∼ 30, (3) Compton scattering off hidden photons delays kinetic decoupling, suppressing small scale structure, and (4) Rutherford scattering makes such dark matter self-interacting and collisional, potentially impacting properties of the Bullet Cluster and the observed morphology of galactic halos. We analyze all of these effects in a WIMPless model in which the hidden sector is a simplified version of the minimal supersymmetric standard model and the dark matter is a hidden sector stau. We find that charged hidden dark matter is viable and consistent with the correct relic density for reasonable model parameters and dark matter masses in the range 1 GeV ∼ X ∼< 10 TeV. At the same time, in the preferred range of parameters, this model predicts cores in the dark matter halos of small galaxies and other halo properties that may be within the reach of future observations. These models therefore provide a viable and well-motivated framework for collisional dark matter with Sommerfeld enhancement, with novel implications for astrophysics and dark matter searches

  6. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin Leigh [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  7. RISMC Toolkit and Methodology Research and Development Plan for External Hazards Analysis

    International Nuclear Information System (INIS)

    Coleman, Justin Leigh

    2016-01-01

    This report includes the description and development plan for a Risk Informed Safety Margins Characterization (RISMC) toolkit and methodology that will evaluate multihazard risk in an integrated manner to support the operating nuclear fleet.

  8. The dark-matter world: Are there dark-matter galaxies?

    OpenAIRE

    Hwang, W-Y. Pauchy

    2011-01-01

    We attempt to answer whether neutrinos and antineutrinos, such as those in the cosmic neutrino background, would clusterize among themselves or even with other dark-matter particles, under certain time span, say 1 Gyr. With neutrino masses in place, the similarity with the ordinary matter increases and so is our confidence for neutrino clustering if time is long enough. In particular, the clusterings could happen with some seeds (cf. see the text for definition), the chance in the dark-matter...

  9. Memory Matters

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Memory Matters KidsHealth / For Kids / Memory Matters What's in ... of your complex and multitalented brain. What Is Memory? When an event happens, when you learn something, ...

  10. Matter in transition

    International Nuclear Information System (INIS)

    Anderson, Lara B.; Gray, James; Raghuram, Nikhil; Taylor, Washington

    2016-01-01

    We explore a novel type of transition in certain 6D and 4D quantum field theories, in which the matter content of the theory changes while the gauge group and other parts of the spectrum remain invariant. Such transitions can occur, for example, for SU(6) and SU(7) gauge groups, where matter fields in a three-index antisymmetric representation and the fundamental representation are exchanged in the transition for matter in the two-index antisymmetric representation. These matter transitions are realized by passing through superconformal theories at the transition point. We explore these transitions in dual F-theory and heterotic descriptions, where a number of novel features arise. For example, in the heterotic description the relevant 6D SU(7) theories are described by bundles on K3 surfaces where the geometry of the K3 is constrained in addition to the bundle structure. On the F-theory side, non-standard representations such as the three-index antisymmetric representation of SU(N) require Weierstrass models that cannot be realized from the standard SU(N) Tate form. We also briefly describe some other situations, with groups such as Sp(3), SO(12), and SU(3), where analogous matter transitions can occur between different representations. For SU(3), in particular, we find a matter transition between adjoint matter and matter in the symmetric representation, giving an explicit Weierstrass model for the F-theory description of the symmetric representation that complements another recent analogous construction.

  11. Mimicking dark matter through a non-minimal gravitational coupling with matter

    International Nuclear Information System (INIS)

    Bertolami, O.; Páramos, J.

    2010-01-01

    In this study one resorts to the phenomenology of models endowed with a non-minimal coupling between matter and geometry, in order to develop a mechanism through which dynamics similar to that due to the presence of dark matter is generated. As a first attempt, one tries to account for the flattening of the galaxy rotation curves as an effect of the non-(covariant) conservation of the energy-momentum tensor of visible matter. Afterwards, one assumes instead that this non-minimal coupling modifies the scalar curvature in a way that can be interpreted as a dark matter component (albeit with negative pressure). It is concluded that it is possible to mimic known dark matter density profiles through an appropriate power-law coupling f 2 = (R/R 0 ) n , with a negative index n — a fact that reflects the dominance of dark matter at large distances. The properties of the model are extensively discussed, and possible cosmological implications are addressed

  12. Morals Matter in Economic Games

    Science.gov (United States)

    Brodbeck, Felix C.; Kugler, Katharina G.; Reif, Julia A. M.; Maier, Markus A.

    2013-01-01

    Contrary to predictions from Expected Utility Theory and Game Theory, when making economic decisions in interpersonal situations, people take the interest of others into account and express various forms of solidarity, even in one-shot interactions with anonymous strangers. Research in other-regarding behavior is dominated by behavioral economical and evolutionary biological approaches. Psychological theory building, which addresses mental processes underlying other-regarding behavior, is rare. Based on Relational Models Theory (RMT, [1]) and Relationship Regulation Theory (RRT, [2]) it is proposed that moral motives influence individuals’ decision behavior in interpersonal situations via conscious and unconscious (automatic) processes. To test our propositions we developed the ‘Dyadic Solidarity Game’ and its solitary equivalent, the ‘Self-Insurance Game’. Four experiments, in which the moral motives “Unity” and “Proportionality” were manipulated, support the propositions made. First, it was shown that consciously activated moral motives (via framing of the overall goal of the experiment) and unconsciously activated moral motives (via subliminal priming) influence other-regarding behavior. Second, this influence was only found in interpersonal, not in solitary situations. Third, by combining the analyses of the two experimental games the extent to which participants apply the Golden Rule (“treat others how you wish to be treated”) could be established. Individuals with a “Unity” motive treated others like themselves, whereas individuals with a “Proportionality” motive gave others less then they gave themselves. The four experiments not only support the assumption that morals matter in economic games, they also deliver new insights in how morals matter in economic decision making. PMID:24358115

  13. Morals matter in economic games.

    Directory of Open Access Journals (Sweden)

    Felix C Brodbeck

    Full Text Available Contrary to predictions from Expected Utility Theory and Game Theory, when making economic decisions in interpersonal situations, people take the interest of others into account and express various forms of solidarity, even in one-shot interactions with anonymous strangers. Research in other-regarding behavior is dominated by behavioral economical and evolutionary biological approaches. Psychological theory building, which addresses mental processes underlying other-regarding behavior, is rare. Based on Relational Models Theory (RMT, [1] and Relationship Regulation Theory (RRT, [2] it is proposed that moral motives influence individuals' decision behavior in interpersonal situations via conscious and unconscious (automatic processes. To test our propositions we developed the 'Dyadic Solidarity Game' and its solitary equivalent, the 'Self-Insurance Game'. Four experiments, in which the moral motives "Unity" and "Proportionality" were manipulated, support the propositions made. First, it was shown that consciously activated moral motives (via framing of the overall goal of the experiment and unconsciously activated moral motives (via subliminal priming influence other-regarding behavior. Second, this influence was only found in interpersonal, not in solitary situations. Third, by combining the analyses of the two experimental games the extent to which participants apply the Golden Rule ("treat others how you wish to be treated" could be established. Individuals with a "Unity" motive treated others like themselves, whereas individuals with a "Proportionality" motive gave others less then they gave themselves. The four experiments not only support the assumption that morals matter in economic games, they also deliver new insights in how morals matter in economic decision making.

  14. Morals matter in economic games.

    Science.gov (United States)

    Brodbeck, Felix C; Kugler, Katharina G; Reif, Julia A M; Maier, Markus A

    2013-01-01

    Contrary to predictions from Expected Utility Theory and Game Theory, when making economic decisions in interpersonal situations, people take the interest of others into account and express various forms of solidarity, even in one-shot interactions with anonymous strangers. Research in other-regarding behavior is dominated by behavioral economical and evolutionary biological approaches. Psychological theory building, which addresses mental processes underlying other-regarding behavior, is rare. Based on Relational Models Theory (RMT, [1]) and Relationship Regulation Theory (RRT, [2]) it is proposed that moral motives influence individuals' decision behavior in interpersonal situations via conscious and unconscious (automatic) processes. To test our propositions we developed the 'Dyadic Solidarity Game' and its solitary equivalent, the 'Self-Insurance Game'. Four experiments, in which the moral motives "Unity" and "Proportionality" were manipulated, support the propositions made. First, it was shown that consciously activated moral motives (via framing of the overall goal of the experiment) and unconsciously activated moral motives (via subliminal priming) influence other-regarding behavior. Second, this influence was only found in interpersonal, not in solitary situations. Third, by combining the analyses of the two experimental games the extent to which participants apply the Golden Rule ("treat others how you wish to be treated") could be established. Individuals with a "Unity" motive treated others like themselves, whereas individuals with a "Proportionality" motive gave others less then they gave themselves. The four experiments not only support the assumption that morals matter in economic games, they also deliver new insights in how morals matter in economic decision making.

  15. Correlation between white matter damage and gray matter lesions in multiple sclerosis patients

    Directory of Open Access Journals (Sweden)

    Xue-mei Han

    2017-01-01

    Full Text Available We observed the characteristics of white matter fibers and gray matter in multiple sclerosis patients, to identify changes in diffusion tensor imaging fractional anisotropy values following white matter fiber injury. We analyzed the correlation between fractional anisotropy values and changes in whole-brain gray matter volume. The participants included 20 patients with relapsing-remitting multiple sclerosis and 20 healthy volunteers as controls. All subjects underwent head magnetic resonance imaging and diffusion tensor imaging. Our results revealed that fractional anisotropy values decreased and gray matter volumes were reduced in the genu and splenium of corpus callosum, left anterior thalamic radiation, hippocampus, uncinate fasciculus, right corticospinal tract, bilateral cingulate gyri, and inferior longitudinal fasciculus in multiple sclerosis patients. Gray matter volumes were significantly different between the two groups in the right frontal lobe (superior frontal, middle frontal, precentral, and orbital gyri, right parietal lobe (postcentral and inferior parietal gyri, right temporal lobe (caudate nucleus, right occipital lobe (middle occipital gyrus, right insula, right parahippocampal gyrus, and left cingulate gyrus. The voxel sizes of atrophic gray matter positively correlated with fractional anisotropy values in white matter association fibers in the patient group. These findings suggest that white matter fiber bundles are extensively injured in multiple sclerosis patients. The main areas of gray matter atrophy in multiple sclerosis are the frontal lobe, parietal lobe, caudate nucleus, parahippocampal gyrus, and cingulate gyrus. Gray matter atrophy is strongly associated with white matter injury in multiple sclerosis patients, particularly with injury to association fibers.

  16. Soft matter physics

    CERN Document Server

    Doi, Masao

    2013-01-01

    Soft matter (polymers, colloids, surfactants and liquid crystals) are an important class of materials in modern technology. They also form the basis of many future technologies, for example in medical and environmental applications. Soft matter shows complex behaviour between fluids and solids, and used to be a synonym of complex materials. Due to the developments of the past two decades, soft condensed matter can now be discussed on the same sound physical basis as solid condensedmatter. The purpose of this book is to provide an overview of soft matter for undergraduate and graduate students

  17. Gaseous Matter

    CERN Document Server

    Angelo, Joseph A

    2011-01-01

    aseous Matter focuses on the many important discoveries that led to the scientific interpretation of matter in the gaseous state. This new, full-color resource describes the basic characteristics and properties of several important gases, including air, hydrogen, helium, oxygen, and nitrogen. The nature and scope of the science of fluids is discussed in great detail, highlighting the most important scientific principles upon which the field is based. Chapters include:. Gaseous Matter An Initial Perspective. Physical Characteristics of Gases. The Rise of the Science of Gases. Kinetic Theory of

  18. A methodological review of qualitative case study methodology in midwifery research.

    Science.gov (United States)

    Atchan, Marjorie; Davis, Deborah; Foureur, Maralyn

    2016-10-01

    To explore the use and application of case study research in midwifery. Case study research provides rich data for the analysis of complex issues and interventions in the healthcare disciplines; however, a gap in the midwifery research literature was identified. A methodological review of midwifery case study research using recognized templates, frameworks and reporting guidelines facilitated comprehensive analysis. An electronic database search using the date range January 2005-December 2014: Maternal and Infant Care, CINAHL Plus, Academic Search Complete, Web of Knowledge, SCOPUS, Medline, Health Collection (Informit), Cochrane Library Health Source: Nursing/Academic Edition, Wiley online and ProQuest Central. Narrative evaluation was undertaken. Clearly worded questions reflected the problem and purpose. The application, strengths and limitations of case study methods were identified through a quality appraisal process. The review identified both case study research's applicability to midwifery and its low uptake, especially in clinical studies. Many papers included the necessary criteria to achieve rigour. The included measures of authenticity and methodology were varied. A high standard of authenticity was observed, suggesting authors considered these elements to be routine inclusions. Technical aspects were lacking in many papers, namely a lack of reflexivity and incomplete transparency of processes. This review raises the profile of case study research in midwifery. Midwives will be encouraged to explore if case study research is suitable for their investigation. The raised profile will demonstrate further applicability; encourage support and wider adoption in the midwifery setting. © 2016 John Wiley & Sons Ltd.

  19. Mitochondrial dysfunction in alveolar and white matter developmental failure in premature infants.

    Science.gov (United States)

    Ten, Vadim S

    2017-02-01

    At birth, some organs in premature infants are not developed enough to meet challenges of the extra-uterine life. Although growth and maturation continues after premature birth, postnatal organ development may become sluggish or even arrested, leading to organ dysfunction. There is no clear mechanistic concept of this postnatal organ developmental failure in premature neonates. This review introduces a concept-forming hypothesis: Mitochondrial bioenergetic dysfunction is a fundamental mechanism of organs maturation failure in premature infants. Data collected in support of this hypothesis are relevant to two major diseases of prematurity: white matter injury and broncho-pulmonary dysplasia. In these diseases, totally different clinical manifestations are defined by the same biological process, developmental failure of the main functional units-alveoli in the lungs and axonal myelination in the brain. Although molecular pathways regulating alveolar and white matter maturation differ, proper bioenergetic support of growth and maturation remains critical biological requirement for any actively developing organ. Literature analysis suggests that successful postnatal pulmonary and white matter development highly depends on mitochondrial function which can be inhibited by sublethal postnatal stress. In premature infants, sublethal stress results mostly in organ maturation failure without excessive cellular demise.

  20. A national-scale remote sensing-based methodology for quantifying tidal marsh biomass to support "Blue Carbon" accounting

    Science.gov (United States)

    Byrd, K. B.; Ballanti, L.; Nguyen, D.; Simard, M.; Thomas, N.; Windham-Myers, L.; Castaneda, E.; Kroeger, K. D.; Gonneea, M. E.; O'Keefe Suttles, J.; Megonigal, P.; Troxler, T.; Schile, L. M.; Davis, M.; Woo, I.

    2016-12-01

    According to 2013 IPCC Wetlands Supplement guidelines, tidal marsh Tier 2 or Tier 3 accounting must include aboveground biomass carbon stock changes. To support this need, we are using free satellite and aerial imagery to develop a national scale, consistent remote sensing-based methodology for quantifying tidal marsh aboveground biomass. We are determining the extent to which additional satellite data will increase the accuracy of this "blue carbon" accounting. Working in 6 U.S. estuaries (Cape Cod, MA, Chesapeake Bay, MD, Everglades, FL, Mississippi Delta, LA, San Francisco Bay, CA, and Puget Sound, WA), we built a tidal marsh biomass dataset (n=2404). Landsat reflectance data were matched spatially and temporally with field plots using Google Earth Engine. We quantified percent cover of green vegetation, non-vegetation, and open water in Landsat pixels using segmentation of 1m National Agriculture Imagery Program aerial imagery. Sentinel-1A C-band backscatter data were used in Chesapeake, Mississippi Delta and Puget Sound. We tested multiple Landsat vegetation indices and Sentinel backscatter metrics in 30m scale biomass linear regression models by region. Scaling biomass by fraction green vegetation significantly improved biomass estimation (e.g. Cape Cod: R2 = 0.06 vs. R2 = 0.60, n=28). The best vegetation indices differed by region, though indices based on the shortwave infrared-1 and red bands were most predictive in the Everglades and the Mississippi Delta, while the soil adjusted vegetation index was most predictive in Puget Sound and Chesapeake. Backscatter metrics significantly improved model predictions over vegetation indices alone; consistently across regions, the most significant metric was the range in backscatter values within the green vegetation segment of the Landsat pixel (e.g. Mississippi Delta: R2 = 0.47 vs. R2 = 0.59, n=15). Results support using remote sensing of biomass stock change to estimate greenhouse gas emission factors in tidal

  1. Actions to promote professional ethics in the people supported education

    Directory of Open Access Journals (Sweden)

    Sucel Batista-Fonseca

    2016-06-01

    Full Text Available An action plan aimed at strengthening professional ethics supported by the methodology of popular education is a tool for achieving quality in the institutions used by managers and by workers committed with efficiency in our organizations. This study seeks to propose an action plan that promotes ethics in institutions supported by the methodology of popular education. The development of this proposal was made aided by documentary analysis with the use of theoretical methods such as analysis-synthesis, induction, deduction, and leaning on the technique of participant observation. The authors have investigated about professional ethics and Popular Education and analyzed these categories separately. The literature review showed that the methodology of popular education is an essential tool to encourage professional ethics.

  2. Variability of insulin degludec and glargine U300: A matter of methodology or just marketing?

    Science.gov (United States)

    Heise, Tim; Heckermann, Sascha; DeVries, J Hans

    2018-05-17

    The variability in the time-action profiles of insulin preparations, in particular basal insulins, has been a matter of debate ever since the publication of a glucose clamp study comparing the day-to-day variability of three different basal insulins (glargine U100, detemir and NPH) in 2004 [1]. While critics did not contest the findings of a lower variability of some basal insulins in this and a later [2] glucose clamp study, they did question the relevance of a lower pharmacokinetic (PK) and pharmacodynamic (PD) variability for clinical endpoints [3, 4]. Nevertheless, this has not stopped marketeers to widely use the results of glucose clamp studies promoting insulins for higher predictability or a suggested flat PK/PD-profile fully covering 24 hours [5]. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  3. Proceedings of Arcom Doctoral Workshop Research Methodology

    OpenAIRE

    Scott, Lloyd

    2018-01-01

    Editorial Editorial Welcome to this special doctoral workshop on Research Methodology which forms part of what is now a well-established support mechanism for researchers in the discipline of the Built Environment and more particularly construction management. The ARCOM doctoral series, around now for some seventeen years, has addressed many of the diverse research areas that PhD researchers in the discipline have chosen to focus on in their doctoral journey. This doctoral workshop has as ...

  4. Peer-led Aboriginal parent support: Program development for vulnerable populations with participatory action research.

    Science.gov (United States)

    Munns, Ailsa; Toye, Christine; Hegney, Desley; Kickett, Marion; Marriott, Rhonda; Walker, Roz

    2017-10-01

    Participatory action research (PAR) is a credible, culturally appropriate methodology that can be used to effect collaborative change within vulnerable populations. This PAR study was undertaken in a Western Australian metropolitan setting to develop and evaluate the suitability, feasibility and effectiveness of an Aboriginal peer-led home visiting programme. A secondary aim, addressed in this paper, was to explore and describe research methodology used for the study and provide recommendations for its implementation in other similar situations. PAR using action learning sets was employed to develop the parent support programme and data addressing the secondary, methodological aim were collected through focus groups using semi-structured and unstructured interview schedules. Findings were addressed throughout the action research process to enhance the research process. The themes that emerged from the data and addressed the methodological aim were the need for safe communication processes; supportive engagement processes and supportive organisational processes. Aboriginal peer support workers (PSWs) and community support agencies identified three important elements central to their capacity to engage and work within the PAR methodology. This research has provided innovative data, highlighting processes and recommendations for child health nurses to engage with the PSWs, parents and community agencies to explore culturally acceptable elements for an empowering methodology for peer-led home visiting support. There is potential for this nursing research to credibly inform policy development for Aboriginal child and family health service delivery, in addition to other vulnerable population groups. Child health nurses/researchers can use these new understandings to work in partnership with Aboriginal communities and families to develop empowering and culturally acceptable strategies for developing Aboriginal parent support for the early years. Impact Statement Child

  5. A framework for assessing the adequacy and effectiveness of software development methodologies

    Science.gov (United States)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  6. Risk-based Regulatory Evaluation Program methodology

    International Nuclear Information System (INIS)

    DuCharme, A.R.; Sanders, G.A.; Carlson, D.D.; Asselin, S.V.

    1987-01-01

    The objectives of this DOE-supported Regulatory Evaluation Progrwam are to analyze and evaluate the safety importance and economic significance of existing regulatory guidance in order to assist in the improvement of the regulatory process for current generation and future design reactors. A risk-based cost-benefit methodology was developed to evaluate the safety benefit and cost of specific regulations or Standard Review Plan sections. Risk-based methods can be used in lieu of or in combination with deterministic methods in developing regulatory requirements and reaching regulatory decisions

  7. Matter and Energy

    CERN Document Server

    Karam, P Andrew

    2011-01-01

    In Matter and Energy, readers will learn about the many forms of energy, the wide variety of particles in nature, and Albert Einstein's world-changing realization of how matter can be changed into pure energy. The book also examines the recent discoveries of dark matter and dark energy and the future of the universe.

  8. A framework for characterizing usability requirements elicitation and analysis methodologies (UREAM)

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Mannaert, H.

    2012-01-01

    Dedicated methodologies for the elicitation and analysis of usability requirements have been proposed in literature, usually developed by usability experts. The usability of these approaches by non-expert software engineers is not obvious. In this paper, the objective is to support developers and

  9. The intersections between TRIZ and forecasting methodology

    Directory of Open Access Journals (Sweden)

    Georgeta BARBULESCU

    2010-12-01

    Full Text Available The authors’ intention is to correlate the basic knowledge in using the TRIZ methodology (Theory of Inventive Problem Solving or in Russian: Teoriya Resheniya Izobretatelskikh Zadatch as a problem solving tools meant to help the decision makers to perform more significant forecasting exercises. The idea is to identify the TRIZ features and instruments (40 inventive principles, i.e. for putting in evidence the noise and signal problem, for trend identification (qualitative and quantitative tendencies and support tools in technological forecasting, to make the decision-makers able to refine and to increase the level of confidence in the forecasting results. The interest in connecting TRIZ to forecasting methodology, nowadays, relates to the massive application of TRIZ methods and techniques for engineering system development world-wide and in growing application of TRIZ’s concepts and paradigms for improvements of non-engineering systems (including the business and economic applications.

  10. Methodology for combining dynamic responses. Technical report

    International Nuclear Information System (INIS)

    Mattu, R.K.

    1980-05-01

    Procedures in accordance with Appendix A of 10 CFR 50, GDC 2, call for an appropriate combination of the effects of the accident loads and loads caused by natural phenomena (such as earthquakes) to be reflected in the design bases of safety equipment. This requirement of interaction of loads has been implemented in various ways both within the NRC and the Nuclear Industry. An NRR Working Group constituted to examine load combination methodologies developed recommendations which were published in September 1978 as NUREG-0484, (PB-287 432). Revision 1 of NUREG-0484 extends the conclusions of the original NUREG-0484 on the use of SRSS methodology for the combination of SSE and LOCA responses beyond RCPB to any other ASME Section III, Class 1, 2, or 3 affected system, component or support, and provides criteria for the combination of dynamic responses other than SSE and LOCA

  11. Chemical plant innovative safety investments decision-support methodology.

    Science.gov (United States)

    Reniers, G L L; Audenaert, A

    2009-01-01

    This article examines the extent to which investing in safety during the creation of a new chemical installation proves profitable. The authors propose a management supporting cost-benefit model that identifies and evaluates investments in safety within a chemical company. This innovative model differentiates between serious accidents and less serious accidents, thus providing an authentic image of prevention-related costs and benefits. In classic cost-benefit analyses, which do not make such differentiations, only a rudimentary image of potential profitability resulting from investments in safety is obtained. The resulting management conclusions that can be drawn from such classical analyses are of a very limited nature. The proposed model, however, is applied to a real case study and the proposed investments in safety at an appointed chemical installation are weighed against the estimated hypothetical benefits resulting from the preventive measures to be installed at the installation. In the case-study carried out in question, it would appear that the proposed prevention investments are justified. Such an economic exercise may be very important to chemical corporations trying to (further) improve their safety investments.

  12. A Network Based Methodology to Reveal Patterns in Knowledge Transfer

    Directory of Open Access Journals (Sweden)

    Orlando López-Cruz

    2015-12-01

    Full Text Available This paper motivates, presents and demonstrates in use a methodology based in complex network analysis to support research aimed at identification of sources in the process of knowledge transfer at the interorganizational level. The importance of this methodology is that it states a unified model to reveal knowledge sharing patterns and to compare results from multiple researches on data from different periods of time and different sectors of the economy. This methodology does not address the underlying statistical processes. To do this, national statistics departments (NSD provide documents and tools at their websites. But this proposal provides a guide to model information inferences gathered from data processing revealing links between sources and recipients of knowledge being transferred and that the recipient detects as main source to new knowledge creation. Some national statistics departments set as objective for these surveys the characterization of innovation dynamics in firms and to analyze the use of public support instruments. From this characterization scholars conduct different researches. Measures of dimensions of the network composed by manufacturing firms and other organizations conform the base to inquiry the structure that emerges from taking ideas from other organizations to incept innovations. These two sets of data are actors of a two- mode-network. The link between two actors (network nodes, one acting as the source of the idea. The second one acting as the destination comes from organizations or events organized by organizations that “provide” ideas to other group of firms. The resulting demonstrated design satisfies the objective of being a methodological model to identify sources in knowledge transfer of knowledge effectively used in innovation.

  13. Condensed elementary particle matter

    International Nuclear Information System (INIS)

    Kajantie, K.

    1996-01-01

    Quark matter is a special case of condensed elementary particle matter, matter governed by the laws of particle physics. The talk discusses how far one can get in the study of particle matter by reducing the problem to computations based on the action. As an example the computation of the phase diagram of electroweak matter is presented. It is quite possible that ultimately an antireductionist attitude will prevail: experiments will reveal unpredicted phenomena not obviously reducible to the study of the action. (orig.)

  14. Developing Agent-Oriented Video Surveillance System through Agent-Oriented Methodology (AOM

    Directory of Open Access Journals (Sweden)

    Cheah Wai Shiang

    2016-12-01

    Full Text Available Agent-oriented methodology (AOM is a comprehensive and unified agent methodology for agent-oriented software development. Although AOM is claimed to be able to cope with a complex system development, it is still not yet determined up to what extent this may be true. Therefore, it is vital to conduct an investigation to validate this methodology. This paper presents the adoption of AOM in developing an agent-oriented video surveillance system (VSS. An intruder handling scenario is designed and implemented through AOM. AOM provides an alternative method to engineer a distributed security system in a systematic manner. It presents the security system at a holistic view; provides a better conceptualization of agent-oriented security system and supports rapid prototyping as well as simulation of video surveillance system.

  15. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  16. Ordinary Dark Matter versus Mysterious Dark Matter in Galactic Rotation

    Science.gov (United States)

    Gallo, C. F.; Feng, James

    2008-04-01

    To theoretically describe the measured rotational velocity curves of spiral galaxies, there are two different approaches and conclusions. (1) ORDINARY DARK MATTER. We assume Newtonian gravity/dynamics and successfully find (via computer) mass distributions in bulge/disk configurations that duplicate the measured rotational velocities. There is ordinary dark matter within the galactic disk towards the cooler periphery which has lower emissivity/opacity. There are no mysteries in this scenario based on verified physics. (2) MYSTERIOUS DARK MATTER. Others INaccurately assume the galactic mass distributions follow the measured light distributions, and then the measured rotational velocity curves are NOT duplicated. To alleviate this discrepancy, speculations are invoked re ``Massive Peripheral Spherical Halos of Mysterious Dark Matter.'' But NO matter has been detected in this UNtenable Halo configuration. Many UNverified ``Mysteries'' are invoked as necessary and convenient. CONCLUSION. The first approach utilizing Newtonian gravity/dynamics and searching for the ordinary mass distributions within the galactic disk simulates reality and agrees with data.

  17. WEAKLY INTERACTING MASSIVE PARTICLE DARK MATTER AND FIRST STARS: SUPPRESSION OF FRAGMENTATION IN PRIMORDIAL STAR FORMATION

    International Nuclear Information System (INIS)

    Smith, Rowan J.; Glover, Simon C. O.; Klessen, Ralf S.; Iocco, Fabio; Schleicher, Dominik R. G.; Hirano, Shingo; Yoshida, Naoki

    2012-01-01

    We present the first three-dimensional simulations to include the effects of dark matter annihilation feedback during the collapse of primordial minihalos. We begin our simulations from cosmological initial conditions and account for dark matter annihilation in our treatment of the chemical and thermal evolution of the gas. The dark matter is modeled using an analytical density profile that responds to changes in the peak gas density. We find that the gas can collapse to high densities despite the additional energy input from the dark matter. No objects supported purely by dark matter annihilation heating are formed in our simulations. However, we find that dark matter annihilation heating has a large effect on the evolution of the gas following the formation of the first protostar. Previous simulations without dark matter annihilation found that protostellar disks around Population III stars rapidly fragmented, forming multiple protostars that underwent mergers or ejections. When dark matter annihilation is included, however, these disks become stable to radii of 1000 AU or more. In the cases where fragmentation does occur, it is a wide binary that is formed.

  18. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  19. Methodological Approaches in Conducting Overviews: Current State in HTA Agencies

    Science.gov (United States)

    Pieper, Dawid; Antoine, Sunya-Lee; Morfeld, Jana-Carina; Mathes, Tim; Eikermann, Michaela

    2014-01-01

    Objectives: Overviews search for reviews rather than for primary studies. They might have the potential to support decision making within a shorter time frame by reducing production time. We aimed to summarize available instructions for authors intending to conduct overviews as well as the currently applied methodology of overviews in…

  20. Dirac matter

    CERN Document Server

    Rivasseau, Vincent; Fuchs, Jean-Nöel

    2017-01-01

    This fifteenth volume of the Poincare Seminar Series, Dirac Matter, describes the surprising resurgence, as a low-energy effective theory of conducting electrons in many condensed matter systems, including graphene and topological insulators, of the famous equation originally invented by P.A.M. Dirac for relativistic quantum mechanics. In five highly pedagogical articles, as befits their origin in lectures to a broad scientific audience, this book explains why Dirac matters. Highlights include the detailed "Graphene and Relativistic Quantum Physics", written by the experimental pioneer, Philip Kim, and devoted to graphene, a form of carbon crystallized in a two-dimensional hexagonal lattice, from its discovery in 2004-2005 by the future Nobel prize winners Kostya Novoselov and Andre Geim to the so-called relativistic quantum Hall effect; the review entitled "Dirac Fermions in Condensed Matter and Beyond", written by two prominent theoreticians, Mark Goerbig and Gilles Montambaux, who consider many other mater...

  1. Quantifying indices of short- and long-range white matter connectivity at each cortical vertex.

    Directory of Open Access Journals (Sweden)

    Maria Carmela Padula

    Full Text Available Several neurodevelopmental diseases are characterized by impairments in cortical morphology along with altered white matter connectivity. However, the relationship between these two measures is not yet clear. In this study, we propose a novel methodology to compute and display metrics of white matter connectivity at each cortical point. After co-registering the extremities of the tractography streamlines with the cortical surface, we computed two measures of connectivity at each cortical vertex: the mean tracts' length, and the proportion of short- and long-range connections. The proposed measures were tested in a clinical sample of 62 patients with 22q11.2 deletion syndrome (22q11DS and 57 typically developing individuals. Using these novel measures, we achieved a fine-grained visualization of the white matter connectivity patterns at each vertex of the cortical surface. We observed an intriguing pattern of both increased and decreased short- and long-range connectivity in 22q11DS, that provides novel information about the nature and topology of white matter alterations in the syndrome. We argue that the method presented in this study opens avenues for additional analyses of the relationship between cortical properties and patterns of underlying structural connectivity, which will help clarifying the intrinsic mechanisms that lead to altered brain structure in neurodevelopmental disorders.

  2. Strategies for dark matter detection

    International Nuclear Information System (INIS)

    Silk, J.

    1988-01-01

    The present status of alternative forms of dark matter, both baryonic and nonbaryonic, is reviewed. Alternative arguments are presented for the predominance of either cold dark matter (CDM) or of baryonic dark matter (BDM). Strategies are described for dark matter detection, both for dark matter that consists of weakly interacting relic particles and for dark matter that consists of dark stellar remnants

  3. Soil organic matter studies

    International Nuclear Information System (INIS)

    1977-01-01

    A total of 77 papers were presented and discussed during this symposium, 37 are included in this Volume II. The topics covered in this volume include: biochemical transformation of organic matter in soils; bitumens in soil organic matter; characterization of humic acids; carbon dating of organic matter in soils; use of modern techniques in soil organic matter research; use of municipal sludge with special reference to heavy metals constituents, soil nitrogen, and physical and chemical properties of soils; relationship of soil organic matter and plant metabolism; interaction between agrochemicals and organic matter; and peat. Separate entries have been prepared for those 20 papers which discuss the use of nuclear techniques in these studies

  4. Supporting Teachers in Inclusive Education

    Directory of Open Access Journals (Sweden)

    Alekhina S.V.

    2015-03-01

    Full Text Available The article regards the issues of support provision to teachers involved in inclusive education as the main requirement for successful realization of inclusion. The methodological framework used in the study is a resource approach. The article describes the ways of extending the means of supporting teachers. The article also arguments for consolidating all the educators of inclusive schools into inclusive teams equally interested in joint work of administration and educators of intervention programs.

  5. Exploratory analysis of diffusion tensor imaging in children with attention deficit hyperactivity disorder: evidence of abnormal white matter structure.

    Science.gov (United States)

    Pastura, Giuseppe; Doering, Thomas; Gasparetto, Emerson Leandro; Mattos, Paulo; Araújo, Alexandra Prüfer

    2016-06-01

    Abnormalities in the white matter microstructure of the attentional system have been implicated in the aetiology of attention deficit hyperactivity disorder (ADHD). Diffusion tensor imaging (DTI) is a promising magnetic resonance imaging (MRI) technology that has increasingly been used in studies of white matter microstructure in the brain. The main objective of this work was to perform an exploratory analysis of white matter tracts in a sample of children with ADHD versus typically developing children (TDC). For this purpose, 13 drug-naive children with ADHD of both genders underwent MRI using DTI acquisition methodology and tract-based spatial statistics. The results were compared to those of a sample of 14 age- and gender-matched TDC. Lower fractional anisotropy was observed in the splenium of the corpus callosum, right superior longitudinal fasciculus, bilateral retrolenticular part of the internal capsule, bilateral inferior fronto-occipital fasciculus, left external capsule and posterior thalamic radiation (including right optic radiation). We conclude that white matter tracts in attentional and motor control systems exhibited signs of abnormal microstructure in this sample of drug-naive children with ADHD.

  6. Guide for prioritizing power plant productivity improvement projects: handbook of availability improvement methodology

    International Nuclear Information System (INIS)

    1981-01-01

    As part of its program to help improve electrical power plant productivity, the Department of Energy (DOE) has developed a methodology for evaluating productivity improvement projects. This handbook presents a simplified version of this methodology called the Availability Improvement Methodology (AIM), which provides a systematic approach for prioritizing plant improvement projects. Also included in this handbook is a description of data taking requirements necessary to support the AIM methodology, benefit/cost analysis, and root cause analysis for tracing persistent power plant problems. In applying the AIM methodology, utility engineers should be mindful that replacement power costs are frequently greater for forced outages than for planned outages. Equivalent availability includes both. A cost-effective ranking of alternative plant improvement projects must discern between those projects which will reduce forced outages and those which might reduce planned outages. As is the case with any analytical procedure, engineering judgement must be exercised with respect to results of purely mathematical calculations

  7. Critical dialogical approach: A methodological direction for occupation-based social transformative work.

    Science.gov (United States)

    Farias, Lisette; Laliberte Rudman, Debbie; Pollard, Nick; Schiller, Sandra; Serrata Malfitano, Ana Paula; Thomas, Kerry; van Bruggen, Hanneke

    2018-05-03

    Calls for embracing the potential and responsibility of occupational therapy to address socio-political conditions that perpetuate occupational injustices have materialized in the literature. However, to reach beyond traditional frameworks informing practices, this social agenda requires the incorporation of diverse epistemological and methodological approaches to support action commensurate with social transformative goals. Our intent is to present a methodological approach that can help extend the ways of thinking or frameworks used in occupational therapy and science to support the ongoing development of practices with and for individuals and collectives affected by marginalizing conditions. We describe the epistemological and theoretical underpinnings of a methodological approach drawing on Freire and Bakhtin's work. Integrating our shared experience taking part in an example study, we discuss the unique advantages of co-generating data using two methods aligned with this approach; dialogical interviews and critical reflexivity. Key considerations when employing this approach are presented, based on its proposed epistemological and theoretical stance and our shared experiences engaging in it. A critical dialogical approach offers one way forward in expanding occupational therapy and science scholarship by promoting collaborative knowledge generation and examination of taken-for-granted understandings that shape individuals assumptions and actions.

  8. Sterile neutrino dark matter

    CERN Document Server

    Merle, Alexander

    2017-01-01

    This book is a new look at one of the hottest topics in contemporary science, Dark Matter. It is the pioneering text dedicated to sterile neutrinos as candidate particles for Dark Matter, challenging some of the standard assumptions which may be true for some Dark Matter candidates but not for all. So, this can be seen either as an introduction to a specialized topic or an out-of-the-box introduction to the field of Dark Matter in general. No matter if you are a theoretical particle physicist, an observational astronomer, or a ground based experimentalist, no matter if you are a grad student or an active researcher, you can benefit from this text, for a simple reason: a non-standard candidate for Dark Matter can teach you a lot about what we truly know about our standard picture of how the Universe works.

  9. Dark matter: the astrophysical case

    International Nuclear Information System (INIS)

    Silk, J.

    2012-01-01

    The identification of dark matter is one of the most urgent problems in cosmology. I describe the astrophysical case for dark matter, from both an observational and a theoretical perspective. This overview will therefore focus on the observational motivations rather than the particle physics aspects of dark matter constraints on specific dark matter candidates. First, however, I summarize the astronomical evidence for dark matter, then I highlight the weaknesses of the standard cold dark matter model (LCDM) to provide a robust explanation of some observations. The greatest weakness in the dark matter saga is that we have not yet identified the nature of dark matter itself

  10. White matter structural connectivity and episodic memory in early childhood

    Directory of Open Access Journals (Sweden)

    Chi T. Ngo

    2017-12-01

    Full Text Available Episodic memory undergoes dramatic improvement in early childhood; the reason for this is poorly understood. In adults, episodic memory relies on a distributed neural network. Key brain regions that supporting these processes include the hippocampus, portions of the parietal cortex, and portions of prefrontal cortex, each of which shows different developmental profiles. Here we asked whether developmental differences in the axonal pathways connecting these regions may account for the robust gains in episodic memory in young children. Using diffusion weighted imaging, we examined whether white matter connectivity between brain regions implicated in episodic memory differed with age, and were associated with memory performance differences in 4- and 6-year-old children. Results revealed that white matter connecting the hippocampus to the inferior parietal lobule significantly predicted children’s performance on episodic memory tasks. In contrast, variation in the white matter connecting the hippocampus to the medial prefrontal cortex did not relate to memory performance. These findings suggest that structural connectivity between the hippocampus and lateral parietal regions is relevant to the development of episodic memory. Keywords: White matter, Memory development, Episodic memory, Diffusion weighted imaging

  11. Contextual factors, methodological principles and teacher cognition

    Directory of Open Access Journals (Sweden)

    Rupert Walsh

    2014-01-01

    Full Text Available Teachers in various contexts worldwide are sometimes unfairly criticized for not putting teaching methods developed for the well-resourced classrooms of Western countries into practice. Factors such as the teachers’ “misconceptualizations” of “imported” methods, including Communicative Language Teaching (CLT, are often blamed, though the challenges imposed by “contextual demands,” such as large class sizes, are sometimes recognised. Meanwhile, there is sometimes an assumption that in the West there is a happy congruence between policy supportive of CLT or Task-Based Language Teaching, teacher education and supervision, and curriculum design with teachers’ cognitions and their practices. Our case study of three EFL teachers at a UK adult education college is motivated by a wish to question this assumption. Findings from observational and interview data suggest the practices of two teachers were largely consistent with their methodological principles, relating to stronger and weaker forms of CLT respectively, as well as to more general educational principles, such as a concern for learners; the supportive environment seemed to help. The third teacher appeared to put “difficult” contextual factors, for example, tests, ahead of methodological principles without, however, obviously benefiting. Implications highlight the important role of teacher cognition research in challenging cultural assumptions.

  12. Size Matters: The Effect of the Scramble for Africa on Informal Institutions and Development

    OpenAIRE

    Dimico, Arcangelo

    2013-01-01

    We argue that the partition of ethnic groups following the Scramble for Africa does not itself matter for development in Africa. It matters only when the partitioned groups are relatively small because small groups lack political representation which may promote ethnic mobilization and foster support for informal (rather than formal) institutions which then may affect development. Furthermore, the analysis of data from the Afrobarometer shows that the persistence of informal/tribal institutio...

  13. A Novel Type of Oil—generating Organic Matter —Crystal—enclosed Organic Matter

    Institute of Scientific and Technical Information of China (English)

    周中毅; 裴存民; 等

    1992-01-01

    The comparative study of organic matter in carbonate rocks and argillaceous rocks from the same horizon indicates that the organic thermal maturities of carbonate rocks are much lower than those of argillaceous rocks .Ana extensive analysis of extracted and inclused organic matter from the same sample shows that inclused organic matter is different from extracted organic matter,and the thermal maturity of the former is usually lower than that of the latter in terms of biomarker structural parameters.It seems that carbonate mineras could preserve organic matter and retard organic maturation.The inclused organic matter,abundant in most carbonate rocks,will be released from minerals and transformed into oil and gas during the high-thermal maturity stage.

  14. Drift design methodology and preliminary application for the Yucca Mountain Site Characterization Project

    International Nuclear Information System (INIS)

    Hardy, M.P.; Bauer, S.J.

    1991-12-01

    Excavation stability in an underground nuclear waste repository is required during construction, emplacement, retrieval (if required), and closure phases to ensure worker health and safety, and to prevent development of potential pathways for radionuclide migration in the post-closure period. Stable excavations are developed by appropriate excavation procedures, design of the room shape, design and installation of rock support reinforcement systems, and implementation of appropriate monitoring and maintenance programs. In addition to the loads imposed by the in situ stress field, the repository drifts will be impacted by thermal loads developed after waste emplacement and, periodically, by seismic loads from naturally occurring earthquakes and underground nuclear events. A priori evaluation of stability is required for design of the ground support system, to confirm that the thermal loads are reasonable, and to support the license application process. In this report, a design methodology for assessing drift stability is presented. This is based on site conditions, together with empirical and analytical methods. Analytical numerical methods are emphasized at this time because empirical data are unavailable for excavations in welded tuff either at elevated temperatures or under seismic loads. The analytical methodology incorporates analysis of rock masses that are systematically jointed, randomly jointed, and sparsely jointed. In situ thermal and seismic loads are considered. Methods of evaluating the analytical results and estimating ground support requirements for all the full range of expected ground conditions are outlines. The results of a preliminary application of the methodology using the limited available data are presented. 26 figs., 55 tabs

  15. Collapsed Dark Matter Structures

    Science.gov (United States)

    Buckley, Matthew R.; DiFranzo, Anthony

    2018-02-01

    The distributions of dark matter and baryons in the Universe are known to be very different: The dark matter resides in extended halos, while a significant fraction of the baryons have radiated away much of their initial energy and fallen deep into the potential wells. This difference in morphology leads to the widely held conclusion that dark matter cannot cool and collapse on any scale. We revisit this assumption and show that a simple model where dark matter is charged under a "dark electromagnetism" can allow dark matter to form gravitationally collapsed objects with characteristic mass scales much smaller than that of a Milky-Way-type galaxy. Though the majority of the dark matter in spiral galaxies would remain in the halo, such a model opens the possibility that galaxies and their associated dark matter play host to a significant number of collapsed substructures. The observational signatures of such structures are not well explored but potentially interesting.

  16. Collapsed Dark Matter Structures.

    Science.gov (United States)

    Buckley, Matthew R; DiFranzo, Anthony

    2018-02-02

    The distributions of dark matter and baryons in the Universe are known to be very different: The dark matter resides in extended halos, while a significant fraction of the baryons have radiated away much of their initial energy and fallen deep into the potential wells. This difference in morphology leads to the widely held conclusion that dark matter cannot cool and collapse on any scale. We revisit this assumption and show that a simple model where dark matter is charged under a "dark electromagnetism" can allow dark matter to form gravitationally collapsed objects with characteristic mass scales much smaller than that of a Milky-Way-type galaxy. Though the majority of the dark matter in spiral galaxies would remain in the halo, such a model opens the possibility that galaxies and their associated dark matter play host to a significant number of collapsed substructures. The observational signatures of such structures are not well explored but potentially interesting.

  17. Multi-technical approach to characterize the dissolved organic matter from clay-stone

    International Nuclear Information System (INIS)

    Blanchart, Pascale; Michels, Raymond; Faure, Pierre; Parant, Stephane; Bruggeman, Christophe; De Craen, Mieke

    2012-01-01

    Document available in extended abstract form only. Currently, different clay formations (Boom Clay, Callovo-Oxfordian argilites, Opalinus Clay, Toarcian shales...) are studied as reference host rocks for methodological studies on the geological disposal of high-level and long-lived radioactive waste. While a significant effort is being done on the characterization of the mineral composition and the reactivity of the clays as barriers, the occurrence of organic matter, even in low proportion cannot be neglected. The organic matter appears as gas (C 1 -C 4 as identified in the Bure underground facilities), as solid (kerogen), as hydrocarbon liquids (free hydrocarbons within the kerogen or adsorbed on minerals) as well as in the aqueous phase (Dissolved Organic Matter - DOM). DOM raises specific interest, as it may have complexation properties towards metals and rare earth elements and is potentially mobile. Therefore, it is important to characterize the DOM as part of a study of feasibility of geological disposal. In this study, four host rocks were studied: - The Callovo-Oxfordian shales of Bure Underground Research Laboratory (Meuse, France); - The Opalinus Clay of Mont Terri Underground Research Laboratory (Switzerland); - The Toarcian shales of Tournemire (Aveyron, France); - The Boom Clay formation studied in The HADES Underground Research Laboratory (Mol, Belgium). Organic matter characteristics vary upon formation in terms of (i) origin (mainly marine type II; mixtures of marine type II and higher plants type III organic matter often poorly preserved), (ii) TOC contents, (iii) thermal maturity (for instance, Opalinus Clay and Toarcian shales are more mature and have poor oxygen content compare to Callovo-Oxfordian shales and Boom Clay). These differences in organic matter quality may have an influence on the quantity and the quality of DOM. The DOM of the rocks was isolated by Soxhlet extraction using pure water. A quantitative and qualitative multi

  18. Soil organic matter

    International Nuclear Information System (INIS)

    1976-01-01

    The nature, content and behaviour of the organic matter, or humus, in soil are factors of fundamental importance for soil productivity and the development of optimum conditions for growth of crops under diverse temperate, tropical and arid climatic conditions. In the recent symposium on soil organic matter studies - as in the two preceding ones in 1963 and 1969 - due consideration was given to studies involving the use of radioactive and stable isotopes. However, the latest symposium was a departure from previous efforts in that non-isotopic approaches to research on soil organic matter were included. A number of papers dealt with the behaviour and functions of organic matter and suggested improved management practices, the use of which would contribute to increasing agricultural production. Other papers discussed the turnover of plant residues, the release of plant nutrients through the biodegradation of organic compounds, the nitrogen economy and the dynamics of transformation of organic forms of nitrogen. In addition, consideration was given to studies on the biochemical transformation of organic matter, characterization of humic acids, carbon-14 dating and the development of modern techniques and their impact on soil organic matter research

  19. Computational intelligence as a platform for data collection methodology in management science

    DEFF Research Database (Denmark)

    Jespersen, Kristina Risom

    2006-01-01

    With the increased focus in management science on how to collect data close to the real-world of managers, then agent-based simulations have interesting prospects that are usable for the design of business applications aimed at the collection of data. As a new generation of data collection...... methodologies this chapter discusses and presents a behavioral simulation founded in the agent-based simulation life cycle and supported by Web technology. With agent-based modeling the complexity of the method is increased without limiting the research due to the technological support, because this makes...... it possible to exploit the advantages of a questionnaire, an experimental design, a role-play and a scenario as such gaining the synergy effect of these methodologies. At the end of the chapter an example of a simulation is presented for researchers and practitioners to study....

  20. Effective description of dark matter self-interactions in small dark matter haloes

    International Nuclear Information System (INIS)

    Kummer, Janis

    2017-07-01

    Self-interacting dark matter may have striking astrophysical signatures, such as observ- able offsets between galaxies and dark matter in merging galaxy clusters. Numerical N-body simulations used to predict such observables typically treat the galaxies as collisionless test particles, a questionable assumption given that each galaxy is embedded in its own dark matter halo. To enable a more accurate treatment we develop an effective description of small dark matter haloes taking into account the two major effects due to dark matter self-scatterings: deceleration and evaporation. We point out that self-scatterings can have a sizeable impact on the trajectories of galaxies, diminishing the separation between galaxies and dark matter in merging clusters. This effect depends sensitively on the underlying particle physics, in particular the angular dependence of the self-scattering cross section, and cannot be predicted from the momentum transfer cross section alone.

  1. Gray Matter Alterations in Adults with Attention-Deficit/Hyperactivity Disorder Identified by Voxel Based Morphometry

    Science.gov (United States)

    Seidman, Larry J.; Biederman, Joseph; Liang, Lichen; Valera, Eve M.; Monuteaux, Michael C.; Brown, Ariel; Kaiser, Jonathan; Spencer, Thomas; Faraone, Stephen V.; Makris, Nikos

    2014-01-01

    Background Gray and white matter volume deficits have been reported in many structural magnetic resonance imaging (MRI) studies of children with attention-deficit/hyperactivity disorder (ADHD); however, there is a paucity of structural MRI studies of adults with ADHD. This study used voxel based morphometry and applied an a priori region of interest approach based on our previous work, as well as from well-developed neuroanatomical theories of ADHD. Methods Seventy-four adults with DSM-IV ADHD and 54 healthy control subjects comparable on age, sex, race, handedness, IQ, reading achievement, frequency of learning disabilities, and whole brain volume had an MRI on a 1.5T Siemens scanner. A priori region of interest hypotheses focused on reduced volumes in ADHD in dorsolateral prefrontal cortex, anterior cingulate cortex, caudate, putamen, inferior parietal lobule, and cerebellum. Analyses were carried out by FSL-VBM 1.1. Results Relative to control subjects, ADHD adults had significantly smaller gray matter volumes in parts of six of these regions at p ≤ .01, whereas parts of the dorsolateral prefrontal cortex and inferior parietal lobule were significantly larger in ADHD at this threshold. However, a number of other regions were smaller and larger in ADHD (especially fronto-orbital cortex) at this threshold. Only the caudate remained significantly smaller at the family-wise error rate. Conclusions Adults with ADHD have subtle volume reductions in the caudate and possibly other brain regions involved in attention and executive control supporting frontostriatal models of ADHD. Modest group brain volume differences are discussed in the context of the nature of the samples studied and voxel based morphometry methodology. PMID:21183160

  2. Phase transition from nuclear matter to color superconducting quark matter

    Energy Technology Data Exchange (ETDEWEB)

    Bentz, W. E-mail: bentz@keyaki.cc.u-tokai.ac.jp; Horikawa, T.; Ishii, N.; Thomas, A.W

    2003-06-02

    We construct the nuclear and quark matter equations of state at zero temperature in an effective quark theory (the Nambu-Jona-Lasinio model), and discuss the phase transition between them. The nuclear matter equation of state is based on the quark-diquark description of the single nucleon, while the quark matter equation of state includes the effects of scalar diquark condensation (color superconductivity). The effect of diquark condensation on the phase transition is discussed in detail.

  3. Dark matter and cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N.

    1992-03-01

    The cosmological dark matter problem is reviewed. The Big Bang Nucleosynthesis constraints on the baryon density are compared with the densities implied by visible matter, dark halos, dynamics of clusters, gravitational lenses, large-scale velocity flows, and the {Omega} = 1 flatness/inflation argument. It is shown that (1) the majority of baryons are dark; and (2) non-baryonic dark matter is probably required on large scales. It is also noted that halo dark matter could be either baryonic or non-baryonic. Descrimination between ``cold`` and ``hot`` non-baryonic candidates is shown to depend on the assumed ``seeds`` that stimulate structure formation. Gaussian density fluctuations, such as those induced by quantum fluctuations, favor cold dark matter, whereas topological defects such as strings, textures or domain walls may work equally or better with hot dark matter. A possible connection between cold dark matter, globular cluster ages and the Hubble constant is mentioned. Recent large-scale structure measurements, coupled with microwave anisotropy limits, are shown to raise some questions for the previously favored density fluctuation picture. Accelerator and underground limits on dark matter candidates are also reviewed.

  4. Dark matter and cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Schramm, D.N.

    1992-03-01

    The cosmological dark matter problem is reviewed. The Big Bang Nucleosynthesis constraints on the baryon density are compared with the densities implied by visible matter, dark halos, dynamics of clusters, gravitational lenses, large-scale velocity flows, and the {Omega} = 1 flatness/inflation argument. It is shown that (1) the majority of baryons are dark; and (2) non-baryonic dark matter is probably required on large scales. It is also noted that halo dark matter could be either baryonic or non-baryonic. Descrimination between cold'' and hot'' non-baryonic candidates is shown to depend on the assumed seeds'' that stimulate structure formation. Gaussian density fluctuations, such as those induced by quantum fluctuations, favor cold dark matter, whereas topological defects such as strings, textures or domain walls may work equally or better with hot dark matter. A possible connection between cold dark matter, globular cluster ages and the Hubble constant is mentioned. Recent large-scale structure measurements, coupled with microwave anisotropy limits, are shown to raise some questions for the previously favored density fluctuation picture. Accelerator and underground limits on dark matter candidates are also reviewed.

  5. Dark matter and cosmology

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1992-03-01

    The cosmological dark matter problem is reviewed. The Big Bang Nucleosynthesis constraints on the baryon density are compared with the densities implied by visible matter, dark halos, dynamics of clusters, gravitational lenses, large-scale velocity flows, and the Ω = 1 flatness/inflation argument. It is shown that (1) the majority of baryons are dark; and (2) non-baryonic dark matter is probably required on large scales. It is also noted that halo dark matter could be either baryonic or non-baryonic. Descrimination between ''cold'' and ''hot'' non-baryonic candidates is shown to depend on the assumed ''seeds'' that stimulate structure formation. Gaussian density fluctuations, such as those induced by quantum fluctuations, favor cold dark matter, whereas topological defects such as strings, textures or domain walls may work equally or better with hot dark matter. A possible connection between cold dark matter, globular cluster ages and the Hubble constant is mentioned. Recent large-scale structure measurements, coupled with microwave anisotropy limits, are shown to raise some questions for the previously favored density fluctuation picture. Accelerator and underground limits on dark matter candidates are also reviewed

  6. Methodological individualism in experimental games: not so easily dismissed.

    Science.gov (United States)

    Krueger, Joachim I

    2008-06-01

    Orthodox game theory and social preference models cannot explain why people cooperate in many experimental games or how they manage to coordinate their choices. The theory of evidential decision making provides a solution, based on the idea that people tend to project their own choices onto others, whatever these choices might be. Evidential decision making preserves methodological individualism, and it works without recourse to social preferences. Rejecting methodological individualism, team reasoning is a thinly disguised resurgence of the group mind fallacy, and the experiments reported by Colman et al. [Colman, A. M., Pulford, B. D., & Rose, J. (this issue). Collective rationality in interactive decisions: Evidence for team reasoning. Acta Psychologica, doi:10.1016/j.actpsy.2007.08.003.] do not offer evidence that uniquely supports team reasoning.

  7. Proceedings of workshop on dark matter and the structure of the universe

    International Nuclear Information System (INIS)

    Sasaki, Misao

    1989-10-01

    The workshop on 'Dark matter and the structure of the universe' was held from January 29 to February 1, 1989 at the Research Institute for Theoretical Physics, Hiroshima University. It aimed at clarifying the basic theoretical problems of the dark matter and the structure of the universe, and gaining inspiration on the direction of future research. In the first half of the workshop, the observed data on the large scale structure were critically reviewed, and some new ideas and theoretical frameworks which relate the actual cosmological structure to the observable quantities were presented. In the second half of the workshop, the various possible matters being proposed for the dark matter were examined in the light of both observed (or experimental) data and theoretical predictions. The speakers in the workshop gave well prepared, stimulative talks, and made it possible for the participants to have fruitful and constructive discussions. The workshop was supported partially by the Grant in Aid for Scientific Research, Ministry of Education, and by the Research Institute for Theoretical Physics, Hiroshima University. In this report, eight presentations on observational and theoretical cosmology and ten on dark matter and galaxy formation are collected. (K.I.)

  8. Comparative study of researcher community support and supervisory support among Finnish and Danish PhD-students

    DEFF Research Database (Denmark)

    Cornér, Solveig; Pyhältö, Kirsi; Peltonen, Jouni

    Prior research on doctoral supervision and researcher communities has identified social support as a key determinant of the doctoral journey (Jairam & Kahl, 2012; Zhao, Golde & McCormick, 2007). Supervisory support, for instance, in terms of constructive feedback and encouragement (Pyhältö......, 313-329. • Pyhältö, K., Vekkaila (o.s. Tuomainen), J., & Keskinen, J. (2015). Fit matters in the supervisory relationship: Doctoral students’ and supervisors’ perceptions about supervisory activities. Innovations in Education and Teaching International, 52(1), 4-16. • Zhau, C-M, Golde, C.M., McCormick...

  9. EDITORIAL: Nobel Symposium 148: Graphene and Quantum Matter Nobel Symposium 148: Graphene and Quantum Matter

    Science.gov (United States)

    Niemi, Antti; Wilczek, Frank; Ardonne, Eddy; Hansson, Hans

    2012-01-01

    The 2010 Nobel Symposium on Graphene and Quantum Matter, was held at the Grand Hotel in Saltsjöbaden south of Stockholm on 27-31 May. The main theme of the meeting was graphene, and the symposium turned out to be very timely: two of the participants, Andre Geim and Kanstantin Novoselov returned to Stockholm less then six months later to receive the 2010 Nobel Prize in Physics. In these proceedings leading experts give up-to-date, historical, experimental, theoretical and technological perspectives on the remarkable material graphene, and several papers also make connections to other states of quantum matter. Saltsjöbaden is beautifully situated in the inner archipelago of Stockholm. It provided a pleasant setting for the talks and the ensuing discussions that took place in an enthusiastic and friendly atmosphere. The social programme included a boat trip in the light summer night and a dinner at the renowned Grand Hotel. These proceedings are ordered thematically, starting with historical overviews, followed by first experimental and then theoretical papers on the physics of graphene. Next are several papers addressing more general topics in quantum matter and finally contributions on the technological applications of graphene. We hope that this volume will serve as a source of knowledge and inspiration for any physicist interested in graphene, and at the same time provide a snapshot of a young field of research that is developing at very high speed. We are grateful to Marja Fahlander for excellent administrative support, and to the Nobel Foundation who funded the symposium.

  10. A Systematic Review of Unmet Information and Psychosocial Support Needs of Adults Diagnosed with Thyroid Cancer.

    Science.gov (United States)

    Hyun, Yong Gyu; Alhashemi, Ahmad; Fazelzad, Rouhi; Goldberg, Alyse S; Goldstein, David P; Sawka, Anna M

    2016-09-01

    Patient education and psychosocial support to patients are important elements of comprehensive cancer care, but the needs of thyroid cancer survivors are not well understood. The published English-language quantitative literature on (i) unmet medical information and (ii) psychosocial support needs of thyroid cancer survivors was systematically reviewed. A librarian information specialist searched seven electronic databases and a hand search was conducted. Two reviewers independently screened citations from the electronic search and reviewed relevant full-text papers. There was consensus between reviewers on the included papers, and duplicate independent abstraction was performed. The results were summarized descriptively. A total of 1984 unique electronic citations were screened, and 51 full-text studies were reviewed (three from the hand search). Seven cross-sectional, single-arm, survey studies were included, containing data from 6215 thyroid cancer survivor respondents. The respective study sizes ranged from 57 to 2398 subjects. All of the studies had some methodological limitations. Unmet information needs were variable relating to the disease, diagnostic tests, treatments, and co-ordination of medical care. There were relatively high unmet information needs related to aftercare (especially long-term effects of the disease or its treatment and its management) and psychosocial concerns (including practical and financial matters). Psychosocial support needs were incompletely met. Patient information on complementary and alternative medicine was very limited. In conclusion, thyroid cancer survivors perceive many unmet information needs, and these needs extend to aftercare. Psychosocial information and supportive care needs may be insufficiently met in this population. More work is needed to improve knowledge translation and psychosocial support for thyroid cancer survivors.

  11. mHealth in psychiatry: time for methodological change.

    Science.gov (United States)

    Nicholas, Jennifer; Boydell, Katherine; Christensen, Helen

    2016-05-01

    A multitude of mental health apps are available to consumers through the Apple and Google app stores. However, evidence supporting the effectiveness of mHealth is scant. We argue this gap between app availability and research evidence is primarily due to unsuitable knowledge translation practices and therefore suggest abandoning the randomised controlled trial as the primary app evaluation paradigm. Alternative evaluation methodologies such as iterative participatory research and single case designs are better aligned with mHealth translational needs. A further challenge to the use of mobile technology in mental health is the dissemination of information about app quality to consumers. Strategies to facilitate successful dissemination of quality resources must consider several factors, such as target audience and context. In practice, structured solutions to inform consumers of evidence-informed apps could range from the development of consumer used tools to app accreditation portals. Consumer enthusiasm for apps represents an opportunity to increase access and support for psychiatric populations. However, adoption of alternative research methodologies and the development of dissemination strategies are vital before this opportunity can be substantially seized. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. The matter-ekpyrotic bounce scenario in Loop Quantum Cosmology

    Science.gov (United States)

    Haro, Jaume; Amorós, Jaume; Aresté Saló, Llibert

    2017-09-01

    We will perform a detailed study of the matter-ekpyrotic bouncing scenario in Loop Quantum Cosmology using the methods of the dynamical systems theory. We will show that when the background is driven by a single scalar field, at very late times, in the contracting phase, all orbits depict a matter dominated Universe, which evolves to an ekpyrotic phase. After the bounce the Universe enters in the expanding phase, where the orbits leave the ekpyrotic regime going to a kination (also named deflationary) regime. Moreover, this scenario supports the production of heavy massive particles conformally coupled with gravity, which reheats the universe at temperatures compatible with the nucleosynthesis bounds and also the production of massless particles non-conformally coupled with gravity leading to very high reheating temperatures but ensuring the nucleosynthesis success. Dealing with cosmological perturbations, these background dynamics produce a nearly scale invariant power spectrum for the modes that leave the Hubble radius, in the contracting phase, when the Universe is quasi-matter dominated, whose spectral index and corresponding running is compatible with the recent experimental data obtained by PLANCK's team.

  13. Cigarette smoking is associated with reduced microstructural integrity of cerebral white matter.

    Science.gov (United States)

    Gons, Rob A R; van Norden, Anouk G W; de Laat, Karlijn F; van Oudheusden, Lucas J B; van Uden, Inge W M; Zwiers, Marcel P; Norris, David G; de Leeuw, Frank-Erik

    2011-07-01

    that smoking affects the microstructural integrity of cerebral white matter and support previous data that smoking is associated with impaired cognition. Importantly, they suggest that quitting smoking may reverse the impaired structural integrity.

  14. Secretly asymmetric dark matter

    Science.gov (United States)

    Agrawal, Prateek; Kilic, Can; Swaminathan, Sivaramakrishnan; Trendafilova, Cynthia

    2017-01-01

    We study a mechanism where the dark matter number density today arises from asymmetries generated in the dark sector in the early Universe, even though the total dark matter number remains zero throughout the history of the Universe. The dark matter population today can be completely symmetric, with annihilation rates above those expected from thermal weakly interacting massive particles. We give a simple example of this mechanism using a benchmark model of flavored dark matter. We discuss the experimental signatures of this setup, which arise mainly from the sector that annihilates the symmetric component of dark matter.

  15. Right hemisphere grey matter structure and language outcomes in chronic left hemisphere stroke

    Science.gov (United States)

    Xing, Shihui; Lacey, Elizabeth H.; Skipper-Kallal, Laura M.; Jiang, Xiong; Harris-Love, Michelle L.; Zeng, Jinsheng

    2016-01-01

    The neural mechanisms underlying recovery of language after left hemisphere stroke remain elusive. Although older evidence suggested that right hemisphere language homologues compensate for damage in left hemisphere language areas, the current prevailing theory suggests that right hemisphere engagement is ineffective or even maladaptive. Using a novel combination of support vector regression-based lesion-symptom mapping and voxel-based morphometry, we aimed to determine whether local grey matter volume in the right hemisphere independently contributes to aphasia outcomes after chronic left hemisphere stroke. Thirty-two left hemisphere stroke survivors with aphasia underwent language assessment with the Western Aphasia Battery-Revised and tests of other cognitive domains. High-resolution T1-weighted images were obtained in aphasia patients and 30 demographically matched healthy controls. Support vector regression-based multivariate lesion-symptom mapping was used to identify critical language areas in the left hemisphere and then to quantify each stroke survivor’s lesion burden in these areas. After controlling for these direct effects of the stroke on language, voxel-based morphometry was then used to determine whether local grey matter volumes in the right hemisphere explained additional variance in language outcomes. In brain areas in which grey matter volumes related to language outcomes, we then compared grey matter volumes in patients and healthy controls to assess post-stroke plasticity. Lesion–symptom mapping showed that specific left hemisphere regions related to different language abilities. After controlling for lesion burden in these areas, lesion size, and demographic factors, grey matter volumes in parts of the right temporoparietal cortex positively related to spontaneous speech, naming, and repetition scores. Examining whether domain general cognitive functions might explain these relationships, partial correlations demonstrated that grey matter

  16. AEGIS methodology and a perspective from AEGIS methodology demonstrations

    International Nuclear Information System (INIS)

    Dove, F.H.

    1981-03-01

    Objectives of AEGIS (Assessment of Effectiveness of Geologic Isolation Systems) are to develop the capabilities needed to assess the post-closure safety of waste isolation in geologic formation; demonstrate these capabilities on reference sites; apply the assessment methodology to assist the NWTS program in site selection, waste package and repository design; and perform repository site analyses for the licensing needs of NWTS. This paper summarizes the AEGIS methodology, the experience gained from methodology demonstrations, and provides an overview in the following areas: estimation of the response of a repository to perturbing geologic and hydrologic events; estimation of the transport of radionuclides from a repository to man; and assessment of uncertainties

  17. Testing of tunnel support : dynamic load testing of rockbolt elements to provide data for safer support design.

    CSIR Research Space (South Africa)

    Ortlepp, WD

    1998-06-01

    Full Text Available This research report discusses the development of a realistic and controllable method of testing support tendons dynamically, which has been achieved in this research project, offers a new and fresh opportunity for improving the design methodology...

  18. Qualitative interviewing: methodological challenges in Arab settings.

    Science.gov (United States)

    Hawamdeh, Sana; Raigangar, Veena

    2014-01-01

    To explore some of the main methodological challenges faced by interviewers in Arab settings, particularly during interviews with psychiatric nurses. Interviews are a tool used commonly in qualitative research. However, the cultural norms and practices of interviewees must be considered to ensure that an appropriate interviewing style is used, a good interviewee-interviewer relationship formed and consent for participation obtained sensitively. A study to explore the nature of psychiatric nurses' practices that used unstructured interviews. This is a methodology paper that discusses a personal experience of addressing many challenges that are specific to qualitative interviewing in Arab settings, supported by literature on the topic. Suggestions for improving the interview process to make it more culturally sensitive are provided and recommendations for future research are made. Openness, flexibility and a reflexive approach by the researcher can help manage challenges in Arab settings. Researchers should allow themselves to understand the cultural elements of a population to adapt interviewing methods with the aim of generating high quality qualitative research.

  19. Audit calculation for the LOCA methodology for KSNP

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Un Chul; Park, Chang Hwan; Choi, Yong Won; Yoo, Jun Soo [Seoul National Univ., Seoul (Korea, Republic of)

    2006-11-15

    The objective of this research is to perform the audit regulatory calculation for the LOCA methodology for KSNP. For LBLOCA calculation, several uncertainty variables and new ranges of those are added to those of previous KINS-REM to improve the applicability of KINS-REM for KSNP LOCA. And those results are applied to LBLOCA audit calculation by statistical method. For SBLOCA calculation, after selecting BATHSY9.1.b, which is not used by KHNP, the results of RELAP5/Mod3.3 and RELAP5/MOD3.3ef-sEM for KSNP SBLOCA are compared to evaluate the conservativeness or applicability of RELAP5/MOD3.3ef-sEM code for KSNP SBLOCA. The result of this research can be used to support the activities of KINS for reviewing the LOCA methodology for KSNP proposed by KHNP.

  20. Methodological proposal for the definition of improvement strategies in logistics of SME

    Directory of Open Access Journals (Sweden)

    Yeimy Liseth Becerra

    2014-12-01

    Full Text Available A methodological proposal for defining strategies of improvement in logistics of SMEs is presented as a means to fulfill a specific objective of the project Methodological design on storage logistics, acquisition, ownership of information systems and communication for Colombian SMEs, baker subsector, which currently runs the research group SEPRO, of Universidad Nacional of Colombia and supported by Colciencias. The project corresponds to the completion of the last stage of the base project, and aims to implement the corresponding target, raised in the research project that has been developing the research group SEPRO. To do this, it was made a review of the methodology used during the execution of the basic project, as well as the state of the art of techniques used in similar research for the evaluation and definition of breeding strategies in SMEs logistics. Revised techniques were compared and a proposed methodology was configured, which consists of the techniques that represented the greatest advantages for the research development.

  1. Projecting labor demand and worker immigration at nuclear power plant construction sites: an evaluation of methodology

    International Nuclear Information System (INIS)

    Herzog, H.W. Jr; Schlottmann, A.M.; Schriver, W.R.

    1981-12-01

    The study evaluates methodology employed for the projection of labor demand at, and worker migration to, nuclear power plant construction sites. In addition, suggestions are offered as to how this projection methodology might be improved. The study focuses on projection methodologies which forecast either construction worker migration or labor requirements of alternative types of construction activity. Suggested methodological improvements relate both to institutional factors within the nuclear power plant construction industry, and to a better use of craft-specific data on construction worker demand/supply. In addition, the timeliness and availability of the regional occupational data required to support, or implement these suggestions are examined

  2. IAEA support for operating nuclear reactors

    International Nuclear Information System (INIS)

    Akira, O.

    2010-01-01

    The IAEA programme, under the pillar of science and technology, provides support to the existing fleet of nuclear power plants (NPPs) for excellence in operation, support to new countries for infrastructure development, stimulating technology innovation for sustainable development and building national capability. Practical activities include methodology development, information sharing and providing guidance documents and state-of-the-art reports, networking of research activities, and review services using guidance documents as a basis of evaluation. This paper elaborates more on the IAEA's activities in support of the existing fleet of nuclear power plants

  3. Methodology for Monitoring Sustainable Development of Isolated Microgrids in Rural Communities

    Directory of Open Access Journals (Sweden)

    Claudia Rahmann

    2016-11-01

    Full Text Available Microgrids are a rapidly evolving and increasingly common form of local power generation used to serve the needs of both rural and urban communities. In this paper, we present a methodology to evaluate the evolution of the sustainability of stand-alone microgrids projects. The proposed methodology considers a composite sustainability index (CSI that includes both positive and negative impacts of the operation of the microgrid in a given community. The CSI is constructed along environmental, social, economic and technical dimensions of the microgrid. The sub-indexes of each dimension are aggregated into the CSI via a set of adaptive weighting factors, which indicate the relative importance of the corresponding dimension in the sustainability goals. The proposed methodology aims to be a support instrument for policy makers especially when defining sound corrective measures to guarantee the sustainability of small, isolated microgrid projects. To validate the performance of the proposed methodology, a microgrid installed in the northern part of Chile (Huatacondo has been used as a benchmarking project.

  4. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  5. Methodology for Automatic Ontology Generation Using Database Schema Information

    Directory of Open Access Journals (Sweden)

    JungHyen An

    2018-01-01

    Full Text Available An ontology is a model language that supports the functions to integrate conceptually distributed domain knowledge and infer relationships among the concepts. Ontologies are developed based on the target domain knowledge. As a result, methodologies to automatically generate an ontology from metadata that characterize the domain knowledge are becoming important. However, existing methodologies to automatically generate an ontology using metadata are required to generate the domain metadata in a predetermined template, and it is difficult to manage data that are increased on the ontology itself when the domain OWL (Ontology Web Language individuals are continuously increased. The database schema has a feature of domain knowledge and provides structural functions to efficiently process the knowledge-based data. In this paper, we propose a methodology to automatically generate ontologies and manage the OWL individual through an interaction of the database and the ontology. We describe the automatic ontology generation process with example schema and demonstrate the effectiveness of the automatically generated ontology by comparing it with existing ontologies using the ontology quality score.

  6. A Methodology to Institutionalise User Experience in Provincial Government

    Directory of Open Access Journals (Sweden)

    Marco Cobus Pretorius

    2014-12-01

    Full Text Available Problems experienced with website usability can prevent users from accessing and adopting technology, such as e-Government. At present, a number of guidelines exist for e-Government website user experience (UX design; however, the effectiveness of the implementation of these guidelines depends on the expertise of the website development team and on an organisation’s understanding of UX. Despite the highlighted importance of UX, guidelines are rarely applied in South African e-Government website designs. UX guidelines cannot be implemented if there is a lack of executive support, trained staff, budget and user-centred design processes. The goal of this research is to propose and evaluate a methodology (called the “Institutionalise UX in Government (IUXG methodology” to institutionalise UX in South African Provincial Governments (SAPGs. The Western Cape Government in South Africa was used as a case study to evaluate the proposed IUXG methodology. The results show that the IUXG methodology can assist SAPGs to establish UX as standard practice and improve the UX maturity levels.

  7. PREFACE: Strangeness in Quark Matter (SQM2009) Strangeness in Quark Matter (SQM2009)

    Science.gov (United States)

    Fraga, Eduardo; Kodama, Takeshi; Padula, Sandra; Takahashi, Jun

    2010-09-01

    The 14th International Conference on Strangeness in Quark Matter (SQM2009) was held in Brazil from 27 September to 2 October 2009 at Hotel Atlântico, Búzios, Rio de Janeiro. The conference was jointly organized by Universidade Federal do Rio de Janeiro, Universidade Estadual de Campinas, Centro Brasileiro de Pesquisas Físicas, Universidade de São Paulo, Universidade Estadual Paulista and Universidade Federal do Rio Grande do Sul. Over 120 scientists from Argentina, Brazil, China, France, Germany, Hungary, Italy, Japan, Mexico, The Netherlands, Norway, Poland, Russia, Slovakia, South Africa, Switzerland, the UK and the USA gathered at the meeting to discuss the physics of hot and dense matter through the signals of strangeness and also the behavior of heavy quarks. Group photograph The topics covered were strange and heavy quark production in nuclear collisions, strange and heavy quark production in elementary processes, bulk matter phenomena associated with strange and heavy quarks, and strangeness in astrophysics. In view of the LHC era and many other upcoming new machines, together with recent theoretical developments, sessions focused on `New developments and new facilities' and 'Open questions' were also included. A stimulating round-table discussion on 'Physics opportunities in the next decade in the view of strangeness and heavy flavor in matter' was chaired in a relaxed atmosphere by Grazyna Odyniec and conducted by P Braun-Munzinger, W Florkowski, K Redlich, K Šafařík and H Stöcker, We thank these colleagues for pointing out to young participants new physics directions to be pursued. We also thank J Dunlop and K Redlich for excellent introductory lectures given on the Sunday evening pre-conference session. In spite of the not-so-helpful weather, the beauty and charm of the town of Búzios helped to make the meeting successful. Nevertheless, the most important contributions were the excellent talks, whose contents are part of these proceedings, given

  8. White Matter Tracts Connected to the Medial Temporal Lobe Support the Development of Mnemonic Control.

    Science.gov (United States)

    Wendelken, Carter; Lee, Joshua K; Pospisil, Jacqueline; Sastre, Marcos; Ross, Julia M; Bunge, Silvia A; Ghetti, Simona

    2015-09-01

    One of the most important factors driving the development of memory during childhood is mnemonic control, or the capacity to initiate and maintain the processes that guide encoding and retrieval operations. The ability to selectively attend to and encode relevant stimuli is a particularly useful form of mnemonic control, and is one that undergoes marked improvement over childhood. We hypothesized that structural integrity of white matter tracts, in particular those connecting medial temporal lobe memory regions to other cortical areas, and/or those connecting frontal and parietal control regions, should contribute to successful mnemonic control. To test this hypothesis, we examined the relationship between structural integrity of selected white matter tracts and an experimental measure of mnemonic control, involving enhancement of memory by attention at encoding, in 116 children aged 7-11 and 25 young adults. We observed a positive relationship between integrity of uncinate fasciculus and mnemonic enhancement across age groups. In adults, but not in children, we also observed an association between mnemonic enhancement and integrity of ventral cingulum bundle and ventral fornix/fimbria. Integrity of fronto-parietal tracts, including dorsal cingulum and superior longitudinal fasciculus, was unrelated to mnemonic enhancement. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  10. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  11. Brain Structural Effects of Psychopharmacological Treatment in Bipolar Disorder

    Science.gov (United States)

    McDonald, Colm

    2015-01-01

    Bipolar disorder is associated with subtle neuroanatomical deficits including lateral ventricular enlargement, grey matter deficits incorporating limbic system structures, and distributed white matter pathophysiology. Substantial heterogeneity has been identified by structural neuroimaging studies to date and differential psychotropic medication use is potentially a substantial contributor to this. This selective review of structural neuroimaging and diffusion tensor imaging studies considers evidence that lithium, mood stabilisers, antipsychotic medication and antidepressant medications are associated with neuroanatomical variation. Most studies are negative and suffer from methodological weaknesses in terms of directly assessing medication effects on neuroanatomy, since they commonly comprise posthoc assessments of medication associations with neuroimaging metrics in small heterogenous patient groups. However the studies which report positive findings tend to form a relatively consistent picture whereby lithium and antiepileptic mood stabiliser use is associated with increased regional grey matter volume, especially in limbic structures. These findings are further supported by the more methodologically robust studies which include large numbers of patients or repeated intra-individual scanning in longitudinal designs. Some similar findings of an apparently ameliorative effect of lithium on white matter microstructure are also emerging. There is less support for an effect of antipsychotic or antidepressant medication on brain structure in bipolar disorder, but these studies are further limited by methodological difficulties. In general the literature to date supports a normalising effect of lithium and mood stabilisers on brain structure in bipolar disorder, which is consistent with the neuroprotective characteristics of these medications identified by preclinical studies. PMID:26412064

  12. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    ... in the pharmaceutical industry, Clinical trial methodology emphasizes the importance of statistical thinking in clinical research and presents the methodology as a key component of clinical research...

  13. Diseases of white matter

    International Nuclear Information System (INIS)

    Holland, B.A.

    1987-01-01

    The diagnosis of white matter abnormalities was revolutionized by the advent of computed tomography (CT), which provided a noninvasive method of detection and assessment of progression of a variety of white matter processes. However, the inadequacies of CT were recognized early, including its relative insensitivity to small foci of abnormal myelin in the brain when correlated with autopsy findings and its inability to image directly white matter diseases of the spinal cord. Magnetic resonance imaging (MRI), on the other hand, sensitive to the slight difference in tissue composition of normal gray and white matter and to subtle increase in water content associated with myelin disorders, is uniquely suited for the examination of white matter pathology. Its clinical applications include the evaluation of the normal process of myelination in childhood and the various white matter diseases, including disorders of demyelination and dysmyelination

  14. Proposed systematic methodology for analysis of Pb-210 radioactivity in residues produced in Brazilian natural gas pipes

    International Nuclear Information System (INIS)

    Ferreira, Aloisio Cordilha

    2003-11-01

    Since the 80's, the potential radiological hazards due to the handling of solid wastes contaminated with Rn-222 long-lived progeny - Pb-210 in special - produced in gas pipes and removed by pig operations have been subject of growing concern abroad our country. Nevertheless, little or no attention has been paid to this matter in the Brazilian plants up to now, being these hazards frequently underestimated or even ignored. The main purpose of this work was to propose a systematic methodology for analysis of Pb-210 radioactivity in black powder samples from some Brazilian plants, through the evaluation of direct Pb-210 gamma spectrometry and Bi-210 beta counting technical viabilities. In both cases, one in five samples of black powder analysed showed relevant activity (above 1Bq/kg) of Pb-210, being these results probably related to particular features of each specific plant (production levels, reservoir geochemical profile, etc.), in such a way that a single pattern is not observed. For the proposed methodology, gamma spectrometry proved to be the most reliable technique, showing a 3.5% standard deviation, and, for a 95% confidence level, overall fitness in the range of Pb-210 concentration of activity presented in the standard sample reference sheet, provided by IAEA for intercomparison purposes. In the Brazilian scene, however, the availability of statistically supported evidences is insufficient to allow the potential radiological hazard due to the management of black powder to be discarded. Thus, further research efforts are recommended in order to detect the eventually critical regions or plants where gas exploration, production and processing practices will require a regular program of radiological surveillance, in the near future. (author)

  15. Matter and gravitons in the gravitational collapse

    Directory of Open Access Journals (Sweden)

    Roberto Casadio

    2016-12-01

    Full Text Available We consider the effects of gravitons in the collapse of baryonic matter that forms a black hole. We first note that the effective number of (soft off-shell gravitons that account for the (negative Newtonian potential energy generated by the baryons is conserved and always in agreement with Bekenstein's area law of black holes. Moreover, their (positive interaction energy reproduces the expected post-Newtonian correction and becomes of the order of the total ADM mass of the system when the size of the collapsing object approaches its gravitational radius. This result supports a scenario in which the gravitational collapse of regular baryonic matter produces a corpuscular black hole without central singularity, in which both gravitons and baryons are marginally bound and form a Bose–Einstein condensate at the critical point. The Hawking emission of baryons and gravitons is then described by the quantum depletion of the condensate and we show the two energy fluxes are comparable, albeit negligibly small on astrophysical scales.

  16. Matter and gravitons in the gravitational collapse

    Energy Technology Data Exchange (ETDEWEB)

    Casadio, Roberto, E-mail: casadio@bo.infn.it [Dipartimento di Fisica e Astronomia, Alma Mater Universià di Bologna, via Irnerio 46, 40126 Bologna (Italy); I.N.F.N., Sezione di Bologna, IS FLAG, viale B. Pichat 6/2, I-40127 Bologna (Italy); Giugno, Andrea, E-mail: A.Giugno@physik.uni-muenchen.de [Arnold Sommerfeld Center, Ludwig-Maximilians-Universität, Theresienstraße 37, 80333 München (Germany); Giusti, Andrea, E-mail: andrea.giusti@bo.infn.it [Dipartimento di Fisica e Astronomia, Alma Mater Universià di Bologna, via Irnerio 46, 40126 Bologna (Italy); I.N.F.N., Sezione di Bologna, IS FLAG, viale B. Pichat 6/2, I-40127 Bologna (Italy)

    2016-12-10

    We consider the effects of gravitons in the collapse of baryonic matter that forms a black hole. We first note that the effective number of (soft off-shell) gravitons that account for the (negative) Newtonian potential energy generated by the baryons is conserved and always in agreement with Bekenstein's area law of black holes. Moreover, their (positive) interaction energy reproduces the expected post-Newtonian correction and becomes of the order of the total ADM mass of the system when the size of the collapsing object approaches its gravitational radius. This result supports a scenario in which the gravitational collapse of regular baryonic matter produces a corpuscular black hole without central singularity, in which both gravitons and baryons are marginally bound and form a Bose–Einstein condensate at the critical point. The Hawking emission of baryons and gravitons is then described by the quantum depletion of the condensate and we show the two energy fluxes are comparable, albeit negligibly small on astrophysical scales.

  17. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  18. Baryonic matter and beyond

    OpenAIRE

    Fukushima, Kenji

    2014-01-01

    We summarize recent developments in identifying the ground state of dense baryonic matter and beyond. The topics include deconfinement from baryonic matter to quark matter, a diquark mixture, topological effect coupled with chirality and density, and inhomogeneous chiral condensates.

  19. Methodological Quality of Consensus Guidelines in Implant Dentistry.

    Science.gov (United States)

    Faggion, Clovis Mariano; Apaza, Karol; Ariza-Fritas, Tania; Málaga, Lilian; Giannakopoulos, Nikolaos Nikitas; Alarcón, Marco Antonio

    2017-01-01

    Consensus guidelines are useful to improve clinical decision making. Therefore, the methodological evaluation of these guidelines is of paramount importance. Low quality information may guide to inadequate or harmful clinical decisions. To evaluate the methodological quality of consensus guidelines published in implant dentistry using a validated methodological instrument. The six implant dentistry journals with impact factors were scrutinised for consensus guidelines related to implant dentistry. Two assessors independently selected consensus guidelines, and four assessors independently evaluated their methodological quality using the Appraisal of Guidelines for Research & Evaluation (AGREE) II instrument. Disagreements in the selection and evaluation of guidelines were resolved by consensus. First, the consensus guidelines were analysed alone. Then, systematic reviews conducted to support the guidelines were included in the analysis. Non-parametric statistics for dependent variables (Wilcoxon signed rank test) was used to compare both groups. Of 258 initially retrieved articles, 27 consensus guidelines were selected. Median scores in four domains (applicability, rigour of development, stakeholder involvement, and editorial independence), expressed as percentages of maximum possible domain scores, were below 50% (median, 26%, 30.70%, 41.70%, and 41.70%, respectively). The consensus guidelines and consensus guidelines + systematic reviews data sets could be compared for 19 guidelines, and the results showed significant improvements in all domain scores (p dentistry journals is needed. The findings of the present study may help researchers to better develop consensus guidelines in implant dentistry, which will improve the quality and trust of information needed to make proper clinical decisions.

  20. Never forget a name: white matter connectivity predicts person memory

    Science.gov (United States)

    Metoki, Athanasia; Alm, Kylie H.; Wang, Yin; Ngo, Chi T.; Olson, Ingrid R.

    2018-01-01

    Through learning and practice, we can acquire numerous skills, ranging from the simple (whistling) to the complex (memorizing operettas in a foreign language). It has been proposed that complex learning requires a network of brain regions that interact with one another via white matter pathways. One candidate white matter pathway, the uncinate fasciculus (UF), has exhibited mixed results for this hypothesis: some studies have shown UF involvement across a range of memory tasks, while other studies report null results. Here, we tested the hypothesis that the UF supports associative memory processes and that this tract can be parcellated into subtracts that support specific types of memory. Healthy young adults performed behavioral tasks (two face-name learning tasks, one word pair memory task) and underwent a diffusion-weighted imaging scan. Our results revealed that variation in UF microstructure was significantly associated with individual differences in performance on both face-name tasks, as well as the word association memory task. A UF sub-tract, functionally defined by its connectivity between face-selective regions in the anterior temporal lobe and orbitofrontal cortex, selectively predicted face-name learning. In contrast, connectivity between the fusiform face patch and both anterior face patches had no predictive validity. These findings suggest that there is a robust and replicable relationship between the UF and associative learning and memory. Moreover, this large white matter pathway can be subdivided to reveal discrete functional profiles. PMID:28646241