WorldWideScience

Sample records for methodologies methods techniques

  1. Artificial Intelligence Techniques and Methodology

    OpenAIRE

    Carbonell, Jaime G.; Sleeman, Derek

    1982-01-01

    Two closely related aspects of artificial intelligence that have received comparatively little attention in the recent literature are research methodology, and the analysis of computational techniques that span multiple application areas. We believe both issues to be increasingly significant as Artificial Intelligence matures into a science and spins off major application efforts. It is imperative to analyze the repertoire of AI methods with respect to past experience, utility in new domains,...

  2. Methodology for using root locus technique for mobile robots path planning

    Directory of Open Access Journals (Sweden)

    Mario Ricardo Arbulú Saavedra

    2015-11-01

    Full Text Available This paper shows the analysis and the implementation methodology of the technique of dynamic systems roots location used in free-obstacle path planning for mobile robots. First of all, the analysis and morphologic behavior identification of the paths depending on roots location in complex plane are performed, where paths type and their attraction and repulsion features in the presence of other roots similarly to the obtained with artificial potential fields are identified. An implementation methodology for this technique of mobile robots path planning is proposed, starting from three different methods of roots location for obstacles in the scene. Those techniques change depending on the obstacle key points selected for roots, such as borders, crossing points with original path, center and vertices. Finally, a behavior analysis of general technique and the effectiveness of each tried method is performed, doing 20 tests for each one, obtaining a value of 65% for the selected method. Modifications and possible improvements to this methodology are also proposed.

  3. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    Science.gov (United States)

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  4. METHODOLOGY OF TECHNIQUE PREPARATION FOR LOW VISION JAVELIN THROWERS

    Directory of Open Access Journals (Sweden)

    Milan Matić

    2013-07-01

    Full Text Available Javelin throwing discipline for disabled people has been expanding couple of years back. In addition, world’s records have been improving year after year. The esential part in preparation of low vision javelin throwers is mastering the technique elements, crucial for acquiring better results. Method of theoretical analysis, decriptive and comparative methods of survey were applied. Relevant knowledge in the area of low vision javelin throwers was analyzed and systematized, and then interpretated theoretically and applied on the top javelin thrower, which served as a base for the inovative apporoach in methodology and praxis with disabled people. Due to visual impairment, the coordination and balance are challenged. This limitation practically makes the difference in methodology, explained in this article. Apart from the goals focused on improving the condition and results on competitions, more specialized goals should be considered, e.g. improving of orientation, balance and socialization process for the people who have low vision. Special approach used in the technique preparation brought the significant improvement in techique of our famous Paralympian Grlica Miloš. In addition to the technique improvement he acquired better results on the big competitions and a few worldwide valuable prizes were won. The area of ’sport for disabled people’ is not enough present in the praxis of sport’s workers. More articles and scientific surveys on this topic are needed for further work and results improvement with these kind of sportsmen.

  5. Methodology for attainment of density and effective atomic number through dual energy technique using microtomographic images

    International Nuclear Information System (INIS)

    Alves, H.; Lima, I.; Lopes, R.T.

    2014-01-01

    Dual energy technique for computerized microtomography shows itself as a promising method for identification of mineralogy on geological samples of heterogeneous composition. It can also assist with differentiating very similar objects regarding the attenuation coefficient, which are usually not separable during image processing and analysis of microtomographic data. Therefore, the development of a feasible and applicable methodology of dual energy in the analysis of microtomographic images was sought. - Highlights: • Dual energy technique is promising for identification of distribution of minerals. • A feasible methodology of dual energy in analysis of tomographic images was sought. • The dual energy technique is efficient for density and atomic number identification. • Simulation showed that the proposed methodology agrees with theoretical data. • Nondestructive characterization of distribution of density and chemical composition

  6. TECHNIQUE AND METHODOLOGY OF TRAINING IN SWIMMING CRAWL

    Directory of Open Access Journals (Sweden)

    Selim Alili

    2013-07-01

    Full Text Available The paper shows the technique and methodology training crawl swimming. Developed: the position of the head and body, footwork, hand movements, exercises for training footwork training drills and exercises for improving coordination technique on dry land and in water. Stated that accomplishes this swimmer swimming technique allows fast and is the fastest discipline. Therefore we can say that it is a favorite way of swimming and a pleasure to watch on the big stage.

  7. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  8. Dosimetric methodology for extremities of individuals occupationally exposed to beta radiation using the optically stimulated luminescence technique

    International Nuclear Information System (INIS)

    Pinto, Teresa Cristina Nathan Outeiro

    2010-01-01

    A dosimetric methodology was established for the determination of extremity doses of individuals occupationally exposed to beta radiation, using Al 2 O 3 :C detectors and the optically stimulated luminescence (OSL) reader system microStar, Landauer. The main parts of the work were: characterization of the dosimetric material Al 2 O 3 :C using the OSL technique; establishment of the dose evaluation methodology; dose rate determination of beta radiation sources; application of the established method in a practical test with individuals occupationally exposed to beta radiation during a calibration simulation of clinical applicators; validation of the methodology by the comparison between the dose results of the practical test using the OSL and the thermoluminescence (TL) techniques. The results show that both the OSL Al-2O 3 :C detectors and the technique may be utilized for individual monitoring of extremities and beta radiation. (author)

  9. Quantitative assessments of distributed systems methodologies and techniques

    CERN Document Server

    Bruneo, Dario

    2015-01-01

    Distributed systems employed in critical infrastructures must fulfill dependability, timeliness, and performance specifications. Since these systems most often operate in an unpredictable environment, their design and maintenance require quantitative evaluation of deterministic and probabilistic timed models. This need gave birth to an abundant literature devoted to formal modeling languages combined with analytical and simulative solution techniques The aim of the book is to provide an overview of techniques and methodologies dealing with such specific issues in the context of distributed

  10. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Science.gov (United States)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  11. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    Directory of Open Access Journals (Sweden)

    Diana Guzys

    2015-05-01

    Full Text Available In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  12. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.

    Science.gov (United States)

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  13. Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method

    Science.gov (United States)

    Ramlo, Susan

    2015-01-01

    Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…

  14. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  15. Group techniques as a methodological strategy in acquiring teamwork abilities by college students

    Directory of Open Access Journals (Sweden)

    César Torres Martín

    2013-02-01

    Full Text Available From the frame of the European Higher Education Space an adaptation of teaching-learning process is being promoted by means of the pedagogical renewal, introducing into the class a major number of active or participative methodologies in order to provide students with a major autonomy in said process. This requires taking into account the incorporation of basic skills within university curriculum, especially “teamwork”. By means of group techniques students can acquire interpersonal and cognitive skills, as well as abilities that will enable them to face different group situations throughout their academic and professional career. These techniques are necessary not only as a methodological strategy in the classroom, but also as a reflection instrument for students to assess their behavior in group, with an aim to modify conduct strategies that make that relationship with others influences their learning process. Hence the importance of this ability to sensitize students positively for collective work. Thus using the research-action method in the academic classroom during one semester and making systematic intervention with different group techniques, we manage to present obtained results by means of an analysis of the qualitative data, where the selected instruments are group discussion and personal reflection.

  16. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Directory of Open Access Journals (Sweden)

    Muhammad Nurul Zhafirah

    2017-01-01

    Full Text Available Increased demand in internet of thing (IOT application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  17. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  19. Applications of mixed-methods methodology in clinical pharmacy research.

    Science.gov (United States)

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  20. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  1. Methodology for qualitative content analysis with the technique of mind maps using Nvivo and FreeMind softwares

    Directory of Open Access Journals (Sweden)

    José Leonardo Oliveira Lima

    2016-12-01

    Full Text Available Introduction: In a survey it is not enough choosing tools, resources and procedures. It is important to understand the method beyond the technics and their relationship with philosophy, epistemology and methodology. Objective: To discuss theoretical and methodological concerns on Qualitative Research in Information Science and the process of Qualitative Content Analysis (QCA at User Studies field and to show a followed path of QCA integrated with Mind Maps technic for developing categories and indicators, by using Qualitative Data Analysis Software (QDAS and Mind Maps designing tools. Methodology: The research was descriptive, methodological, bibliographical and fieldwork conducted with open interviews that were processed using the QCA method with the support of QDAS Nvivo and FreeMind Software for Mind Map design. Results: It is shown the theory of qualitative research and QCA and a methodological path of QCA by using techniques and software mentioned above. Conclusions: When it comes to qualitative researches, the theoretical framework suggests the need of more dialogue among Information Science and other disciplines. The process of QCA evidenced: a viable path that might help further related investigations using the QDAS; the contribution of Mind Maps and their design softwares to develop the indicators and categories of QCA.

  2. Carving TechniqueMethodical Perspectives

    Directory of Open Access Journals (Sweden)

    Adela BADAU

    2015-09-01

    Full Text Available The alpine skiing has undergone major changes and adjustments due to both technological innovations of materials and update of theoretical and methodological concepts on all levels of specific training. The purpose: the introduction of technological innovation in the field of materials specif ic to carving ski causes a review of methodology, aiming at bringing the execution technique to superior indices in order to obtain positive results. The event took place in Poiana Brasov between December 2014 and March 2015, on an 800m long slope and comp rised a single experimental group made of four males and four females, cadet category, that carried out two lessons per day. The tests targeted the technique level for slalom skiing and giant slalom skiing, having in view four criteria: leg work, basin mov ement, torso position and arms work. As a result of the research and of the statistic - mathematical analysis of the individual values, the giant slalom race registered an average improvement of 3.5 points between the tests, while the slalom race registered 4 points. In conclusion, the use of a specific methodology applied scientifically, which aims to select the most efficient means of action specific to children’s ski, determines technical improvement at an advanced level.

  3. Methodology to evaluate the impact of the erosion in cultivated floors applying the technique of the 137CS

    International Nuclear Information System (INIS)

    Gil Castillo, R.; Peralta Vital, J.L.; Carrazana, J.; Riverol, M.; Penn, F.; Cabrera, E.

    2004-01-01

    The present paper shows the results obtained in the framework of 2 Nuclear Projects, in the topic of application of nuclear techniques to evaluate the erosion rates in cultivated soils. Taking into account the investigations with the 137 CS technique, carried out in the Province of Pinar del Rio, was obtained and validated (first time) a methodology to evaluate the erosion impact in a cropland. The obtained methodology includes all relevant stages for the adequate application of the 137 CS technique, from the initial step of area selection, the soil sampling process, selection of the models and finally, the results evaluation step. During the methodology validation process in soils of the Municipality of San Juan y Martinez, the erosion rates estimated by the methodology and the obtained values by watershed segment measures (traditional technique) were compared in a successful manner. The methodology is a technical guide, for the adequate application of the 137 CS technique to estimate the soil redistribution rates in cultivated soils

  4. A dynamic systems engineering methodology research study. Phase 2: Evaluating methodologies, tools, and techniques for applicability to NASA's systems projects

    Science.gov (United States)

    Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.

    1989-01-01

    A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.

  5. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: a methodological review.

    Science.gov (United States)

    Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth

    2017-11-28

    The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further

  6. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  7. Prosopography of social and political groups historically located: method or research technique?

    Directory of Open Access Journals (Sweden)

    Lorena Madruga Monteiro

    2014-06-01

    Full Text Available The prosopographical approach has been questioned in different disciplinary domains as its scientific nature. The debate prosopography is a technique, a tool for research, an auxiliary science or method transpires in scientific arguments and those who are dedicated to explaining the prosopographical research assumptions. In the social sciences, for example, prosopography is not seen only as an instrument of research, but as a method associated with a theoretical construct to apprehend the social world. The historians that use prosopographic analysis, in turn, oscillate about the analysis of collective biography is a method or a polling technique. Given this setting we aimed at in this article, discuss the prosopographical approach from their different uses. The study presents a literature review, demonstrating the technique of prosopography as historical research, and further as a method of sociological analysis, and then highlight your procedures and methodological limits.

  8. Systematization of types and methods of radiation therapy methods and techniques of irradiation of patients

    International Nuclear Information System (INIS)

    Vajnberg, M.Sh.

    1991-01-01

    The paper is concerned with the principles of systematization and classification of types and methods of radiation therapy, approaches to the regulation of its terminology. They are based on the distinction of the concepts of radiation therapy and irradiation of patients. The author gives a consice historical review of improvement of the methodology of radiation therapy in the course of developing of its methods and facilities. Problems of terminology are under discussion. There is a table of types and methods of radiation therapy, methods and techniques of irradiation. In the appendices one can find a table of typical legends and examples of graphic signs to denote methods of irradiation. Potentialities of a practical use of the system are described

  9. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  10. Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis.

    Science.gov (United States)

    Liao, Hongjing; Hitchcock, John

    2018-06-01

    This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Theoretical and methodological basis of the comparative historical and legal method development

    Directory of Open Access Journals (Sweden)

    Д. А. Шигаль

    2015-05-01

    Full Text Available Problem setting. Development of any scientific method is always both a question of its structural and functional characteristics and place in the system of scientific methods, and a comment as for practicability of such methodological work. This paper attempts to give a detailed response to the major comments and objections arising in respect of the separation as an independent means of special and scientific knowledge of comparative historical and legal method. Recent research and publications analysis. Analyzing research and publications within the theme of the scientific article, it should be noted that attention to methodological issues of both general and legal science at the time was paid by such prominent foreign and domestic scholars as I. D. Andreev, Yu. Ya. Baskin, O. L. Bygych, M. A. Damirli, V. V. Ivanov, I. D. Koval'chenko, V. F. Kolomyitsev, D. V. Lukyanov, L. A. Luts, J. Maida, B. G. Mogilnytsky, N. M. Onishchenko, N. M. Parkhomenko, O. V. Petryshyn, S. P. Pogrebnyak, V. I. Synaisky, V. M. Syryh, O. F. Skakun, A. O. Tille, D. I. Feldman and others. It should be noted that, despite a large number of scientific papers in this field, the interest of research partnership in the methodology of history of state and law science still unfairly remains very low. Paper objective. The purpose of this scientific paper is theoretical and methodological rationale for the need of separation and development of comparative historical and legal method in the form of answers to more common questions and objections that arise in scientific partnership in this regard. Paper main body. Development of comparative historical and legal means of knowledge is quite justified because it meets the requirements of the scientific method efficiency, which criteria are the speed for achieving this goal, ease of use of one or another way of scientific knowledge, universality of research methods, convenience of techniques that are used and so on. Combining the

  12. Methodology or method? A critical review of qualitative case study reports

    Directory of Open Access Journals (Sweden)

    Nerida Hyett

    2014-05-01

    Full Text Available Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12, social sciences and anthropology (n=7, or methods (n=15 case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  13. Methodology or method? A critical review of qualitative case study reports

    Science.gov (United States)

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  14. Covariance methodology applied to 35S disintegration rate measurements by the CIEMAT/NIST method

    International Nuclear Information System (INIS)

    Koskinas, M.F.; Nascimento, T.S.; Yamazaki, I.M.; Dias, M.S.

    2014-01-01

    The Nuclear Metrology Laboratory (LMN) at IPEN is carrying out measurements in a LSC (Liquid Scintillation Counting system), applying the CIEMAT/NIST method. In this context 35 S is an important radionuclide for medical applications and it is difficult to be standardized by other primary methods due to low beta ray energy. The CIEMAT/NIST is a standard technique used by most metrology laboratories in order to improve accuracy and speed up beta emitter standardization. The focus of the present work was to apply the covariance methodology for determining the overall uncertainty in the 35 S disintegration rate. All partial uncertainties involved in the measurements were considered, taking into account all possible correlations between each pair of them. - Highlights: ► 35 S disintegration rate measured in Liquid Scintillator system using CIEMAT/NIST method. ► Covariance methodology applied to the overall uncertainty in the 35 S disintegration rate. ► Monte Carlo simulation was applied to determine 35 S activity in the 4πβ(PC)-γ coincidence system

  15. Robust Optimization in Simulation : Taguchi and Response Surface Methodology

    NARCIS (Netherlands)

    Dellino, G.; Kleijnen, J.P.C.; Meloni, C.

    2008-01-01

    Optimization of simulated systems is tackled by many methods, but most methods assume known environments. This article, however, develops a 'robust' methodology for uncertain environments. This methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by

  16. Towards the methodological optimization of the moss bag technique in terms of contaminants concentrations and replicability values

    Science.gov (United States)

    Ares, A.; Fernández, J. A.; Carballeira, A.; Aboal, J. R.

    2014-09-01

    The moss bag technique is a simple and economical environmental monitoring tool used to monitor air quality. However, routine use of the method is not possible because the protocols involved have not yet been standardized. Some of the most variable methodological aspects include (i) selection of moss species, (ii) ratio of moss weight to surface area of the bag, (iii) duration of exposure, and (iv) height of exposure. In the present study, the best option for each of these aspects was selected on the basis of the mean concentrations and data replicability of Cd, Cu, Hg, Pb and Zn measured during at least two exposure periods in environments affected by different degrees of contamination. The optimal choices for the studied aspects were the following: (i) Sphagnum denticulatum, (ii) 5.68 mg of moss tissue for each cm-2 of bag surface, (iii) 8 weeks of exposure, and (iv) 4 m height of exposure. Duration of exposure and height of exposure accounted for most of the variability in the data. The aim of this methodological study was to provide data to help establish a standardized protocol that will enable use of the moss bag technique by public authorities.

  17. Adaptability of laser diffraction measurement technique in soil physics methodology

    Science.gov (United States)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  18. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    Science.gov (United States)

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  19. Issues in Learning About and Teaching Qualitative Research Methods and Methodology in the Social Sciences

    Directory of Open Access Journals (Sweden)

    Franz Breuer

    2007-01-01

    Full Text Available For many qualitative researchers in the social sciences, learning about and teaching qualitative research methods and methodology raises a number of questions. This topic was the focus of a symposium held during the Second Berlin Summer School for Qualitative Research Methods in July 2006. In this contribution, some of the issues discussed during the symposium are taken up and extended, and some basic dimensions underlying these issues are summarized. How qualitative research methods and methodology are taught is closely linked to the ways in which qualitative researchers in the social sciences conceptualize themselves and their discipline. In the following, we distinguish between a paradigmatic and a pragmatic view. From a pragmatic point of view, qualitative research methods are considered research strategies or techniques and can be taught in the sense of recipes with specific steps to be carried out. According to a paradigmatic point of view (strongly inspired by constructivism, qualitative research methods and methodology are conceptualized as a craft to be practiced together by a "master" and an "apprentice." Moreover, the teaching of qualitative research methods also depends heavily on the institutional standing of qualitative compared to quantitative research method. Based on these considerations, five basic dimensions of learning about and teaching qualitative research methods are suggested: ways of teaching (ranging from the presentation of textbook knowledge to cognitive apprenticeship and instructors' experience with these; institutional contexts, including their development and the teaching of qualitative research methods in other than university contexts; the "fit" between personality and method, including relevant personal skills and talents; and, as a special type of instructional context that increasingly has gained importance, distance learning and its implications for learning about and teaching qualitative research methods

  20. Distance learning methodology and technique in scientific and vocational communication (on the example of the master’s distance course in linguistics

    Directory of Open Access Journals (Sweden)

    S. S. Khromov

    2016-01-01

    Full Text Available The article is devoted to the elaboration of methodology and technique of the master’s distance course in linguistics for Russian students. The research novelty lies in the fact that the course presents the results methodic and scientific work of the teachers’ and students’ stuff. Within the course framework we plan to transfer the communicative activity concept to the distance forms of education and modeling a new type of the educational product.The purposes of the research are: 1 to develop the distance learning methodology and technique for a linguistic master’s course; 2 to elaborate an internal structure of the project; 3 to demonstrate which vocational, language and speech competencies are to appear as tge result of the project; 4 to describe the algorithm of the full-time lecture course in linguistics in a distance format; 5 to conduct a pedagogical experiment realizing the distance learning education in master’s linguistic course; 6 to prove the innovation and the productivity of the elaborated master’s course in linguistics.The research is based on 1 the paper variant of the full-time lecture course 2 the curriculum of the lecture course 3 the concept of the master’s course in linguistics 4 the concept of the distance course in linguistics 5 students’ interviews 6 virtual tools The research methods are 1 descriptive 2 project 3 comparative 4 statistic methodsConclusion. The novelty and the productivity of the course have been proved and they are manifested in the following 1 in the ability to develop vocational, language and speech competences of the students 2 in developing individual trajectories of the students 3 in expanding sociocultural potential of the students 4 in developing sociocultural potential of the students 5 in intensifying education process. As a result of the experiment we can state that 1 the methodology and technique of distance tools in projecting master’s course in linguistics are described 2 the

  1. Methods and techniques for prediction of environmental impact

    International Nuclear Information System (INIS)

    1992-04-01

    Environmental impact assessment (EIA) is the procedure that helps decision makers understand the environmental implications of their decisions. The prediction of environmental effects or impact is an extremely important part of the EIA procedure and improvements in existing capabilities are needed. Considerable attention is paid within environmental impact assessment and in handbooks on EIA to methods for identifying and evaluating environmental impacts. However, little attention is given to the issue distribution of information on impact prediction methods. The quantitative or qualitative methods for the prediction of environmental impacts appear to be the two basic approaches for incorporating environmental concerns into the decision-making process. Depending on the nature of the proposed activity and the environment likely to be affected, a combination of both quantitative and qualitative methods is used. Within environmental impact assessment, the accuracy of methods for the prediction of environmental impacts is of major importance while it provides for sound and well-balanced decision making. Pertinent and effective action to deal with the problems of environmental protection and the rational use of natural resources and sustainable development is only possible given objective methods and techniques for the prediction of environmental impact. Therefore, the Senior Advisers to ECE Governments on Environmental and Water Problems, decided to set up a task force, with the USSR as lead country, on methods and techniques for the prediction of environmental impacts in order to undertake a study to review and analyse existing methodological approaches and to elaborate recommendations to ECE Governments. The work of the task force was completed in 1990 and the resulting report, with all relevant background material, was approved by the Senior Advisers to ECE Governments on Environmental and Water Problems in 1991. The present report reflects the situation, state of

  2. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  3. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  4. Development of a high-order finite volume method with multiblock partition techniques

    Directory of Open Access Journals (Sweden)

    E. M. Lemos

    2012-03-01

    Full Text Available This work deals with a new numerical methodology to solve the Navier-Stokes equations based on a finite volume method applied to structured meshes with co-located grids. High-order schemes used to approximate advective, diffusive and non-linear terms, connected with multiblock partition techniques, are the main contributions of this paper. Combination of these two techniques resulted in a computer code that involves high accuracy due the high-order schemes and great flexibility to generate locally refined meshes based on the multiblock approach. This computer code has been able to obtain results with higher or equal accuracy in comparison with results obtained using classical procedures, with considerably less computational effort.

  5. GENESIS OF METHODOLOGY OF MANAGEMENT BY DEVELOPMENT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Z.N. Varlamova

    2007-06-01

    Full Text Available In clause the genesis of methodology of management of development of organizations as sets of the used methodological approaches and methods is investigated. The results of the comparative analysis of the methodological approaches to management of organizational development are submitted. The traditional methodological approaches are complemented strategic experiment and methodology case studies. The approaches to formation of new methodology and technique of research of sources of competitive advantages of organization are considered.

  6. Methodological aspects and development of techniques for neutron activation analysis of microcomponents in materials of geologic origin

    International Nuclear Information System (INIS)

    Cohen, I.M.

    1982-01-01

    Some aspects of the activation analysis methodology applied to geological samples activated in nuclear reactors were studied, and techniques were developed for the determination of various elements in different types of matrixes, using gamma spectrometry for the measurement of the products. The consideration of the methodological aspects includes the study of the working conditions, the preparation of samples and standards, irradiations, treatment of the irradiated material, radiochemical separation and measurement. Experiments were carried out on reproducibility and errors in relation to the behaviour of the measurement equipment and that of the methods of area calculation (total area, Covell and Wasson), as well as on the effects of geometry variations on the results of the measurements, the RA-3 reactors's flux variations, and the homogeneity of the samples and standards. Also studied were: the selection of the conditions of determination, including the irradiation and decay times; the irradiation with thermal and epithermal neutrons; the measurement with the use of absorbers, and the resolution of complex peaks. Both non-destructive and radiochemical separation techniques were developed for the analysis of 5 types of geological materials. These methods were applied to the following determinations: a) In, Cd, Mn, Ga and Co in blende; b) La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb and Lu in fluorites; c) La, Ca, Eu, Tb, Yb, Se and Th in barites and celestites; d) Cu and Zn in soils. The spectral interferences or those due to nuclear reactions were studied and evaluated by mathematical calculation. (M.E.L.) [es

  7. [A comparative study of blood culture conventional method vs. a modified lysis/centrifugation technique for the diagnosis of fungemias].

    Science.gov (United States)

    Santiago, Axel Rodolfo; Hernández, Betsy; Rodríguez, Marina; Romero, Hilda

    2004-12-01

    The purpose of this work was to compare the efficacy of blood culture conventional method vs. a modified lysis/centrifugation technique. Out of 450 blood specimens received in one year, 100 where chosen for this comparative study: 60 from patients with AIDS, 15 from leukemic patients, ten from febrile neutropenic patients, five from patients with respiratory infections, five from diabetics and five from septicemic patients. The specimens were processed, simultaneously, according to the above mentioned methodologies with daily inspections searching for fungal growth in order to obtain the final identification of the causative agent. The number (40) of isolates recovered was the same using both methods, which included; 18 Candida albicans (45%), ten Candida spp. (25%), ten Histoplasma capsulatum (25%), and two Cryptococcus neoformans (5%). When the fungal growth time was compared by both methods, growth was more rapid when using the modified lysis/centrifugation technique than when using the conventional method. Statistical analysis revealed a significant difference (pcentrifugation technique showed to be more efficacious than the conventional one, and therefore the implementation of this methodology is highly recommended for the isolation of fungi from blood.

  8. Efficacy of Blood Sources and Artificial Blood Feeding Methods in Rearing of Aedes aegypti (Diptera: Culicidae) for Sterile Insect Technique and Incompatible Insect Technique Approaches in Sri Lanka

    OpenAIRE

    Nayana Gunathilaka; Tharaka Ranathunge; Lahiru Udayanga; Wimaladharma Abeyewickreme

    2017-01-01

    Introduction Selection of the artificial membrane feeding technique and blood meal source has been recognized as key considerations in mass rearing of vectors. Methodology Artificial membrane feeding techniques, namely, glass plate, metal plate, and Hemotek membrane feeding method, and three blood sources (human, cattle, and chicken) were evaluated based on feeding rates, fecundity, and hatching rates of Aedes aegypti. Significance in the variations among blood feeding was investigated by one...

  9. A methodology for semiautomatic taxonomy of concepts extraction from nuclear scientific documents using text mining techniques

    International Nuclear Information System (INIS)

    Braga, Fabiane dos Reis

    2013-01-01

    This thesis presents a text mining method for semi-automatic extraction of taxonomy of concepts, from a textual corpus composed of scientific papers related to nuclear area. The text classification is a natural human practice and a crucial task for work with large repositories. The document clustering technique provides a logical and understandable framework that facilitates the organization, browsing and searching. Most clustering algorithms using the bag of words model to represent the content of a document. This model generates a high dimensionality of the data, ignores the fact that different words can have the same meaning and does not consider the relationship between them, assuming that words are independent of each other. The methodology presents a combination of a model for document representation by concepts with a hierarchical document clustering method using frequency of co-occurrence concepts and a technique for clusters labeling more representatives, with the objective of producing a taxonomy of concepts which may reflect a structure of the knowledge domain. It is hoped that this work will contribute to the conceptual mapping of scientific production of nuclear area and thus support the management of research activities in this area. (author)

  10. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  11. Evaluation and assessment of nuclear power plant seismic methodology

    International Nuclear Information System (INIS)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-01-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology

  12. Evaluation and assessment of nuclear power plant seismic methodology

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-03-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.

  13. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  14. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection

  15. Methodologies of Knowledge Discovery from Data and Data Mining Methods in Mechanical Engineering

    Directory of Open Access Journals (Sweden)

    Rogalewicz Michał

    2016-12-01

    Full Text Available The paper contains a review of methodologies of a process of knowledge discovery from data and methods of data exploration (Data Mining, which are the most frequently used in mechanical engineering. The methodologies contain various scenarios of data exploring, while DM methods are used in their scope. The paper shows premises for use of DM methods in industry, as well as their advantages and disadvantages. Development of methodologies of knowledge discovery from data is also presented, along with a classification of the most widespread Data Mining methods, divided by type of realized tasks. The paper is summarized by presentation of selected Data Mining applications in mechanical engineering.

  16. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    Science.gov (United States)

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  17. In vitro cumulative gas production techniques: history, methodological considerations and challenges

    NARCIS (Netherlands)

    Rymer, C.; Huntington, J.A.; Williams, B.A.; Givens, D.I.

    2005-01-01

    Methodology used to measure in vitro gas production is reviewed to determine impacts of sources of variation on resultant gas production profiles (GPP). Current methods include measurement of gas production at constant pressure (e.g., use of gas tight syringes), a system that is inexpensive, but may

  18. Adaptation of Agile Project Management Methodology for Project Team

    Directory of Open Access Journals (Sweden)

    Rasnacis Arturs

    2015-12-01

    Full Text Available A project management methodology that defines basic processes, tools, techniques, methods, resources and procedures used to manage a project is necessary for effective and successful IT project management. Each company needs to define its own methodology or adapt some of the existing ones. The purpose of the research is to evaluate the possibilities of adapting IT project development methodology according to the company, company employee characteristics and their mutual relations. The adaptation process will be illustrated with a case study at an IT company in Latvia where the developed methodology is based on Agile Scrum, one of the most widespread Agile methods.

  19. Acceleration techniques for the discrete ordinate method

    International Nuclear Information System (INIS)

    Efremenko, Dmitry; Doicu, Adrian; Loyola, Diego; Trautmann, Thomas

    2013-01-01

    In this paper we analyze several acceleration techniques for the discrete ordinate method with matrix exponential and the small-angle modification of the radiative transfer equation. These techniques include the left eigenvectors matrix approach for computing the inverse of the right eigenvectors matrix, the telescoping technique, and the method of false discrete ordinate. The numerical simulations have shown that on average, the relative speedup of the left eigenvector matrix approach and the telescoping technique are of about 15% and 30%, respectively. -- Highlights: ► We presented the left eigenvector matrix approach. ► We analyzed the method of false discrete ordinate. ► The telescoping technique is applied for matrix operator method. ► Considered techniques accelerate the computations by 20% in average.

  20. SCIENTIFIC METHODOLOGY FOR THE APPLIED SOCIAL SCIENCES: CRITICAL ANALYSES ABOUT RESEARCH METHODS, TYPOLOGIES AND CONTRIBUTIONS FROM MARX, WEBER AND DURKHEIM

    Directory of Open Access Journals (Sweden)

    Mauricio Corrêa da Silva

    2015-06-01

    Full Text Available This study aims to discuss the importance of the scientific method to conduct and advertise research in applied social sciences and research typologies, as well as to highlight contributions from Marx, Weber and Durkheim to the scientific methodology. To reach this objective, we conducted a review of the literature on the term research, the scientific method,the research techniques and the scientific methodologies. The results of the investigation revealed that it is fundamental that the academic investigator uses a scientific method to conduct and advertise his/her academic works in applied social sciences in comparison with the biochemical or computer sciences and in the indicated literature. Regarding the contributions to the scientific methodology, we have Marx, dialogued, the dialectical, striking analysis, explicative of social phenomenon, the need to understand the phenomena as historical and concrete totalities; Weber, the distinction between “facts” and “value judgments” to provide objectivity to the social sciences and Durkheim, the need to conceptualize very well its object of study, reject sensible data and imbue with the spirit of discovery and of being surprised with the results.

  1. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  2. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    International Nuclear Information System (INIS)

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-01

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  3. Approaches to qualitative research in mathematics education examples of methodology and methods

    CERN Document Server

    Bikner-Ahsbahs, Angelika; Presmeg, Norma

    2014-01-01

    This volume documents a range of qualitative research approaches emerged within mathematics education over the last three decades, whilst at the same time revealing their underlying methodologies. Continuing the discussion as begun in the two 2003 ZDM issues dedicated to qualitative empirical methods, this book presents astate of the art overview on qualitative research in mathematics education and beyond. The structure of the book allows the reader to use it as an actual guide for the selection of an appropriate methodology, on a basis of both theoretical depth and practical implications. The methods and examples illustrate how different methodologies come to life when applied to a specific question in a specific context. Many of the methodologies described are also applicable outside mathematics education, but the examples provided are chosen so as to situate the approach in a mathematical context.

  4. Efficacy of Blood Sources and Artificial Blood Feeding Methods in Rearing of Aedes aegypti (Diptera: Culicidae for Sterile Insect Technique and Incompatible Insect Technique Approaches in Sri Lanka

    Directory of Open Access Journals (Sweden)

    Nayana Gunathilaka

    2017-01-01

    Full Text Available Introduction. Selection of the artificial membrane feeding technique and blood meal source has been recognized as key considerations in mass rearing of vectors. Methodology. Artificial membrane feeding techniques, namely, glass plate, metal plate, and Hemotek membrane feeding method, and three blood sources (human, cattle, and chicken were evaluated based on feeding rates, fecundity, and hatching rates of Aedes aegypti. Significance in the variations among blood feeding was investigated by one-way ANOVA, cluster analysis of variance (ANOSIM, and principal coordinates (PCO analysis. Results. Feeding rates of Ae. aegypti significantly differed among the membrane feeding techniques as suggested by one-way ANOVA (p0.05. Conclusions. Metal plate method could be recommended as the most effective membrane feeding technique for mass rearing of Ae. aegypti, due to its high feeding rate and cost effectiveness. Cattle blood could be recommended for mass rearing Ae. aegypti.

  5. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    Science.gov (United States)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a

  6. Priority research directions in the area of qualitative methodology

    OpenAIRE

    Melnikova, Olga; Khoroshilov, Dmitry

    2010-01-01

    The basic directions of modern theoretical and practical research in the area of qualitative methodology in Russia are discussed in the article. The complexity of research is considered from three points of view: the development of methodology of qualitative analysis, qualitative methods, and verbal and nonverbal projective techniques. The authors present an integrative model of the qualitative analysis, the research on specificity of the use of discourse-analysis method and projective techni...

  7. Determination of plutonium isotopic abundances by gamma-ray spectrometry. Interim report on the status of methods and techniques developed by the Lawrence Livermore Laboratory

    International Nuclear Information System (INIS)

    Gunnink, R.

    1980-03-01

    This report presents an overview of methods and techniques developed by the Lawrence Livermore Laboratory for determining plutonium isotopic abundances from gamma-ray spectra that have been measured with germanium detectors. The methodology of fitting the spectral features includes discussions of algorithms for gamma-ray and x-ray peak shape fitting and generation of response spectra profiles characteristic of specific isotopes. Applications of the techniques developed at government, commercial, and Japanese reprocessing plants are described. Current development of the methodology for the nondestructive analysis of samples containing nondescript solid materials is also presented

  8. Statistical methods of evaluating and comparing imaging techniques

    International Nuclear Information System (INIS)

    Freedman, L.S.

    1987-01-01

    Over the past 20 years several new methods of generating images of internal organs and the anatomy of the body have been developed and used to enhance the accuracy of diagnosis and treatment. These include ultrasonic scanning, radioisotope scanning, computerised X-ray tomography (CT) and magnetic resonance imaging (MRI). The new techniques have made a considerable impact on radiological practice in hospital departments, not least on the investigational process for patients suspected or known to have malignant disease. As a consequence of the increased range of imaging techniques now available, there has developed a need to evaluate and compare their usefulness. Over the past 10 years formal studies of the application of imaging technology have been conducted and many reports have appeared in the literature. These studies cover a range of clinical situations. Likewise, the methodologies employed for evaluating and comparing the techniques in question have differed widely. While not attempting an exhaustive review of the clinical studies which have been reported, this paper aims to examine the statistical designs and analyses which have been used. First a brief review of the different types of study is given. Examples of each type are then chosen to illustrate statistical issues related to their design and analysis. In the final sections it is argued that a form of classification for these different types of study might be helpful in clarifying relationships between them and bringing a perspective to the field. A classification based upon a limited analogy with clinical trials is suggested

  9. Profile of research methodology and statistics training of ...

    African Journals Online (AJOL)

    The aim of this study was to determine the profile of research methodology and ... Method: Respondents for this descriptive study were persons responsible for the ..... universities: all study designs, all sampling techniques, incidence and.

  10. Dating method by electron spin resonance at the 'Museum National d'Histoire Naturelle'. Twenty years of methodological researches and geochronological applications

    International Nuclear Information System (INIS)

    Bahain, J.J.

    2007-12-01

    The Electronic Spin Resonance (ESR) dating method has considerably evolved since its first uses in France at the early 1980's. The samples classically used until the middle part of the 1990's, carbonates and bones, were forsaken progressively and replaced by teeth and bleached quartz. Analytical progresses and methodological developments related to the study of these two materials have considerably increase on one hand the precision and the accuracy of the obtained results, allowing a comparison with those derived from other geochronological techniques, and on the other hand the chronological range of application of the ESR method. Even if the ESR method experiences still today many methodological developments, it has an undeniable geochronological potential and it is one of the rare methods permitting the direct dating of the Lower and Middle Pleistocene layers in non volcanic areas. Then its application is of a considerable importance for the study of the first human settlements of Eurasia and the possibility to date various types of materials carried out from the same archaeological level, jointly by ESR and different other geochronological techniques, by allowing a intercalibration of the results, offers in addition a tool to evaluate the reliability of the obtained ages. Some of the most significant results obtained at the Department of Prehistory of the National Museum of Natural History are evoked in this memory and illustrates both the potential and the current limits of this method. (author)

  11. Computational techniques of the simplex method

    CERN Document Server

    Maros, István

    2003-01-01

    Computational Techniques of the Simplex Method is a systematic treatment focused on the computational issues of the simplex method. It provides a comprehensive coverage of the most important and successful algorithmic and implementation techniques of the simplex method. It is a unique source of essential, never discussed details of algorithmic elements and their implementation. On the basis of the book the reader will be able to create a highly advanced implementation of the simplex method which, in turn, can be used directly or as a building block in other solution algorithms.

  12. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  13. Study of input variables in group method of data handling methodology

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Bueno, Elaine Inacio

    2013-01-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a pre-selected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and ANN methodologies, and applied to the IPEN research Reactor IEA-1. The system performs the monitoring by comparing the GMDH and ANN calculated values with measured ones. As the GMDH is a self-organizing methodology, the input variables choice is made automatically. On the other hand, the results of ANN methodology are strongly dependent on which variables are used as neural network input. (author)

  14. Methodology for the application of probabilistic safety assessment techniques (PSA) to the cobalt-therapy units in Cuba

    International Nuclear Information System (INIS)

    Vilaragut Llanes, J.J.; Ferro Fernandez, R.; Troncoso Fleitas, M.; Lozano Lima, B.; Fuente Puch, A. de la; Perez Reyes, Y.; Dumenigo Gonzalez, C.

    2001-01-01

    The applications of PSA techniques in the nuclear power plants during the last two decades and the positive results obtained for decision making in relation with safety, as a complement to deterministic methods, have increased their use in the rest of the nuclear applications. At present a large set of documents from international institutions can be found summarizing the investigations carried out in this field and promoting their use in radioactive facilities. Although still without a mandatory character, the new regulations on radiological safety also promote the complete or partial application of the PSA techniques in the safety assessment of the radiological practices. Also the IAEA, through various programs in which Cuba has been inserted, is taking a group of actions so that the nuclear community will encourage the application of the probabilistic risk methods for the evaluations and decision making with respect to safety. However, the fact that in no radioactive installation has a complete PSA study been carried out, makes that certain methodological aspects require to be improved and modified for the application of these techniques. This work presents the main elements for the use of PSA in the evaluation of the safety of cobalt-therapy units in Cuba. Also presented, as part of the results of the first stage of the Study, are the Guidelines that are being applied in a Research Contract with the Agency by the authors themselves, who belong to the CNSN, together with other specialists from the Cuban Ministry of Public Health. (author) [es

  15. Floating Node Method and Virtual Crack Closure Technique for Modeling Matrix Cracking-Delamination Migration

    Science.gov (United States)

    DeCarvalho, Nelson V.; Chen, B. Y.; Pinho, Silvestre T.; Baiz, P. M.; Ratcliffe, James G.; Tay, T. E.

    2013-01-01

    A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.

  16. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  17. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  18. VIII. THE PAST, PRESENT, AND FUTURE OF DEVELOPMENTAL METHODOLOGY.

    Science.gov (United States)

    Little, Todd D; Wang, Eugene W; Gorrall, Britt K

    2017-06-01

    This chapter selectively reviews the evolution of quantitative practices in the field of developmental methodology. The chapter begins with an overview of the past in developmental methodology, discussing the implementation and dissemination of latent variable modeling and, in particular, longitudinal structural equation modeling. It then turns to the present state of developmental methodology, highlighting current methodological advances in the field. Additionally, this section summarizes ample quantitative resources, ranging from key quantitative methods journal articles to the various quantitative methods training programs and institutes. The chapter concludes with the future of developmental methodology and puts forth seven future innovations in the field. The innovations discussed span the topics of measurement, modeling, temporal design, and planned missing data designs. Lastly, the chapter closes with a brief overview of advanced modeling techniques such as continuous time models, state space models, and the application of Bayesian estimation in the field of developmental methodology. © 2017 The Society for Research in Child Development, Inc.

  19. Diuresis renography in children: methodological aspects

    International Nuclear Information System (INIS)

    Bonnin, F.; Le Stanc, E.; Busquet, G.; Saidi, L.; Lyonnet, F.

    1995-01-01

    In paediatrics, diuresis renography is used as a method to guide clinical management of hydronephrosis or hydro-uretero-nephrosis. Various pitfalls in the technique and other errors exist and may lead to a misinterpretation of the test. The methodology for performing and interpreting the diuresis renography is discussed. (authors). 12 refs., 4 figs

  20. An Improved Clutter Suppression Method for Weather Radars Using Multiple Pulse Repetition Time Technique

    Directory of Open Access Journals (Sweden)

    Yingjie Yu

    2017-01-01

    Full Text Available This paper describes the implementation of an improved clutter suppression method for the multiple pulse repetition time (PRT technique based on simulated radar data. The suppression method is constructed using maximum likelihood methodology in time domain and is called parametric time domain method (PTDM. The procedure relies on the assumption that precipitation and clutter signal spectra follow a Gaussian functional form. The multiple interleaved pulse repetition frequencies (PRFs that are used in this work are set to four PRFs (952, 833, 667, and 513 Hz. Based on radar simulation, it is shown that the new method can provide accurate retrieval of Doppler velocity even in the case of strong clutter contamination. The obtained velocity is nearly unbiased for all the range of Nyquist velocity interval. Also, the performance of the method is illustrated on simulated radar data for plan position indicator (PPI scan. Compared with staggered 2-PRT transmission schemes with PTDM, the proposed method presents better estimation accuracy under certain clutter situations.

  1. Assessment of change in knowledge about research methods among delegates attending research methodology workshop

    Directory of Open Access Journals (Sweden)

    Manisha Shrivastava

    2018-01-01

    Conclusion: There was increase in knowledge of the delegates after attending research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  2. Innovative Mixed-Methods Research: Moving beyond Design Technicalities to Epistemological and Methodological Realizations

    Science.gov (United States)

    Riazi, A. Mehdi

    2016-01-01

    Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…

  3. Variable identification in group method of data handling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Iraci Martinez, E-mail: martinez@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Bueno, Elaine Inacio [Instituto Federal de Educacao, Ciencia e Tecnologia, Guarulhos, SP (Brazil)

    2011-07-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  4. Variable identification in group method of data handling methodology

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Bueno, Elaine Inacio

    2011-01-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  5. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    Science.gov (United States)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  6. Assessment of proliferation resistances of aqueous reprocessing techniques using the TOPS methodology

    International Nuclear Information System (INIS)

    Åberg Lindell, M.; Grape, S.; Håkansson, A.; Jacobsson Svärd, S.

    2013-01-01

    Highlights: • Proliferation resistances of three possible LFR fuel cycles are assessed. • The TOPS methodology has been chosen for the PR assessment. • Reactor operation, reprocessing and fuel fabrication are examined. • Purex, Ganex, and a combination of Purex, Diamex and Sanex, are compared. • The safeguards analysis speaks in favor of Ganex as opposed to the Purex process. - Abstract: The aim of this study is to assess and compare the proliferation resistances (PR) of three possible Generation IV lead-cooled fast reactor fuel cycles, involving the reprocessing techniques Purex, Ganex and a combination of Purex, Diamex and Sanex, respectively. The examined fuel cycle stages are reactor operation, reprocessing and fuel fabrication. The TOPS methodology has been chosen for the PR assessment, and the only threat studied is the case where a technically advanced state diverts nuclear material covertly. According to the TOPS methodology, the facilities have been divided into segments, here roughly representing the different forms of nuclear material occurring in each examined fuel cycle stage. For each segment, various proliferation barriers have been assessed. The results make it possible to pinpoint where the facilities can be improved. The results show that the proliferation resistance of a fuel cycle involving recycling of minor actinides is higher than for the traditional Purex reprocessing cycle. Furthermore, for the purpose of nuclear safeguards, group actinide extraction should be preferred over reprocessing options where pure plutonium streams occur. This is due to the fact that a solution containing minor actinides is less attractive to a proliferator than a pure Pu solution. Thus, the safeguards analysis speaks in favor of Ganex as opposed to the Purex process

  7. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    Science.gov (United States)

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  8. Valuation of micro and small enterprises using the methodology multicriteria and method of discounted cash flow

    Directory of Open Access Journals (Sweden)

    Marcus Vinicius Andrade de Lima

    2010-01-01

    Full Text Available This paper presents a contribution to the method of discounted cash flow methodology using multicriteria decision aid. This methodology incorporates qualitative variables and subjective to the traditional method of discounted cash flows used in valuation companies. To illustrate the proposed method was a descriptive study of exploratory nature applied to a multicase. The intervention was in Micro and Small Enterprises (MSE from the chemical, pharmaceutical and tourism. As a result, the appraiser set the price of the business taking into account the result of mixing between the two methodologies.

  9. Essential ultrasound techniques of the pediatric airway

    DEFF Research Database (Denmark)

    Stafrace, Samuel; Engelhardt, Thomas; Teoh, Wendy H

    2016-01-01

    Ultrasound of the airways is a technique which has been described in a number of recent articles and reviews highlighting the diagnostic possibilities and simple methodology. However, there is a paucity of information focusing specifically on such methods in children where equipment, technique, a...

  10. Are There Two Methods of Grounded Theory? Demystifying the Methodological Debate

    Directory of Open Access Journals (Sweden)

    Cheri Ann Hernandez, RN, Ph.D., CDE

    2008-06-01

    Full Text Available Grounded theory is an inductive research method for the generation of substantive or formal theory, using qualitative or quantitative data generated from research interviews, observation, or written sources, or some combination thereof (Glaser & Strauss, 1967. In recent years there has been much controversy over the etiology of its discovery, as well as, the exact way in which grounded theory research is to be operationalized. Unfortunately, this situation has resulted in much confusion, particularly among novice researchers who wish to utilize this research method. In this article, the historical, methodological and philosophical roots of grounded theory are delineated in a beginning effort to demystify this methodological debate. Grounded theory variants such as feminist grounded theory (Wuest, 1995 or constructivist grounded theory (Charmaz, 1990 are beyond the scope of this discussion.

  11. Identifying plant cell-surface receptors: combining 'classical' techniques with novel methods.

    Science.gov (United States)

    Uebler, Susanne; Dresselhaus, Thomas

    2014-04-01

    Cell-cell communication during development and reproduction in plants depends largely on a few phytohormones and many diverse classes of polymorphic secreted peptides. The peptide ligands are bound at the cell surface of target cells by their membranous interaction partners representing, in most cases, either receptor-like kinases or ion channels. Although knowledge of both the extracellular ligand and its corresponding receptor(s) is necessary to describe the downstream signalling pathway(s), to date only a few ligand-receptor pairs have been identified. Several methods, such as affinity purification and yeast two-hybrid screens, have been used very successfully to elucidate interactions between soluble proteins, but most of these methods cannot be applied to membranous proteins. Experimental obstacles such as low concentration and poor solubility of membrane receptors, as well as instable transient interactions, often hamper the use of these 'classical' approaches. However, over the last few years, a lot of progress has been made to overcome these problems by combining classical techniques with new methodologies. In the present article, we review the most promising recent methods in identifying cell-surface receptor interactions, with an emphasis on success stories outside the field of plant research.

  12. An accurate method for determining residual stresses with magnetic non-destructive techniques in welded ferromagnetic steels

    International Nuclear Information System (INIS)

    Vourna, P

    2016-01-01

    The scope of the present research work was to investigate the proper selection criteria for developing a suitable methodology for the accurate determination of residual stresses existing in welded parts. Magnetic non-destructive testing took place by the use of two magnetic non-destructive techniques: by the measurement of the magnetic Barkhausen noise and by the evaluation of the magnetic hysteresis loop parameters. The spatial distribution of residual stresses in welded metal parts by both non-destructive magnetic methods and two diffraction methods was determined. The conduction of magnetic measurements required an initial calibration of ferromagnetic steels. Based on the examined volume of the sample, all methods used were divided into two large categories: the first one was related to the determination of surface residual stress, whereas the second one was related to bulk residual stress determination. The first category included the magnetic Barkhausen noise and the X-ray diffraction measurements, while the second one included the magnetic permeability and the neutron diffraction data. The residual stresses determined by the magnetic techniques were in a good agreement with the diffraction ones. (paper)

  13. Detection of concrete dam leakage using an integrated geophysical technique based on flow-field fitting method

    Science.gov (United States)

    Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.

    2017-05-01

    An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.

  14. Quantifying biopsychosocial aspects in everyday contexts: an integrative methodological approach from the behavioral sciences

    Science.gov (United States)

    Portell, Mariona; Anguera, M Teresa; Hernández-Mendo, Antonio; Jonsson, Gudberg K

    2015-01-01

    Contextual factors are crucial for evaluative research in psychology, as they provide insights into what works, for whom, in what circumstances, in what respects, and why. Studying behavior in context, however, poses numerous methodological challenges. Although a comprehensive framework for classifying methods seeking to quantify biopsychosocial aspects in everyday contexts was recently proposed, this framework does not contemplate contributions from observational methodology. The aim of this paper is to justify and propose a more general framework that includes observational methodology approaches. Our analysis is rooted in two general concepts: ecological validity and methodological complementarity. We performed a narrative review of the literature on research methods and techniques for studying daily life and describe their shared properties and requirements (collection of data in real time, on repeated occasions, and in natural settings) and classification criteria (eg, variables of interest and level of participant involvement in the data collection process). We provide several examples that illustrate why, despite their higher costs, studies of behavior and experience in everyday contexts offer insights that complement findings provided by other methodological approaches. We urge that observational methodology be included in classifications of research methods and techniques for studying everyday behavior and advocate a renewed commitment to prioritizing ecological validity in behavioral research seeking to quantify biopsychosocial aspects. PMID:26089708

  15. Tenon hospital 3-D dosimetric methodology for radiosurgery of complex AVMs

    International Nuclear Information System (INIS)

    Lefkopoulos, D.; Schlienger, M.; Plazas, M.C.; Laugier, A.

    1990-01-01

    This paper presents the methodology of the irradiation treatment planning for the calculation of the 3-D dose distribution developed at the Tenon Hospital since four years. This dosimetric method is independent of the Linac irradiation technique, thus is can be used with any other type of radiosurgery technique. (author)

  16. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  17. FORMATION OF COGNITIVE INTEREST AT ENGLISH LANGUAGE LESSONS IN PRIMARY SCHOOL: TECHNOLOGIES, METHODS, TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Kotova, E.G.

    2017-09-01

    Full Text Available There are a lot of didactic and technological methods and techniques that shape and develop cognitive interest of primary school students in modern methodology of teaching foreign languages. The use of various forms of gaming interaction, problem assignments, information and communication technologies in the teaching of primary school students allows diversifying the teaching of a foreign language, contributes to the development of their creative and cognitive activity. The use of health-saving technologies ensures the creation of a psychologically and emotionally supportive atmosphere at the lesson, which is an essential condition for acquiring new knowledge and maintaining stable cognitive interest among students while learning a foreign language.

  18. An Effective Vacuum Assisted Extraction Method for the Optimization of Labdane Diterpenoids from Andrographis paniculata by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Ya-Qi Wang

    2014-12-01

    Full Text Available An effective vacuum assisted extraction (VAE technique was proposed for the first time and applied to extract bioactive components from Andrographis paniculata. The process was carefully optimized by response surface methodology (RSM. Under the optimized experimental conditions, the best results were obtained using a boiling temperature of 65 °C, 50% ethanol concentration, 16 min of extraction time, one extraction cycles and a 12:1 liquid-solid ratio. Compared with conventional ultrasonic assisted extraction and heat reflux extraction, the VAE technique gave shorter extraction times and remarkable higher extraction efficiency, which indicated that a certain degree of vacuum gave the solvent a better penetration of the solvent into the pores and between the matrix particles, and enhanced the process of mass transfer. The present results demonstrated that VAE is an efficient, simple and fast method for extracting bioactive components from A. paniculata, which shows great potential for becoming an alternative technique for industrial scale-up applications.

  19. An effective vacuum assisted extraction method for the optimization of labdane diterpenoids from Andrographis paniculata by response surface methodology.

    Science.gov (United States)

    Wang, Ya-Qi; Wu, Zhen-Feng; Ke, Gang; Yang, Ming

    2014-12-31

    An effective vacuum assisted extraction (VAE) technique was proposed for the first time and applied to extract bioactive components from Andrographis paniculata. The process was carefully optimized by response surface methodology (RSM). Under the optimized experimental conditions, the best results were obtained using a boiling temperature of 65 °C, 50% ethanol concentration, 16 min of extraction time, one extraction cycles and a 12:1 liquid-solid ratio. Compared with conventional ultrasonic assisted extraction and heat reflux extraction, the VAE technique gave shorter extraction times and remarkable higher extraction efficiency, which indicated that a certain degree of vacuum gave the solvent a better penetration of the solvent into the pores and between the matrix particles, and enhanced the process of mass transfer. The present results demonstrated that VAE is an efficient, simple and fast method for extracting bioactive components from A. paniculata, which shows great potential for becoming an alternative technique for industrial scale-up applications.

  20. Research Methods in Education

    Science.gov (United States)

    Check, Joseph; Schutt, Russell K.

    2011-01-01

    "Research Methods in Education" introduces research methods as an integrated set of techniques for investigating questions about the educational world. This lively, innovative text helps students connect technique and substance, appreciate the value of both qualitative and quantitative methodologies, and make ethical research decisions.…

  1. A proposed descriptive methodology for environmental geologic (envirogeologic) site characterization

    International Nuclear Information System (INIS)

    Schwarz, D.L.; Snyder, W.S.

    1994-01-01

    We propose a descriptive methodology for use in environmental geologic (envirogeologic) site characterization. The method uses traditional sedimentologic descriptions augmented by environmental data needs, and facies analysis. Most other environmental methodologies for soil and sediment characterization use soil engineering and engineering geology techniques that classify by texture and engineering properties. This technique is inadequate for envirogeologic characterization of sediments. In part, this inadequacy is due to differences in the grain-size between the Unified soil Classification and the Udden-Wentworth scales. Use of the soil grain-size classification could easily cause confusion when attempting to relate descriptions based on this classification to our basic understanding of sedimentary depositional systems. The proposed envirogeologic method uses descriptive parameters to characterize a sediment sample, suggests specific tests on samples for adequate characterization, and provides a guidelines for subsurface facies analysis, based on data retrieved from shallow boreholes, that will allow better predictive models to be developed. This methodology should allow for both a more complete site assessment, and provide sufficient data for selection of the appropriate remediation technology, including bioremediation. 50 refs

  2. New methodologies for living material imaging. Compilation of summaries

    International Nuclear Information System (INIS)

    Barabino, Gabriele; Beaurepaire, Emmanuel; Betrouni, Nacim; Montagnat, Johan; Moonen, Chrit; Olivo-Marin, Jean-Christophe; Paul-Gilloteaux, Perrine; Tillement, Olivier; Barbier, Emmanuel; Beuf, Olivier; Chamot, Christophe; Clarysse, Patrick; Coll, Jean-Luc; Dojat, Michel; Lartizien, Carole; Peyrin, Francoise; Ratiney, Helene; Texier-Nogues, Isabelle; Usson, Yves; Vial, Jean-Claude; Gaillard, Sophie; Aubry, Jean-Francois; Barillot, Christian; Betrouni, Nacim; Beloeil, Jean-Claude; Bernard, Monique; Bridal, Lori; Coll, Jean-Luc; Cozzone, Patrick; Cuenod, Charles-Andre; Darrasse, Luc; Franconi, Jean-Michel; Frapart, Yves-Michel; Grenier, Nicolas; Guilloteau, Denis; Laniece, Philippe; Guilloteau, Denis; Laniece, Philippe; Lethimonnier, Franck; Moonen, Chrit; Pain, Frederic; Patat, Frederic; Tanter, Mickael; Trebossen, Regine; Van Beers, Bernard; Visvikis, Dimitris; Buvat, Irene; Carrault, Guy; Frouin, Frederique; Kouame, Denis; Meste, Olivier; Peyrin, Francoise; Brasse, David; Buvat, Irene; Dauvergne, Denis; Haddad, Ferid; Menard, Laurent; Ouadi, Ali; Olivo-Marin, Jean-Christophe; Pansu, Robert; Peyrieras, Nadine; Salamero, Jean; Usson, Yves; Werts, Martin; Beaurepaire, Emmanuel; Blanchoin, Laurent; Boltze, Frederic; Cavalli, Giacomo; Choquet, Daniel; Coppey, Maite; Dahan, Maxime; Dieterlen, Alain; Ducommun, Bernard; Favard, Cyril; Fort, Emmanuel; Gadal, Olivier; Heliot, Laurent; Hofflack, Bernard; Kervrann, Charles; Langowski, Jorg; LeBivic, Andre; Leveque-Fort, Sandrine; Matthews, Cedric; Monneret, Serge; Mordon, Serge; Mely, Yves

    2012-12-01

    Living material imaging, which is essential to medical diagnosis and therapy methods as well as fundamental and applied biology, is necessarily pluri-disciplinary, at the intersection of physics, (bio)chemistry and pharmacy, and requests mathematical and computer processing of signals and images. Image processing techniques may be applied at different levels (molecular, cellular or tissue level) or using various modes (optics, X rays, NMR, PET, US). This conference therefore presents recent methodological developments addressing the study of living material. The program of the conference started with a plenary session (multimode non linear microscopy of tissues and embryonary morphogenesis) followed by 6 sessions which titles are: (1) new microscopies applied to living materials), (2) agents for molecular and functional imaging), (3) recent developments in methodologies and instrumentations, (4) image processing methods and techniques, (5) image aided diagnosis, therapy and medical surveillance, (6) heterogenous data bases and distributed computations

  3. Thresholding: A Pixel-Level Image Processing Methodology Preprocessing Technique for an OCR System for the Brahmi Script

    Directory of Open Access Journals (Sweden)

    H. K. Anasuya Devi

    2006-12-01

    Full Text Available In this paper we study the methodology employed for preprocessing the archaeological images. We present the various algorithms used in the low-level processing stage of image analysis for Optical Character Recognition System for Brahmi Script. The image preprocessing technique covered in this paper is thresholding. We also try to analyze the results obtained by the pixel-level processing algorithms.

  4. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  5. LWR design decision methodology. Phase III. Final report

    International Nuclear Information System (INIS)

    Bertucio, R.; Held, J.; Lainoff, S.; Leahy, T.; Prather, W.; Rees, D.; Young, J.

    1982-01-01

    Traditionally, management decisions regarding design options have been made using quantitative cost information and qualitative safety information. A Design Decision Methodology, which utilizes probabilistic risk assessment techniques, including event trees and fault trees, along with systems engineering and standard cost estimation methods, has been developed so that a quantitative safety measure may be obtained as well. The report documents the development of this Design Decision Methodology, a demonstration of the methodology on a current licensing issue with the cooperation of the Washington Public Power Supply System (WPPSS), and a discussion of how the results of the demonstration may be used addressing the various issues associated with a licensing position on the issue

  6. Formation of the methodological matrix of the strategic analysis of the enterprise

    Directory of Open Access Journals (Sweden)

    N.H. Vygovskaya

    2018-04-01

    Full Text Available The article is devoted to the study of the methodological matrix of the strategic analysis of the enterprise. The aim of this article is to analyze the influence of methodological changes in the 20th century on the methodology of strategic analysis; critical assessment and generalization of scientific approaches to its methods. Evaluation of scientific works on analysis made it possible to identify such problems in the methodology of strategic analysis as the lack of consideration of the features of strategic analysis in the formation of its methods, which often leads to confusion of methods of financial (economic, thrifty analysis; failure to use the fact that the strategic analysis contains, besides the methods of analyzing the internal and external environment, the methods of forecast analysis aimed at forming the strategy for the development of the enterprise; identification of the concepts «image», «reception», «method» of analysis; multidirectionality and indistinctness of signs of classification of methods of strategic analysis; blind copying of foreign methods of application of techniques and methods of strategic analysis without taking into account the specifics of domestic economic conditions. The expediency of using the system approach in forming the methodological design of strategic analysis is proved, which will allow to combine the methodology as a science of methods (a broad approach to the methods of strategic analysis with methodology as a set of applied methods and methods of analysis (narrow approach to methodology. The use of the system approach allowed to distinguish three levels of the methodology of strategic analysis. The first and second levels of methodology correspond to the level of science, the third level – the practice. When developing the third level of special methods of strategic analysis, an approach is applied that differentiates them depending on the stages of strategic analysis (methods of the stage

  7. Methodological study of volcanic glass dating by fission track method

    International Nuclear Information System (INIS)

    Araya, A.M.O.

    1987-01-01

    After a description of the method and from the analysis of the age equation we show the methodology used in the plotting of the correction curve and the results of the study of correction curves and corrected ages. From a study of the size correction method we see that the reactor irradiation effect on the curve is negligible and that the correction curve is independent of the thermal treatment but, it depends on chemical treatment and sample. Comparing the corrected ages obtained from both correction method and the ages given by other authors we can conclude that they are in agreement and concerning the plateau method, both isothermal and isochronic plateau give the same results. (author) [pt

  8. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence.

    Science.gov (United States)

    Jaspers, Monique W M

    2009-05-01

    Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the human-computer interaction field, we provide an overview of the methodological and empirical research available on the three usability inspection and testing methods most often used. We describe two 'expert-based' and one 'user-based' usability method: (1) the heuristic evaluation, (2) the cognitive walkthrough, and (3) the think aloud. All three usability evaluation methods are applied in laboratory settings. Heuristic evaluation is a relatively efficient usability evaluation method with a high benefit-cost ratio, but requires high skills and usability experience of the evaluators to produce reliable results. The cognitive walkthrough is a more structured approach than the heuristic evaluation with a stronger focus on the learnability of a computer application. Major drawbacks of the cognitive walkthrough are the required level of detail of task and user background descriptions for an adequate application of the latest version of the technique. The think aloud is a very direct method to gain deep insight in the problems end users encounter in interaction with a system but data analyses is extensive and requires a high level of expertise both in the cognitive ergonomics and in computer system application domain. Each of the three usability evaluation methods has shown its usefulness, has its own advantages and disadvantages; no single method has revealed any significant results indicating that it is singularly effective in all circumstances. A combination of different techniques that compliment one another should preferably be used as their collective application will be more powerful than applied in isolation. Innovative mobile and automated solutions to support end-user testing have

  9. A review on fault classification methodologies in power transmission systems: Part-II

    Directory of Open Access Journals (Sweden)

    Avagaddi Prasad

    2018-05-01

    Full Text Available The countless extent of power systems and applications requires the improvement in suitable techniques for the fault classification in power transmission systems, to increase the efficiency of the systems and to avoid major damages. For this purpose, the technical literature proposes a large number of methods. The paper analyzes the technical literature, summarizing the most important methods that can be applied to fault classification methodologies in power transmission systems.The part 2 of the article is named “A review on fault classification methodologies in power transmission systems”. In this part 2 we discussed the advanced technologies developed by various researchers for fault classification in power transmission systems. Keywords: Transmission line protection, Protective relaying, Soft computing techniques

  10. Simplified dose calculation method for mantle technique

    International Nuclear Information System (INIS)

    Scaff, L.A.M.

    1984-01-01

    A simplified dose calculation method for mantle technique is described. In the routine treatment of lymphom as using this technique, the daily doses at the midpoints at five anatomical regions are different because the thicknesses are not equal. (Author) [pt

  11. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    Energy Technology Data Exchange (ETDEWEB)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed.

  12. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    International Nuclear Information System (INIS)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed

  13. CASE METHOD. ACTIVE LEARNING METHODOLOGY TO ACQUIRE SIGNIFICANT IN CHEMISTRY

    Directory of Open Access Journals (Sweden)

    Clotilde Pizarro

    2015-09-01

    Full Text Available In this paper the methodology of cases in first year students of the Engineering Risk Prevention and Environment is applied. For this purpose a real case of contamination occurred at a school in the region of Valparaiso called "La Greda" is presented. If the application starts delivering an extract of the information collected from the media and they made a brief induction on the methodology to be applied. A plenary session, which is debate about possible solutions to the problem and establishing a relationship between the case and drives the chemistry program is then performed. Is concluded that the application of the case method, was a fruitful tool in yields obtained by students, since the percentage of approval was 75%, which is considerably higher than previous years.

  14. DEMATEL Technique: A Systematic Review of the State-of-the-Art Literature on Methodologies and Applications

    Directory of Open Access Journals (Sweden)

    Sheng-Li Si

    2018-01-01

    Full Text Available Decision making trial and evaluation laboratory (DEMATEL is considered as an effective method for the identification of cause-effect chain components of a complex system. It deals with evaluating interdependent relationships among factors and finding the critical ones through a visual structural model. Over the recent decade, a large number of studies have been done on the application of DEMATEL and many different variants have been put forward in the literature. The objective of this study is to review systematically the methodologies and applications of the DEMATEL technique. We reviewed a total of 346 papers published from 2006 to 2016 in the international journals. According to the approaches used, these publications are grouped into five categories: classical DEMATEL, fuzzy DEMATEL, grey DEMATEL, analytical network process- (ANP- DEMATEL, and other DEMATEL. All papers with respect to each category are summarized and analyzed, pointing out their implementing procedures, real applications, and crucial findings. This systematic and comprehensive review holds valuable insights for researchers and practitioners into using the DEMATEL in terms of indicating current research trends and potential directions for further research.

  15. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  16. Detectors for LEP: methods and techniques

    International Nuclear Information System (INIS)

    Fabjan, C.

    1979-01-01

    This note surveys detection methods and techniques of relevance for the LEP physics programme. The basic principles of the detector physics are sketched, as recent improvement in understanding points towards improvements and also limitations in performance. Development and present status of large detector systems is presented and permits some conservative extrapolations. State-of-the-art techniques and technologies are presented and their potential use in the LEP physics programme assessed. (Auth.)

  17. Developments in FT-ICR MS instrumentation, ionization techniques, and data interpretation methods for petroleomics.

    Science.gov (United States)

    Cho, Yunju; Ahmed, Arif; Islam, Annana; Kim, Sunghwan

    2015-01-01

    Because of the increasing importance of heavy and unconventional crude oil as an energy source, there is a growing need for petroleomics: the pursuit of more complete and detailed knowledge of the chemical compositions of crude oil. Crude oil has an extremely complex nature; hence, techniques with ultra-high resolving capabilities, such as Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS), are necessary. FT-ICR MS has been successfully applied to the study of heavy and unconventional crude oils such as bitumen and shale oil. However, the analysis of crude oil with FT-ICR MS is not trivial, and it has pushed analysis to the limits of instrumental and methodological capabilities. For example, high-resolution mass spectra of crude oils may contain over 100,000 peaks that require interpretation. To visualize large data sets more effectively, data processing methods such as Kendrick mass defect analysis and statistical analyses have been developed. The successful application of FT-ICR MS to the study of crude oil has been critically dependent on key developments in FT-ICR MS instrumentation and data processing methods. This review offers an introduction to the basic principles, FT-ICR MS instrumentation development, ionization techniques, and data interpretation methods for petroleomics and is intended for readers having no prior experience in this field of study. © 2014 Wiley Periodicals, Inc.

  18. Scalable Techniques for Formal Verification

    CERN Document Server

    Ray, Sandip

    2010-01-01

    This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue

  19. The Sine Method: An Alternative Height Measurement Technique

    Science.gov (United States)

    Don C. Bragg; Lee E. Frelich; Robert T. Leverett; Will Blozan; Dale J. Luthringer

    2011-01-01

    Height is one of the most important dimensions of trees, but few observers are fully aware of the consequences of the misapplication of conventional height measurement techniques. A new approach, the sine method, can improve height measurement by being less sensitive to the requirements of conventional techniques (similar triangles and the tangent method). We studied...

  20. Three-dimensional RAMA fluence methodology benchmarking

    International Nuclear Information System (INIS)

    Baker, S. P.; Carter, R. G.; Watkins, K. E.; Jones, D. B.

    2004-01-01

    This paper describes the benchmarking of the RAMA Fluence Methodology software, that has been performed in accordance with U. S. Nuclear Regulatory Commission Regulatory Guide 1.190. The RAMA Fluence Methodology has been developed by TransWare Enterprises Inc. through funding provided by the Electric Power Research Inst., Inc. (EPRI) and the Boiling Water Reactor Vessel and Internals Project (BWRVIP). The purpose of the software is to provide an accurate method for calculating neutron fluence in BWR pressure vessels and internal components. The Methodology incorporates a three-dimensional deterministic transport solution with flexible arbitrary geometry representation of reactor system components, previously available only with Monte Carlo solution techniques. Benchmarking was performed on measurements obtained from three standard benchmark problems which include the Pool Criticality Assembly (PCA), VENUS-3, and H. B. Robinson Unit 2 benchmarks, and on flux wire measurements obtained from two BWR nuclear plants. The calculated to measured (C/M) ratios range from 0.93 to 1.04 demonstrating the accuracy of the RAMA Fluence Methodology in predicting neutron flux, fluence, and dosimetry activation. (authors)

  1. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  2. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  3. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  4. Methodological exploratory study applied to occupational epidemiology

    Energy Technology Data Exchange (ETDEWEB)

    Carneiro, Janete C.G. Gaburo; Vasques, MOnica Heloisa B.; Fontinele, Ricardo S.; Sordi, Gian Maria A. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)]. E-mail: janetegc@ipen.br

    2007-07-01

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  5. Methodological exploratory study applied to occupational epidemiology

    International Nuclear Information System (INIS)

    Carneiro, Janete C.G. Gaburo; Vasques, MOnica Heloisa B.; Fontinele, Ricardo S.; Sordi, Gian Maria A.

    2007-01-01

    The utilization of epidemiologic methods and techniques has been object of practical experimentation and theoretical-methodological reflection in health planning and programming process. Occupational Epidemiology is the study of the causes and prevention of diseases and injuries from exposition and risks in the work environment. In this context, there is no intention to deplete such a complex theme but to deal with basic concepts of Occupational Epidemiology, presenting the main characteristics of the analysis methods used in epidemiology, as investigate the possible determinants of exposition (chemical, physical and biological agents). For this study, the social-demographic profile of the IPEN-CNEN/SP work force was used. The knowledge of this reference population composition is based on sex, age, educational level, marital status and different occupations, aiming to know the relation between the health aggravating factors and these variables. The methodology used refers to a non-experimental research based on a theoretical methodological practice. The work performed has an exploratory character, aiming a later survey of indicators in the health area in order to analyze possible correlations related to epidemiologic issues. (author)

  6. Uncertainty quantification in reactor physics using adjoint/perturbation techniques and adaptive spectral methods

    NARCIS (Netherlands)

    Gilli, L.

    2013-01-01

    This thesis presents the development and the implementation of an uncertainty propagation algorithm based on the concept of spectral expansion. The first part of the thesis is dedicated to the study of uncertainty propagation methodologies and to the analysis of spectral techniques. The concepts

  7. Non-perturbative methodologies for low-dimensional strongly-correlated systems: From non-Abelian bosonization to truncated spectrum methods.

    Science.gov (United States)

    James, Andrew J A; Konik, Robert M; Lecheminant, Philippe; Robinson, Neil J; Tsvelik, Alexei M

    2018-02-26

    We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symmetries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one and two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb-Liniger model, 1  +  1D quantum chromodynamics, as well as Landau-Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. We describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.

  8. Non-perturbative methodologies for low-dimensional strongly-correlated systems: From non-Abelian bosonization to truncated spectrum methods

    Science.gov (United States)

    James, Andrew J. A.; Konik, Robert M.; Lecheminant, Philippe; Robinson, Neil J.; Tsvelik, Alexei M.

    2018-04-01

    We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symmetries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one and two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb–Liniger model, 1  +  1D quantum chromodynamics, as well as Landau–Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. We describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.

  9. Methodology for ranking restoration options

    DEFF Research Database (Denmark)

    Jensen, Per Hedemann

    1999-01-01

    techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps:-characterisation of relevant contaminated sites -identication and characterisation of relevant restoration...... techniques -assessment of the radiological impact -development and application of a selection methodology for restoration options -formulation ofgeneric conclusions and development of a manual The project is intended to apply to situations in which sites with nuclear installations have been contaminated...

  10. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  11. SKILLS-BASED ECLECTIC TECHNIQUES MATRIX FOR ELT MICROTEACHINGS

    Directory of Open Access Journals (Sweden)

    İskender Hakkı Sarıgöz

    2016-10-01

    Full Text Available Foreign language teaching undergoes constant changes due to the methodological improvement. This progress may be examined in two parts. They are the methods era and the post-methods era. It is not pragmatic today to propose a particular language teaching method and its techniques for all purposes. The holistic inflexibility of mid-century methods has long gone. In the present day, constructivist foreign language teaching trends attempt to see the learner as a whole person and an individual who may be different from the other students in many respects. At the same time, the individual differences should not keep the learners away from group harmony. For this reason, current teacher training programs require eclectic teaching matrixes for unit design considering the mixed ability student groups. These matrixes can be prepared in a multidimensional fashion because there are many functional techniques in different methods and other new techniques to be created by instructors freely in accordance with the teaching aims. The hypothesis in this argument is that the collection of foreign language teaching techniques compiled in ELT microteachings for a particular group of learners has to be arranged eclectically in order to update the teaching process. Nevertheless, designing a teaching format of this sort is a demanding and highly criticized task. This study briefly argues eclecticism in language-skills based methodological struggle from the perspective of ELT teacher education.

  12. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    Science.gov (United States)

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  13. Alteration of Box-Jenkins methodology by implementing genetic algorithm method

    Science.gov (United States)

    Ismail, Zuhaimy; Maarof, Mohd Zulariffin Md; Fadzli, Mohammad

    2015-02-01

    A time series is a set of values sequentially observed through time. The Box-Jenkins methodology is a systematic method of identifying, fitting, checking and using integrated autoregressive moving average time series model for forecasting. Box-Jenkins method is an appropriate for a medium to a long length (at least 50) time series data observation. When modeling a medium to a long length (at least 50), the difficulty arose in choosing the accurate order of model identification level and to discover the right parameter estimation. This presents the development of Genetic Algorithm heuristic method in solving the identification and estimation models problems in Box-Jenkins. Data on International Tourist arrivals to Malaysia were used to illustrate the effectiveness of this proposed method. The forecast results that generated from this proposed model outperformed single traditional Box-Jenkins model.

  14. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    Science.gov (United States)

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017.

  15. The Art of Hardware Architecture Design Methods and Techniques for Digital Circuits

    CERN Document Server

    Arora, Mohit

    2012-01-01

    This book highlights the complex issues, tasks and skills that must be mastered by an IP designer, in order to design an optimized and robust digital circuit to solve a problem. The techniques and methodologies described can serve as a bridge between specifications that are known to the designer and RTL code that is final outcome, reducing significantly the time it takes to convert initial ideas and concepts into right-first-time silicon.� Coverage focuses on real problems rather than theoretical concepts, with an emphasis on design techniques across various aspects of chip-design.�� Describes techniques to help IP designers get it right the first time, creating designs optimized in terms of power, area and performance; Focuses on practical aspects of chip design and minimizes theory; Covers chip design in a consistent way, starting with basics and gradually developing advanced concepts, such as electromagnetic compatibility (EMC) design techniques and low-power design techniques such as dynamic voltage...

  16. Radioisotope methodology course radioprotection aspects

    International Nuclear Information System (INIS)

    Bergoc, R.M.; Caro, R.A.; Menossi, C.A.

    1996-01-01

    The advancement knowledge in molecular and cell biology, biochemistry, medicine and pharmacology, which has taken place during the last 50 years, after World War II finalization, is really outstanding. It can be safely said that this fact is principally due to the application of radioisotope techniques. The research on metabolisms, biodistribution of pharmaceuticals, pharmacodynamics, etc., is mostly carried out by means of techniques employing radioactive materials. Radioisotopes and radiation are frequently used in medicine both as diagnostic and therapeutic tools. The radioimmunoanalysis is today a routine method in endocrinology and in general clinical medicine. The receptor determination and characterization is a steadily growing methodology used in clinical biochemistry, pharmacology and medicine. The use of radiopharmaceuticals and radiation of different origins, for therapeutic purposes, should not be overlooked. For these reasons, the importance to teach radioisotope methodology is steadily growing. This is principally the case for specialization at the post-graduate level but at the pre graduate curriculum it is worthwhile to give some elementary theoretical and practical notions on this subject. These observations are justified by a more than 30 years teaching experience at both levels at the School of Pharmacy and Biochemistry of the University of Buenos Aires, Argentina. In 1960 we began to teach Physics III, an obligatory pregraduate course for biochemistry students, in which some elementary notions of radioactivity and measurement techniques were given. Successive modifications of the biochemistry pregraduate curriculum incorporated radiochemistry as an elective subject and since 1978, radioisotope methodology, as obligatory subject for biochemistry students. This subject is given at the radioisotope laboratory during the first semester of each year and its objective is to provide theoretical and practical knowledge to the biochemistry students, even

  17. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  18. Methods and techniques of nuclear in-core fuel management

    International Nuclear Information System (INIS)

    Jong, A.J. de.

    1992-04-01

    Review of methods of nuclear in-core fuel management (the minimal critical mass problem, minimal power peaking) and calculational techniques: reactorphysical calculations (point reactivity models, continuous refueling, empirical methods, depletion perturbation theory, nodal computer programs); optimization techniques (stochastic search, linear programming, heuristic parameter optimization). (orig./HP)

  19. Designs, Techniques, and Reporting Strategies in Geography Education: A Review of Research Methods

    Science.gov (United States)

    Zadrozny, Joann; McClure, Caroline; Lee, Jinhee; Jo, Injeong

    2016-01-01

    A wide variety of research is being completed and published in geography education. The purpose of this article is to provide a general overview of the different types of methodologies, research designs, and techniques used by geography education researchers. Analyzing three geography education journals, we found 191 research articles published…

  20. Proposed Methodology for Establishing Area of Applicability

    International Nuclear Information System (INIS)

    Broadhead, B.L.; Hopper, C.M.; Parks, C.V.

    1999-01-01

    This paper presents the application of sensitivity and uncertainty (S/U) analysis methodologies to the data validation tasks of a criticality safety computational study. The S/U methods presented are designed to provide a formal means of establishing the area (or range) of applicability for criticality safety data validation studies. The development of parameters that are analogous to the standard trending parameters form the key to the technique. These parameters are the so-called D parameters, which represent the differences by energy group of S/U-generated sensitivity profiles, and c parameters, which are the k correlation coefficients, each of which give information relative to the similarity between pairs of selected systems. The use of a Generalized Linear Least-Squares Methodology (GLLSM) tool is also described in this paper. These methods and guidelines are also applied to a sample validation for uranium systems with enrichments greater than 5 wt %

  1. Auditing organizational communication: evaluating the methodological strengths and weaknesses of the critical incident technique, network analysis, and the communication satisfaction questionnaire

    NARCIS (Netherlands)

    Koning, K.H.

    2016-01-01

    This dissertation focuses on the methodology of communication audits. In the context of three Dutch high schools, we evaluated several audit instruments. The first study in this dissertation focuses on the question whether the rationale of the critical incident technique (CIT) still applies when it

  2. IS THERE A NEED FOR THE POST-NON-CLASSICAL METHODOLOGY IN PEDAGOGY?

    Directory of Open Access Journals (Sweden)

    Vladislav L. Benin

    2014-01-01

    Full Text Available  The publication continues the discussion, started by Yu.V. Larina in ≪Education in Search of the Congruity Principle≫ concerning the modern methodology of pedagogical science; and identifies the criteria of the given principle along with the limitations of the post-non-classical approaches to the humanities.Methods: The methodology involves the analysis of existing view points, formalization of characteristics of post-non-classical science, and reflection of pedagogical principle of cultural conformity.Results: The research outcomes demonstrate that the gradual undermining of the fundamental science results in erosion of methodological background. In case of interdisciplinary subjects, a researcher is forced to integrate different methods and techniques, which provokes further disruption of the methodology.Scientific novelty: The author classifies and extrapolates to the humanities sphere the main characteristics of post-non-classical science; and makes a conclusion about the gradual decline of researchers’ training quality due to the lack of methodological clarity, and aggressive forms of science vulgarization leading to spontaneous development of clipping methodology.The practical significance: Implementation of the research findings can activate both theoretical and methodological aspects of teacher’s training and selfeducation.

  3. Construction techniques and management methods for BWR plants

    International Nuclear Information System (INIS)

    Shimizu, Yohji; Tateishi, Mizuo; Hayashi, Yoshishige

    1989-01-01

    Toshiba is constantly striving for safer and more efficient plant construction to realize high-quality BWR plants within a short construction period. To achieve these aims, Toshiba has developed and improved a large number of construction techniques and construction management methods. In the area of installation, various techniques have been applied such as the modularization of piping and equipment, shop installation of reactor internals, etc. Further, installation management has been upgraded by the use of pre-installation review programs, the development of installation control systems, etc. For commissioning, improvements in commissioning management have been achieved through the use of computer systems, and testing methods have also been upgraded by the development of computer systems for the recording and analysis of test data and the automatic adjustment of controllers in the main control system of the BWR. This paper outlines these construction techniques and management methods. (author)

  4. A technology-assessment methodology for electric utility planning: With application to nuclear power plant decommissioning

    International Nuclear Information System (INIS)

    Lough, W.T.

    1987-01-01

    Electric utilities and public service commissions have not taken full advantage of the many proven methodologies and techniques available for evaluating complex technological issues. In addition, evaluations performed are deficient in their use of (1) methods for evaluating public attitudes and (2) formal methods of analysis for decision making. These oversight are substantiated through an examination of the literature relevant to electric utility planning. The assessment process known as technology assessment or TA is proposed, and a TA model is developed for route in use in utility planning by electric utilities and state regulatory commissions. Techniques to facilitate public participation and techniques to aid decision making are integral to the proposed model and are described in detail. Criteria are provided for selecting an appropriate technique on a case-by-case basis. The TA model proved to be an effective methodology for evaluating technological issues associated with electric utility planning such as decommissioning nuclear power plants. Through the use of the nominal group technique, the attitudes of a group of residential ratepayers were successfully identified and included in the decision-making process

  5. Determination of the aerosol filters efficiency by means of the tracer techniques

    International Nuclear Information System (INIS)

    Hirling, J.

    1978-01-01

    Estimation of the nonradioactive methods of filters efficiency determination and tracer techniques are given. The methods are stated and discriptions of the instrumentation for estimation of the filters efficiency are given, in particular: methodology of production of the radioactive synthetic test-aerosols by means of the disperse and steamcondensation aerosol generators; the radio isotope method of the aerosol filters investigations; the methodology of filtartion efficiency determination. The results are given of the radioisotope investigations of filters; properties of the artificial radioactive test-aerosols; characteristics of filters, determined by the tracer techniques. Curves are given for the filtration efficiency of the viscose filtering nozzles of different density depending on the filters load. (I.T.) [ru

  6. Human error risk management for engineering systems: a methodology for design, safety assessment, accident investigation and training

    International Nuclear Information System (INIS)

    Cacciabue, P.C.

    2004-01-01

    The objective of this paper is to tackle methodological issues associated with the inclusion of cognitive and dynamic considerations into Human Reliability methods. A methodology called Human Error Risk Management for Engineering Systems is presented that offers a 'roadmap' for selecting and consistently applying Human Factors approaches in different areas of application and contains also a 'body' of possible methods and techniques of its own. Two types of possible application are discussed to demonstrate practical applications of the methodology. Specific attention is dedicated to the issue of data collection and definition from specific field assessment

  7. Alternative containment integrity test methods, an overview of possible techniques

    International Nuclear Information System (INIS)

    Spletzer, B.L.

    1986-01-01

    A study is being conducted to develop and analyze alternative methods for testing of containment integrity. The study is focused on techniques for continuously monitoring containment integrity to provide rapid detection of existing leaks, thus providing greater certainty of the integrity of the containment at any time. The study is also intended to develop techniques applicable to the currently required Type A integrated leakage rate tests. A brief discussion of the range of alternative methods currently being considered is presented. The methods include applicability to all major containment types, operating and shutdown plant conditions, and quantitative and qualitative leakage measurements. The techniques are analyzed in accordance with the current state of knowledge of each method. The bulk of the techniques discussed are in the conceptual stage, have not been tested in actual plant conditions, and are presented here as a possible future direction for evaluating containment integrity. Of the methods considered, no single method provides optimum performance for all containment types. Several methods are limited in the types of containment for which they are applicable. The results of the study to date indicate that techniques for continuous monitoring of containment integrity exist for many plants and may be implemented at modest cost

  8. Inventory differences: An evaluation methodology

    International Nuclear Information System (INIS)

    Heinberg, C.L.; Roberts, N.J.

    1987-01-01

    This paper discusses an evaluation methodology which is used for inventory differences at the Los Alamos National Laboratory. It is recognized that there are various methods which can be, and are being, used to evaluate process inventory differences at DOE facilities. The purpose of this paper is to share our thoughts on the subject and our techniques with those who are responsible for the evaluation of inventory differences at their facility. One of the most dangerous aspects of any evaluation technique, especially one as complex as most inventory difference evaluations tend to be, is to fail to look at the tools being used as indicators. There is a tendency to look at the results of an evaluation by one technique as an absolute. At the Los Alamos National Laboratory, several tools are used and the final evaluation is based on a combination of the observed results of a many-faceted evaluation. The tools used and some examples are presented

  9. A study on the advanced statistical core thermal design methodology

    International Nuclear Information System (INIS)

    Lee, Seung Hyuk

    1992-02-01

    A statistical core thermal design methodology for generating the limit DNBR and the nominal DNBR is proposed and used in assessing the best-estimate thermal margin in a reactor core. Firstly, the Latin Hypercube Sampling Method instead of the conventional Experimental Design Technique is utilized as an input sampling method for a regression analysis to evaluate its sampling efficiency. Secondly and as a main topic, the Modified Latin Hypercube Sampling and the Hypothesis Test Statistics method is proposed as a substitute for the current statistical core thermal design method. This new methodology adopts 'a Modified Latin Hypercube Sampling Method' which uses the mean values of each interval of input variables instead of random values to avoid the extreme cases that arise in the tail areas of some parameters. Next, the independence between the input variables is verified through 'Correlation Coefficient Test' for statistical treatment of their uncertainties. And the distribution type of DNBR response is determined though 'Goodness of Fit Test'. Finally, the limit DNBR with one-sided 95% probability and 95% confidence level, DNBR 95/95 ' is estimated. The advantage of this methodology over the conventional statistical method using Response Surface and Monte Carlo simulation technique lies in its simplicity of the analysis procedure, while maintaining the same level of confidence in the limit DNBR result. This methodology is applied to the two cases of DNBR margin calculation. The first case is the application to the determination of the limit DNBR where the DNBR margin is determined by the difference between the nominal DNBR and the limit DNBR. The second case is the application to the determination of the nominal DNBR where the DNBR margin is determined by the difference between the lower limit value of the nominal DNBR and the CHF correlation limit being used. From this study, it is deduced that the proposed methodology gives a good agreement in the DNBR results

  10. Large sample NAA facility and methodology development

    International Nuclear Information System (INIS)

    Roth, C.; Gugiu, D.; Barbos, D.; Datcu, A.; Aioanei, L.; Dobrea, D.; Taroiu, I. E.; Bucsa, A.; Ghinescu, A.

    2013-01-01

    A Large Sample Neutron Activation Analysis (LSNAA) facility has been developed at the TRIGA- Annular Core Pulsed Reactor (ACPR) operated by the Institute for Nuclear Research in Pitesti, Romania. The central irradiation cavity of the ACPR core can accommodate a large irradiation device. The ACPR neutron flux characteristics are well known and spectrum adjustment techniques have been successfully applied to enhance the thermal component of the neutron flux in the central irradiation cavity. An analysis methodology was developed by using the MCNP code in order to estimate counting efficiency and correction factors for the major perturbing phenomena. Test experiments, comparison with classical instrumental neutron activation analysis (INAA) methods and international inter-comparison exercise have been performed to validate the new methodology. (authors)

  11. Diffusion tensor trace mapping in normal adult brain using single-shot EPI technique: A methodological study of the aging brain

    International Nuclear Information System (INIS)

    Chen, Z.G.; Hindmarsh, T.; Li, T.Q.

    2001-01-01

    Purpose: To quantify age-related changes of the average diffusion coefficient value in normal adult brain using orientation-independent diffusion tensor trace mapping and to address the methodological influences on diffusion quantification. Material and Methods: Fifty-four normal subjects (aged 20-79 years) were studied on a 1.5-T whole-body MR medical unit using a diffusion-weighted single-shot echo-planar imaging technique. Orientation-independent diffusion tensor trace maps were constructed for each subject using diffusion-weighted MR measurements in four different directions using a tetrahedral gradient combination pattern. The global average (including cerebral spinal fluid) and the tissue average of diffusion coefficients in adult brains were determined by analyzing the diffusion coefficient distribution histogram for the entire brain. Methodological influences on the measured diffusion coefficient were also investigated by comparing the results obtained using different experimental settings. Results: Both global and tissue averages of the diffusion coefficient are significantly correlated with age (p<0.03). The global average of the diffusion coefficient increases 3% per decade after the age of 40, whereas the increase in the tissue average of diffusion coefficient is about 1% per decade. Experimental settings for self-diffusion measurements, such as data acquisition methods and number of b-values, can slightly influence the statistical distribution histogram of the diffusion tensor trace and its average value. Conclusion: Increased average diffusion coefficient in adult brains with aging are consistent with findings regarding structural changes in the brain that have been associated with aging. The study also demonstrates that it is desirable to use the same experimental parameters for diffusion coefficient quantification when comparing between different subjects and groups of interest

  12. Classification of assembly techniques for micro products

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Tosello, Guido; Gegeckaite, Asta

    2005-01-01

    of components and level of integration are made. This paper describes a systematic characterization of micro assembly methods. This methodology offers the opportunity of a cross comparison among different techniques to gain a choosing principle of the favourable micro assembly technology in a specific case...

  13. Failure modes induced by natural radiation environments on DRAM memories: study, test methodology and mitigation technique

    International Nuclear Information System (INIS)

    Bougerol, A.

    2011-05-01

    DRAMs are frequently used in space and aeronautic systems. Their sensitivity to cosmic radiations have to be known in order to satisfy reliability requirements for critical applications. These evaluations are traditionally done with particle accelerators. However, devices become more complex with technology integration. Therefore new effects appear, inducing longer and more expensive tests. There is a complementary solution: the pulsed laser, which triggers similar effects as particles. Thanks to these two test tools, main DRAM radiation failure modes were studied: SEUs (Single Event Upset) in memory blocks, and SEFIs (Single Event Functional Interrupt) in peripheral circuits. This work demonstrates the influence of test patterns on SEU and SEFI sensitivities depending on technology used. In addition, this study identifies the origin of the most frequent type of SEFIs. Moreover, laser techniques were developed to quantify sensitive surfaces of the different effects. This work led to a new test methodology for industry, in order to optimize test cost and efficiency using both pulsed laser beams and particle accelerators. Finally, a new fault tolerant technique is proposed: based on DRAM cell radiation immunity when discharged, this technique allows to correct all bits of a logic word. (author)

  14. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    International Nuclear Information System (INIS)

    Okrent, D.

    1989-01-01

    This final report summarizes the accomplishments of a two year research project entitled ''Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed

  15. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  16. Epidemiology of intestinal parasitosis in Italy between 2005 and 2008: diagnostic techniques and methodologies

    Directory of Open Access Journals (Sweden)

    Daniele Crotti

    2013-04-01

    Full Text Available Aim of the study was to keep a real and actual photo relating to 2005-2008 regarding to diagnostic techniques and methodologies for intestinal parasites; so it would be possible to know specific epidemiology and suggest more rational and efficacious guide-lines. All members of AMCLI were involved in the proposal of a retrospective study regarding bowel parasites, helminths and protozoa.To engaged laboratories we asked how O&P was performed, if a specifical research for E. vermicularis and S. stercoralis was performed, if for the identification of D. fragilis, Entamoeba histolytica/dispar and Cryptosporidum spp were performed recommended specific permanent stains. 23 laboratories gave assent; but for an inferior number was possible to use the data for analysis and evaluation. Relating O&P only some laboratories performed permanent stains: Giemsa for D. fragilis, antigen and/or Trichrome stain for E. histolytica/dispar, antigen and/or acid fast stain for Cryptosporidium spp.Not all laboratories research specifically S. stercoralis. So the epidemiology is differentiated and related more to adequate or not adequate techniques than cohorts of examined populations. The overall positivity for parasites ranged from 0% to18.7%,for protozoa (pathogens or not were from 0% to 14.7%; for nematodes from 0% to 3.7%; for cestodes from 0% to 1.0%; for trematodes from 0% to 1.0%.Among helminths, E. vermicularis, followed by S. stercoralis, also in O&P, is the most frequent.The specific research of S. stercoralis gave a positivity from 0% to 33.3%; the cellophane tape test was positive for E. vermicularis from 0% fo 21.9% of cases.Among pathogen protozoa, D. fragilis, when permanent stain were applied, prevailed from 0% to 16.6%; G. duodenalis from 0.8% to 4.3%; E. histolytica/dispar, using a permanent stain or research of antigen, was identified from 0% to 20.6%. Coccidia were very rare, with Cryptosporidium spp observed from 0% to 5.2%. These are our conclusions

  17. Forecasting in Intelligence: Indications and Warning Methodology in Modern Practice

    Directory of Open Access Journals (Sweden)

    Marina Gennadievna Vlasova

    2015-12-01

    Full Text Available Today the national security system effectiveness seriously depends on the professional analysis of information and timely forecasts. Thus the efficient methods of forecasting in the sphere of international relations are of current importance for the modern intelligence services. The Indications and Warning Technique that was a key element of forecasting methodology in intelligence until the end of Cold War is estimated in the present article. Is this method still relevant in the contemporary world with its new international order, new security challenges and technological revolution in the data collection and processing? The main conclusion based on the overview of current researches and known intelligence practice is that indicators technique is still relevant for the early warning of national security threats but requires some adaptation to today’s issues. The most important trends in adaptation are supposed to be a creation of broadest possible spectrum of threatens scenarios as well as research of current strategic threatens and corresponding indicators. Also the appropriate software that automates the use of indications technique by the security services is very important. The author believes that the cooperation between intelligence services and academic community can increase the efficiency of the Indications Methodology and of the strategic forecasting as well.

  18. The selection of probabilistic safety assessment techniques for non-reactor nuclear facilities

    International Nuclear Information System (INIS)

    Vail, J.

    1992-01-01

    Historically, the probabilistic safety assessment (PSA) methodology of choice is the well known event tree/fault tree inductive technique. For reactor facilities is has stood the test of time. Some non-reactor nuclear facilities have found inductive methodologies difficult to apply. The stand-alone fault tree deductive technique has been used effectively to analyze risk in nuclear chemical processing facilities and waste handling facilities. The selection between the two choices suggest benefits from use of the deductive method for non-reactor facilities

  19. A methodology to measure the degre of managerial innovation

    OpenAIRE

    Ayhan, Mustafa Batuhan; Oztemel, Ercan

    2014-01-01

    Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the...

  20. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    Science.gov (United States)

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More than Qualitative Methods

    Science.gov (United States)

    Bowleg, Lisa

    2017-01-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with…

  2. Application of an integrated multi-criteria decision making AHP-TOPSIS methodology for ETL software selection.

    Science.gov (United States)

    Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik

    2016-01-01

    Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.

  3. Modeling and optimization of effective parameters on the size of synthesized Fe{sub 3}O{sub 4} superparamagnetic nanoparticles by coprecipitation technique using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ghazanfari, Mohammad Reza, E-mail: Ghazanfari.mr@gmail.com [Department of Materials Science and Engineering, Ferdowsi University of Mashhad, 9177948974 Mashhad (Iran, Islamic Republic of); Kashefi, Mehrdad, E-mail: m-kashefi@um.ac.ir [Department of Materials Science and Engineering, Ferdowsi University of Mashhad, 9177948974 Mashhad (Iran, Islamic Republic of); Jaafari, Mahmoud Reza [Biotechnology Research Center, Nanotechnology Research Center, School of Pharmacy, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2016-05-01

    Generally, the statistical methods are defined as appropriate techniques to study the processes trends. In current research, the Fe{sub 3}O{sub 4} superparamagnetic nanoparticles were synthesized by coprecipitation method. In order to investigate the size properties of synthesized particles, the experimental design was done using central composite method (CCD) of response surface methodology (RSM) while the temperature, pH, and cation ratio of reaction were selected as influential factors. After particles synthesis based on designed runs, the different responses such as hydrodynamic size of particles (both freeze dried and air dried), size distribution, crystallite size, magnetic size, and zeta potential were evaluated by different techniques i.e. dynamic light scattering (DLS), X-ray diffraction (XRD), and vibrating sample magnetometer (VSM). Based on these results, the quadratic polynomial model was fitted for each response that could predict the response amounts. In following, the study of factors effects was carried out that showed the temperature, pH, and their interactions had higher effectiveness. Finally, by optimizing, it was clear that the minimum amounts of particle size (10.15 nm) and size distribution (13.01 nm) were reached in the minimum temperature (70 °C) and cation ratio (0.5) amounts and maximum pH amount (10.5). Moreover, the characterizations showed the particles size was about 10 nm while the amounts of M{sub s}, H{sub c}, and M{sub r} were equal to 60 (emu/g), 0.2 (Oe) and 0.22 (emu/g), respectively. - Highlights: • The Fe{sub 3}O{sub 4} nanoparticles were successfully synthesized by coprecipitation method. • By RSM technique, some predicted models were presented for particles size. • Temperature, pH and their interactions had most effectiveness on the particles size. • The drying techniques can effect on the size properties.

  4. Measurement of the porosity of amorphous materials by gamma ray transmission methodology

    International Nuclear Information System (INIS)

    Pottker, Walmir Eno; Appoloni, Carlos Roberto

    2000-01-01

    In this work it is presented the measurement of the total porosity of TRe soil, Sandstone Berea rocks and porous ceramics samples. For the determination of the total porosity, the Arquimedes method (conventional) and the gamma ray transmission methodology were employed. The porosity measurement using the gamma methodology has a significant advantage respect to the conventional method due to the fast and non-destructive determination, and also for supplying results with a greater characterization in small scales, in relation to the heterogeneity of the porosity. The conventional methodology presents good results only for homogeneous samples. The experimental set up for the gamma ray transmission technique consisted of a 241 Am source (59,53 keV ), a NaI(Tl) scintillation detector, collimators, a XYZ micrometric table and standard gamma spectrometry electronics connected to a multichannel analyser. (author)

  5. An Empirical Review of Research Methodologies and Methods in Creativity Studies (2003-2012)

    Science.gov (United States)

    Long, Haiying

    2014-01-01

    Based on the data collected from 5 prestigious creativity journals, research methodologies and methods of 612 empirical studies on creativity, published between 2003 and 2012, were reviewed and compared to those in gifted education. Major findings included: (a) Creativity research was predominantly quantitative and psychometrics and experiment…

  6. Development of methodology and direction of practice administrative neuromarketing

    OpenAIRE

    Glushchenko V.; Glushchenko I.

    2018-01-01

    Development of methodology and practical aspects of application of administrative neuromarketing acts as a subject of work, subject of article is administrative neuromarketing in the organization, in article the concept and content of administrative neuromarketing, philosophy, culture, functions, tasks and the principles of administrative neuromarketing are investigated, the technique of the logical analysis of a possibility of application of methods of administrative neuromarketing for incre...

  7. An Overview of Short-term Statistical Forecasting Methods

    DEFF Research Database (Denmark)

    Elias, Russell J.; Montgomery, Douglas C.; Kulahci, Murat

    2006-01-01

    An overview of statistical forecasting methodology is given, focusing on techniques appropriate to short- and medium-term forecasts. Topics include basic definitions and terminology, smoothing methods, ARIMA models, regression methods, dynamic regression models, and transfer functions. Techniques...... for evaluating and monitoring forecast performance are also summarized....

  8. A Methodology for the Selection of Multi-Criteria Decision Analysis Methods in Real Estate and Land Management Processes

    Directory of Open Access Journals (Sweden)

    Maria Rosaria Guarini

    2018-02-01

    Full Text Available Real estate and land management are characterised by a complex, elaborate combination of technical, regulatory and governmental factors. In Europe, Public Administrators must address the complex decision-making problems that need to be resolved, while also acting in consideration of the expectations of the different stakeholders involved in settlement transformation. In complex situations (e.g., with different aspects to be considered and multilevel actors involved, decision-making processes are often used to solve multidisciplinary and multidimensional analyses, which support the choices of those who are making the decision. Multi-Criteria Decision Analysis (MCDA methods are included among the examination and evaluation techniques considered useful by the European Community. Such analyses and techniques are performed using methods, which aim to reach a synthesis of the various forms of input data needed to define decision-making problems of a similar complexity. Thus, one or more of the conclusions reached allow for informed, well thought-out, strategic decisions. According to the technical literature on MCDA, numerous methods are applicable in different decision-making situations, however, advice for selecting the most appropriate for the specific field of application and problem have not been thoroughly investigated. In land and real estate management, numerous queries regarding evaluations often arise. In brief, the objective of this paper is to outline a procedure with which to select the method best suited to the specific queries of evaluation, which commonly arise while addressing decision-making problems. In particular issues of land and real estate management, representing the so-called “settlement sector”. The procedure will follow a theoretical-methodological approach by formulating a taxonomy of the endogenous and exogenous variables of the multi-criteria analysis methods.

  9. Glycoprotein Enrichment Analytical Techniques: Advantages and Disadvantages.

    Science.gov (United States)

    Zhu, R; Zacharias, L; Wooding, K M; Peng, W; Mechref, Y

    2017-01-01

    Protein glycosylation is one of the most important posttranslational modifications. Numerous biological functions are related to protein glycosylation. However, analytical challenges remain in the glycoprotein analysis. To overcome the challenges associated with glycoprotein analysis, many analytical techniques were developed in recent years. Enrichment methods were used to improve the sensitivity of detection, while HPLC and mass spectrometry methods were developed to facilitate the separation of glycopeptides/proteins and enhance detection, respectively. Fragmentation techniques applied in modern mass spectrometers allow the structural interpretation of glycopeptides/proteins, while automated software tools started replacing manual processing to improve the reliability and throughput of the analysis. In this chapter, the current methodologies of glycoprotein analysis were discussed. Multiple analytical techniques are compared, and advantages and disadvantages of each technique are highlighted. © 2017 Elsevier Inc. All rights reserved.

  10. Alternative method of highway traffic safety analysis for developing countries using delphi technique and Bayesian network.

    Science.gov (United States)

    Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae

    2016-08-01

    Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  12. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  13. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  14. Assessment of change in knowledge about research methods among delegates attending research methodology workshop.

    Science.gov (United States)

    Shrivastava, Manisha; Shah, Nehal; Navaid, Seema

    2018-01-01

    In an era of evidence based medicine research is an essential part of medical profession whether clinical or academic. A research methodology workshop intends to help participants, those who are newer to research field or those who are already doing empirical research. The present study was conducted to assess the changes in knowledge of the participants of a research methodology workshop through a structured questionnaire. With administrative and ethical approval, a four day research methodology workshop was planned. The participants were subjected to a structured questionnaire (pre-test) containing 20 multiple choice questions (Q1-Q 20) related to the topics to be covered in research methodology workshop before the commencement of the workshop and then subjected to similar posttest questionnaire after the completion of workshop. The mean values of pre and post-test scores were calculated and the results were analyzed and compared. Out of the total 153 delegates, 45(29 %) were males and 108 were (71 %) females. 92 (60%) participants consented to fill the pre-test questionnaire and 68 (44%) filled the post-test questionnaire. The mean Pre-test and post-test scores at 95% Confidence Interval were 07.62 (SD ±3.220) and 09.66 (SD ±2.477) respectively. The differences were found to be significant using Paired Sample T test ( P research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  15. Methodology of sustainability accounting

    Directory of Open Access Journals (Sweden)

    O.H. Sokil

    2017-03-01

    Full Text Available Modern challenges of the theory and methodology of accounting are realized through the formation and implementation of new concepts, the purpose of which is to meet the needs of users in standard and unique information. The development of a methodology for sustainability accounting is a key aspect of the management of an economic entity. The purpose of the article is to form the methodological bases of accounting for sustainable development and determine its goals, objectives, object, subject, methods, functions and key aspects. The author analyzes the theoretical bases of the definition and considers the components of the traditional accounting methodology. Generalized structural diagram of the methodology for accounting for sustainable development is offered in the article. The complex of methods and principles of sustainable development accounting for systematized and non-standard provisions has been systematized. The new system of theoretical and methodological provisions of accounting for sustainable development is justified in the context of determining its purpose, objective, subject, object, methods, functions and key aspects.

  16. Methodology for assessing the probability of corrosion in concrete structures on the basis of half-cell potential and concrete resistivity measurements.

    Science.gov (United States)

    Sadowski, Lukasz

    2013-01-01

    In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential E corr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  17. Methodology for Assessing the Probability of Corrosion in Concrete Structures on the Basis of Half-Cell Potential and Concrete Resistivity Measurements

    Directory of Open Access Journals (Sweden)

    Lukasz Sadowski

    2013-01-01

    Full Text Available In recent years, the corrosion of steel reinforcement has become a major problem in the construction industry. Therefore, much attention has been given to developing methods of predicting the service life of reinforced concrete structures. The progress of corrosion cannot be visually assessed until a crack or a delamination appears. The corrosion process can be tracked using several electrochemical techniques. Most commonly the half-cell potential measurement technique is used for this purpose. However, it is generally accepted that it should be supplemented with other techniques. Hence, a methodology for assessing the probability of corrosion in concrete slabs by means of a combination of two methods, that is, the half-cell potential method and the concrete resistivity method, is proposed. An assessment of the probability of corrosion in reinforced concrete structures carried out using the proposed methodology is presented. 200 mm thick 750 mm  ×  750 mm reinforced concrete slab specimens were investigated. Potential Ecorr and concrete resistivity ρ in each point of the applied grid were measured. The experimental results indicate that the proposed methodology can be successfully used to assess the probability of corrosion in concrete structures.

  18. Teaching research methods in nursing using Aronson's Jigsaw Technique. A cross-sectional survey of student satisfaction.

    Science.gov (United States)

    Leyva-Moral, Juan M; Riu Camps, Marta

    2016-05-01

    To adapt nursing studies to the European Higher Education Area, new teaching methods have been included that assign maximum importance to student-centered learning and collaborative work. The Jigsaw Technique is based on collaborative learning and everyone in the group must play their part because each student's mark depends on the other students. Home group members are given the responsibility to become experts in a specific area of knowledge. Experts meet together to reach an agreement and improve skills. Finally, experts return to their home groups to share all their findings. The aim of this study was to evaluate nursing student satisfaction with the Jigsaw Technique used in the context of a compulsory course in research methods for nursing. A cross-sectional study was conducted using a self-administered anonymous questionnaire administered to students who completed the Research Methods course during the 2012-13 and 2013-14 academic years. The questionnaire was developed taking into account the learning objectives, competencies and skills that should be acquired by students, as described in the course syllabus. The responses were compared by age group (younger or older than 22years). A total of 89.6% of nursing students under 22years believed that this methodology helped them to develop teamwork, while this figure was 79.6% in older students. Nursing students also believed it helped them to work independently, with differences according to age, 79.7% and 58% respectively (p=0.010). Students disagreed with the statement "The Jigsaw Technique involves little workload", with percentages of 88.5% in the group under 22years and 80% in older students. Most believed that this method should not be employed in upcoming courses, although there were differences by age, with 44.3% of the younger group being against and 62% of the older group (p=0.037). The method was not highly valued by students, mainly by those older than 22years, who concluded that they did not learn

  19. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  20. Numerical solution of large nonlinear boundary value problems by quadratic minimization techniques

    International Nuclear Information System (INIS)

    Glowinski, R.; Le Tallec, P.

    1984-01-01

    The objective of this paper is to describe the numerical treatment of large highly nonlinear two or three dimensional boundary value problems by quadratic minimization techniques. In all the different situations where these techniques were applied, the methodology remains the same and is organized as follows: 1) derive a variational formulation of the original boundary value problem, and approximate it by Galerkin methods; 2) transform this variational formulation into a quadratic minimization problem (least squares methods) or into a sequence of quadratic minimization problems (augmented lagrangian decomposition); 3) solve each quadratic minimization problem by a conjugate gradient method with preconditioning, the preconditioning matrix being sparse, positive definite, and fixed once for all in the iterative process. This paper will illustrate the methodology above on two different examples: the description of least squares solution methods and their application to the solution of the unsteady Navier-Stokes equations for incompressible viscous fluids; the description of augmented lagrangian decomposition techniques and their application to the solution of equilibrium problems in finite elasticity

  1. Waste Package Design Methodology Report

    Energy Technology Data Exchange (ETDEWEB)

    D.A. Brownson

    2001-09-28

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report.

  2. Waste Package Design Methodology Report

    International Nuclear Information System (INIS)

    D.A. Brownson

    2001-01-01

    The objective of this report is to describe the analytical methods and processes used by the Waste Package Design Section to establish the integrity of the various waste package designs, the emplacement pallet, and the drip shield. The scope of this report shall be the methodology used in criticality, risk-informed, shielding, source term, structural, and thermal analyses. The basic features and appropriateness of the methods are illustrated, and the processes are defined whereby input values and assumptions flow through the application of those methods to obtain designs that ensure defense-in-depth as well as satisfy requirements on system performance. Such requirements include those imposed by federal regulation, from both the U.S. Department of Energy (DOE) and U.S. Nuclear Regulatory Commission (NRC), and those imposed by the Yucca Mountain Project to meet repository performance goals. The report is to be used, in part, to describe the waste package design methods and techniques to be used for producing input to the License Application Report

  3. The digital subtraction technique in lateral urethrocystography. Die laterale Urethrozystographie in digitaler Subtraktionstechnik

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, K. (Radiologische Abt., Caritas-Krankenhaus, Bad Mergentheim (Germany)); Peterseim, H. (Frauenklinik, Caritas-Krankenhaus, Bad Mergentheim (Germany)); Grehn, S. (Radiologische Abt., Caritas-Krankenhaus, Bad Mergentheim (Germany))

    1994-09-01

    The first application of the digital subtraction technique to lateral urethrocystography is described. This methodology facilitates gynecological-urological operations by providing reliable investigation results and an unambiguous image interpretation. Methods and first experiences with the digital subtraction technique in gynecological-urological diagnosis are reported. (orig.)

  4. A methodology for on line fatigue life monitoring : rainflow cycle counting method

    International Nuclear Information System (INIS)

    Mukhopadhyay, N.K.; Dutta, B.K.; Kushwaha, H.S.

    1992-01-01

    Green's function technique is used in on line fatigue life monitoring to convert plant data to stress versus time data. This technique converts plant data most efficiently to stress versus time data. To compute the fatigue usage factor the actual number of cycles experienced by the component is to be found out from stress versus time data. Using material fatigue properties the fatigue usage factor is to be computed from the number of cycles. Generally the stress response is very irregular in nature. To convert an irregular stress history to stress frequency spectra rainflow cycle counting method is used. This method is proved to be superior to other counting methods and yields best fatigue estimates. A code has been developed which computes the number of cycles experienced by the component from stress time history using rainflow cycle counting method. This postprocessor also computes the accumulated fatigue usage factor from material fatigue properties. The present report describes the development of a code to compute fatigue usage factor using rainflow cycle counting technique and presents a real life case study. (author). 10 refs., 10 figs

  5. Development of a novel methodology for indoor emission source identification

    DEFF Research Database (Denmark)

    Han, K.H.; Zhang, J.S.; Knudsen, H.N.

    2011-01-01

    The objective of this study was to develop and evaluate a methodology to identify individual sources of emissions based on the measurements of mixed air samples and the emission signatures of individual materials previously determined by Proton Transfer Reaction-Mass Spectrometry (PTR-MS), an on......-line analytical device. The methodology based on signal processing principles was developed by employing the method of multiple regression least squares (MRLS) and a normalization technique. Samples of nine typical building materials were tested individually and in combination, including carpet, ceiling material...... experiments and investigation are needed for cases where the relative emission rates among different compounds may change over a long-term period....

  6. A methodology to measure the degre of managerial innovation

    Directory of Open Access Journals (Sweden)

    Mustafa Batuhan Ayhan

    2014-01-01

    Full Text Available Purpose: The main objective of this study is to introduce the concept of managerial innovation and to propose a quantitative methodology to measure the degree of managerial innovation capability by analyzing the evolution of the techniques used for management functions.Design/methodology/approach: The methodology mainly focuses on the different techniques used for each management functions namely; Planning, Organizing, Leading, Controlling and Coordinating. These functions are studied and the different techniques used for them are listed. Since the techniques used for these management functions evolve in time due to technological and social changes, a methodology is required to measure the degree of managerial innovation capability. This competency is measured through an analysis performed to point out which techniques used for each of these functions.Findings: To check the validity and applicability of this methodology, it is implemented to a manufacturing company. Depending on the results of the implementation, enhancements are suggested to the company for each function to survive in the changing managerial conditionsResearch limitations/implications: The primary limitation of this study is the implementation area. Although the study is implemented in just a single manufacturing company, it is welcomed to apply the same methodology to measure the managerial innovation capabilities of other manufacturing companies. Moreover, the model is ready to be adapted to different sectors although it is mainly prepared for manufacturing sector.Originality/value: Although innovation management is widely studied, managerial innovation is a new concept and introduced to measure the capability to challenge the changes occur in managerial functions. As a brief this methodology aims to be a pioneer in the field of managerial innovation regarding the evolution of management functions. Therefore it is expected to lead more studies to inspect the progress of

  7. SEMANTIC NETWORKS: THEORETICAL, TECHNICAL, METHODOLOGIC AND ANALYTICAL ASPECTS

    Directory of Open Access Journals (Sweden)

    José Ángel Vera Noriega

    2005-09-01

    Full Text Available This work is a review of the methodological procedures and cares for the measurement of the connotative meanings which will be used in the elaboration of instruments with ethnic validity. Beginning from the techniques originally proposed by Figueroa et al. (1981 and later described by Lagunes (1993, the intention is to offer a didactic panorama to carry out the measurement by semantic networks introducing some recommendations derived from the studies performed with this method.

  8. Variance Reduction Techniques in Monte Carlo Methods

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.

    2010-01-01

    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the

  9. Empowerment methods and techniques for sport managers

    Directory of Open Access Journals (Sweden)

    THANOS KRIEMADIS

    2006-01-01

    Full Text Available We live in a globalize economic, social and technological environment where organizations can be successful only if they have required resources (material resources, facilities and equipment, and human resources. The managers and the organizations should empower and enable employees to accomplish their work in meaningful ways. Empowerment has been described as a means to enable employees to make decisions and as a personal phenomenon where individuals take responsibility for their own actions. The aim of the present study was to present effective methods and techniques of employee empowerment which constitute for the organization a source of competitive advantage. The paper will present and explain empowerment methods and techniques such as: (a organizational culture, (b vision statements, (c organizational values, (d teamwork, (e the role of manager - leadership, (f devolving responsibility accountability, (g information sharing, (h continuous training, (i appraisal rewards, (j goal setting, and (k performance appraisal process.

  10. Applying Qualitative Research Methods to Narrative Knowledge Engineering

    OpenAIRE

    O'Neill, Brian; Riedl, Mark

    2014-01-01

    We propose a methodology for knowledge engineering for narrative intelligence systems, based on techniques used to elicit themes in qualitative methods research. Our methodology uses coding techniques to identify actions in natural language corpora, and uses these actions to create planning operators and procedural knowledge, such as scripts. In an iterative process, coders create a taxonomy of codes relevant to the corpus, and apply those codes to each element of that corpus. These codes can...

  11. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools

    Science.gov (United States)

    Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)

    2002-01-01

    This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.

  12. Digital processing methodology applied to exploring of radiological images

    International Nuclear Information System (INIS)

    Oliveira, Cristiane de Queiroz

    2004-01-01

    In this work, digital image processing is applied as a automatic computational method, aimed for exploring of radiological images. It was developed an automatic routine, from the segmentation and post-processing techniques to the radiology images acquired from an arrangement, consisting of a X-ray tube, target and filter of molybdenum, of 0.4 mm and 0.03 mm, respectively, and CCD detector. The efficiency of the methodology developed is showed in this work, through a case study, where internal injuries in mangoes are automatically detected and monitored. This methodology is a possible tool to be introduced in the post-harvest process in packing houses. A dichotomic test was applied to evaluate a efficiency of the method. The results show a success of 87.7% to correct diagnosis and 12.3% to failures to correct diagnosis with a sensibility of 93% and specificity of 80%. (author)

  13. Deliverable 4.1 Homogeneous LCA methodology agreed by NEPTUNE and INNOWATECH

    DEFF Research Database (Denmark)

    Larsen, Henrik Fred; Hauschild, Michael Zwicky; Wenzel, Henrik

    2007-01-01

    In order to do a life cycle assessment (LCA) of a waste water treatment technique, a system to handle the mapped inventory data and a life cycle impact assessment (LCIA) method/model is needed. Besides NEPTUNE, another EU-funded project has the same methodology need namely INNOWATECH (contract No....... 036882) running in parallel with NEPTUNE but focusing on industrial waste water. With the aim of facilitating cooperation between the two projects a common LCA methodology framework has been worked out and is described in the following. This methodology work has been done as a joint effort between...... NEPTUNE WP4 and INNOWATECH WP4 represented by the WP4 lead partner IVL. The aim of the co-operation is to establish common methodologies and/or LCA models and/or tools in order to achieve a homogenous approach in INNOWATECH and NEPTUNE. Further, the aim is to facilitate possibilities of data exchange...

  14. Soft Systems Methodology

    Science.gov (United States)

    Checkland, Peter; Poulter, John

    Soft systems methodology (SSM) is an approach for tackling problematical, messy situations of all kinds. It is an action-oriented process of inquiry into problematic situations in which users learn their way from finding out about the situation, to taking action to improve it. The learning emerges via an organised process in which the situation is explored using a set of models of purposeful action (each built to encapsulate a single worldview) as intellectual devices, or tools, to inform and structure discussion about a situation and how it might be improved. This paper, written by the original developer Peter Checkland and practitioner John Poulter, gives a clear and concise account of the approach that covers SSM's specific techniques, the learning cycle process of the methodology and the craft skills which practitioners develop. This concise but theoretically robust account nevertheless includes the fundamental concepts, techniques, core tenets described through a wide range of settings.

  15. Situating methodology within qualitative research.

    Science.gov (United States)

    Kramer-Kile, Marnie L

    2012-01-01

    Qualitative nurse researchers are required to make deliberate and sometimes complex methodological decisions about their work. Methodology in qualitative research is a comprehensive approach in which theory (ideas) and method (doing) are brought into close alignment. It can be difficult, at times, to understand the concept of methodology. The purpose of this research column is to: (1) define qualitative methodology; (2) illuminate the relationship between epistemology, ontology and methodology; (3) explicate the connection between theory and method in qualitative research design; and 4) highlight relevant examples of methodological decisions made within cardiovascular nursing research. Although there is no "one set way" to do qualitative research, all qualitative researchers should account for the choices they make throughout the research process and articulate their methodological decision-making along the way.

  16. Leakage localisation method in a water distribution system based on sensitivity matrix: methodology and real test

    OpenAIRE

    Pascual Pañach, Josep

    2010-01-01

    Leaks are present in all water distribution systems. In this paper a method for leakage detection and localisation is presented. It uses pressure measurements and simulation models. Leakage localisation methodology is based on pressure sensitivity matrix. Sensitivity is normalised and binarised using a common threshold for all nodes, so a signatures matrix is obtained. A pressure sensor optimal distribution methodology is developed too, but it is not used in the real test. To validate this...

  17. Study of possibility using LANL PSA-methodology for accident probability RBMK researches

    International Nuclear Information System (INIS)

    Petrin, S.V.; Yuferev, V.Y.; Zlobin, A.M.

    1995-01-01

    The reactor facility probabilistic safety analysis methodologies are considered which are used at U.S. LANL and RF NIKIET. The methodologies are compared in order to reveal their similarity and differences, determine possibilities of using the LANL technique for RBMK type reactor safety analysis. It is found that at the PSA-1 level the methodologies practically do not differ. At LANL the PHA, HAZOP hazards analysis methods are used for more complete specification of the accounted initial event list which can be also useful at performance of PSA for RBMK. Exchange of information regarding the methodology of detection of dependent faults and consideration of human factor impact on reactor safety is reasonable. It is accepted as useful to make a comparative study result analysis for test problems or PSA fragments using various computer programs employed at NIKIET and LANL

  18. Volatility in the California power market: source, methodology and recommendations

    International Nuclear Information System (INIS)

    Dahlgren, R.W.; Liu, C.-C.; Lawarree, J.

    2001-01-01

    Extreme short-term price volatility in competitive electricity markets creates the need for price risk management for electric utilities. Recent methods in California provide examples of lessons that can be applied to other markets worldwide. Value-at-Risk (VAR), a method for quantifying risk exposure in the financial industry, is introduced as a technique that is applicable to quantifying price risk exposure in power systems. The methodology for applying VAR using changes in prices from corresponding hours on previous periods to understand how the hourly VAR entity is exposed when the power system is obligated to serve a load and does not have a contract for supply. The VAR methodology introduced is then applied to a sample company in California that is serving a 100 MW load. Proposed remedies for the problems observed in the competitive California electric power industry are introduced. (Author)

  19. A fresh recipe for designers: HCI approach to explore the nexus between design techniques and formal methods in software development

    Directory of Open Access Journals (Sweden)

    Julian Galindo Losada

    2016-11-01

    Full Text Available Emerging companies involved in design and implementation of innovative products demand multidisciplinary teams to be competitive in the market. This need mainly exposes designers to extend their knowledge not only in User Interface elements of the design process but also in software methodologies to cover the lack of resources and expertise in start-ups. It raises the question of how designers can line up HCI techniques with best practices in software development while preserving usability and easy-to-use principles. To explore this gap, this paper proposes an approach which combines existing technology and methods by studying the nexus between HCI prototyping and software engineering. The approach is applied into a case study in the design of a virtual shop harmonizing the use of storyboards and the spiral. A comprehensive analysis is performed by using a Technology acceptance model (TAM regarding with two variables: usability and easy-to-use. The present finding underlines the positive integration of HCI techniques and formal methods without compromising user satisfaction with a potential benefit for small companies in a formation stage.

  20. Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches

    Science.gov (United States)

    Farassat, Fereidoun; Casper, Jay H.

    2006-01-01

    In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.

  1. Research methods for English language teachers

    CERN Document Server

    McDonough, Jo

    2014-01-01

    This book offers a lively introduction to the research methods and techniques available to English language teachers who wish to investigate aspects of their own practice. It covers qualitative and quantitative methodology and includes sections on observation, introspection, diary studies, experiments, interviews, questionnaires, numerical techniques and case study research. Each method is illustrated with examples in language teaching contexts, and techniques of data collection and analysis are introduced. The authors focus particularly on research in the classroom, on tests, materials, the

  2. Investigating surety methodologies for cognitive systems.

    Energy Technology Data Exchange (ETDEWEB)

    Caudell, Thomas P. (University of New Mexico, Albuquerque, NM); Peercy, David Eugene; Mills, Kristy (University of New Mexico, Albuquerque, NM); Caldera, Eva (University of New Mexico, Albuquerque, NM)

    2006-11-01

    Advances in cognitive science provide a foundation for new tools that promise to advance human capabilities with significant positive impacts. As with any new technology breakthrough, associated technical and non-technical risks are involved. Sandia has mitigated both technical and non-technical risks by applying advanced surety methodologies in such areas as nuclear weapons, nuclear reactor safety, nuclear materials transport, and energy systems. In order to apply surety to the development of cognitive systems, we must understand the concepts and principles that characterize the certainty of a system's operation as well as the risk areas of cognitive sciences. This SAND report documents a preliminary spectrum of risks involved with cognitive sciences, and identifies some surety methodologies that can be applied to potentially mitigate such risks. Some potential areas for further study are recommended. In particular, a recommendation is made to develop a cognitive systems epistemology framework for more detailed study of these risk areas and applications of surety methods and techniques.

  3. Methodological issues in the study of risk perception and human behavior

    International Nuclear Information System (INIS)

    Rathbun, P.F.

    1983-01-01

    The purpose of this paper is to provide a broad perspective on the use of the methods and techniques of the behavioral and social sciences as they pertain to the work of the Nuclear Regulatory Commission, particularly in issues of risk perception. Four major topics or themes are discussed: (1) a brief overview of the classic theories of risk perception; (2) current contractor work in the area of risk perception and cognitive psychology; (3) other uses of the social and behavioral sciences in the Agency; and (4) methodological considerations in using the techniques

  4. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More Than Qualitative Methods.

    Science.gov (United States)

    Bowleg, Lisa

    2017-10-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with this premise, I address four themes in this commentary. First, I criticize the ubiquitous and uncritical use of the term health disparities in U.S. public health. Next, I advocate for the increased use of qualitative methodologies-namely, photovoice and critical ethnography-that, pursuant to critical approaches, prioritize dismantling social-structural inequities as a prerequisite to health equity. Thereafter, I discuss epistemological stance and its influence on all aspects of the research process. Finally, I highlight my critical discourse analysis HIV prevention research based on individual interviews and focus groups with Black men, as an example of a critical health equity research approach.

  5. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  6. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls – A review

    International Nuclear Information System (INIS)

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J. Pablo; Garcia-Jares, Carmen

    2016-01-01

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005–2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. - Highlights:

  7. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls – A review

    Energy Technology Data Exchange (ETDEWEB)

    Lores, Marta, E-mail: marta.lores@usc.es; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J. Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005–2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. - Highlights:

  8. Validation of methodologies for the analysis of lead and methyl-ether in gasoline, using the techniques of atomic emission with plasma source coupled inductively and micellar liquid chromatography

    International Nuclear Information System (INIS)

    Redondo Escalante, M.

    1995-01-01

    This study established and optimized the experimental variables for the lead quantization through the Icp-Aes technique, in aqueous media. A comparative study of several proposal methods, that appears in the literature for the extraction in aqueous media of the lead in gasoline was made. It determined that it is not possible, to make this procedure using the reaction of hydrolysis of tetraethyl lead. The op tim conditions were established, for the lead quantization in gasoline, using methyl-isobutyl-ketone and also ethanol as dis solvents. The conditions of the proposed methodologies were optimized, and the variables of analytical performance were defined. It was demonstrated, that it is possible to prepare lead dissolution patterns, in organic media, starting from inorganic salts of this metal. The techniques of chromatography of gases and of liquid chromatography of high pressure, in the analysis of methyl-ter butyl-ether (Mtbe), were compared. It demonstrated that it is possible, to quantize the Mtbe through the HPLC technique, and it found that the 'micellar' liquid chromatography. (author) [es

  9. Methodological optimization of tinnitus assessment using prepulse inhibition of the acoustic startle reflex.

    Science.gov (United States)

    Longenecker, R J; Galazyuk, A V

    2012-11-16

    Recently prepulse inhibition of the acoustic startle reflex (ASR) became a popular technique for tinnitus assessment in laboratory animals. This method confers a significant advantage over the previously used time-consuming behavioral approaches utilizing basic mechanisms of conditioning. Although this technique has been successfully used to assess tinnitus in different laboratory animals, many of the finer details of this methodology have not been described enough to be replicated, but are critical for tinnitus assessment. Here we provide detail description of key procedures and methodological issues that provide guidance for newcomers with the process of learning to correctly apply gap detection techniques for tinnitus assessment in laboratory animals. The major categories of these issues include: refinement of hardware for best performance, optimization of stimulus parameters, behavioral considerations, and identification of optimal strategies for data analysis. This article is part of a Special Issue entitled: Tinnitus Neuroscience. Copyright © 2012. Published by Elsevier B.V.

  10. Methodological issues affecting the study of fish parasites. II. Sampling method affects ectoparasite studies

    Czech Academy of Sciences Publication Activity Database

    Kvach, Yuriy; Ondračková, Markéta; Janáč, Michal; Jurajda, Pavel

    2016-01-01

    Roč. 121, č. 1 (2016), s. 59-66 ISSN 0177-5103 R&D Projects: GA ČR GBP505/12/G112 Institutional support: RVO:68081766 Keywords : Parasite community * Fish sampling method * Methodology * Parasitological examination * Rutilus rutilus Subject RIV: EG - Zoology Impact factor: 1.549, year: 2016

  11. Methodological proposal for studying suicide as a complex phenomenon

    Directory of Open Access Journals (Sweden)

    Minayo Maria Cecília de Souza

    2006-01-01

    Full Text Available The authors present a methodological proposal for studying suicide and suicide attempts from a combined socio-anthropological, epidemiological, and psychosocial perspective. This interdisciplinary and complex research model simultaneously examined individual, socioeconomic, historical/cultural, and population data as few studies have succeeded to date. Considering that the present study was conducted in a specific social reality, the authors created a methodological approach to comprehend the effects of a crisis in an industrial restructuring process in a mining company town in the State of Minas Gerais, Brazil, that was associated with unusually high suicide rates. Since it referred to a small geographic area (with only 100,000 inhabitants, the research is considered an ideal case study. The authors created different strategies to trace the local epidemiological profile, adapted a psychosocial autopsy technique to elucidate suicide cases and a psychosocial harm assessment technique to comprehend suicide attempts, and conducted a local analysis of the socio-cultural context. The methods proposed here (with advantages and limitations proved productive for elucidating the study hypothesis.

  12. Methodologies and Methods for User Behavioral Research.

    Science.gov (United States)

    Wang, Peiling

    1999-01-01

    Discusses methodological issues in empirical studies of information-related behavior in six specific research areas: information needs and uses; information seeking; relevance judgment; online searching (including online public access catalog, online database, and the Web); human-system interactions; and reference transactions. (Contains 191…

  13. A Method to Extract the Intrinsic Mechanical Properties of Soft Metallic Thin Films Based on Nanoindentation Continuous Stiffness Measurement Technique

    International Nuclear Information System (INIS)

    Zhou, X Y; Jiang, Z D; Wang, H R; Zhu, Q

    2006-01-01

    In order to determine accurately the intrinsic hardness of the soft metallic thin film on a hard substrate using nanoindentation, a proper methodology irrespective of several important effects the Oliver-Pharr method concerns is described. First, the original analysis data such as the load, P, and contact stiffness, S, as a function of the indentation depth, h, are acquired by means of the continuous stiffness measurement (CSM) technique. By CSM, the complicating effects including indentation creep behaviour of metal materials as well as thermal drift on the measured results are avoided effectively. Then, the hardness of film-only is calculated via a material characteristic parameter, P/S 2 , which is independent of the contact area, A, based on the constant modulus assumption method. In this way, the influences of the substrate contribution and material pile-up behaviour needn't be accounted for. Guided by above ideas, moreover, a 504 nm Au film on the glass substrate system was chosen to study. The results show that the hardness of Au thin film is 1.6±1 GPa, which agree well with the literature. While the composite hardness measured by Oliver-Pharr method is between 2∼3GPa, obviously, which is overestimated. This implies the present methodology is a more accurate and simple way for extracting the true hardness of the soft metallic thin films

  14. Proposal of a method for evaluating tsunami risk using response-surface methodology

    Science.gov (United States)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response

  15. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  16. Innovative research methods for studying treatments for rare diseases: methodological review.

    Science.gov (United States)

    Gagne, Joshua J; Thompson, Lauren; O'Keefe, Kelly; Kesselheim, Aaron S

    2014-11-24

    To examine methods for generating evidence on health outcomes in patients with rare diseases. Methodological review of existing literature. PubMed, Embase, and Academic Search Premier searched for articles describing innovative approaches to randomized trial design and analysis methods and methods for conducting observational research in patients with rare diseases. We assessed information related to the proposed methods, the specific rare disease being studied, and outcomes from the application of the methods. We summarize methods with respect to their advantages in studying health outcomes in rare diseases and provide examples of their application. We identified 46 articles that proposed or described methods for studying patient health outcomes in rare diseases. Articles covered a wide range of rare diseases and most (72%) were published in 2008 or later. We identified 16 research strategies for studying rare disease. Innovative clinical trial methods minimize sample size requirements (n=4) and maximize the proportion of patients who receive active treatment (n=2), strategies crucial to studying small populations of patients with limited treatment choices. No studies describing unique methods for conducting observational studies in patients with rare diseases were identified. Though numerous studies apply unique clinical trial designs and considerations to assess patient health outcomes in rare diseases, less attention has been paid to innovative methods for studying rare diseases using observational data. © Gagne et al 2014.

  17. Methodological Challenges in Sustainability Science: A Call for Method Plurality, Procedural Rigor and Longitudinal Research

    Directory of Open Access Journals (Sweden)

    Henrik von Wehrden

    2017-02-01

    Full Text Available Sustainability science encompasses a unique field that is defined through its purpose, the problem it addresses, and its solution-oriented agenda. However, this orientation creates significant methodological challenges. In this discussion paper, we conceptualize sustainability problems as wicked problems to tease out the key challenges that sustainability science is facing if scientists intend to deliver on its solution-oriented agenda. Building on the available literature, we discuss three aspects that demand increased attention for advancing sustainability science: 1 methods with higher diversity and complementarity are needed to increase the chance of deriving solutions to the unique aspects of wicked problems; for instance, mixed methods approaches are potentially better suited to allow for an approximation of solutions, since they cover wider arrays of knowledge; 2 methodologies capable of dealing with wicked problems demand strict procedural and ethical guidelines, in order to ensure their integration potential; for example, learning from solution implementation in different contexts requires increased comparability between research approaches while carefully addressing issues of legitimacy and credibility; and 3 approaches are needed that allow for longitudinal research, since wicked problems are continuous and solutions can only be diagnosed in retrospect; for example, complex dynamics of wicked problems play out across temporal patterns that are not necessarily aligned with the common timeframe of participatory sustainability research. Taken together, we call for plurality in methodologies, emphasizing procedural rigor and the necessity of continuous research to effectively addressing wicked problems as well as methodological challenges in sustainability science.

  18. Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions.

    Science.gov (United States)

    Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-29

    Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a

  19. The intersections between TRIZ and forecasting methodology

    Directory of Open Access Journals (Sweden)

    Georgeta BARBULESCU

    2010-12-01

    Full Text Available The authors’ intention is to correlate the basic knowledge in using the TRIZ methodology (Theory of Inventive Problem Solving or in Russian: Teoriya Resheniya Izobretatelskikh Zadatch as a problem solving tools meant to help the decision makers to perform more significant forecasting exercises. The idea is to identify the TRIZ features and instruments (40 inventive principles, i.e. for putting in evidence the noise and signal problem, for trend identification (qualitative and quantitative tendencies and support tools in technological forecasting, to make the decision-makers able to refine and to increase the level of confidence in the forecasting results. The interest in connecting TRIZ to forecasting methodology, nowadays, relates to the massive application of TRIZ methods and techniques for engineering system development world-wide and in growing application of TRIZ’s concepts and paradigms for improvements of non-engineering systems (including the business and economic applications.

  20. The fairness of the PPS reimbursement methodology.

    Science.gov (United States)

    Gianfrancesco, F D

    1990-01-01

    In FY 1984 the Medicare program implemented a new method of reimbursing hospitals for inpatient services, the Prospective Payment System (PPS). Under this system, hospitals are paid a predetermined amount per Medicare discharge, which varies according to certain patient and hospital characteristics. This article investigates the presence of systematic biases and other potential imperfections in the PPS reimbursement methodology as revealed by its effects on Medicare operating ratios. The study covers the first three years of the PPS (approximately 1984-1986) and is based on hospital data from the Medicare cost reports and other related sources. Regression techniques were applied to these data to determine how Medicare operating ratios were affected by specific aspects of the reimbursement methodology. Several possible imbalances were detected. The potential undercompensation relating to these can be harmful to certain classes of hospitals and to the Medicare populations that they serve. PMID:2109738

  1. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Directory of Open Access Journals (Sweden)

    Vatutin Eduard

    2017-12-01

    Full Text Available The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  2. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Science.gov (United States)

    Vatutin, Eduard

    2017-12-01

    The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  3. A comparison of partial order technique with three methods of multi-criteria analysis for ranking of chemical substances.

    Science.gov (United States)

    Lerche, Dorte; Brüggemann, Rainer; Sørensen, Peter; Carlsen, Lars; Nielsen, Ole John

    2002-01-01

    An alternative to the often cumbersome and time-consuming risk assessments of chemical substances could be more reliable and advanced priority setting methods. An elaboration of the simple scoring methods is provided by Hasse Diagram Technique (HDT) and/or Multi-Criteria Analysis (MCA). The present study provides an in depth evaluation of HDT relative to three MCA techniques. The new and main methodological step in the comparison is the use of probability concepts based on mathematical tools such as linear extensions of partially ordered sets and Monte Carlo simulations. A data set consisting of 12 High Production Volume Chemicals (HPVCs) is used for illustration. It is a paradigm in this investigation to claim that the need of external input (often subjective weightings of criteria) should be minimized and that the transparency should be maximized in any multicriteria prioritisation. The study illustrates that the Hasse diagram technique (HDT) needs least external input, is most transparent and is least subjective. However, HDT has some weaknesses if there are criteria which exclude each other. Then weighting is needed. Multi-Criteria Analysis (i.e. Utility Function approach, PROMETHEE and concordance analysis) can deal with such mutual exclusions because their formalisms to quantify preferences allow participation e.g. weighting of criteria. Consequently MCA include more subjectivity and loose transparency. The recommendation which arises from this study is that the first step in decision making is to run HDT and as the second step possibly is to run one of the MCA algorithms.

  4. Applying Mixed Methods Techniques in Strategic Planning

    Science.gov (United States)

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  5. Combining heuristic and statistical techniques in landslide hazard assessments

    Science.gov (United States)

    Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni

    2014-05-01

    As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.

  6. Practical implementation of a methodology for digital images authentication using forensics techniques

    OpenAIRE

    Francisco Rodríguez-Santos; Guillermo Delgado-Gutierréz; Leonardo Palacios-Luengas; Rubén Vázquez Medina

    2015-01-01

    This work presents a forensics analysis methodology implemented to detect modifications in JPEG digital images by analyzing the image’s metadata, thumbnail, camera traces and compression signatures. Best practices related with digital evidence and forensics analysis are considered to determine if the technical attributes and the qualities of an image are consistent with each other. This methodology is defined according to the recommendations of the Good Practice Guide for Computer-Based Elect...

  7. New design methods for computer aided architecturald design methodology teaching

    NARCIS (Netherlands)

    Achten, H.H.

    2003-01-01

    Architects and architectural students are exploring new ways of design using Computer Aided Architectural Design software. This exploration is seldom backed up from a design methodological viewpoint. In this paper, a design methodological framework for reflection on innovate design processes by

  8. Teaching Research Methodology Using a Project-Based Three Course Sequence Critical Reflections on Practice

    Science.gov (United States)

    Braguglia, Kay H.; Jackson, Kanata A.

    2012-01-01

    This article presents a reflective analysis of teaching research methodology through a three course sequence using a project-based approach. The authors reflect critically on their experiences in teaching research methods courses in an undergraduate business management program. The introduction of a range of specific techniques including student…

  9. Meta-Analytical Studies in Transport Economics. Methodology and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Brons, M.R.E.

    2006-05-18

    Vast increases in the external costs of transport in the late twentieth century have caused national and international governmental bodies to worry about the sustainability of their transport systems. In this thesis we use meta-analysis as a research method to study various topics in transport economics that are relevant for sustainable transport policymaking. Meta-analysis is a research methodology that is based on the quantitative summarisation of a body of previously documented empirical evidence. In several fields of economic, meta-analysis has become a well-accepted research tool. Despite the appeal of the meta-analytical approach, there are methodological difficulties that need to be acknowledged. We study a specific methodological problem which is common in meta-analysis in economics, viz., within-study dependence caused by multiple sampling techniques. By means of Monte Carlo analysis we investigate the effect of such dependence on the performance of various multivariate estimators. In the applied part of the thesis we use and develop meta-analytical techniques to study the empirical variation in indicators of the price sensitivity of demand for aviation transport, the price sensitivity of demand for gasoline, the efficiency of urban public transport and the valuation of the external costs of noise from rail transport. We focus on the estimation of mean values for these indicators and on the identification of the impact of conditioning factors.

  10. Radiological Characterization Methodology for INEEL-Stored Remote-Handled Transuranic (RH TRU) Waste from Argonne National Laboratory-East

    International Nuclear Information System (INIS)

    Kuan, P.; Bhatt, R.N.

    2003-01-01

    An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-based characterization methodology offers a superior alternative to traditional waste assay techniques. Using the methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits

  11. Evaluation of the effectiveness of the three-dimensional residual stresses method based on the eigenstrain methodology via x-ray measurements

    International Nuclear Information System (INIS)

    Ogawa, Masaru; Ishii, Takehiro; Furusako, Seiji

    2015-01-01

    In order to prevent fractures caused by fatigue or stress corrosion cracking in welded structures, it is important to predict crack propagation for cracks observed during in-service inspections. However, it is difficult to evaluate three-dimensional welding residual stresses non-destructively. Today, it is possible to measure residual stresses just on surface by X-ray diffraction. Neutron diffraction makes it possible to measure welding residual stresses non-destructively even in the thickness direction but it is only available in special irradiation facilities. Therefore, it is impossible to use neutron diffraction as an on-site measurement technique. As non-destructive method of three-dimensional welding residual stresses based on the eigenstrain methodology, the bead flush method has been proposed. In this method, three-dimensional welding residual stresses are calculated by an elastic FEM (Finite Element Method) analysis from eigenstrain distributions which are estimated by an inverse analysis from released strains by strain gauges in the removal of the weld reinforcement. Here, the removal of the excess metal contributes inhibition of crack initiation. Therefore, the bead flush method is a non-destructive technique essentially. However, estimation accuracy of this method becomes relatively poor when processing strains are added on the machined surface. The first author has been developed the bead flush method to be free from the influence of the processing strains. In this method, eigenstrains are estimated not from released strains but from residual strains on surface by X-ray diffraction. In this study, welding residual stresses on the bottom surface in an actual welded plate are estimated from elastic strains measured on the top surface using this method. To evaluate estimation accuracy, estimated residual stresses on the bottom surface are compared with residual stresses measured by X-ray diffraction. Here, eigenstrain distributions not only in the welding

  12. Emission computed tomography: methodology and applications

    International Nuclear Information System (INIS)

    Reivich, M.; Alavi, A.; Greenberg, J.; Fowler, J.; Christman, D.; Rosenquist, A.; Rintelmann, W.; Hand, P.; MacGregor, R.; Wolf, A.

    1980-01-01

    A technique for the determination of local cerebral glucose metabolism using positron emission computed tomography is described as an example of the development of use of this methodology for the study of these parameters in man. The method for the determination of local cerebral glucose metabolism utilizes 18 F-2-fluoro-2-deoxyglucose ([ 18 F]-FDG). In this method [ 18 F]-FDG is used as a tracer for the exchange of glucose between plasma and brain and its phosphorylation by hexokinase in the tissue. The labelled product of metabolism, [ 18 F]-FDG phosphate, is essentially trapped in the tissue over the time course of the measurement. The studies demonstrate the potential usefulness of emission computed tomography for the measurement of various biochemical and physiological parameters in man. (Auth.)

  13. Methods and experimental techniques in computer engineering

    CERN Document Server

    Schiaffonati, Viola

    2014-01-01

    Computing and science reveal a synergic relationship. On the one hand, it is widely evident that computing plays an important role in the scientific endeavor. On the other hand, the role of scientific method in computing is getting increasingly important, especially in providing ways to experimentally evaluate the properties of complex computing systems. This book critically presents these issues from a unitary conceptual and methodological perspective by addressing specific case studies at the intersection between computing and science. The book originates from, and collects the experience of, a course for PhD students in Information Engineering held at the Politecnico di Milano. Following the structure of the course, the book features contributions from some researchers who are working at the intersection between computing and science.

  14. An Investigation of Science Teachers’ Teaching Methods and Techniques: Amasya Case

    Directory of Open Access Journals (Sweden)

    Orhan KARAMUSTAFAOĞLU

    2014-10-01

    Full Text Available The purpose of this study is to determine the methods and techniques science teachers mostly employ in their classrooms. To collect data, the researchers employed a survey with 60 science teachers and randomly selected 6 of them to observe these selected teachers in real classroom situation. Furthermore, the researchers invited 154 students taught by the selected 6 teachers in this study, for focus group interviewing. After analyzing the collected data, the researchers found that teachers in this study 1 were more likely to use narrative method, 2 supported their teaching with question and answer, demonstration, case study, and problem solving methods and techniques, and 3 rarely employed student centered discussion, laboratory practice, role playing and project-based learning methods in their classroom. Consequently, there exist some differences between theory and practice regarding teaching methods and techniques of teachers in this study.

  15. Methodological peculiarities of the Braun-Blanquet method used for marine bottom vegetation classification

    Directory of Open Access Journals (Sweden)

    AFANASYEV Dmitry F.

    2012-09-01

    Full Text Available The features of the Brown Blanquet method application for the classification of the Black and Azov Seas bottom phytocenoses are discussed. Special attention is given to following methodological questions: term of observations, necessary for associations revealing, size of relevance area, the features of geobotanic al underwater exploration technology, description of the bottom communities with epiphytes and the peculiarities of bottom vegetation syntaxonomic analysis.

  16. Methodology of shooting training using modern IT techniques

    Science.gov (United States)

    Gudzbeler, Grzegorz; Struniawski, Jarosław

    2017-08-01

    Mastering, improvement, shaping and preservation of skills of safe, efficient and effective use of the firearm requires the use of relevant methodology of conducting the shooting training. However reality of police trainings does not usually allow for intensive training shooting with the use of ammunition. An alternative solution is the use of modern training technologies. Example of this is the "Virtual system of improvement tactics of intervention services responsible for security and shooting training." Introduction of stimulator to police trainings will enable complete stuff preparation to achieve its tasks, creating potential of knowledge and experience in many areas, far exceeding the capabilities of conventional training.

  17. Measurements of Gluconeogenesis and Glycogenolysis: A Methodological Review.

    Science.gov (United States)

    Chung, Stephanie T; Chacko, Shaji K; Sunehag, Agneta L; Haymond, Morey W

    2015-12-01

    Gluconeogenesis is a complex metabolic process that involves multiple enzymatic steps regulated by myriad factors, including substrate concentrations, the redox state, activation and inhibition of specific enzyme steps, and hormonal modulation. At present, the most widely accepted technique to determine gluconeogenesis is by measuring the incorporation of deuterium from the body water pool into newly formed glucose. However, several techniques using radioactive and stable-labeled isotopes have been used to quantitate the contribution and regulation of gluconeogenesis in humans. Each method has its advantages, methodological assumptions, and set of propagated errors. In this review, we examine the strengths and weaknesses of the most commonly used stable isotopes methods to measure gluconeogenesis in vivo. We discuss the advantages and limitations of each method and summarize the applicability of these measurements in understanding normal and pathophysiological conditions. © 2015 by the American Diabetes Association. Readers may use this article as long as the work is properly cited, the use is educational and not for profit, and the work is not altered.

  18. Methodology for the design of the method of siliceous sandstones operation using special software

    Directory of Open Access Journals (Sweden)

    Luis Ángel Lara-González

    2014-12-01

    Full Text Available The methodologies used for the design of the method of sandstones explotation by descending staggered banks using specialized software tools are reported. The data analyzed were collected in the field for the operating license 14816 in Melgar, Tolima. The characterization of the rock mass was held from physical and mechanical tests, performed on cylindrical test tubes in order to obtain the value of the maximum strenght and elastic modulus of the rock. The direction and dip of the sandstone package was rock. The direction and dip of the sandstone package was determined by using the stereographic projection whit DIPS®  software, and the safety factor of the slope was obtained with established banks whit SLIDE® . The slops are 8 meters high and 8 meters wide whit a tilt angle 60°, which generated a safety factor  of 2.1. The design  of the mining method was carried out with GEOVIA SURPAC® , at an early stage of development ascending to the level 11 of the exploitation, to then start mining in descending order to control the stabiLity of slopes. The results obtained allow a general methodology for the development of projects to optimize the process of evaluation and selection of mining method by using specialized design tools.

  19. Methodologies and applications for critical infrastructure protection: State-of-the-art

    International Nuclear Information System (INIS)

    Yusta, Jose M.; Correa, Gabriel J.; Lacal-Arantegui, Roberto

    2011-01-01

    This work provides an update of the state-of-the-art on energy security relating to critical infrastructure protection. For this purpose, this survey is based upon the conceptual view of OECD countries, and specifically in accordance with EU Directive 114/08/EC on the identification and designation of European critical infrastructures, and on the 2009 US National Infrastructure Protection Plan. The review discusses the different definitions of energy security, critical infrastructure and key resources, and shows some of the experie'nces in countries considered as international reference on the subject, including some information-sharing issues. In addition, the paper carries out a complete review of current methodologies, software applications and modelling techniques around critical infrastructure protection in accordance with their functionality in a risk management framework. The study of threats and vulnerabilities in critical infrastructure systems shows two important trends in methodologies and modelling. A first trend relates to the identification of methods, techniques, tools and diagrams to describe the current state of infrastructure. The other trend accomplishes a dynamic behaviour of the infrastructure systems by means of simulation techniques including systems dynamics, Monte Carlo simulation, multi-agent systems, etc. - Highlights: → We examine critical infrastructure protection experiences, systems and applications. → Some international experiences are reviewed, including EU EPCIP Plan and the US NIPP programme. → We discuss current methodologies and applications on critical infrastructure protection, with emphasis in electric networks.

  20. Protein engineering techniques gateways to synthetic protein universe

    CERN Document Server

    Poluri, Krishna Mohan

    2017-01-01

    This brief provides a broad overview of protein-engineering research, offering a glimpse of the most common experimental methods. It also presents various computational programs with applications that are widely used in directed evolution, computational and de novo protein design. Further, it sheds light on the advantages and pitfalls of existing methodologies and future perspectives of protein engineering techniques.

  1. A New Simulation Technique for Study of Collisionless Shocks: Self-Adaptive Simulations

    International Nuclear Information System (INIS)

    Karimabadi, H.; Omelchenko, Y.; Driscoll, J.; Krauss-Varban, D.; Fujimoto, R.; Perumalla, K.

    2005-01-01

    The traditional technique for simulating physical systems modeled by partial differential equations is by means of time-stepping methodology where the state of the system is updated at regular discrete time intervals. This method has inherent inefficiencies. In contrast to this methodology, we have developed a new asynchronous type of simulation based on a discrete-event-driven (as opposed to time-driven) approach, where the simulation state is updated on a 'need-to-be-done-only' basis. Here we report on this new technique, show an example of particle acceleration in a fast magnetosonic shockwave, and briefly discuss additional issues that we are addressing concerning algorithm development and parallel execution

  2. Using a hybrid methodology of dasyametric mapping and data ...

    African Journals Online (AJOL)

    Using a hybrid methodology of dasyametric mapping and data interpolation techniques ... AFRICAN JOURNALS ONLINE (AJOL) · Journals · Advanced Search ... the value and accuracy of the developed methodology is that of the 2011 census ...

  3. Clinical trial methodology

    National Research Council Canada - National Science Library

    Peace, Karl E; Chen, Ding-Geng

    2011-01-01

    "Now viewed as its own scientific discipline, clinical trial methodology encompasses the methods required for the protection of participants in a clinical trial and the methods necessary to provide...

  4. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  5. From systems biology to dynamical neuropharmacology: proposal for a new methodology.

    Science.gov (United States)

    Erdi, P; Kiss, T; Tóth, J; Ujfalussy, B; Zalányi, L

    2006-07-01

    The concepts and methods of systems biology are extended to neuropharmacology in order to test and design drugs for the treatment of neurological and psychiatric disorders. Computational modelling by integrating compartmental neural modelling techniques and detailed kinetic descriptions of pharmacological modulation of transmitter-receptor interaction is offered as a method to test the electrophysiological and behavioural effects of putative drugs. Even more, an inverse method is suggested as a method for controlling a neural system to realise a prescribed temporal pattern. In particular, as an application of the proposed new methodology, a computational platform is offered to analyse the generation and pharmacological modulation of theta rhythm related to anxiety.

  6. Application of the Delphi technique in healthcare maintenance.

    Science.gov (United States)

    Njuangang, Stanley; Liyanage, Champika; Akintoye, Akintola

    2017-10-09

    Purpose The purpose of this paper is to examine the research design, issues and considerations in the application of the Delphi technique to identify, refine and rate the critical success factors and performance measures in maintenance-associated infections. Design/methodology/approach In-depth literature review through the application of open and axial coding were applied to formulate the interview and research questions. These were used to conduct an exploratory case study of two healthcare maintenance managers, randomly selected from two National Health Service Foundation Trusts in England. The results of exploratory case study provided the rationale for the application of the Delphi technique in this research. The different processes in the application of the Delphi technique in healthcare research are examined thoroughly. Findings This research demonstrates the need to apply and integrate different research methods to enhance the validity of the Delphi technique. The rationale for the application of the Delphi technique in this research is because some healthcare maintenance managers lack knowledge about basic infection control (IC) principles to make hospitals safe for patient care. The result of first round of the Delphi exercise is a useful contribution in its own rights. It identified a number of salient issues and differences in the opinions of the Delphi participants, noticeably between healthcare maintenance managers and members of the infection control team. It also resulted in useful suggestions and comments to improve the quality and presentation of the second- and third-round Delphi instruments. Practical implications This research provides a research methodology that can be adopted by researchers investigating new and emerging issues in the healthcare sector. As this research demonstrates, the Delphi technique is relevant in soliciting expert knowledge and opinion to identify performance measures to control maintenance-associated infections in

  7. Resection methodology for PSP data processing: Recent ...

    Indian Academy of Sciences (India)

    M. Senthilkumar (Newgen Imaging) 1461 1996 Oct 15 13:05:22

    Abstract. PSP data processing, which primarily involves image alignment and image analysis, is a crucial element in obtaining accurate PSP results. There are two broad approaches to image alignment: the algebraic transformation technique, often called image-warping technique, and resection methodology, which uses ...

  8. Selecting a Sustainable Disinfection Technique for Wastewater Reuse Projects

    Directory of Open Access Journals (Sweden)

    Jorge Curiel-Esparza

    2014-09-01

    Full Text Available This paper presents an application of the Analytical Hierarchy Process (AHP by integrating a Delphi process for selecting the best sustainable disinfection technique for wastewater reuse projects. The proposed methodology provides project managers a tool to evaluate problems with multiple criteria and multiple alternatives which involve non-commeasurable decision criteria, with expert opinions playing a major role in the selection of these treatment technologies. Five disinfection techniques for wastewater reuse have been evaluated for each of the nine criteria weighted according to the opinions of consulted experts. Finally, the VIKOR method has been applied to determine a compromise solution, and to establish the stability of the results. Therefore, the expert system proposed to select the optimal disinfection alternative is a hybrid method combining the AHP with the Delphi method and the VIKOR technique, which is shown to be appropriate in realistic scenarios where multiple stakeholders are involved in the selection of a sustainable disinfection technique for wastewater reuse projects.

  9. Methodological concerns for determining power output in the jump squat.

    Science.gov (United States)

    Cormie, Prue; Deane, Russell; McBride, Jeffrey M

    2007-05-01

    The purpose of this study was to investigate the validity of power measurement techniques during the jump squat (JS) utilizing various combinations of a force plate and linear position transducer (LPT) devices. Nine men with at least 6 months of prior resistance training experience participated in this acute investigation. One repetition maximums (1RM) in the squat were determined, followed by JS testing under 2 loading conditions (30% of 1RM [JS30] and 90% of 1RM [JS90]). Three different techniques were used simultaneously in data collection: (a) 1 linear position transducer (1-LPT); (b) 1 linear position transducer and a force plate (1-LPT + FP); and (c) 2 linear position transducers and a force place (2-LPT + FP). Vertical velocity-, force-, and power-time curves were calculated for each lift using these methodologies and were compared. Peak force and peak power were overestimated by 1-LPT in both JS30 and JS90 compared with 2-LPT + FP and 1-LPT + FP (p squat varies according to the measurement technique utilized. The 1-LPT methodology is not a valid means of determining power output in the jump squat. Furthermore, the 1-LPT + FP method may not accurately represent power output in free weight movements that involve a significant amount of horizontal motion.

  10. THE MEASUREMENT METHODOLOGY IMPROVEMENT OF THE HORIZONTAL IRREGULARITIES IN PLAN

    Directory of Open Access Journals (Sweden)

    O. M. Patlasov

    2015-08-01

    Full Text Available Purpose. Across the track superstructure (TSS there are structures where standard approach to the decision on the future of their operation is not entirely correct or acceptable. In particular, it concerns the track sections which are sufficiently quickly change their geometric parameters: the radius of curvature, angle of rotation, and the like. As an example, such portions of TSS may include crossovers where their component is within the so-called connecting part, which at a sufficiently short length, substantially changes curvature. The estimation of the position in terms of a design on the basis of the existing technique (by the difference in the adjacent arrows bending is virtually impossible. Therefore it is proposed to complement and improve the methodology for assessing the situation of the curve in plan upon difference in the adjacent versine. Methodology. The possible options for measuring horizontal curves in the plan were analyzed. The most adequate method, which does not contradict existing on the criterion of the possibility of using established standards was determined. The ease of measurement and calculation was took into account. Findings. Qualitative and quantitative verification of the proposed and existing methods showed very good agreement of the measurement results. This gives grounds to assert that this methodology can be recommended to the workers of track facilities in the assessment of horizontal irregularities in plan not only curves, but also within the connecting part of switch congresses. Originality. The existing method of valuation of the geometric position of the curves in the plan was improved. It does not create new regulations, and all results are evaluated by existing norms. Practical value. The proposed technique makes it possible, without creating a new regulatory framework, to be attached to existing one, and expanding the boundaries of its application. This method can be used not only for ordinary curves

  11. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Science.gov (United States)

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  12. Comparative study on software development methodologies

    OpenAIRE

    Mihai Liviu DESPA

    2014-01-01

    This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager han...

  13. Isotopic dilution methods to determine the gross transformation rates of nitrogen, phosphorus, and sulfur in soil: a review of the theory, methodologies, and limitations

    International Nuclear Information System (INIS)

    Di, H. J.; Cameron, K. C.; McLaren, R. G.

    2000-01-01

    The rates at which nutrients are released to, and removed from, the mineral nutrient pool are important in regulating the nutrient supply to plants. These nutrient transformation rates need to be taken into account when developing nutrient management strategies for economical and sustainable production. A method that is gaining popularity for determining the gross transformation rates of nutrients in the soil is the isotopic dilution technique. The technique involves labelling a soil mineral nutrient pool, e.g. NH 4 + , NO 3 - , PO 4 3- , or SO 4 2- , and monitoring the changes with time of the size of the labelled nutrient pool and the excess tracer abundance (atom %, if stable isotope tracer is used) or specific activity (if radioisotope is used) in the nutrient pool. Because of the complexity of the concepts and procedures involved, the method has sometimes been used incorrectly, and results misinterpreted. This paper discusses the isotopic dilution technique, including the theoretical background, the methodologies to determine the gross flux rates of nitrogen, phosphorus, and sulfur, and the limitations of the technique. The assumptions, conceptual models, experimental procedures, and compounding factors are discussed. Possible effects on the results by factors such as the uniformity of tracer distribution in the soil, changes in soil moisture content, substrate concentration, and aeration status, and duration of the experiment are also discussed. The influx and out-flux transformation rates derived from this technique are often contributed by several processes simultaneously, and thus cannot always be attributed to a particular nutrient transformation process. Despite the various constraints or possible compounding factors, the technique is a valuable tool that can provide important quantitative information on nutrient dynamics in the soil-plant system. Copyright (2000) CSIRO Publishing

  14. Biological indication in aquatic ecosystems. Biological indication in limnic and coastal ecosystems - fundamentals, techniques, methodology

    International Nuclear Information System (INIS)

    Gunkel, G.

    1994-01-01

    Biological methods of water quality evaluation today form an integral part of environmental monitoring and permit to continuously monitor the condition of aquatic ecosystems. They indicate both improvements in water quality following redevelopment measures, and the sometimes insidious deterioration of water quality. This book on biological indication in aquatic ecosystems is a compendium of measurement and evaluation techniques for limnic systems by means of biological parameters. At present, however, an intense discussion of biological evaluation techniques is going on, for one thing as a consequence of the German reunification and the need to unify evaluation techniques, and for another because of harmonizations within the European Community. (orig./EF) [de

  15. Convergence studies of deterministic methods for LWR explicit reflector methodology

    International Nuclear Information System (INIS)

    Canepa, S.; Hursin, M.; Ferroukhi, H.; Pautz, A.

    2013-01-01

    The standard approach in modem 3-D core simulators, employed either for steady-state or transient simulations, is to use Albedo coefficients or explicit reflectors at the core axial and radial boundaries. In the latter approach, few-group homogenized nuclear data are a priori produced with lattice transport codes using 2-D reflector models. Recently, the explicit reflector methodology of the deterministic CASMO-4/SIMULATE-3 code system was identified to potentially constitute one of the main sources of errors for core analyses of the Swiss operating LWRs, which are all belonging to GII design. Considering that some of the new GIII designs will rely on very different reflector concepts, a review and assessment of the reflector methodology for various LWR designs appeared as relevant. Therefore, the purpose of this paper is to first recall the concepts of the explicit reflector modelling approach as employed by CASMO/SIMULATE. Then, for selected reflector configurations representative of both GII and GUI designs, a benchmarking of the few-group nuclear data produced with the deterministic lattice code CASMO-4 and its successor CASMO-5, is conducted. On this basis, a convergence study with regards to geometrical requirements when using deterministic methods with 2-D homogenous models is conducted and the effect on the downstream 3-D core analysis accuracy is evaluated for a typical GII deflector design in order to assess the results against available plant measurements. (authors)

  16. Selected methods of waste monitoring using modern analytical techniques

    International Nuclear Information System (INIS)

    Hlavacek, I.; Hlavackova, I.

    1993-11-01

    Issues of the inspection and control of bituminized and cemented waste are discussed, and some methods of their nondestructive testing are described. Attention is paid to the inspection techniques, non-nuclear spectral techniques in particular, as employed for quality control of the wastes, waste concentrates, spent waste leaching solutions, as well as for the examination of environmental samples (waters and soils) from the surroundings of nuclear power plants. Some leaching tests used abroad for this purpose and practical analyses by the ICP-AES technique are given by way of example. The ICP-MS technique, which is unavailable in the Czech Republic, is routinely employed abroad for alpha nuclide measurements; examples of such analyses are also given. The next topic discussed includes the monitoring of organic acids and complexants to determine the degree of their thermal decomposition during the bituminization of wastes on an industrial line. All of the methods and procedures highlighted can be used as technical support during the monitoring of radioactive waste properties in industrial conditions, in the chemical and radiochemical analyses of wastes and related matter, in the calibration of nondestructive testing instrumentation, in the monitoring of contamination of the surroundings of nuclear facilities, and in trace analysis. (author). 10 tabs., 1 fig., 14 refs

  17. Reflective Methodology: The Beginning Teacher

    Science.gov (United States)

    Templeton, Ronald K.; Siefert, Thomas E.

    1970-01-01

    Offers a variety of specific techniques which will help the beginning teacher to implement reflective methodology and create an inquiry-centered classroom atmosphere, at the same time meeting the many more pressing demands of first-year teaching. (JES)

  18. Mixing Methods in Organizational Ethics and Organizational Innovativeness Research : Three Approaches to Mixed Methods Analysis

    OpenAIRE

    Riivari, Elina

    2015-01-01

    This chapter discusses three categories of mixed methods analysis techniques: variableoriented, case-oriented, and process/experience-oriented. All three categories combine qualitative and quantitative approaches to research methodology. The major differences among the categories are the focus of the study, available analysis techniques and timely aspect of the study. In variable-oriented analysis, the study focus is relationships between the research phenomena. In case-oriente...

  19. A methodology for producing small scale rural land use maps in semi-arid developing countries using orbital imagery

    Science.gov (United States)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. Results have shown that it is feasible to design a methodology that can provide suitable guidelines for operational production of small scale rural land use maps of semiarid developing regions from LANDSAT MSS imagery, using inexpensive and unsophisticated visual techniques. The suggested methodology provides immediate practical benefits to map makers attempting to produce land use maps in countries with limited budgets and equipment. Many preprocessing and interpretation techniques were considered, but rejected on the grounds that they were inappropriate mainly due to the high cost of imagery and/or equipment, or due to their inadequacy for use in operational projects in the developing countries. Suggested imagery and interpretation techniques, consisting of color composites and monocular magnification proved to be the simplest, fastest, and most versatile methods.

  20. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  1. Prioritization methodology for chemical replacement

    Science.gov (United States)

    Cruit, Wendy; Goldberg, Ben; Schutzenhofer, Scott

    1995-01-01

    Since United States of America federal legislation has required ozone depleting chemicals (class 1 & 2) to be banned from production, The National Aeronautics and Space Administration (NASA) and industry have been required to find other chemicals and methods to replace these target chemicals. This project was initiated as a development of a prioritization methodology suitable for assessing and ranking existing processes for replacement 'urgency.' The methodology was produced in the form of a workbook (NASA Technical Paper 3421). The final workbook contains two tools, one for evaluation and one for prioritization. The two tools are interconnected in that they were developed from one central theme - chemical replacement due to imposed laws and regulations. This workbook provides matrices, detailed explanations of how to use them, and a detailed methodology for prioritization of replacement technology. The main objective is to provide a GUIDELINE to help direct the research for replacement technology. The approach for prioritization called for a system which would result in a numerical rating for the chemicals and processes being assessed. A Quality Function Deployment (QFD) technique was used in order to determine numerical values which would correspond to the concerns raised and their respective importance to the process. This workbook defines the approach and the application of the QFD matrix. This technique: (1) provides a standard database for technology that can be easily reviewed, and (2) provides a standard format for information when requesting resources for further research for chemical replacement technology. Originally, this workbook was to be used for Class 1 and Class 2 chemicals, but it was specifically designed to be flexible enough to be used for any chemical used in a process (if the chemical and/or process needs to be replaced). The methodology consists of comparison matrices (and the smaller comparison components) which allow replacement technology

  2. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  3. Cochrane Qualitative and Implementation Methods Group guidance series-paper 3: methods for assessing methodological limitations, data extraction and synthesis, and confidence in synthesized qualitative findings.

    Science.gov (United States)

    Noyes, Jane; Booth, Andrew; Flemming, Kate; Garside, Ruth; Harden, Angela; Lewin, Simon; Pantoja, Tomas; Hannes, Karin; Cargo, Margaret; Thomas, James

    2018-05-01

    The Cochrane Qualitative and Implementation Methods Group develops and publishes guidance on the synthesis of qualitative and mixed-method implementation evidence. Choice of appropriate methodologies, methods, and tools is essential when developing a rigorous protocol and conducting the synthesis. Cochrane authors who conduct qualitative evidence syntheses have thus far used a small number of relatively simple methods to address similarly written questions. Cochrane has invested in methodological work to develop new tools and to encourage the production of exemplar reviews to show the value of more innovative methods that address a wider range of questions. In this paper, in the series, we report updated guidance on the selection of tools to assess methodological limitations in qualitative studies and methods to extract and synthesize qualitative evidence. We recommend application of Grades of Recommendation, Assessment, Development, and Evaluation-Confidence in the Evidence from Qualitative Reviews to assess confidence in qualitative synthesized findings. This guidance aims to support review authors to undertake a qualitative evidence synthesis that is intended to be integrated subsequently with the findings of one or more Cochrane reviews of the effects of similar interventions. The review of intervention effects may be undertaken concurrently with or separate to the qualitative evidence synthesis. We encourage further development through reflection and formal testing. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. A micro focus with macro impact: Exploration of initial abstraction coefficient ratio (λ) in Soil Conservation Curve Number (CN) methodology

    International Nuclear Information System (INIS)

    Ling, L; Yusop, Z

    2014-01-01

    Researchers started to cross examine United States Department of Agriculture (USDA) Soil Conservation Services (SCS) Curve Number (CN) methodology after the technique produced inconsistent results throughout the world. More field data from recent decades were leaning against the assumption of the initial abstraction coefficient ratio value proposed by SCS in 1954. Physiographic conditions were identified as vital influencing factors to be considered under this methodology while practitioners of this method are encouraged to validate and derive regional specific relationship and employ the method with caution

  5. Evaluation of a Delphi technique based expert judgement method for LCA valuation - DELPHI II

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, Y.; Torkkeli, S. [VTT Chemical Technology, Espoo (Finland). Environmental Technology; Wilson, B. [Landbank Environmental Research and Consulting, London (United Kingdom)

    1999-07-01

    Because of the complexity and trade-offs between different points of the life cycles of the analysed systems, a method which measures the environmental damage caused by each intervention is needed in order to make a choice between the products. However, there is no commonly agreed methodology for this particular purpose. In most of the methods the valuation is implicitly or explicitly based on economic criteria. For various reasons, however, economically obtained criteria do not necessarily reflect ecological arguments correctly. Thus, there is a need for new, ecologically based valuation methods. One such approach is the expert judgement method, based on the Delphi technique, which rejects the economic basis in favour of the judgements of a group of environmental experts. However, it is not self evident that the expert judgement based environmental rating of interventions will be essentially more correct and certain than other methods. In this study the method was evaluated at different points of the procedure in order to obtain a picture of the quality of the indexes produced. The evaluation was based on an actual Delphi study made in 1995-1996 in Finland, Sweden and Norway. The main questions addressed were the significance of the results and the operational quality of the Delphi procedure. The results obtained by applying the expert method indexes were also compared with the results obtained with other valuation methods for the background life cycle inventory of the case study. Additional material included feedback data from panellists of the case study, collected with a questionnaire. The questionnaire data was analysed to identify major dimensions in the criteria for evaluating interventions and correlation of the final indexes of the Delphi I study with these dimensions. The rest of the questionnaire material was used to document panellists' opinions and experiences of the Delphi process, familiarity with the environmental impacts of various

  6. Evaluation of a Delphi technique based expert judgement method for LCA valuation - DELPHI II

    International Nuclear Information System (INIS)

    Virtanen, Y.; Torkkeli, S.

    1999-01-01

    Because of the complexity and trade-offs between different points of the life cycles of the analysed systems, a method which measures the environmental damage caused by each intervention is needed in order to make a choice between the products. However, there is no commonly agreed methodology for this particular purpose. In most of the methods the valuation is implicitly or explicitly based on economic criteria. For various reasons, however, economically obtained criteria do not necessarily reflect ecological arguments correctly. Thus, there is a need for new, ecologically based valuation methods. One such approach is the expert judgement method, based on the Delphi technique, which rejects the economic basis in favour of the judgements of a group of environmental experts. However, it is not self evident that the expert judgement based environmental rating of interventions will be essentially more correct and certain than other methods. In this study the method was evaluated at different points of the procedure in order to obtain a picture of the quality of the indexes produced. The evaluation was based on an actual Delphi study made in 1995-1996 in Finland, Sweden and Norway. The main questions addressed were the significance of the results and the operational quality of the Delphi procedure. The results obtained by applying the expert method indexes were also compared with the results obtained with other valuation methods for the background life cycle inventory of the case study. Additional material included feedback data from panellists of the case study, collected with a questionnaire. The questionnaire data was analysed to identify major dimensions in the criteria for evaluating interventions and correlation of the final indexes of the Delphi I study with these dimensions. The rest of the questionnaire material was used to document panellists' opinions and experiences of the Delphi process, familiarity with the environmental impacts of various interventions

  7. Photo-Elicitation and Visual Semiotics: A Unique Methodology for Studying Inclusion for Children with Disabilities

    Science.gov (United States)

    Stockall, Nancy

    2013-01-01

    The methodology in this paper discusses the use of photographs as an elicitation strategy that can reveal the thinking processes of participants in a qualitatively rich manner. Photo-elicitation techniques combined with a Piercian semiotic perspective offer a unique method for creating a frame of action for later participant analysis. Illustrative…

  8. Comparative study on software development methodologies

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2014-12-01

    Full Text Available This paper focuses on the current state of knowledge in the field of software development methodologies. It aims to set the stage for the formalization of a software development methodology dedicated to innovation orientated IT projects. The paper starts by depicting specific characteristics in software development project management. Managing software development projects involves techniques and skills that are proprietary to the IT industry. Also the software development project manager handles challenges and risks that are predominantly encountered in business and research areas that involve state of the art technology. Conventional software development stages are defined and briefly described. Development stages are the building blocks of any software development methodology so it is important to properly research this aspect. Current software development methodologies are presented. Development stages are defined for every showcased methodology. For each methodology a graphic representation is illustrated in order to better individualize its structure. Software development methodologies are compared by highlighting strengths and weaknesses from the stakeholder's point of view. Conclusions are formulated and a research direction aimed at formalizing a software development methodology dedicated to innovation orientated IT projects is enunciated.

  9. The need for standardization of methodology and components in commercial radioimmunoassay kits

    International Nuclear Information System (INIS)

    Wood, W.G.; Marschner, I.; Scriba, P.C.

    1978-01-01

    The problems arising from the increasing use of commercial kits in radioimmunoassay (RIA) and related fields are discussed. These problems differ according to the substance under test. The quality of individual reagents is often good, but the methodology is often not optimal and may contain short-cuts which, although commercially attractive, can lead to erroneous values and poor sensitivity and precision. Minor modifications in the methodology often lead to big improvements in sensitivity and precision. This has been demonstrated in three digoxin kits employing antibody-coated tube techniques and in four kits for thyrotropin (TSH) using different techniques. It has also been noted that with many quality-control sera imported from the USA no values are ascribed to European kits for the components listed, thus reducing these sera to the function of precision control. The study underlines the need to standardize kit components and assay methods to enable the results obtained by different laboratories with different kits to be compared. (author)

  10. Design Methodology - Design Synthesis

    DEFF Research Database (Denmark)

    Andreasen, Mogens Myrup

    2003-01-01

    Design Methodology is part of our practice and our knowledge about designing, and it has been strongly supported by the establishing and work of a design research community. The aim of this article is to broaden the reader¿s view of designing and Design Methodology. This is done by sketching...... the development of Design Methodology through time and sketching some important approaches and methods. The development is mainly forced by changing industrial condition, by the growth of IT support for designing, but also by the growth of insight into designing created by design researchers.......ABSTRACT Design Methodology shall be seen as our understanding of how to design; it is an early (emerging late 60ies) and original articulation of teachable and learnable methodics. The insight is based upon two sources: the nature of the designed artefacts and the nature of human designing. Today...

  11. Survey of Dynamic PSA Methodologies

    International Nuclear Information System (INIS)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung; Kim, Taewan

    2015-01-01

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  12. Survey of Dynamic PSA Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Hansul; Kim, Hyeonmin; Heo, Gyunyoung [Kyung Hee University, Yongin (Korea, Republic of); Kim, Taewan [KEPCO International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2015-05-15

    Event Tree(ET)/Fault Tree(FT) are significant methodology in Probabilistic Safety Assessment(PSA) for Nuclear Power Plants(NPPs). ET/FT methodology has the advantage for users to be able to easily learn and model. It enables better communication between engineers engaged in the same field. However, conventional methodologies are difficult to cope with the dynamic behavior (e.g. operation mode changes or sequence-dependent failure) and integrated situation of mechanical failure and human errors. Meanwhile, new possibilities are coming for the improved PSA by virtue of the dramatic development on digital hardware, software, information technology, and data analysis.. More specifically, the computing environment has been greatly improved with being compared to the past, so we are able to conduct risk analysis with the large amount of data actually available. One method which can take the technological advantages aforementioned should be the dynamic PSA such that conventional ET/FT can have time- and condition-dependent behaviors in accident scenarios. In this paper, we investigated the various enabling techniques for the dynamic PSA. Even though its history and academic achievement was great, it seems less interesting from industrial and regulatory viewpoint. Authors expect this can contribute to better understanding of dynamic PSA in terms of algorithm, practice, and applicability. In paper, the overview for the dynamic PSA was conducted. Most of methodologies share similar concepts. Among them, DDET seems a backbone for most of methodologies since it can be applied to large problems. The common characteristics sharing the concept of DDET are as follows: • Both deterministic and stochastic approaches • Improves the identification of PSA success criteria • Helps to limit detrimental effects of sequence binning (normally adopted in PSA) • Helps to avoid defining non-optimal success criteria that may distort the risk • Framework for comprehensively considering

  13. The evaluation framework for business process management methodologies

    Directory of Open Access Journals (Sweden)

    Sebastian Lahajnar

    2016-06-01

    Full Text Available In an intense competition in the global market, organisations seek to take advantage of all their internal and external potentials, advantages, and resources. It has been found that, in addition to competitive products and services, a good business also requires an effective management of business processes, which is the discipline of the business process management (BPM. The introduction of the BPM in the organisation requires a thoughtful selection of an appropriate methodological approach, since the latter will formalize activities, products, applications and other efforts of the organisation in this field. Despite many technology-driven solutions of software companies, recommendations of consulting companies, techniques, good practices and tools, the decision on what methodology to choose is anything but simple. The aim of this article is to simplify the adoption of such decisions by building a framework for the evaluation of BPM methodologies according to a qualitative multi-attribute decision-making method. The framework defines a hierarchical decision-making model, formalizes the decision-making process and thus contributes significantly to an independent, credible final decision that is the most appropriate for a specific organisation.

  14. Selecting a software development methodology. [of digital flight control systems

    Science.gov (United States)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  15. VIKOR Technique: A Systematic Review of the State of the Art Literature on Methodologies and Applications

    Directory of Open Access Journals (Sweden)

    Abbas Mardani

    2016-01-01

    Full Text Available The main objective of this paper is to present a systematic review of the VlseKriterijuska Optimizacija I Komoromisno Resenje (VIKOR method in several application areas such as sustainability and renewable energy. This study reviewed a total of 176 papers, published in 2004 to 2015, from 83 high-ranking journals; most of which were related to Operational Research, Management Sciences, decision making, sustainability and renewable energy and were extracted from the “Web of Science and Scopus” databases. Papers were classified into 15 main application areas. Furthermore, papers were categorized based on the nationalities of authors, dates of publications, techniques and methods, type of studies, the names of the journals and studies purposes. The results of this study indicated that more papers on VIKOR technique were published in 2013 than in any other year. In addition, 13 papers were published about sustainability and renewable energy fields. Furthermore, VIKOR and fuzzy VIKOR methods, had the first rank in use. Additionally, the Journal of Expert Systems with Applications was the most significant journal in this study, with 27 publications on the topic. Finally, Taiwan had the first rank from 22 nationalities which used VIKOR technique.

  16. Monte Carlo Techniques for the Comprehensive Modeling of Isotopic Inventories in Future Nuclear Systems and Fuel Cycles. Final Report

    International Nuclear Information System (INIS)

    Paul P.H. Wilson

    2005-01-01

    The development of Monte Carlo techniques for isotopic inventory analysis has been explored in order to facilitate the modeling of systems with flowing streams of material through varying neutron irradiation environments. This represents a novel application of Monte Carlo methods to a field that has traditionally relied on deterministic solutions to systems of first-order differential equations. The Monte Carlo techniques were based largely on the known modeling techniques of Monte Carlo radiation transport, but with important differences, particularly in the area of variance reduction and efficiency measurement. The software that was developed to implement and test these methods now provides a basis for validating approximate modeling techniques that are available to deterministic methodologies. The Monte Carlo methods have been shown to be effective in reproducing the solutions of simple problems that are possible using both stochastic and deterministic methods. The Monte Carlo methods are also effective for tracking flows of materials through complex systems including the ability to model removal of individual elements or isotopes in the system. Computational performance is best for flows that have characteristic times that are large fractions of the system lifetime. As the characteristic times become short, leading to thousands or millions of passes through the system, the computational performance drops significantly. Further research is underway to determine modeling techniques to improve performance within this range of problems. This report describes the technical development of Monte Carlo techniques for isotopic inventory analysis. The primary motivation for this solution methodology is the ability to model systems of flowing material being exposed to varying and stochastically varying radiation environments. The methodology was developed in three stages: analog methods which model each atom with true reaction probabilities (Section 2), non-analog methods

  17. PROBLEM SOLVING TECHNIQUES AS A PART OF IMPLEMENTATION OF SIX SIGMA METHODOLOGY IN TIRE PRODUCTION. CASE STUDY

    Directory of Open Access Journals (Sweden)

    Maciej WOJTASZAK

    2015-07-01

    Full Text Available Problem solving methods – are an indispensable part of the management and improvement of production. At the turn of decades, with the development of industry, specific techniques have been implemented and refined by the leaders in this field, such as Toyota, GE and Motorola. The foundation of problem solving is to find real root cause of the problem as soon as possible, its understanding and implementation of appropriate solutions that will ensure that the problem does not occur again. This paper provides an overview of methods and techniques to solve problems in the manufactur-ing plant Trelleborg Wheel Systems Sri Lanka, producing pneumatic tires for light agricultural machinery. These tech-niques are implemented as part of the Lean Six Sigma program.

  18. Evaluation of safeguards procedures: a summary of a methodology

    International Nuclear Information System (INIS)

    Salisbury, J.D.; Savage, J.W.

    1979-01-01

    A methodology for the evaluation of safeguards procedures is described. As presently conceptualized, the methodology will consist of the following steps: (1) expansion of the general protection requirements that are contained in the NRC regulations into more detailed but still generic requirements for use at the working level; (2) development of techniques and formats for using the working-level requirements in an evaluation; (3) development of a technique for converting specific facility protection procedures into a format that will allow comparison with the working-level requirements; (4) development of an evaluation technique for comparing the facility protection procedures to determine if they meet the protection requirements

  19. Analysis Planning Methodology: For Thesis, Joint Applied Project, & MBA Research Reports

    OpenAIRE

    Naegle, Brad R.

    2010-01-01

    Acquisition Research Handbook Series Purpose: This guide provides the graduate student researcher—you—with techniques and advice on creating an effective analysis plan, and it provides methods for focusing the data-collection effort based on that analysis plan. As a side benefit, this analysis planning methodology will help you to properly scope the research effort and will provide you with insight for changes in that effort. The information presented herein was supported b...

  20. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    Science.gov (United States)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an

  1. CIAU methodology and BEPU applications

    International Nuclear Information System (INIS)

    Petruzzi, A.; D'Auria, F.

    2009-01-01

    Best-Estimate calculation results from complex thermal-hydraulic system codes (like Relap5, Cathare, Athlet, Trace, etc..) are affected by unavoidable approximations that are unpredictable without the use of computational tools that account for the various sources of uncertainty. Therefore the use of best-estimate codes within the reactor technology, either for design or safety purposes, implies understanding and accepting the limitations and the deficiencies of those codes. Uncertainties may have different origins ranging from the approximation of the models, to the approximation of the numerical solution, and to the lack of precision of the values adopted for boundary and initial conditions. The amount of uncertainty that affects a calculation may strongly depend upon the codes and the modeling techniques (i.e. the code's users). A consistent and robust uncertainty methodology must be developed taking into consideration all the above aspects. The CIAU (Code with the capability of Internal Assessment of Uncertainty) and the UMAE (Uncertainty Methodology based on Accuracy Evaluation) methods have been developed by University of Pisa (UNIPI) in the framework of a long lasting research activities started since 80's and involving several researchers. CIAU is extensively discussed in the available technical literature, Refs. [1, 2, 3, 4, 5, 6, 7], and tens of additional relevant papers, that provide comprehensive details about the method, can be found in the bibliography lists of the above references. Therefore, the present paper supplies only 'spot-information' about CIAU and focuses mostly on the applications to some cases of industrial interest. In particular the application of CIAU to the OECD BEMUSE (Best Estimate Methods Uncertainty and Sensitivity Evaluation, [8, 9]) project is discussed and a critical comparison respect with other uncertainty methods (in relation to items like: sources of uncertainties, selection of the input parameters and quantification of

  2. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  3. Test cases for interface tracking methods: methodology and current status

    International Nuclear Information System (INIS)

    Lebaigue, O.; Jamet, D.; Lemonnier, E.

    2004-01-01

    Full text of publication follows:In the past decade, a large number of new methods have been developed to deal with interfaces in the numerical simulation of two-phase flows. We have collected a set of 36 test cases, which can be seen as a tool to help engineers and researchers selecting the most appropriate method(s) for their specific fields of application. This set can be use: - To perform an initial evaluation of the capabilities of available methods with regard to the specificity of the final application and the most important features to be recovered from the simulation. - To measure the maximum mesh size to be used for a given physical problem in order to obtain an accurate enough solution. - To assess and quantify the performances of a selected method equipped with its set of physical models. The computation of a well-documented test case allows estimating the error due to the numerical technique by comparison with reference solutions. This process is compulsory to gain confidence and credibility on the prediction capabilities of a numerical method and its physical models. - To broaden the capabilities of a given numerical technique. The test cases may be used to identify the need for improvement of the overall numerical scheme or to determine the physical part of the model, which is responsible for the observed limitations. Each test case falls within one of the following categories: - Analytical solutions of well-known sets of equations corresponding to simple geometrical situations. - Reference numerical solutions of moderately complex problems, produced by accurate methods (e.g., boundary Fitted coordinate method) on refined meshes. - Separate effects analytical experiments. The presentation will suggest how to use the test cases for assessing the physical models and the numerical methods. The expected fallout of using test cases is indeed on the one hand to identify the merits of existing methods and on the other hand to orient further research towards

  4. On the methodology of feeding ecology in fish

    Directory of Open Access Journals (Sweden)

    Saikia Surjya Kumar

    2016-06-01

    Full Text Available Feeding ecology explains predator’s preference to some preys over others in their habitat and their competitions thereof. The subject, as a functional and applied biology, is highly neglected, and in case of fish, a uniform and consistent methodology is absent. The currently practiced methods are largely centred on mathematical indices and highly erroneous because of non-uniform outcomes. Therefore, it requires a relook into the subject to elucidate functional contributions and to make it more comparable and comprehensive science. In this article, approachable methodological strategies have been forwarded in three hierarchical steps, namely, food occurrence, feeding biology and interpretative ecology. All these steps involve wide ranges of techniques, within the scope of ecology but not limited to, and traverse from narrative to functional evolutionary ecology. The first step is an assumption-observation practice to assess food of fish, followed by feeding biology that links morphological, histological, cytological, bacteriological or enzymological correlations to preferred food in the environment. Interpretative ecology is the higher level of analysis in which the outcomes are tested and discussed against evolutionary theories. A description of possible pedagogics on the methods of feeding ecological studies has also been forwarded.

  5. The Methodological Dynamism of Grounded Theory

    Directory of Open Access Journals (Sweden)

    Nicholas Ralph

    2015-11-01

    Full Text Available Variations in grounded theory (GT interpretation are the subject of ongoing debate. Divergences of opinion, genres, approaches, methodologies, and methods exist, resulting in disagreement on what GT methodology is and how it comes to be. From the postpositivism of Glaser and Strauss, to the symbolic interactionist roots of Strauss and Corbin, through to the constructivism of Charmaz, the field of GT methodology is distinctive in the sense that those using it offer new ontological, epistemological, and methodological perspectives at specific moments in time. We explore the unusual dynamism attached to GT’s underpinnings. Our view is that through a process of symbolic interactionism, in which generations of researchers interact with their context, moments are formed and philosophical perspectives are interpreted in a manner congruent with GT’s essential methods. We call this methodological dynamism, a process characterized by contextual awareness and moment formation, contemporaneous translation, generational methodology, and methodological consumerism.

  6. EPA Method 245.2: Mercury (Automated Cold Vapor Technique)

    Science.gov (United States)

    Method 245.2 describes procedures for preparation and analysis of drinking water samples for analysis of mercury using acid digestion and cold vapor atomic absorption. Samples are prepared using an acid digestion technique.

  7. Methodology for teaching facial filling with hyaluronic acid.

    Science.gov (United States)

    De Oliveira Ruiz, R; Laruccia, M M; Gerenutti, M

    2014-01-01

    This paper shows the importance of the methodization in teaching facial dermal filling on the training of physicians who intend to work or are already working in the area of facial aesthetics. The methodology is based on the procedures performed in Iz Clinic of Plastic Surgery from 2007 to 2010, where the results of the use of dermal filling products were observed. We chose the hyaluronic acid for the methodization of education. Even being a safe procedure, the dermal filling needs to be done by trained professionals because some complications may occur. The theoretical discussion of facial anatomy, physiology and classification of aging, rheological characteristics of products and application techniques underpin the practical part, in which the live demo or supervision of the procedure is performed. The idealization of classes, both theoretical and practical, proposed in this work proved to be of great value in teaching physicians. The success of this method can be seen from the results achieved by students and by observing the drop in reports of adverse effects. After learning the techniques of facial dermal filling with products based on hyaluronic acid, a doctor may perform this therapy with other fillers, with harmonious results.

  8. Need for standardization of methodology and components in commercial radioimmunoassay kits

    Energy Technology Data Exchange (ETDEWEB)

    Wood, W G; Marschner, I; Scriba, P C [Muenchen Univ. (Germany, F.R.). Medizinische Klinik Innenstadt

    1977-01-01

    The problems arising from increasing use of commercial kits in radioimmunoassay (RIA) and related fields are discussed. The problems arising in various RIAs differ according to the substance under test. The quality of individual components is often good, although methodology is often not optimal and contains short-cuts, which although commercially attractive, can lead to erroneous values and poor sensitivity and precision. Minor modification of methodology often leads to major improvements in sensitivity and precision, and this has been demonstrated in the case of three digoxin kits employing antibody-coated tube techniques and in four kits for thyrotropin (TSH) using different techniques. It has also been noted that in many imported quality control sera from the USA no values have been ascribed to European kits for the components listed, thus reducing these sera to the function of precision control. The deductions from this study are that a standardization of kit components and assay methods is desirable in order to allow comparison of results between laboratories using different kits.

  9. Development of a methodology for automated assessment of the quality of digitized images in mammography

    International Nuclear Information System (INIS)

    Santana, Priscila do Carmo

    2010-01-01

    The process of evaluating the quality of radiographic images in general, and mammography in particular, can be much more accurate, practical and fast with the help of computer analysis tools. The purpose of this study is to develop a computational methodology to automate the process of assessing the quality of mammography images through techniques of digital imaging processing (PDI), using an existing image processing environment (ImageJ). With the application of PDI techniques was possible to extract geometric and radiometric characteristics of the images evaluated. The evaluated parameters include spatial resolution, high-contrast detail, low contrast threshold, linear detail of low contrast, tumor masses, contrast ratio and background optical density. The results obtained by this method were compared with the results presented in the visual evaluations performed by the Health Surveillance of Minas Gerais. Through this comparison was possible to demonstrate that the automated methodology is presented as a promising alternative for the reduction or elimination of existing subjectivity in the visual assessment methodology currently in use. (author)

  10. Drifter technique: a new method to obtain metaphases in Hep-2 cell line cultures

    Directory of Open Access Journals (Sweden)

    Eleonidas Moura Lima

    2005-07-01

    Full Text Available The Hep-2 cell line is derived from laryngeal carcinoma cells and is often utilized as a model in carcinogenesis and mutagenesis tests. To evaluate the proliferative potential of this line, we developed a cytogenetic methodology (drifter technique to obtain metaphases from cells that loose cellular adhesion when they underwent mitosis in culture. By this procedure, 2000 cells were counted, resulting in a mitotic index (MI of 22.2%. Although this MI was not statistically different from the one obtained using either a classical cytogenetic method or a cell synchronization technique, the drifter technique has the advantage of not requiring the use of some reagents for the obtention of metaphases and also of diminishing the consumption of maintenance reagents for this cell line.A linhagem celular Hep-2 é formada por células de carcinoma da laringe e é muito utilizada em modelos de carcinogênese e mutagenêse. Para avaliar o potencial proliferativo desta linhagem, desenvolvemos uma metodologia citogenética (técnica do sobrenadante para obtenção de metáfases a partir de células que, ao entrarem em mitose, perdem adesão celular, ficando em suspensão no meio de cultura. Através deste procedimento, foram contadas 2000 células, correspondendo a um índice mitótico (IM de 22.2% . Apesar de o IM obtido por esta técnica não ter sido estatisticamente diferente do IM obtido por outras metodologias citogenéticas clássicas, a técnica do sobrenadante é vantajosa porque elimina o uso de alguns reagentes utilizados na obtenção de metáfases e também diminui o consumo de reagentes de manutenção desta linhagem.

  11. Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations

    International Nuclear Information System (INIS)

    Martens, Hans-Juergen von

    2010-01-01

    The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s 2 ). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.

  12. Design of an Integrated Methodology for Analytical Design of Complex Supply Chains

    Directory of Open Access Journals (Sweden)

    Shahid Rashid

    2012-01-01

    Full Text Available A literature review and gap analysis indentifies key limitations of industry best practice when modelling of supply chains. To address these limitations the paper reports on the conception and development of an integrated modelling methodology designed to underpin the analytical design of complex supply chains. The methodology is based upon a systematic deployment of EM, CLD, and SM techniques; the integration of which is achieved via common modelling concepts and decomposition principles. Thereby the methodology facilitates: (i graphical representation and description of key “processing”, “resourcing” and “work flow” properties of supply chain configurations; (ii behavioural exploration of currently configured supply chains, to facilitate reasoning about uncertain demand impacts on supply, make, delivery, and return processes; (iii predictive quantification about relative performances of alternative complex supply chain configurations, including risk assessments. Guidelines for the application of each step of the methodology are described. Also described are recommended data collection methods and expected modelling outcomes for each step. The methodology is being extensively case tested to quantify potential benefits & costs relative to current best industry practice. The paper reflects on preliminary benefits gained during industry based case study modelling and identifies areas of potential improvement.

  13. Techniques and methods in nuclear materials traceability

    International Nuclear Information System (INIS)

    Persiani, P.J.

    1996-01-01

    The nonproliferation community is currently addressing concerns that the access to special nuclear materials may increase the illicit trafficking in weapons-usable materials from civil and/or weapons material stores and/or fuel cycles systems. Illicit nuclear traffic usually involves reduced quantities of nuclear materials perhaps as samplings of a potential protracted diversionary flow from sources to users. To counter illicit nuclear transactions requires the development of techniques and methods in nuclear material traceability as an important phase of a broad forensic analysis capability. This report discusses how isotopic signatures and correlation methods were applied to determine the origins of Highly Enriched Uranium (HEU) and Plutonium samples reported as illicit trafficking in nuclear materials

  14. Thresholding methods for PET imaging: A review

    International Nuclear Information System (INIS)

    Dewalle-Vignion, A.S.; Betrouni, N.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; Dewalle-Vignion, A.S.; Hossein-Foucher, C.; Huglo, D.; Vermandel, M.; El Abiad, A.

    2010-01-01

    This work deals with positron emission tomography segmentation methods for tumor volume determination. We propose a state of art techniques based on fixed or adaptive threshold. Methods found in literature are analysed with an objective point of view on their methodology, advantages and limitations. Finally, a comparative study is presented. (authors)

  15. Is There a Consensus on Consensus Methodology? Descriptions and Recommendations for Future Consensus Research.

    Science.gov (United States)

    Waggoner, Jane; Carline, Jan D; Durning, Steven J

    2016-05-01

    The authors of this article reviewed the methodology of three common consensus methods: nominal group process, consensus development panels, and the Delphi technique. The authors set out to determine how a majority of researchers are conducting these studies, how they are analyzing results, and subsequently the manner in which they are reporting their findings. The authors conclude with a set of guidelines and suggestions designed to aid researchers who choose to use the consensus methodology in their work.Overall, researchers need to describe their inclusion criteria. In addition to this, on the basis of the current literature the authors found that a panel size of 5 to 11 members was most beneficial across all consensus methods described. Lastly, the authors agreed that the statistical analyses done in consensus method studies should be as rigorous as possible and that the predetermined definition of consensus must be included in the ultimate manuscript. More specific recommendations are given for each of the three consensus methods described in the article.

  16. Unstructured characteristic method embedded with variational nodal method using domain decomposition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Girardi, E.; Ruggieri, J.M. [CEA Cadarache (DER/SPRC/LEPH), 13 - Saint-Paul-lez-Durance (France). Dept. d' Etudes des Reacteurs; Santandrea, S. [CEA Saclay, Dept. Modelisation de Systemes et Structures DM2S/SERMA/LENR, 91 - Gif sur Yvette (France)

    2005-07-01

    This paper describes a recently-developed extension of our 'Multi-methods,multi-domains' (MM-MD) method for the solution of the multigroup transport equation. Based on a domain decomposition technique, our approach allows us to treat the one-group equation by cooperatively employing several numerical methods together. In this work, we describe the coupling between the Method of Characteristics (integro-differential equation, unstructured meshes) with the Variational Nodal Method (even parity equation, cartesian meshes). Then, the coupling method is applied to the benchmark model of the Phebus experimental facility (Cea Cadarache). Our domain decomposition method give us the capability to employ a very fine mesh in describing a particular fuel bundle with an appropriate numerical method (MOC), while using a much large mesh size in the rest of the core, in conjunction with a coarse-mesh method (VNM). This application shows the benefits of our MM-MD approach, in terms of accuracy and computing time: the domain decomposition method allows us to reduce the Cpu time, while preserving a good accuracy of the neutronic indicators: reactivity, core-to-bundle power coupling coefficient and flux error. (authors)

  17. Unstructured characteristic method embedded with variational nodal method using domain decomposition techniques

    International Nuclear Information System (INIS)

    Girardi, E.; Ruggieri, J.M.

    2005-01-01

    This paper describes a recently-developed extension of our 'Multi-methods,multi-domains' (MM-MD) method for the solution of the multigroup transport equation. Based on a domain decomposition technique, our approach allows us to treat the one-group equation by cooperatively employing several numerical methods together. In this work, we describe the coupling between the Method of Characteristics (integro-differential equation, unstructured meshes) with the Variational Nodal Method (even parity equation, cartesian meshes). Then, the coupling method is applied to the benchmark model of the Phebus experimental facility (Cea Cadarache). Our domain decomposition method give us the capability to employ a very fine mesh in describing a particular fuel bundle with an appropriate numerical method (MOC), while using a much large mesh size in the rest of the core, in conjunction with a coarse-mesh method (VNM). This application shows the benefits of our MM-MD approach, in terms of accuracy and computing time: the domain decomposition method allows us to reduce the Cpu time, while preserving a good accuracy of the neutronic indicators: reactivity, core-to-bundle power coupling coefficient and flux error. (authors)

  18. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  19. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  20. Methodological issues affecting the study of fish parasites. III. Effect of fish preservation method

    Czech Academy of Sciences Publication Activity Database

    Kvach, Yuriy; Ondračková, Markéta; Janáč, Michal; Jurajda, Pavel

    2018-01-01

    Roč. 127, č. 3 (2018), s. 213-224 ISSN 0177-5103 R&D Projects: GA ČR(CZ) GBP505/12/G112 Institutional support: RVO:68081766 Keywords : flounder Paralichthys-olivaceus * Neoheterobothrium-hirame * community structure * infection levels * Baltic sea * Odontobutidae * ectoparasites * Perciformes * collection * ecology * Parasite community * Preservation methods * Perca fluviatilis * Rhodeus amarus * Methodology * Parasitological examination Subject RIV: GL - Fish ing OBOR OECD: Fish ery Impact factor: 1.549, year: 2016

  1. Preferences of Teaching Methods and Techniques in Mathematics with Reasons

    Science.gov (United States)

    Ünal, Menderes

    2017-01-01

    In this descriptive study, the goal was to determine teachers' preferred pedagogical methods and techniques in mathematics. Qualitative research methods were employed, primarily case studies. 40 teachers were randomly chosen from various secondary schools in Kirsehir during the 2015-2016 educational terms, and data were gathered via…

  2. A Methodology for Integrating Maintainability Using Software Metrics

    OpenAIRE

    Lewis, John A.; Henry, Sallie M.

    1989-01-01

    Maintainability must be integrated into software early in the development process. But for practical use, the techniques used must be as unobtrusive to the existing software development process as possible. This paper defines a methodology for integrating maintainability into large-scale software and describes an experiment which implemented the methodology into a major commercial software development environment.

  3. Onto-clust--a methodology for combining clustering analysis and ontological methods for identifying groups of comorbidities for developmental disorders.

    Science.gov (United States)

    Peleg, Mor; Asbeh, Nuaman; Kuflik, Tsvi; Schertz, Mitchell

    2009-02-01

    Children with developmental disorders usually exhibit multiple developmental problems (comorbidities). Hence, such diagnosis needs to revolve on developmental disorder groups. Our objective is to systematically identify developmental disorder groups and represent them in an ontology. We developed a methodology that combines two methods (1) a literature-based ontology that we created, which represents developmental disorders and potential developmental disorder groups, and (2) clustering for detecting comorbid developmental disorders in patient data. The ontology is used to interpret and improve clustering results and the clustering results are used to validate the ontology and suggest directions for its development. We evaluated our methodology by applying it to data of 1175 patients from a child development clinic. We demonstrated that the ontology improves clustering results, bringing them closer to an expert generated gold-standard. We have shown that our methodology successfully combines an ontology with a clustering method to support systematic identification and representation of developmental disorder groups.

  4. A review of methods for assessment of trace element bioavailability in humans

    International Nuclear Information System (INIS)

    Ahmad, T.; Bilal, R.

    2001-01-01

    Deficiency of micronutrients is widespread among the low socio-economic strata of population. Different intervention strategies are used to eradicate these deficiencies. The most important step in the confirmation of the efficacy/success of an intervention is bioavailability. There are a number of methods for determining the bioavailability, involving both nuclear and non-nuclear techniques. Traditionally, bioavailability of different micronutrients was determined using the chemical balance method, that is, amount excreted subtracted from the amount ingested. Nowadays, methodologies have been developed for measuring the bioavailability of different trace elements incorporating the use of isotopes. The isotopic techniques are very accurate and highly specific. This paper summarizes the various methodologies available with special emphasis on nuclear methods. (author)

  5. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  6. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    Science.gov (United States)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  7. A Critique of Methodological Dualism in Education

    Science.gov (United States)

    Yang, Jeong A.; Yoo, Jae-Bong

    2018-01-01

    This paper aims to critically examine the paradigm of methodological dualism and explore whether methodologies in social science currently are appropriate for educational research. There are two primary methodologies for studying education: quantitative and qualitative methods. This is what we mean by "methodological dualism". Is…

  8. A Method to Test Model Calibration Techniques: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    2016-09-01

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then the calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.

  9. THE FUTURE OF LANGUAGE TEACHING METHODOLOGY

    OpenAIRE

    Ted Rodgers

    1998-01-01

    Abstract : This paper reviews the current state of ELT methodology, particulary in respect to a number of current views suggesting that the profession is now in a "post-methods" era in which previous attention to Methods (Total Physical Response, Silent Way, Natural Approach, etc.) has given way to a more generic approach to ELT methodology. Ten potential future courses of ELT methodology are outlines and three of these are considered in some detail. Particular consideration is given as to ho...

  10. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    Science.gov (United States)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  11. Variance-to-mean method generalized by linear difference filter technique

    International Nuclear Information System (INIS)

    Hashimoto, Kengo; Ohsaki, Hiroshi; Horiguchi, Tetsuo; Yamane, Yoshihiro; Shiroya, Seiji

    1998-01-01

    The conventional variance-to-mean method (Feynman-α method) seriously suffers the divergency of the variance under such a transient condition as a reactor power drift. Strictly speaking, then, the use of the Feynman-α is restricted to a steady state. To apply the method to more practical uses, it is desirable to overcome this kind of difficulty. For this purpose, we propose an usage of higher-order difference filter technique to reduce the effect of the reactor power drift, and derive several new formulae taking account of the filtering. The capability of the formulae proposed was demonstrated through experiments in the Kyoto University Critical Assembly. The experimental results indicate that the divergency of the variance can be effectively suppressed by the filtering technique, and that the higher-order filter becomes necessary with increasing variation rate in power

  12. Comparison of economic evaluation methodology between levelized method and the evaluation system in China

    International Nuclear Information System (INIS)

    Zhang Shengli

    2005-01-01

    Different methodology would bring different results. This paper includes an introduction of levelized discounted generation cost methodology as well as that of Chinese system, respectively. In general, there have two key indices in Chinese evaluation system, they are generation cost and electricity sales price to the grid. This paper contains a description of cost breakdown and calculation procedure for each index. Comparison between these two methods and the primary differences are also included. For the first time, equations for calculating generation cost and selling price to the grid based on Chinese system have been derived, and its accuracy has been shown through running the special computer program. The two systems are quite different in many aspects. Firstly, levelized generation cost is always calculated with discounted method that excluded in Chinese system. Secondly, levelized generation cost is a single and constant value that would not change over the economic life while generation cost in Chinese system is estimated on a year by year base. Thirdly, the makeup of generation cost in Chinese system is different from that of levelized system since taxes and dividend share removed. Finally, the electricity sales price in Chinese system is more similar to levelized generation cost. (authors)

  13. La interpretacion consecutiva: metodologia y tecnicas (Consecutive Interpretation: Methodology and Techniques).

    Science.gov (United States)

    Drallny, Ines

    1987-01-01

    Describes the purpose and appropriate methodology for various levels of interpreter training, for both consecutive and simultaneous interpretation. The importance of relating the intent of the text to the explicit language forms through which that intent is realized is discussed, and appropriate criteria for evaluation of student interpreters are…

  14. Developing "Personality" Taxonomies: Metatheoretical and Methodological Rationales Underlying Selection Approaches, Methods of Data Generation and Reduction Principles.

    Science.gov (United States)

    Uher, Jana

    2015-12-01

    Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".

  15. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  16. Intelligent Search Method Based ACO Techniques for a Multistage Decision Problem EDP/LFP

    Directory of Open Access Journals (Sweden)

    Mostefa RAHLI

    2006-07-01

    Full Text Available The implementation of a numerical library of calculation based optimization in electrical supply networks area is in the centre of the current research orientations, thus, our project in a form given is centred on the development of platform NMSS1. It's a software environment which will preserve many efforts as regards calculations of charge, smoothing curves, losses calculation and economic planning of the generated powers [23].The operational research [17] in a hand and the industrial practice in the other, prove that the means and processes of simulation reached a level of very appreciable reliability and mathematical confidence [4, 5, 14]. It is of this expert observation that many processes make confidence to the results of simulation.The handicaps of this approach or methodology are that it makes base its judgments and handling on simplified assumptions and constraints whose influence was deliberately neglected to be added to the cost to spend [14].By juxtaposing the methods of simulation with artificial intelligence techniques, gathering set of numerical methods acquires an optimal reliability whose assurance can not leave doubt.Software environment NMSS [23] can be a in the field of the rallying techniques of simulation and electric network calculation via a graphic interface. In the same software integrate an AI capability via a module expert system.Our problem is a multistage case where are completely dependant and can't be performed separately.For a multistage problem [21, 22], the results obtained from a credible (large size problem calculation, makes the following question: Could choice of numerical methods set make the calculation of a complete problem using more than two treatments levels, a total error which will be the weakest one possible? It is well-known according to algorithmic policy; each treatment can be characterized by a function called mathematical complexity. This complexity is in fact a coast (a weight overloading

  17. A Survey of 2D Face Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Mejda Chihaoui

    2016-09-01

    Full Text Available Despite the existence of various biometric techniques, like fingerprints, iris scan, as well as hand geometry, the most efficient and more widely-used one is face recognition. This is because it is inexpensive, non-intrusive and natural. Therefore, researchers have developed dozens of face recognition techniques over the last few years. These techniques can generally be divided into three categories, based on the face data processing methodology. There are methods that use the entire face as input data for the proposed recognition system, methods that do not consider the whole face, but only some features or areas of the face and methods that use global and local face characteristics simultaneously. In this paper, we present an overview of some well-known methods in each of these categories. First, we expose the benefits of, as well as the challenges to the use of face recognition as a biometric tool. Then, we present a detailed survey of the well-known methods by expressing each method’s principle. After that, a comparison between the three categories of face recognition techniques is provided. Furthermore, the databases used in face recognition are mentioned, and some results of the applications of these methods on face recognition databases are presented. Finally, we highlight some new promising research directions that have recently appeared.

  18. Application of a methodology for retouching

    Directory of Open Access Journals (Sweden)

    Ana Bailão

    2010-11-01

    Full Text Available Between November 2006 and January 2010, an investigation into retouching methodologies was carried out. The aim of this paper is to describe, in four steps, the retouching methodology of a contemporary painting. The four steps are: chromatic and formal study, considering the use of Gestalt theory and the phenomena of contrast and assimilation; selection of the technique; choice of the materials and retouching practice.Entre Novembre 2006 et Janvier 2010, nous avons fait une recherche dans le cadre du programme de Maitrise sur la méthodologie et les techniques de retouche. Le but de cet article est la description, en quatre étapes, de la méthodologie de retouche d’une peinture contemporaine. Les quatre étapes sont: étude chromatique et formelle, avec l’utilisation de la théorie de la Gestalt et des phénomènes de contraste et assimilation, la sélection de la technique, le choix des matériaux et la pratique de retouche. 

  19. Linking the Organizational Forms Teachers and Teaching Methods in a Class Instructional Methodology

    Directory of Open Access Journals (Sweden)

    Graciela Nápoles-Quiñones

    2016-05-01

    Full Text Available A descriptive study was conducted to show the link between the organizational forms teachers and teaching methods, to expose the pedagogical theory, to deepen the teaching-learning process through methodological class. The main content of the work of teachers is the preparation and level rise; which requires the selection and use of working methods, ways and procedures in accordance with the real and objective conditions of staff who have received the action and conducive to teaching work. Teachers should be aware that you need to master the content they teach, be aware of the level of development of its students, the specific characteristics of the group and of each student, and competent to reciprocate the content they teach with reality.

  20. An LWR design decision Methodology

    International Nuclear Information System (INIS)

    Leahy, T.J.; Rees, D.C.; Young, J.

    1982-01-01

    While all parties involved in nuclear plant regulation endeavor to make decisions which optimize the considerations of plant safety and financial impacts, these decisions are generally made without the benefit of a systematic and rigorous approach to the questions confronting the decision makers. A Design Decision Methodology has been developed which provides such a systematic approach. By employing this methodology, which makes use of currently accepted probabilistic risk assessment techniques and cost estimation, informed decisions may be made against a background of comparisons between the relative levels of safety and costs associated with various design alternatives

  1. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  2. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  3. Tornado missile simulation and design methodology. Volume 1: simulation methodology, design applications, and TORMIS computer code. Final report

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.

    1981-08-01

    A probabilistic methodology has been developed to predict the probabilities of tornado-propelled missiles impacting and damaging nuclear power plant structures. Mathematical models of each event in the tornado missile hazard have been developed and sequenced to form an integrated, time-history simulation methodology. The models are data based where feasible. The data include documented records of tornado occurrence, field observations of missile transport, results of wind tunnel experiments, and missile impact tests. Probabilistic Monte Carlo techniques are used to estimate the risk probabilities. The methodology has been encoded in the TORMIS computer code to facilitate numerical analysis and plant-specific tornado missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and risk has been assessed for a hypothetical nuclear power plant design case study

  4. An overview of farming system typology methodologies and its use in the study of pasture-based farming system: a review

    Energy Technology Data Exchange (ETDEWEB)

    Madry, W.; Mena, Y.; Roszkowska, B.; Gozdowski, D.; Hryniewski, R.; Castel, J. M.

    2013-06-01

    The main objective of the paper is to do a critic study of the use of typology methodologies within pasture-based farming systems (PBFS), especially those situated in less favoured areas, showing in each case the more relevant variables or indicators determining the farming system classification. Another objective is to do an overview of the most used farming system typology methodologies in general. First some considerations about the concept of farming system and approaches to its study have been done. Next, the farming system typology methodologies have been showed in general to different farming systems, but addressed preferably to PBFS. The different tools integrated in these methodologies have been considered: sampling methods, sources of data, variables or indicators obtained from available data and techniques of analysis (statistical or not). Methods for farming system classification have been presented (expert methods, analytical methods or a combination of both types). Among the statistical methods, the multivariate analysis has been overall treated, including the principal component analysis and the cluster analysis. Finally, the use of farming system typology methodologies on different pasture-based farming systems has been presented. The most important aspects considered are following: the main objective of the typology, the main animal species, the employed methods of classification and the main variables involved in this classification. (Author) 56 refs.

  5. International Conference eXtended Discretization MethodS

    CERN Document Server

    Benvenuti, Elena

    2016-01-01

    This book gathers selected contributions on emerging research work presented at the International Conference eXtended Discretization MethodS (X-DMS), held in Ferrara in September 2015. It highlights the most relevant advances made at the international level in the context of expanding classical discretization methods, like finite elements, to the numerical analysis of a variety of physical problems. The improvements are intended to achieve higher computational efficiency and to account for special features of the solution directly in the approximation space and/or in the discretization procedure. The methods described include, among others, partition of unity methods (meshfree, XFEM, GFEM), virtual element methods, fictitious domain methods, and special techniques for static and evolving interfaces. The uniting feature of all contributions is the direct link between computational methodologies and their application to different engineering areas.

  6. Ridge Preservation with Modified “Socket-Shield” Technique: A Methodological Case Series

    Directory of Open Access Journals (Sweden)

    Markus Glocker

    2014-01-01

    Full Text Available After tooth extraction, the alveolar bone undergoes a remodeling process, which leads to horizontal and vertical bone loss. These resorption processes complicate dental rehabilitation, particularly in connection with implants. Various methods of guided bone regeneration (GBR have been described to retain the original dimension of the bone after extraction. Most procedures use filler materials and membranes to support the buccal plate and soft tissue, to stabilize the coagulum and to prevent epithelial ingrowth. It has also been suggested that resorption of the buccal bundle bone can be avoided by leaving a buccal root segment (socket shield technique in place, because the biological integrity of the buccal periodontium (bundle bone remains untouched. This method has also been described in connection with immediate implant placement. The present case report describes three consecutive cases in which a modified method was applied as part of a delayed implantation. The latter was carried out after six months, and during re-entry the new bone formation in the alveolar bone and the residual ridge was clinically evaluated as proof of principle. It was demonstrated that the bone was clinically preserved with this method. Possibilities and limitations are discussed and directions for future research are disclosed.

  7. A Methods and procedures to apply probabilistic safety Assessment (PSA) techniques to the cobalt-therapy process. Cuban experience

    International Nuclear Information System (INIS)

    Vilaragut Llanes, J.J.; Ferro Fernandez, R.; Lozano Lima, B; De la Fuente Puch, A.; Dumenigo Gonzalez, C.; Troncoso Fleitas, M.; Perez Reyes, Y.

    2003-01-01

    This paper presents the results of the Probabilistic Safety Analysis (PSA) to the Cobalt Therapy Process, which was performed as part of the International Atomic Energy Agency's Coordinated Research Project (CRP) to Investigate Appropriate Methods and Procedures to Apply Probabilistic Safety Assessment (PSA) Techniques to Large Radiation Sources. The primary methodological tools used in the analysis were Failure Modes and Effects Analysis (FMEA), Event Trees and Fault Trees. These tools were used to evaluate occupational, public and medical exposures during cobalt therapy treatment. The emphasis of the study was on the radiological protection of patients. During the course of the PSA, several findings were analysed concerning the cobalt treatment process. In relation with the Undesired Events Probabilities, the lowest exposures probabilities correspond to the public exposures during the treatment process (Z21); around 10-10 per year, being the workers exposures (Z11); around 10-4 per year. Regarding to the patient, the Z33 probabilities prevail (not desired dose to normal tissue) and Z34 (not irradiated portion to target volume). Patient accidental exposures are also classified in terms of the extent to which the error is likely to affect individual treatments, individual patients, or all the patients treated on a specific unit. Sensitivity analyses were realised to determine the influence of certain tasks or critical stages on the results. As a conclusion the study establishes that the PSA techniques may effectively and reasonably determine the risk associated to the cobalt-therapy treatment process, though there are some weaknesses in its methodological application for this kind of study requiring further research. These weaknesses are due to the fact that the traditional PSA has been mainly applied to complex hardware systems designed to operate with a high automation level, whilst the cobalt therapy treatment is a relatively simple hardware system with a

  8. Development of a calculation methodology for potential flow over irregular topographies

    International Nuclear Information System (INIS)

    Del Carmen, Alejandra F.; Ferreri, Juan C.; Boutet, Luis I.

    2003-01-01

    Full text: Computer codes for the calculation of potential flow fields over surfaces with irregular topographies have been developed. The flows past multiple simple obstacles and past the neighboring region of the Embalse Nuclear Power Station have been considered. The codes developed allow the calculation of velocities quite near the surface. It, in turn, imposed developing high accuracy techniques. The Boundary Element Method, using a linear approximation on triangular plane elements and an analytical integration methodology has been applied. A particular and quite efficient technique for the calculation of the solid angle at each node vertex was also considered. The results so obtained will be applied to predict the dispersion of passive pollutants coming from discontinuous emissions. (authors)

  9. A new methodology based on the two-region model and microscopic noise analysis techniques for absolute measurements of betaeff, Λ and betaeff/Λ of the IPEN-MB-01 reactor

    International Nuclear Information System (INIS)

    Kuramoto, Renato Yoichi Ribeiro

    2007-01-01

    A new method for absolute measurement of the effective delayed neutron fraction, beta eff based on microscopic noise experiments and the Two-Region Model was developed at the IPEN/MB-01 Research Reactor facility. In contrast with other techniques like the Modified Bennett Method, Nelson-Number Method and 252 Cf-Source Method, the main advantage of this new methodology is to obtain the effective delayed neutron parameters in a purely experimental way, eliminating all parameters that are difficult to measure or calculate. In this way, Rossi-a and Feynman-a experiments for validation of this method were performed at the IPEN/MB-01 facility, and adopting the present approach, beta eff was measured with a 0.67% uncertainty. In addition, the prompt neutron generation time, A, and other parameters were also obtained in an absolute experimental way. In general, the final results agree well with values from frequency analysis experiments. The theory-experiment comparison reveals that JENDL-3.3 shows deviation for beta eff lower than 1% which meets the desired accuracy for the theoretical determination of this parameter. This work supports the reduction of the 235 U thermal yield as proposed by Okajima and Sakurai. (author)

  10. Mixing methodologies in ESL: Cumulative or contradictory?

    Directory of Open Access Journals (Sweden)

    Doug Absalom

    2013-02-01

    Full Text Available It is often assumed that an eclectic approach to teaching a second language is desirable, as no single method can be regarded as universally ideal. Such a mixed methodologies approach was adopted in an ESL course at the University of Newcastle in Australia, deliberately incorporating the use of CALL techniques to enhance the success of a variety of other methods. Surprisingly, students who reported to have enjoyed the CALL classes and testified to their practical usefulness fared worse in the examinations than students in the previous five years, when CALL techniques were not used The. writer speculates on the possible reasons for this unexpected finding, and cites a further example of the lack of success of a mixed methodologies approach. Dit word dikwe/s aangeneem dat 'n eklektiese benadering tot tweedetaalonde"ig wenslik is, aangesien dit aanvaar word dat geen onde"igmetode as universeel ideaal beskou kan word nie. Met die aanname in gedagte, is 'n gemengde metodologiese benadering gebruik vir die aanbieding van 'n Engels tweedetaalkursus aan die Universiteit van Newcastle in Australie. Die kursus het, bo en behalwe die gewone tweedetaalonde"igmetodes, ook gebruik gemaak van CALL-tegnieke (dw.s. rekenaar-ondersteunde leertegnieke met die bedoeling om die taalverwerwingsproses op die wyse te bevorder. Teen die verwagting in het studente wat aangedui het dat die rekenaarklasse genotvol en waardevol was, swakker in die eindeksamen gevaar as die gemiddelde student wat geen CALL-onderrig in dieseifde kursus oor die vorige vyf jaar ontvang het nie. Die skrywer spekuleer oor die moontlike redes vir hierdie onverwagte swak vertoning en ve~s na nog 'n voorbeeld van onsuksesvol/e resultate na die gebruik van 'n gemengde-metode-benadering.

  11. The Methodology of Investigation of Intercultural Rhetoric applied to SFL

    Directory of Open Access Journals (Sweden)

    David Heredero Zorzo

    2016-12-01

    Full Text Available Intercultural rhetoric is a discipline which studies written discourse among individuals from different cultures. It is a very strong field in the Anglo-Saxon scientific world, especially referring to English as a second language, but in Spanish as a foreign language it is not as prominent. Intercultural rhetoric has provided applied linguistics with important methods of investigation, thus applying this to SFL could introduce interesting new perspectives on the subject. In this paper, we present the methodology of investigation of intercultural rhetoric, which is based on the use of different types of corpora for analysing genders, and follows the precepts of tertium comparationis. In addition, it uses techniques of ethnographic investigation. The purpose of this paper is to show the applications of this methodology to SFL and to outline future investigations in the same field.

  12. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  13. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    International Nuclear Information System (INIS)

    2013-01-01

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results

  14. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results.

  15. Nucleon-nucleon interactions via Lattice QCD: Methodology. HAL QCD approach to extract hadronic interactions in lattice QCD

    Science.gov (United States)

    Aoki, Sinya

    2013-07-01

    We review the potential method in lattice QCD, which has recently been proposed to extract nucleon-nucleon interactions via numerical simulations. We focus on the methodology of this approach by emphasizing the strategy of the potential method, the theoretical foundation behind it, and special numerical techniques. We compare the potential method with the standard finite volume method in lattice QCD, in order to make pros and cons of the approach clear. We also present several numerical results for nucleon-nucleon potentials.

  16. Development of techniques and methods for evaluation of quality of scanned image in mammography

    International Nuclear Information System (INIS)

    Carmo Santana, P. do; Nogueira, M.S.

    2008-01-01

    Cancer is the second cause of death in the Brazilian female population and breast cancer is the most frequent neoplasm amongst women. Mammography is an essential tool for diagnosis and early detection of this disease. In order to be effective, the mammography must be of good quality. The Brazilian College of Radiology (CBR), the National Agency for Health Surveillance (ANVISA) and international bodies recommend standards of practice for mammography. Due to the risk of ionizing radiation, techniques that minimize dose and optimize image quality are essential to ensure that all women are submitted to mammography procedures of high quality for the detection of breast cancer. In this research were analyzed components of the image treatment via digital and developed methods and techniques of analysis aiming the detection of structures for medical diagnosis, decreasing variations due to subjectivity. It used free software Image J, to make the evaluations of the information contained in the scanned images. We use the scanned images of calibration of a simulated breast to calibrate the program Image J. Thus, it was able to correctly convert the values of the scale of shades of gray in optical density values of presenting the standard deviation for each measure held. Applying the test t-student noticed that the values obtained with the digital system to the level of contrast and spatial resolution are consistent with the results obtained so subjective, since there was no significant difference (p <0.05) for all comparisons evaluated. Since then, this methodology is recommended in routine evaluations of services of mammography. (author)

  17. Comparison Of Irms Delhi Methodology With Who Methodology On Immunization Coverage

    Directory of Open Access Journals (Sweden)

    Singh Padam

    1996-01-01

    Full Text Available Research question: What are the merits of IRMS Model over WHO Model for Coverage Evaluation Survey? Which method is superior and appropriate for coverage evolution survey of immunization in our setting? Objective: To compare IRMS Delhi methodology with WHO methodology on Immunization Coverage. Study Design: Cross-Sectional Setting: Urban and Rural both. Participants: Mothers& Children Sample Size: 300 children between 1-2 years and 300 mothers in rural areas and 75 children and 75 mothers in urban areas. Study Variables: Rural, Urban, Cast-Group, Size of the stratum, Literacy, Sex and Cost effectiveness. Outcome Variables: Coverage level of immunization. Analysis: Routine Statistical Analysis. Results: IRMS developed methodology scores better rating over WHO methodology, especially when coverage evolution is attempted in medium size villages with existence of socio-economic seggregation-which remains the main characteristic of the Indian villages.

  18. Double-lock technique: a simple method to secure abdominal wall closure

    International Nuclear Information System (INIS)

    Jategaonkar, P.A.; Yadav, S.P.

    2013-01-01

    Secure closure of a laparotomy incision remains an important aspect of any abdominal operation with the aim to avoid the postoperative morbidity and hasten the patient's recovery. Depending on the operator's preference and experience, it may be done by the continuous or the interrupted methods either using a non-absorbable or delayed-absorbable suture. We describe a simple, secure and quick technique of abdominal wall closure which involves continuous suture inter-locked doubly after every third bite. This simple and easy to use mass closure technique can be easily mastered by any member of the surgical team and does not need any assistant. It amalgamates the advantages of both, the continuous and the interrupted methods of closures. To our knowledge, such a technique has not been reported in the literature. (author)

  19. Microautoradiographic methods and their applications in biology

    International Nuclear Information System (INIS)

    Benes, L.

    1978-01-01

    A survey of microautoradiographic methods and of their application in biology is given. The current state of biological microautoradiography is shown, focusing on the efficiency of techniques and on special problems proceeding in autoradiographic investigations in biology. Four more or less independent fields of autoradiography are considered. In describing autoradiographic techniques two methodological tasks are emphasized: The further development of the labelling technique in all metabolic studies and of instrumentation and automation of autoradiograph evaluation. (author)

  20. Methodological proposal for the definition of improvement strategies in logistics of SME

    Directory of Open Access Journals (Sweden)

    Yeimy Liseth Becerra

    2014-12-01

    Full Text Available A methodological proposal for defining strategies of improvement in logistics of SMEs is presented as a means to fulfill a specific objective of the project Methodological design on storage logistics, acquisition, ownership of information systems and communication for Colombian SMEs, baker subsector, which currently runs the research group SEPRO, of Universidad Nacional of Colombia and supported by Colciencias. The project corresponds to the completion of the last stage of the base project, and aims to implement the corresponding target, raised in the research project that has been developing the research group SEPRO. To do this, it was made a review of the methodology used during the execution of the basic project, as well as the state of the art of techniques used in similar research for the evaluation and definition of breeding strategies in SMEs logistics. Revised techniques were compared and a proposed methodology was configured, which consists of the techniques that represented the greatest advantages for the research development.

  1. Formalizing the ISDF Software Development Methodology

    Directory of Open Access Journals (Sweden)

    Mihai Liviu DESPA

    2015-01-01

    Full Text Available The paper is aimed at depicting the ISDF software development methodology by emphasizing quality management and software development lifecycle. The ISDF methodology was built especially for innovative software development projects. The ISDF methodology was developed empirically by trial and error in the process of implementing multiple innovative projects. The research process began by analysing key concepts like innovation and software development and by settling the important dilemma of what makes a web application innovative. Innovation in software development is presented from the end-user, project owner and project manager’s point of view. The main components of a software development methodology are identified. Thus a software development methodology should account for people, roles, skills, teams, tools, techniques, processes, activities, standards, quality measuring tools, and team values. Current software development models are presented and briefly analysed. The need for a dedicated innovation oriented software development methodology is emphasized by highlighting shortcomings of current software development methodologies when tackling innovation. The ISDF methodology is presented in the context of developing an actual application. The ALHPA application is used as a case study for emphasizing the characteristics of the ISDF methodology. The development life cycle of the ISDF methodology includes research, planning, prototyping, design, development, testing, setup and maintenance. Artefacts generated by the ISDF methodology are presented. Quality is managed in the ISDF methodology by assessing compliance, usability, reliability, repeatability, availability and security. In order to properly asses each quality component a dedicated indicator is built. A template for interpreting each indicator is provided. Conclusions are formulated and new related research topics are submitted for debate.

  2. Positive lists of cosmetic ingredients: Analytical methodology for regulatory and safety controls - A review.

    Science.gov (United States)

    Lores, Marta; Llompart, Maria; Alvarez-Rivera, Gerardo; Guerra, Eugenia; Vila, Marlene; Celeiro, Maria; Lamas, J Pablo; Garcia-Jares, Carmen

    2016-04-07

    Cosmetic products placed on the market and their ingredients, must be safe under reasonable conditions of use, in accordance to the current legislation. Therefore, regulated and allowed chemical substances must meet the regulatory criteria to be used as ingredients in cosmetics and personal care products, and adequate analytical methodology is needed to evaluate the degree of compliance. This article reviews the most recent methods (2005-2015) used for the extraction and the analytical determination of the ingredients included in the positive lists of the European Regulation of Cosmetic Products (EC 1223/2009): comprising colorants, preservatives and UV filters. It summarizes the analytical properties of the most relevant analytical methods along with the possibilities of fulfilment of the current regulatory issues. The cosmetic legislation is frequently being updated; consequently, the analytical methodology must be constantly revised and improved to meet safety requirements. The article highlights the most important advances in analytical methodology for cosmetics control, both in relation to the sample pretreatment and extraction and the different instrumental approaches developed to solve this challenge. Cosmetics are complex samples, and most of them require a sample pretreatment before analysis. In the last times, the research conducted covering this aspect, tended to the use of green extraction and microextraction techniques. Analytical methods were generally based on liquid chromatography with UV detection, and gas and liquid chromatographic techniques hyphenated with single or tandem mass spectrometry; but some interesting proposals based on electrophoresis have also been reported, together with some electroanalytical approaches. Regarding the number of ingredients considered for analytical control, single analyte methods have been proposed, although the most useful ones in the real life cosmetic analysis are the multianalyte approaches. Copyright © 2016

  3. The need for standardisation of methodology and components in commercial radioimmunoassay kits

    International Nuclear Information System (INIS)

    Wood, W.G.; Marschner, I.; Scriba, P.C.

    1977-01-01

    The problems arising from increasing use of commercial kits in radioimmunoassay (RIA) and related fields are discussed. The problems arising in various RIAs differ according to the substance under test. The quality of individual components is often good, although methodology is often not optimal and contains short-cuts, which although commercially attractive, can lead to erroneous values and poor sensitivity and precision. Minor modification of methodology often leads to major improvements in sensitivity and precision, and this has been demonstrated in the case of three digoxin kits employing antibody-coated tube techniques and in four kits for thyrotropin (TSH) using different techniques. It has also been noted that in many imported quality control sera from the USA no values have been ascribed to European kits for the components listed, thus reducing these sera to the function of precision control. The deductions from this study are that a standardisation of kit components and assay methods is desirable in order to allow comparison of results between laboratories using different kits. (orig.) [de

  4. Novel methods for tendon investigations

    DEFF Research Database (Denmark)

    Kjær, Michael; Langberg, Henning; Bojsen-Møller, J.

    2008-01-01

    Purpose. Tendon structures have been studied for decades, but over the last decade, methodological development and renewed interest for metabolic, circulatory and tissue protein turnover in tendon tissue has resulted in a rising amount of investigations. Method. This paper will detail the various...... modern investigative techniques available to study tendons. Results. There are a variety of investigative methods available to study the correlations between mechanics and biology in tendons. Conclusion. The available methodologies not only allow for potential insight into physiological...... and pathophysiological mechanisms in tendon tissue, but also, to some extent, allow for more elaborate studies of the intact human tendon. Read More: http://informahealthcare.com/doi/full/10.1080/09638280701785403...

  5. Engineering radioecology: Methodological considerations

    International Nuclear Information System (INIS)

    Nechaev, A.F.; Projaev, V.V.; Sobolev, I.A.; Dmitriev, S.A.

    1995-01-01

    The term ''radioecology'' has been widely recognized in scientific and technical societies. At the same time, this scientific school (radioecology) does not have a precise/generally acknowledged structure, unified methodical basis, fixed subjects of investigation, etc. In other words, radioecology is a vast, important but rather amorphous conglomerate of various ideas, amalgamated mostly by their involvement in biospheric effects of ionizing radiation and some conceptual stereotypes. This paradox was acceptable up to a certain time. However, with the termination of the Cold War and because of remarkable political changes in the world, it has become possible to convert the problem of environmental restoration from the scientific sphere in particularly practical terms. Already the first steps clearly showed an imperfection of existing technologies, managerial and regulatory schemes; lack of qualified specialists, relevant methods and techniques; uncertainties in methodology of decision-making, etc. Thus, building up (or maybe, structuring) of special scientific and technological basis, which the authors call ''engineering radioecology'', seems to be an important task. In this paper they endeavored to substantiate the last thesis and to suggest some preliminary ideas concerning the subject matter of engineering radioecology

  6. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  7. Survey as a group interactive teaching technique

    Directory of Open Access Journals (Sweden)

    Ana GOREA

    2017-03-01

    Full Text Available Smooth running of the educational process and the results depend a great deal on the methods used. The methodology of teaching offers a great variety of teaching techniques that the teacher can make use of in the teaching/learning process. Such techniques as brainstorming, the cube, KLW, case study, Venn diagram, and many other are familiar to the teachers and they use them effectively in the classroom. The present article proposes a technique called ‘survey’, which has been successfully used by the author as a student-centered speaking activity in foreign language classes. It has certain advantages especially if used in large groups. It can be adapted for any other discipline in the case when the teacher wishes to offer the students space for cooperative activity and creativity.

  8. Independent random sampling methods

    CERN Document Server

    Martino, Luca; Míguez, Joaquín

    2018-01-01

    This book systematically addresses the design and analysis of efficient techniques for independent random sampling. Both general-purpose approaches, which can be used to generate samples from arbitrary probability distributions, and tailored techniques, designed to efficiently address common real-world practical problems, are introduced and discussed in detail. In turn, the monograph presents fundamental results and methodologies in the field, elaborating and developing them into the latest techniques. The theory and methods are illustrated with a varied collection of examples, which are discussed in detail in the text and supplemented with ready-to-run computer code. The main problem addressed in the book is how to generate independent random samples from an arbitrary probability distribution with the weakest possible constraints or assumptions in a form suitable for practical implementation. The authors review the fundamental results and methods in the field, address the latest methods, and emphasize the li...

  9. Challenges and Opportunities for Harmonizing Research Methodology

    DEFF Research Database (Denmark)

    van Hees, V. T.; Thaler-Kall, K.; Wolf, K. H.

    2016-01-01

    Objectives: Raw accelerometry is increasingly being used in physical activity research, but diversity in sensor design, attachment and signal processing challenges the comparability of research results. Therefore, efforts are needed to harmonize the methodology. In this article we reflect on how...... increased methodological harmonization may be achieved. Methods: The authors of this work convened for a two-day workshop (March 2014) themed on methodological harmonization of raw accelerometry. The discussions at the workshop were used as a basis for this review. Results: Key stakeholders were identified...... as manufacturers, method developers, method users (application), publishers, and funders. To facilitate methodological harmonization in raw accelerometry the following action points were proposed: i) Manufacturers are encouraged to provide a detailed specification of their sensors, ii) Each fundamental step...

  10. Applying Statistical Process Quality Control Methodology to Educational Settings.

    Science.gov (United States)

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  11. Techniques of radiation dosimetry

    International Nuclear Information System (INIS)

    Mahesk, K.

    1985-01-01

    A text and reference with an interdisciplinary approach to physics, atomic energy, radiochemistry, and radiobiology. Chapters examine basic principles, experimental techniques, the methodology of dose experiments, and applications. Treats 14 different dosimetric techniques, including ionization chamber, thermoluminescence, and lyoluminescence. Considers the conceptual aspects and characteristic features of radiation

  12. Mixed methods research in tobacco control with youth and young adults: A methodological review of current strategies.

    Directory of Open Access Journals (Sweden)

    Craig S Fryer

    Full Text Available Tobacco use among young people is a complex and serious global dilemma that demands innovative and diverse research approaches. The purpose of this methodological review was to examine the current use of mixed methods research in tobacco control with youth and young adult populations and to develop practical recommendations for tobacco control researchers interested in this methodology.Using PubMed, we searched five peer-reviewed journals that publish tobacco control empirical literature for the use of mixed methods research to study young populations, age 12-25 years. Our team analyzed the features of each article in terms of tobacco control topic, population, youth engagement strategies, and several essential elements of mixed methods research.We identified 23 mixed methods studies published by authors from five different countries reported between 2004 and 2015. These 23 articles examined various topics that included tobacco use behavior, tobacco marketing and branding, and cessation among youth and young adults. The most common mixed methods approach was variations of the concurrent design in which the qualitative and quantitative strands were administered at the same time and given equal priority. This review documented several innovative applications of mixed methods research as well as challenges in the reporting of the complex research designs.The use of mixed methods research in tobacco control has great potential for advancing the understanding of complex behavioral and sociocultural issues for all groups, especially youth and young adults.

  13. Mixed methods research in tobacco control with youth and young adults: A methodological review of current strategies.

    Science.gov (United States)

    Fryer, Craig S; Seaman, Elizabeth L; Clark, Rachael S; Plano Clark, Vicki L

    2017-01-01

    Tobacco use among young people is a complex and serious global dilemma that demands innovative and diverse research approaches. The purpose of this methodological review was to examine the current use of mixed methods research in tobacco control with youth and young adult populations and to develop practical recommendations for tobacco control researchers interested in this methodology. Using PubMed, we searched five peer-reviewed journals that publish tobacco control empirical literature for the use of mixed methods research to study young populations, age 12-25 years. Our team analyzed the features of each article in terms of tobacco control topic, population, youth engagement strategies, and several essential elements of mixed methods research. We identified 23 mixed methods studies published by authors from five different countries reported between 2004 and 2015. These 23 articles examined various topics that included tobacco use behavior, tobacco marketing and branding, and cessation among youth and young adults. The most common mixed methods approach was variations of the concurrent design in which the qualitative and quantitative strands were administered at the same time and given equal priority. This review documented several innovative applications of mixed methods research as well as challenges in the reporting of the complex research designs. The use of mixed methods research in tobacco control has great potential for advancing the understanding of complex behavioral and sociocultural issues for all groups, especially youth and young adults.

  14. The Technique to Prevent the Data Leakage using Covert Channels

    Directory of Open Access Journals (Sweden)

    Anna Vasilievna Arkhangelskaya

    2013-12-01

    Full Text Available The purpose of the article was to analyze technique to prevent information leakage using covert channels. The main steps are as follows: involving covert channels identification, data throughput estimation, its elimination or limitation, audit and detection. Three schemes of identification had been analyzed: shared resources methodology, covert flow tree method, message sequence diagram method. Ways of guarantee information delivery from systems with low security level to systems with high security level have been investigated.

  15. First principles calculations using density matrix divide-and-conquer within the SIESTA methodology

    International Nuclear Information System (INIS)

    Cankurtaran, B O; Gale, J D; Ford, M J

    2008-01-01

    The density matrix divide-and-conquer technique for the solution of Kohn-Sham density functional theory has been implemented within the framework of the SIESTA methodology. Implementation details are provided where the focus is on the scaling of the computation time and memory use, in both serial and parallel versions. We demonstrate the linear-scaling capabilities of the technique by providing ground state calculations of moderately large insulating, semiconducting and (near-) metallic systems. This linear-scaling technique has made it feasible to calculate the ground state properties of quantum systems consisting of tens of thousands of atoms with relatively modest computing resources. A comparison with the existing order-N functional minimization (Kim-Mauri-Galli) method is made between the insulating and semiconducting systems

  16. Methodology for the U.S. Food and Drug Administration's radionuclides in foods program

    International Nuclear Information System (INIS)

    Baratta, E.J.

    1998-01-01

    The U.S. Food and Drug Administration (FDA) is responsible for the wholesomeness of the nation's food supply. The FDA modified its food monitoring program in January, 1973, to include radioactive isotopes. The methodology used to perform analyses on these food products are taken from the standard setting societies such as the AOAC International, American Society for Testing Materials and American Public Health Association Standard Methods. In addition, methods not tested by these societies are taken from the literature or from Department of Energy manuals such as the Health and Safety Laboratory and also from Environmental Protection Agency, Public Health Service, and Food and Agricultural Organization manuals. These include the methods for long-lived radionuclides such as tritium, strontium-90, cesium-137 and plutonium. Also, the short-lived radionuclides such as iodine-131, radiocesium, radiocerium and radioruthenium. In addition, they include the natural occurring radionuclides such as radium and uranium isotopes. The activity concentrations of gamma-emitters such as radiocesium, iodine-131 and radioruthenium are determined by gamma-ray spectrometry. This is done using intrinsic germanium detectors with the appropriate hardware and software. The alpha and 'pure' beta-emitters are determined by various radiochemical methods and techniques. The radiochemical methodology and equipment used in analyzing these radionuclides are described and discussed. Also, the methodology and equipment for the gamma-emitters are described in more detail in this paper. In addition, the limits of detection for the methods used will be discussed. (author)

  17. C-E setpoint methodology. C-E local power density and DNB LSSS and LCO setpoint methodology for analog protection systems

    International Nuclear Information System (INIS)

    1976-04-01

    A description is presented of the methodology presently in use by Combustion Engineering to calculate Limiting Safety System Setting (LSSS) for the Local Power Density and Thermal Margin Trip Systems and Limiting Conditions for Operation (LCO) to assure that the specified acceptable fuel design limits are not exceeded during the design basis anticipated operational occurrences. The C-E Nuclear Steam Supply Systems for which the report is applicable are those incorporating the analog reactor protection system and licensed under the requirements of 10CFR50, Appendix A. The design basis events to be accommodated by the subject LSSS and LCO are discussed, and the methods to assure the required protection system response and initial required margin are described. The calculational techniques used to represent the specified acceptable fuel design limits in terms of monitored reactor parameters are provided. Using the resultant limits as a base, the methodology to synthesize the subject LSSS and LCO in terms of the parameters processed by the protection and monitoring systems is described

  18. Diuresis renography in children: methodological aspects; Nephrogramme isotopique avec epreuve d`hyperdiurese chez l`enfant: aspects methodologiques

    Energy Technology Data Exchange (ETDEWEB)

    Bonnin, F.; Le Stanc, E. [Hopital Beaujon, 92 - Clichy (France); Busquet, G.; Saidi, L. [Hopital Mignot, 78 - Versailles (France); Lyonnet, F. [Hopital Lapeyronie, 34 -Montpellier (France)

    1995-12-31

    In paediatrics, diuresis renography is used as a method to guide clinical management of hydronephrosis or hydro-uretero-nephrosis. Various pitfalls in the technique and other errors exist and may lead to a misinterpretation of the test. The methodology for performing and interpreting the diuresis renography is discussed. (authors). 12 refs., 4 figs.

  19. An efficient preconditioning technique using Krylov subspace methods for 3D characteristics solvers

    International Nuclear Information System (INIS)

    Dahmani, M.; Le Tellier, R.; Roy, R.; Hebert, A.

    2005-01-01

    The Generalized Minimal RESidual (GMRES) method, using a Krylov subspace projection, is adapted and implemented to accelerate a 3D iterative transport solver based on the characteristics method. Another acceleration technique called the self-collision rebalancing technique (SCR) can also be used to accelerate the solution or as a left preconditioner for GMRES. The GMRES method is usually used to solve a linear algebraic system (Ax=b). It uses K(r (o) ,A) as projection subspace and AK(r (o) ,A) for the orthogonalization of the residual. This paper compares the performance of these two combined methods on various problems. To implement the GMRES iterative method, the characteristics equations are derived in linear algebra formalism by using the equivalence between the method of characteristics and the method of collision probability to end up with a linear algebraic system involving fluxes and currents. Numerical results show good performance of the GMRES technique especially for the cases presenting large material heterogeneity with a scattering ratio close to 1. Similarly, the SCR preconditioning slightly increases the GMRES efficiency

  20. Which Methodology Works Better? English Language Teachers' Awareness of the Innovative Language Learning Methodologies

    Science.gov (United States)

    Kurt, Mustafa

    2015-01-01

    The present study investigated whether English language teachers were aware of the innovative language learning methodologies in language learning, how they made use of these methodologies and the learners' reactions to them. The descriptive survey method was employed to disclose the frequencies and percentages of 175 English language teachers'…

  1. Sleeve Push Technique: A Novel Method of Space Gaining.

    Science.gov (United States)

    Verma, Sanjeev; Bhupali, Nameksh Raj; Gupta, Deepak Kumar; Singh, Sombir; Singh, Satinder Pal

    2018-01-01

    Space gaining is frequently required in orthodontics. Multiple loops were initially used for space gaining and alignment. The most common used mechanics for space gaining is the use of nickel-titanium open coil springs. The disadvantage of nickel-titanium coil spring is that they cannot be used until the arches are well aligned to receive the stiffer stainless steel wires. Therefore, a new method of gaining space during initial alignment and leveling has been developed and named as sleeve push technique (SPT). The nickel-titanium wires, i.e. 0.012 inches and 0.014 inches along with archwire sleeve (protective tubing) can be used in a modified way to gain space along with alignment. This method helps in gaining space right from day 1 of treatment. The archwire sleeve and nickel-titanium wire in this new SPT act as a mutually synergistic combination and provide the orthodontist with a completely new technique for space opening.

  2. Sleeve push technique: A novel method of space gaining

    Directory of Open Access Journals (Sweden)

    Sanjeev Verma

    2018-01-01

    Full Text Available Space gaining is frequently required in orthodontics. Multiple loops were initially used for space gaining and alignment. The most common used mechanics for space gaining is the use of nickel–titanium open coil springs. The disadvantage of nickel–titanium coil spring is that they cannot be used until the arches are well aligned to receive the stiffer stainless steel wires. Therefore, a new method of gaining space during initial alignment and leveling has been developed and named as sleeve push technique (SPT. The nickel–titanium wires, i.e. 0.012 inches and 0.014 inches along with archwire sleeve (protective tubing can be used in a modified way to gain space along with alignment. This method helps in gaining space right from day 1 of treatment. The archwire sleeve and nickel–titanium wire in this new SPT act as a mutually synergistic combination and provide the orthodontist with a completely new technique for space opening.

  3. Advanced methodology for generation expansion planning including interconnected systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, M; Yokoyama, R; Yasuda, K [Tokyo Metropolitan Univ. (Japan); Sasaki, H [Hiroshima Univ. (Japan); Ogimoto, K [Electric Power Development Co. Ltd., Tokyo (Japan)

    1994-12-31

    This paper reviews advanced methodology for generation expansion planning including interconnected systems developed in Japan, putting focus on flexibility and efficiency in a practical application. First, criteria for evaluating flexibility of generation planning considering uncertainties are introduced. Secondly, the flexible generation mix problem is formulated as a multi-objective optimization with more than two objective functions. The multi-objective optimization problem is then transformed into a single objective problem by using the weighting method, to obtain the Pareto optimal solution, and solved by a dynamics programming technique. Thirdly, a new approach for electric generation expansion planning of interconnected systems is presented, based on the Benders Decomposition technique. That is, large scale generation problem constituted by the general economic load dispatch problem, and several sub problems which are composed of smaller scale isolated system generation expansion plans. Finally, the generation expansion plan solved by an artificial neural network is presented. In conclusion, the advantages and disadvantages of this method from the viewpoint of flexibility and applicability to practical generation expansion planning are presented. (author) 29 refs., 10 figs., 4 tabs.

  4. Method for statistical data analysis of multivariate observations

    CERN Document Server

    Gnanadesikan, R

    1997-01-01

    A practical guide for multivariate statistical techniques-- now updated and revised In recent years, innovations in computer technology and statistical methodologies have dramatically altered the landscape of multivariate data analysis. This new edition of Methods for Statistical Data Analysis of Multivariate Observations explores current multivariate concepts and techniques while retaining the same practical focus of its predecessor. It integrates methods and data-based interpretations relevant to multivariate analysis in a way that addresses real-world problems arising in many areas of inte

  5. New Teaching Techniques to Improve Critical Thinking. The Diaprove Methodology

    Science.gov (United States)

    Saiz, Carlos; Rivas, Silvia F.

    2016-01-01

    The objective of this research is to ascertain whether new instructional techniques can improve critical thinking. To achieve this goal, two different instruction techniques (ARDESOS--group 1--and DIAPROVE--group 2--) were studied and a pre-post assessment of critical thinking in various dimensions such as argumentation, inductive reasoning,…

  6. Data Collection and Analysis Techniques for Evaluating the Perceptual Qualities of Auditory Stimuli

    Energy Technology Data Exchange (ETDEWEB)

    Bonebright, T.L.; Caudell, T.P.; Goldsmith, T.E.; Miner, N.E.

    1998-11-17

    This paper describes a general methodological framework for evaluating the perceptual properties of auditory stimuli. The framework provides analysis techniques that can ensure the effective use of sound for a variety of applications including virtual reality and data sonification systems. Specifically, we discuss data collection techniques for the perceptual qualities of single auditory stimuli including identification tasks, context-based ratings, and attribute ratings. In addition, we present methods for comparing auditory stimuli, such as discrimination tasks, similarity ratings, and sorting tasks. Finally, we discuss statistical techniques that focus on the perceptual relations among stimuli, such as Multidimensional Scaling (MDS) and Pathfinder Analysis. These methods are presented as a starting point for an organized and systematic approach for non-experts in perceptual experimental methods, rather than as a complete manual for performing the statistical techniques and data collection methods. It is our hope that this paper will help foster further interdisciplinary collaboration among perceptual researchers, designers, engineers, and others in the development of effective auditory displays.

  7. A method for predicting monthly rainfall patterns

    International Nuclear Information System (INIS)

    Njau, E.C.

    1987-11-01

    A brief survey is made of previous methods that have been used to predict rainfall trends or drought spells in different parts of the earth. The basic methodologies or theoretical strategies used in these methods are compared with contents of a recent theory of Sun-Weather/Climate links (Njau, 1985a; 1985b; 1986; 1987a; 1987b; 1987c) which point towards the possibility of practical climatic predictions. It is shown that not only is the theoretical basis of each of these methodologies or strategies fully incorporated into the above-named theory, but also this theory may be used to develop a technique by which future monthly rainfall patterns can be predicted in further and finer details. We describe the latter technique and then illustrate its workability by means of predictions made on monthly rainfall patterns in some East African meteorological stations. (author). 43 refs, 11 figs, 2 tabs

  8. Advanced kinetics for calorimetric techniques and thermal stability screening of sulfide minerals

    International Nuclear Information System (INIS)

    Iliyas, Abduljelil; Hawboldt, Kelly; Khan, Faisal

    2010-01-01

    Thermal methods of analysis such as differential scanning calorimetry (DSC) provide a powerful methodology for the study of solid reactions. This paper proposes an improved thermal analysis methodology for thermal stability investigation of complex solid-state reactions. The proposed methodology is based on differential iso-conversional approach and involves peak separation, individual peak analysis and combination of isothermal/non-isothermal DSC measurements for kinetic analysis and prediction. The proposed thermal analysis, which coupled with Mineral Libration Analyzer (MLA) technique was employed to investigate thermal behavior of sulfide mineral oxidation. The importance of various experimental variables such as particle size, heating rate and atmosphere were investigated and discussed. The information gained from such an advanced thermal analysis method is useful for scale-up processes with potential of significant savings in plant operations, as well as in mitigating adverse environmental and safety issues arising from handling and storage of sulfide minerals.

  9. Critical infrastructure systems of systems assessment methodology.

    Energy Technology Data Exchange (ETDEWEB)

    Sholander, Peter E.; Darby, John L.; Phelan, James M.; Smith, Bryan; Wyss, Gregory Dane; Walter, Andrew; Varnado, G. Bruce; Depoy, Jennifer Mae

    2006-10-01

    Assessing the risk of malevolent attacks against large-scale critical infrastructures requires modifications to existing methodologies that separately consider physical security and cyber security. This research has developed a risk assessment methodology that explicitly accounts for both physical and cyber security, while preserving the traditional security paradigm of detect, delay, and respond. This methodology also accounts for the condition that a facility may be able to recover from or mitigate the impact of a successful attack before serious consequences occur. The methodology uses evidence-based techniques (which are a generalization of probability theory) to evaluate the security posture of the cyber protection systems. Cyber threats are compared against cyber security posture using a category-based approach nested within a path-based analysis to determine the most vulnerable cyber attack path. The methodology summarizes the impact of a blended cyber/physical adversary attack in a conditional risk estimate where the consequence term is scaled by a ''willingness to pay'' avoidance approach.

  10. Mixed-Methods Research Methodologies

    Science.gov (United States)

    Terrell, Steven R.

    2012-01-01

    Mixed-Method studies have emerged from the paradigm wars between qualitative and quantitative research approaches to become a widely used mode of inquiry. Depending on choices made across four dimensions, mixed-methods can provide an investigator with many design choices which involve a range of sequential and concurrent strategies. Defining…

  11. A Model of Risk Analysis in Analytical Methodology for Biopharmaceutical Quality Control.

    Science.gov (United States)

    Andrade, Cleyton Lage; Herrera, Miguel Angel De La O; Lemes, Elezer Monte Blanco

    2018-01-01

    One key quality control parameter for biopharmaceutical products is the analysis of residual cellular DNA. To determine small amounts of DNA (around 100 pg) that may be in a biologically derived drug substance, an analytical method should be sensitive, robust, reliable, and accurate. In principle, three techniques have the ability to measure residual cellular DNA: radioactive dot-blot, a type of hybridization; threshold analysis; and quantitative polymerase chain reaction. Quality risk management is a systematic process for evaluating, controlling, and reporting of risks that may affects method capabilities and supports a scientific and practical approach to decision making. This paper evaluates, by quality risk management, an alternative approach to assessing the performance risks associated with quality control methods used with biopharmaceuticals, using the tool hazard analysis and critical control points. This tool provides the possibility to find the steps in an analytical procedure with higher impact on method performance. By applying these principles to DNA analysis methods, we conclude that the radioactive dot-blot assay has the largest number of critical control points, followed by quantitative polymerase chain reaction, and threshold analysis. From the analysis of hazards (i.e., points of method failure) and the associated method procedure critical control points, we conclude that the analytical methodology with the lowest risk for performance failure for residual cellular DNA testing is quantitative polymerase chain reaction. LAY ABSTRACT: In order to mitigate the risk of adverse events by residual cellular DNA that is not completely cleared from downstream production processes, regulatory agencies have required the industry to guarantee a very low level of DNA in biologically derived pharmaceutical products. The technique historically used was radioactive blot hybridization. However, the technique is a challenging method to implement in a quality

  12. Mass Movement Hazards in the Mediterranean; A review on applied techniques and methodologies

    Science.gov (United States)

    Ziade, R.; Abdallah, C.; Baghdadi, N.

    2012-04-01

    Emergent population and expansions of settlements and life-lines over hazardous areas in the Mediterranean region have largely increased the impact of Mass Movements (MM) both in industrialized and developing countries. This trend is expected to continue in the next decades due to increased urbanization and development, continued deforestation and increased regional precipitation in MM-prone areas due to changing climatic patterns. Consequently, and over the past few years, monitoring of MM has acquired great importance from the scientific community as well as the civilian one. This article begins with a discussion of the MM classification, and the different topographic, geologic, hydrologic and environmental impacting factors. The intrinsic (preconditioning) variables determine the susceptibility of MM and extrinsic factors (triggering) can induce the probability of MM occurrence. The evolution of slope instability studies is charted from geodetic or observational techniques, to geotechnical field-based origins to recent higher levels of data acquisition through Remote Sensing (RS) and Geographic Information System (GIS) techniques. Since MM detection and zoning is difficult in remote areas, RS and GIS have enabled regional studies to predominate over site-based ones where they provide multi-temporal images hence facilitate greatly MM monitoring. The unusual extent of the spectrum of MM makes it difficult to define a single methodology to establish MM hazard. Since the probability of occurrence of MM is one of the key components in making rational decisions for management of MM risk, scientists and engineers have developed physical parameters, equations and environmental process models that can be used as assessment tools for management, education, planning and legislative purposes. Assessment of MM is attained through various modeling approaches mainly divided into three main sections: quantitative/Heuristic (1:2.000-1:10.000), semi-quantitative/Statistical (1

  13. Web survey methodology

    CERN Document Server

    Callegaro, Mario; Vehovar, Asja

    2015-01-01

    Web Survey Methodology guides the reader through the past fifteen years of research in web survey methodology. It both provides practical guidance on the latest techniques for collecting valid and reliable data and offers a comprehensive overview of research issues. Core topics from preparation to questionnaire design, recruitment testing to analysis and survey software are all covered in a systematic and insightful way. The reader will be exposed to key concepts and key findings in the literature, covering measurement, non-response, adjustments, paradata, and cost issues. The book also discusses the hottest research topics in survey research today, such as internet panels, virtual interviewing, mobile surveys and the integration with passive measurements, e-social sciences, mixed modes and business intelligence. The book is intended for students, practitioners, and researchers in fields such as survey and market research, psychological research, official statistics and customer satisfaction research.

  14. Methodologies for optimizing ROP detector layout for CANDU (registered) reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kastanya, Doddy, E-mail: kastanyd@aecl.c [Reactor Core Physics Branch, Atomic Energy of Canada Limited, 2251 Speakman Drive, Mississauga, ON, L5K 1B2 (Canada); Caxaj, Victor [Reactor Core Physics Branch, Atomic Energy of Canada Limited, 2251 Speakman Drive, Mississauga, ON, L5K 1B2 (Canada)

    2011-01-15

    The regional overpower protection (ROP) systems protect CANDU (registered) reactors against overpower in the fuel that would reduce the safety margin-to-dryout. Both a localized power peaking within the core (for example, as a result of certain reactivity device configuration) or a general increase in the core power level during a slow-loss-of-regulation (SLOR) event could cause overpower in the fuel. This overpower could lead to fuel sheath dryout. In the CANDU (registered) 600 MW (CANDU 6) design, there are two ROP systems in the core, one for each fast-acting shutdown systems. Each ROP system includes a number of fast-responding, self-powered flux detectors suitably distributed throughout the core within vertical and horizontal assemblies. Traditionally, the placement of these detectors was done using a method called the detector layout optimization (DLO). A new methodology for designing the detector layout for the ROP system has been developed recently. The new method, called the DETPLASA algorithm, utilizes the simulated annealing (SA) technique to optimize the placement of the detectors in the core. Both methodologies will be discussed in detail in this paper. Numerical examples are employed to better illustrate how each method works. Results from some sensitivity studies on three SA parameters are also presented.

  15. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  16. A methodology for the extraction of quantitative information from electron microscopy images at the atomic level

    International Nuclear Information System (INIS)

    Galindo, P L; Pizarro, J; Guerrero, E; Guerrero-Lebrero, M P; Scavello, G; Yáñez, A; Sales, D L; Herrera, M; Molina, S I; Núñez-Moraleda, B M; Maestre, J M

    2014-01-01

    In this paper we describe a methodology developed at the University of Cadiz (Spain) in the past few years for the extraction of quantitative information from electron microscopy images at the atomic level. This work is based on a coordinated and synergic activity of several research groups that have been working together over the last decade in two different and complementary fields: Materials Science and Computer Science. The aim of our joint research has been to develop innovative high-performance computing techniques and simulation methods in order to address computationally challenging problems in the analysis, modelling and simulation of materials at the atomic scale, providing significant advances with respect to existing techniques. The methodology involves several fundamental areas of research including the analysis of high resolution electron microscopy images, materials modelling, image simulation and 3D reconstruction using quantitative information from experimental images. These techniques for the analysis, modelling and simulation allow optimizing the control and functionality of devices developed using materials under study, and have been tested using data obtained from experimental samples

  17. Decision-making methodology for management of hazardous waste

    International Nuclear Information System (INIS)

    Philbin, J.S.; Cranwell, R.M.

    1988-01-01

    A decision-making methodology is presented that combines systems and risk analysis techniques to evaluate hazardous waste management practices associated with DOE weapon production operations. The methodology provides a systematic approach to examining waste generation and waste handling practices in addition to the more visible disposal practices. Release-exposure scenarios for hazardous waste operations are identified and operational risk is determined. Comparisons may be made between existing and alternative waste management practices (and processes) on the basis of overall risk, cost and compliance with regulations. Managers can use this methodology to make and defend resource allocation decisions and to prioritize research needs

  18. Determination of radium isotopes in environmental samples by gamma spectrometry, liquid scintillation counting and alpha spectrometry: a review of analytical methodology

    International Nuclear Information System (INIS)

    Jia, Guogang; Jia, Jing

    2012-01-01

    Radium (Ra) isotopes are important from the viewpoints of radiation protection and environmental protection. Their high toxicity has stimulated the continuing interest in methodology research for determination of Ra isotopes in various media. In this paper, the three most routinely used analytical techniques for Ra isotope determination in biological and environmental samples, i.e. low-background γ-spectrometry, liquid scintillation counting and α-spectrometry, were reviewed, with emphasis on new methodological developments in sample preparation, preconcentration, separation, purification, source preparation and measurement techniques. The accuracy, selectivity, traceability, applicability and minimum detectable activity (MDA) of the three techniques were discussed. It was concluded that the MDA (0.1 mBq L −1 ) of the α-spectrometry technique coupled with chemical separation is about two orders of magnitude lower than that of low-background HPGe γ-spectrometry and LSC techniques. Therefore, when maximum sensitivity is required, the α-spectrometry technique remains the first choice. - Highlights: ► A review is made for determination of Ra isotopes in environmental samples. ► Gamma spectrometry, LSC and a-spectrometry are the main concerned radiometric approach. ► Sample preparation, preconcentration, separation and source preparation are discussed. ► The methods can analyse air, water, seawater, soil, sediment and foodstuffs samples. ► Some new data obtained recently from our laboratory for Ra method study are included.

  19. Application of a distributed optical fiber sensing technique in monitoring the stress of precast piles

    International Nuclear Information System (INIS)

    Lu, Y; Shi, B; Wei, G Q; Zhang, D; Chen, S E

    2012-01-01

    Due to its ability in providing long distance, distributed sensing, the optical fiber sensing technique based on a Brillouin optical time domain reflectometer (BOTDR) has a unique advantage in monitoring the stability and safety of linear structures. This paper describes the application of a BOTDR-based technique to measure the stress within precast piles. The principle behind the BOTDR and the embedding technique for the sensing optical fiber in precast piles is first introduced, and then the analysis method and deformation and stress calculation based on distributed strain data are given. Finally, a methodology for using a BOTDR-based monitoring workflow for in situ monitoring of precast piles, combined with a practical example, is introduced. The methodology requires implantation of optical fibers prior to pile placement. Field experimental results show that the optical fiber implantation method with slotting, embedding, pasting and jointing is feasible, and have accurately measured the axial force, side friction, end-bearing resistance and bearing feature of the precast pile according to the strain measuring data. (paper)

  20. The prosa methodology for scenario development

    International Nuclear Information System (INIS)

    Grupa, J.B.

    2001-01-01

    In this paper a methodology for scenario development is proposed. The method is developed in an effort to convince ourselves (and others) that all conceivable future developments of a waste repository have been covered. To be able to assess all conceivable future developments, the method needs to be comprehensive. To convince us and others the method should be structured in such a way that the treatment of each conceivable future development is traceable. The methodology is currently being applied to two Dutch disposal designs. Preliminary results show that the elaborated method functions better than the original method. However, some elements in the method will need further refinement. (author)

  1. Teens, Food Choice, and Health: How Can a Multi-Method Research Methodology Enhance the Study of Teen Food Choice and Health Messaging?

    OpenAIRE

    Wiseman, Kelleen

    2011-01-01

    This research report compares alternative approaches to analyzing the complex factors that influence teenagers' food choice. Specifically, a multi-method approach-which involves the integration of the qualitative and quantitative research methodoligies, data and analysis-is compared to a single methodological approach, which involves use of either a quantitative or qualitative methodology.

  2. Application opportunities of agile methodology in service company management

    OpenAIRE

    Barauskienė, Diana

    2017-01-01

    Application Opportunities of Agile Methodology in Service Company Management. The main purpose of this master thesis is to identify which methods (or their modified versions) of Agile methodology can be applied in service company management. This master thesis consists of these parts – literature scientific analysis, author’s research methodology (research methods, authors’ research model, essential elements used in the research of application of Agile methodology), research itself (prelimina...

  3. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  4. A methodological approach for designing a usable ontology-based GUI in healthcare.

    Science.gov (United States)

    Lasierra, N; Kushniruk, A; Alesanco, A; Borycki, E; García, J

    2013-01-01

    This paper presents a methodological approach to the design and evaluation of an interface for an ontology-based system used for designing care plans for monitoring patients at home. In order to define the care plans, physicians need a tool for creating instances of the ontology and configuring some rules. Our purpose is to develop an interface to allow clinicians to interact with the ontology. Although ontology-driven applications do not necessarily present the ontology in the user interface, it is our hypothesis that showing selected parts of the ontology in a "usable" way could enhance clinician's understanding and make easier the definition of the care plans. Based on prototyping and iterative testing, this methodology combines visualization techniques and usability methods. Preliminary results obtained after a formative evaluation indicate the effectiveness of suggested combination.

  5. Production methodologies of polymeric and hydrogel particles for drug delivery applications.

    Science.gov (United States)

    Lima, Ana Catarina; Sher, Praveen; Mano, João F

    2012-02-01

    Polymeric particles are ideal vehicles for controlled delivery applications due to their ability to encapsulate a variety of substances, namely low- and high-molecular mass therapeutics, antigens or DNA. Micro and nano scale spherical materials have been developed as carriers for therapies, using appropriated methodologies, in order to achieve a prolonged and controlled drug administration. This paper reviews the methodologies used for the production of polymeric micro/nanoparticles. Emulsions, phase separation, spray drying, ionic gelation, polyelectrolyte complexation and supercritical fluids precipitation are all widely used processes for polymeric micro/nanoencapsulation. This paper also discusses the recent developments and patents reported in this field. Other less conventional methodologies are also described, such as the use of superhydrophobic substrates to produce hydrogel and polymeric particulate biomaterials. Polymeric drug delivery systems have gained increased importance due to the need for improving the efficiency and versatility of existing therapies. This allows the development of innovative concepts that could create more efficient systems, which in turn may address many healthcare needs worldwide. The existing methods to produce polymeric release systems have some critical drawbacks, which compromise the efficiency of these techniques. Improvements and development of new methodologies could be achieved by using multidisciplinary approaches and tools taken from other subjects, including nanotechnologies, biomimetics, tissue engineering, polymer science or microfluidics.

  6. METHODOLOGICAL APPROACHES AND CHALLENGES IN ASSESSING THE VALUE OF INTELLECTUAL CAPITAL

    Directory of Open Access Journals (Sweden)

    E. D. Katulskij

    2016-01-01

    Full Text Available Presence at the enterprise znanievyh resources determines its capacity for sustainable and competitive development. The set of knowledge, skills and abilities (which has operational and management personnel, including transformed in intangible and other assets are considered to be the intellectual capital of the enterprise. Empirically, the presence of the intellectual capital of the enterprise can be identified by its success in the market and the ability to generate a high value added product. However, scientific and methodological point of view, approaches to assessing intellectual capital are currently not standardized and do not provide an objective valuation of the capital.This paper presents an overview of the methodological approaches to the valuation of the intellectual capital of companies and shows the problems of using these approaches in analytical procedures. Based on the materials conclusion about the necessity of further development of the intellectual capital evaluation methods enterprises it was made.The purpose / goal. The purpose of this article is to study the specifics of basic methodological approaches to the valuation of the intellectual capital of enterprises. Moreover, among the main tasks is to provide: an analysis of the most frequently used techniques in the Russian and international practice, assessment of intellectual capital.Methodology. The article is a content analysis of the theoretical and scientific-methodical positions, describing the key and the most frequently used Russian and international approaches to the evaluation of the intellectual capital of enterprises.Conclusions / relevance. The practical significance of this paper is to identify the main issues that arise in the evaluation of the intellectual capital of the enterprises, which determines the need for further scientific development and complement the currently used evaluation methods.

  7. WE-B-BRC-01: Current Methodologies in Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rath, F. [University of Wisconsin Madison (United States)

    2016-06-15

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. We therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology

  8. WE-B-BRC-01: Current Methodologies in Risk Assessment

    International Nuclear Information System (INIS)

    Rath, F.

    2016-01-01

    Prospective quality management techniques, long used by engineering and industry, have become a growing aspect of efforts to improve quality management and safety in healthcare. These techniques are of particular interest to medical physics as scope and complexity of clinical practice continue to grow, thus making the prescriptive methods we have used harder to apply and potentially less effective for our interconnected and highly complex healthcare enterprise, especially in imaging and radiation oncology. An essential part of most prospective methods is the need to assess the various risks associated with problems, failures, errors, and design flaws in our systems. We therefore begin with an overview of risk assessment methodologies used in healthcare and industry and discuss their strengths and weaknesses. The rationale for use of process mapping, failure modes and effects analysis (FMEA) and fault tree analysis (FTA) by TG-100 will be described, as well as suggestions for the way forward. This is followed by discussion of radiation oncology specific risk assessment strategies and issues, including the TG-100 effort to evaluate IMRT and other ways to think about risk in the context of radiotherapy. Incident learning systems, local as well as the ASTRO/AAPM ROILS system, can also be useful in the risk assessment process. Finally, risk in the context of medical imaging will be discussed. Radiation (and other) safety considerations, as well as lack of quality and certainty all contribute to the potential risks associated with suboptimal imaging. The goal of this session is to summarize a wide variety of risk analysis methods and issues to give the medical physicist access to tools which can better define risks (and their importance) which we work to mitigate with both prescriptive and prospective risk-based quality management methods. Learning Objectives: Description of risk assessment methodologies used in healthcare and industry Discussion of radiation oncology

  9. The methods for generating tomographic images using transmition, emission and nuclear magnetic resonance techniques. II. Fourier method and iterative methods

    International Nuclear Information System (INIS)

    Ursu, I.; Demco, D.E.; Gligor, T.D.; Pop, G.; Dollinger, R.

    1987-01-01

    In a wide variety of applications it is necessary to infer the structure of a multidimensional object from a set of its projections. Computed tomography is at present largely extended in the medical field, but the industrial application may ultimately far exceed its medical applications. Two techniques for reconstructing objects from their projections are presented: Fourier methods and iterative techniques. The paper also contains a brief comparative study of the reconstruction algorithms. (authors)

  10. Discourse Analysis of the Documentary Method as "Key" to Self-Referential Communication Systems? Theoretic-Methodological Basics and Empirical Vignettes

    Directory of Open Access Journals (Sweden)

    Gian-Claudio Gentile

    2010-09-01

    Full Text Available Niklas LUHMANN is well known for his deliberate departure from the classical focus on studying individual actions and directing attention on the actors' relatedness through so called (autopoietic communication systems. In contrast to the gain of a new perspective of observation his focus on autopoietic systems is simultaneously its biggest methodological obstacle for the use in social and management sciences. The present contribution considers the above shift on a theoretical level and with a specific qualitative method. It argues for a deeper understanding of systemic sense making and its enactment in a systematic and comprehensible way. Central to this approach is its focus on groups. Using group discussions as the method of data collection, and the "documentary method" by Ralf BOHNSACK (2003 as a method of data analysis, the article describes a methodologically grounded way to record the self-referential systems proposed by LUHMANN's system theory. The theoretical considerations of the paper are illustrated by empirical vignettes derived from a research project conducted in Switzerland concerning the social responsibility of business. URN: urn:nbn:de:0114-fqs1003156

  11. Boundary methods for mode estimation

    Science.gov (United States)

    Pierson, William E., Jr.; Ulug, Batuhan; Ahalt, Stanley C.

    1999-08-01

    This paper investigates the use of Boundary Methods (BMs), a collection of tools used for distribution analysis, as a method for estimating the number of modes associated with a given data set. Model order information of this type is required by several pattern recognition applications. The BM technique provides a novel approach to this parameter estimation problem and is comparable in terms of both accuracy and computations to other popular mode estimation techniques currently found in the literature and automatic target recognition applications. This paper explains the methodology used in the BM approach to mode estimation. Also, this paper quickly reviews other common mode estimation techniques and describes the empirical investigation used to explore the relationship of the BM technique to other mode estimation techniques. Specifically, the accuracy and computational efficiency of the BM technique are compared quantitatively to the a mixture of Gaussian (MOG) approach and a k-means approach to model order estimation. The stopping criteria of the MOG and k-means techniques is the Akaike Information Criteria (AIC).

  12. Problem-solving and developing quality management methods and techniques on the example of automotive industry

    OpenAIRE

    Jacek Łuczak; Radoslaw Wolniak

    2015-01-01

    The knowledge about methods and techniques of quality management together with their effective use can be definitely regarded as an indication of high organisational culture. Using such methods and techniques in an effective way can be attributed to certain level of maturity, as far as the quality management system in an organisation is concerned. There is in the paper an analysis of problem-solving methods and techniques of quality management in the automotive sector in Poland. The survey wa...

  13. On process optimization considering LCA methodology.

    Science.gov (United States)

    Pieragostini, Carla; Mussati, Miguel C; Aguirre, Pío

    2012-04-15

    The goal of this work is to research the state-of-the-art in process optimization techniques and tools based on LCA, focused in the process engineering field. A collection of methods, approaches, applications, specific software packages, and insights regarding experiences and progress made in applying the LCA methodology coupled to optimization frameworks is provided, and general trends are identified. The "cradle-to-gate" concept to define the system boundaries is the most used approach in practice, instead of the "cradle-to-grave" approach. Normally, the relationship between inventory data and impact category indicators is linearly expressed by the characterization factors; then, synergic effects of the contaminants are neglected. Among the LCIA methods, the eco-indicator 99, which is based on the endpoint category and the panel method, is the most used in practice. A single environmental impact function, resulting from the aggregation of environmental impacts, is formulated as the environmental objective in most analyzed cases. SimaPro is the most used software for LCA applications in literature analyzed. The multi-objective optimization is the most used approach for dealing with this kind of problems, where the ε-constraint method for generating the Pareto set is the most applied technique. However, a renewed interest in formulating a single economic objective function in optimization frameworks can be observed, favored by the development of life cycle cost software and progress made in assessing costs of environmental externalities. Finally, a trend to deal with multi-period scenarios into integrated LCA-optimization frameworks can be distinguished providing more accurate results upon data availability. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Methodology for ranking restoration options

    International Nuclear Information System (INIS)

    Hedemann Jensen, Per

    1999-04-01

    The work described in this report has been performed as a part of the RESTRAT Project FI4P-CT95-0021a (PL 950128) co-funded by the Nuclear Fission Safety Programme of the European Commission. The RESTRAT project has the overall objective of developing generic methodologies for ranking restoration techniques as a function of contamination and site characteristics. The project includes analyses of existing remediation methodologies and contaminated sites, and is structured in the following steps: characterisation of relevant contaminated sites; identification and characterisation of relevant restoration techniques; assessment of the radiological impact; development and application of a selection methodology for restoration options; formulation of generic conclusions and development of a manual. The project is intended to apply to situations in which sites with nuclear installations have been contaminated with radioactive materials as a result of the operation of these installations. The areas considered for remedial measures include contaminated land areas, rivers and sediments in rivers, lakes, and sea areas. Five contaminated European sites have been studied. Various remedial measures have been envisaged with respect to the optimisation of the protection of the populations being exposed to the radionuclides at the sites. Cost-benefit analysis and multi-attribute utility analysis have been applied for optimisation. Health, economic and social attributes have been included and weighting factors for the different attributes have been determined by the use of scaling constants. (au)

  15. Procedure Redesign Methods : E3-Control: a redesign methodology for control procedures

    NARCIS (Netherlands)

    Liu, J.; Hofman, W.J.; Tan, Y.H.

    2011-01-01

    This chapter highlights the core research methodology, e3-control, that is applied throughout the ITAIDE project for the purpose of control procedure redesign. We present the key concept of the e3-control methodology and its technical guidelines. Based on the output of this chapter, domain experts

  16. Performance of clustering techniques for solving multi depot vehicle routing problem

    Directory of Open Access Journals (Sweden)

    Eliana M. Toro-Ocampo

    2016-01-01

    Full Text Available The vehicle routing problem considering multiple depots is classified as NP-hard. MDVRP determines simultaneously the routes of a set of vehicles and aims to meet a set of clients with a known demand. The objective function of the problem is to minimize the total distance traveled by the routes given that all customers must be served considering capacity constraints in depots and vehicles. This paper presents a hybrid methodology that combines agglomerative clustering techniques to generate initial solutions with an iterated local search algorithm (ILS to solve the problem. Although previous studies clustering methods have been proposed like strategies to generate initial solutions, in this work the search is intensified on the information generated after applying the clustering technique. Besides an extensive analysis on the performance of techniques, and their effect in the final solution is performed. The operation of the proposed methodology is feasible and effective to solve the problem regarding the quality of the answers and computational times obtained on request evaluated literature

  17. Chapter three: methodology of exposure modeling

    CSIR Research Space (South Africa)

    Moschandreas, DJ

    2002-12-01

    Full Text Available methodologies and models are reviewed. Three exposure/measurement methodologies are assessed. Estimation methods focus on source evaluation and attribution, sources include those outdoors and indoors as well as in occupational and in-transit environments. Fate...

  18. Methodological aspects of EEG and Body dynamics measurements during motion.

    Directory of Open Access Journals (Sweden)

    Pedro eReis

    2014-03-01

    Full Text Available EEG involves recording, analysis, and interpretation of voltages recorded on the human scalp originating from brain grey matter. EEG is one of the favorite methods to study and understand processes that underlie behavior. This is so, because EEG is relatively cheap, easy to wear, light weight and has high temporal resolution. In terms of behavior, this encompasses actions, such as movements, that are performed in response to the environment. However, there are methodological difficulties when recording EEG during movement such as movement artifacts. Thus, most studies about the human brain have examined activations during static conditions. This article attempts to compile and describe relevant methodological solutions that emerged in order to measure body and brain dynamics during motion. These descriptions cover suggestions of how to avoid and reduce motion artifacts, hardware, software and techniques for synchronously recording EEG, EMG, kinematics, kinetics and eye movements during motion. Additionally, we present various recording systems, EEG electrodes, caps and methods for determination of real/custom electrode positions. In the end we will conclude that it is possible to record and analyze synchronized brain and body dynamics related to movement or exercise tasks.

  19. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  20. Project-Based Learning and Agile Methodologies in Electronic Courses: Effect of Student Population and Open Issues

    Directory of Open Access Journals (Sweden)

    Marina Zapater

    2013-12-01

    Full Text Available Project-Based Learning (PBL and Agile methodologies have proven to be very interesting instructional strategies in Electronics and Engineering education, because they provide practical learning skills that help students understand the basis of electronics. In this paper we analyze two courses, one belonging to a Master in Electronic Engineering and one to a Bachelor in Telecommunication Engineering that apply Agile-PBL methodologies, and compare the results obtained in both courses with a traditional laboratory course. Our results support previous work stating that Agile-PBL methodologies increase student satisfaction. However, we also highlight some open issues that negatively affect the implementation of these methodologies,such as planning overhead or accidental complexity. Moreover,we show how differences in the student population, mostly related to the time spent on-campus, their commitment to the course or part-time dedication, have an impact on the benefits of Agile-PBL methods. In these cases, Agile-PBL methodologies by themselves are not enough and need to be combined with other techniques to increase student motivation.

  1. Plasticity models of material variability based on uncertainty quantification techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Reese E. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Rizzi, Francesco [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Boyce, Brad [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Templeton, Jeremy Alan [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Ostien, Jakob [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2017-11-01

    The advent of fabrication techniques like additive manufacturing has focused attention on the considerable variability of material response due to defects and other micro-structural aspects. This variability motivates the development of an enhanced design methodology that incorporates inherent material variability to provide robust predictions of performance. In this work, we develop plasticity models capable of representing the distribution of mechanical responses observed in experiments using traditional plasticity models of the mean response and recently developed uncertainty quantification (UQ) techniques. Lastly, we demonstrate that the new method provides predictive realizations that are superior to more traditional ones, and how these UQ techniques can be used in model selection and assessing the quality of calibrated physical parameters.

  2. POSSIBILITY OF IMPROVING EXISTING STANDARDS AND METHODOLOGIES FOR AUDITING INFORMATION SYSTEMS TO PROVIDE E-GOVERNMENT SERVICES

    Directory of Open Access Journals (Sweden)

    Евгений Геннадьевич Панкратов

    2014-03-01

    Full Text Available This article analyzes the existing methods of e-government systems audit, their shortcomings are examined.  The approaches to improve existing techniques and adapt them to the specific characteristics of e-government systems are suggested. The paper describes the methodology, providing possibilities of integrated assessment of information systems. This methodology uses systems maturity models and can be used in the construction of e-government rankings, as well as in the audit of their implementation process. Maturity models are based on COBIT, COSO methodologies and models of e-government, developed by the relevant committee of the UN. The methodology was tested during the audit of information systems involved in the payment of temporary disability benefits. The audit was carried out during analysis of the outcome of the pilot project for the abolition of the principle of crediting payments for disability benefits.DOI: http://dx.doi.org/10.12731/2218-7405-2014-2-5

  3. Calculation of t8/5 by response surface methodology for electric arc welding applications

    Directory of Open Access Journals (Sweden)

    Meseguer-Valdenebro José Luis

    2014-01-01

    Full Text Available One of the greatest difficulties traditionally found in stainless steel constructions has been the execution of welding parts in them. At the present time, the available technology allows us to use arc welding processes for that application without any disadvantage. Response surface methodology is used to optimise a process in which the variables that take part in it are not related to each other by a mathematical law. Therefore, an empiric model must be formulated. With this methodology the optimisation of one selected variable may be done. In this work, the cooling time that takes place from 800 to 500ºC, t8/5, after TIG welding operation, is modelled by the response surface method. The arc power, the welding velocity and the thermal efficiency factor are considered as the variables that have influence on the t8/5 value. Different cooling times,t8/5, for different combinations of values for the variables are previously determined by a numerical method. The input values for the variables have been experimentally established. The results indicate that response surface methodology may be considered as a valid technique for these purposes.

  4. Agile Methods from the Viewpoint of Information

    Directory of Open Access Journals (Sweden)

    Eder Junior Alves

    2017-10-01

    Full Text Available Introduction: Since Paul M. G. Otlet highlighted the term documentation in 1934, proposing how to collect and organize the world's knowledge, many scientific researches directed observations to the study of Information Science. Methods and techniques have come up with a world view from the perspective of information. Agile methods follow this trend. Objective: The purpose is to analyze the relevance of information flow to organizations adopting agile methods, understanding how the innovation process is influenced by this practice. Methodology: This is a bibliometric study with fundamentals of Systematic Literature Review (SLR. The integration between the SLR technique interacting with Summarize tool is a new methodological proposal. Results: Scrum appears with the highest number of publications in SPELL. In comparison, results of Google Scholar pointed out to the importance of practices and team behaviors. In Science Direct repository, critical success factors in project management and software development are highlighted. Introduction: Conclusions: It was evident that agile methods are being used as process innovations. The benefits and advantages are evident with internal and external occurrence of information flow. Due to the prevalence in the literature, Scrum deserves attention by firms.

  5. Generative Algorithmic Techniques for Architectural Design

    DEFF Research Database (Denmark)

    Larsen, Niels Martin

    2012-01-01

    Architectural design methodology is expanded through the ability to create bespoke computational methods as integrated parts of the design process. The rapid proliferation of digital production techniques within building industry provides new means for establishing seamless flows between digital...... form-generation and the realisation process. A tendency in recent practice shows an increased focus on developing unique tectonic solutions as a crucial ingredient in the design solution. These converging trajectories form the contextual basis for this thesis. In architectural design, digital tools....... The principles are further developed to form new modes of articulation in architectural design. Certain methods are contributions, which suggest a potential for future use and development. Thus, a method is directed towards bottom-up generation of surface topology through the use of an agentbased logic. Another...

  6. Methodologies for nuclear material accounting and control: challenges and expectations

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2007-01-01

    Nuclear Material Accounting and Control (NUMAC) represents one of the most important and indispensable responsibilities of any nuclear installation. The emphasis is to ensure that the nuclear material being handled in the nuclear installation is properly accounted for with the expected accuracy and confidence levels. A number of analytical methods based on both destructive and non-destructive assay techniques are available at the disposal of the nuclear analytical scientists for this purpose and they have been enumerated extensively in literature. Instead of recounting the analytical methodologies available, an attempt has been made in this paper to highlight some of the challenges. (author)

  7. New analytical methods for materials characterization using the techniques of nuclear activation reactions induced by thermal neutrons and accelerated ion beams, coupled to gamma-ray spectroscopy

    International Nuclear Information System (INIS)

    Cincu, Emanuela

    1999-01-01

    This thesis is a comprehensive collection of the author's works in the field of 'Nuclear Activation Techniques with accelerated Charged Particles and Thermal Neutrons' carried out within the framework of the research contracts the author initiated and performed in the period 1990 - 1999. The works objective was to achieve a consistent and complete methodological and instrumental assembly for accurate elemental analysis of technological samples of interest for industry, medicine, and monitoring of the environmental radioactivity. The experiments were carried out using the IFIN-HH facilities: U-120 Cyclotron, 8 MV Tandem Van de Graaff accelerator, and the WWR-S nuclear reactor. Part of the reported works were initiated and performed in collaboration with partners from the chemical industry and metallurgic industry, wishing to employ the sensitive nuclear analytical techniques, which are able to put in evidence simultaneously major, minor elements, and impurities in the investigated samples. The impact with the challenging topics and the characteristics of some investigated technological samples, generated the studies having both theoretical and experimental features, presented in this thesis, as well as the original analytical and methodological solutions. The thesis structure has two parts: The 1st part (Chapter 1) is a survey of the literature until 1999, that concerns the theory of nuclear activation reactions with accelerated charged particles (CPAA) and thermal neutrons (NAA), evidencing the analytical performance of both techniques; details are also given about the 'critical' phenomena encountered in CPAA, whose origin is still under discussion in the literature. The 2 nd part of the thesis contains the original contributions of the author in the theoretical, methodological, and software fields (Chapters 2-8), the experimental results obtained, and the nuclear database software based on the 'Fox-Pro' operation system, conceived for processing the experimental

  8. IMSF: Infinite Methodology Set Framework

    Science.gov (United States)

    Ota, Martin; Jelínek, Ivan

    Software development is usually an integration task in enterprise environment - few software applications work autonomously now. It is usually a collaboration of heterogeneous and unstable teams. One serious problem is lack of resources, a popular result being outsourcing, ‘body shopping’, and indirectly team and team member fluctuation. Outsourced sub-deliveries easily become black boxes with no clear development method used, which has a negative impact on supportability. Such environments then often face the problems of quality assurance and enterprise know-how management. The used methodology is one of the key factors. Each methodology was created as a generalization of a number of solved projects, and each methodology is thus more or less connected with a set of task types. When the task type is not suitable, it causes problems that usually result in an undocumented ad-hoc solution. This was the motivation behind formalizing a simple process for collaborative software engineering. Infinite Methodology Set Framework (IMSF) defines the ICT business process of adaptive use of methods for classified types of tasks. The article introduces IMSF and briefly comments its meta-model.

  9. Micrometeorological Technique for Monitoring of Geological Carbon Capture, Utilization and Storage: Methodology, Workflow and Resources

    Science.gov (United States)

    Burba, G. G.; Madsen, R.; Feese, K.

    2013-12-01

    The eddy covariance (EC) method is a micrometeorological technique for direct high-speed measurements of the transport of gases and energy between land or water surfaces and the atmosphere [1]. This method allows for observations of gas transport scales from 20-40 times per second to multiple years, represents gas exchange integrated over a large area, from hundreds of square meters to tens of square kilometres, and corresponds to gas exchange from the entire surface, including canopy, and soil or water layers. Gas fluxes, emission and exchange rates are characterized from single-point in situ measurements using permanent or mobile towers, or moving platforms such as automobiles, helicopters, airplanes, etc. Presently, over 600 eddy covariance stations are in operation in over 120 countries [1]. EC is now recognized as an effective method in regulatory and industrial applications, including CCUS [2-10]. Emerging projects utilize EC to continuously monitor large areas before and after the injections, to locate and quantify leakages where CO2 may escape from the subsurface, to improve storage efficiency, and for other CCUS characterizations [5-10]. Although EC is one of the most direct and defensible micrometeorological techniques measuring gas emission and transport, and complete automated stations and processing are readily available, the method is mathematically complex, and requires careful setup and execution specific to the site and project. With this in mind, step-by-step instructions were created in [1] to introduce a novice to the EC method, and to assist in further understanding of the method through more advanced references. In this presentation we provide brief highlights of the eddy covariance method, its application to geological carbon capture, utilization and storage, key requirements, instrumentation and software, and review educational resources particularly useful for carbon sequestration research. References: [1] Burba G. Eddy Covariance Method

  10. Radon emanometric technique for 226Ra estimation

    International Nuclear Information System (INIS)

    Mandakini Maharana; Sengupta, D.; Eappen, K.P.

    2010-01-01

    Studies on natural background radiation show that the major contribution of radiation dose received by population is through inhalation pathway vis-a-vis contribution from radon ( 222 Rn) gas. The immediate parent of radon being radium ( 226 Ra), it is imperative that radium content is measured in the various matrices that are present in the environment. Among the various methods available for the measurement of radium, gamma spectrometry and radiochemical method are the two extensively used measurement methods. In comparison with these two methods, the radon emanometric technique, described here, is a simple and convenient method. The paper gives details of sample processing, radon bubbler, Lucas cell and the methodology used in the emanometric method. Comparison of emanometric method with gamma spectrometry has also undertaken and the results for a few soil samples are given. The results show a fairly good agreement among the two methods. (author)

  11. CULTUROLOGICAL APPROACH AS METHODOLOGICAL BASIS OF MATHEMATICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    Ye. A. Perminov

    2017-01-01

    Full Text Available Introduction. Today, in the era of a mathematization of science and total expansion of digital technologies, mass mathematical education becomes a necessary part of culture of every person. However, there are some serious obstacles to formation and development of general mathematical culture: insufficient understanding of its importance by society and the state; fragmentary-clipconsciousness, emerging among representatives of the younger generation under the influence of the Internet, and preventing formation of a complete picture of the modern world; traditional system of disjointed subjects and courses in school, secondary vocational and high school mathematics education; non-cognitive (automatic transferring of the approaches, principles, technologies and techniques into training which are not specific in order to master a course. Development of sociological, axiological and especially culturological aspects of mathematical methodology is required for the solution of the urgent problems of methodology in mathematical education.The aim of the publication is to discuss methodological aspects of culturological approach realization in mathematical education.Methodology and research methods. The theoretical scientific methods of the present article involve analysis and synthesis of the content of philosophical, mathematical, pedagogical, methodological literature and normative documents; comparative, culturological and logical types of analysis of mathematical education; systematic, competence-based, practice-oriented and personal-activity metho-dological approaches were used to understand the concept of mathematical education.Results and scientific novelty. The practicability and leading role of culturological approach to promoting mathematical knowledge is proved from historical, philosophical and pedagogical positions. It is stated that objective conceptualization of progressive ideas and new methods of mathematical science and mathematical

  12. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. Part 6: A low-cost method for land use mapping using simple visual techniques of interpretation. [Spain

    Science.gov (United States)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. It was found that color composite transparencies and monocular magnification provided the best base for land use interpretation. New methods for determining optimum sample sizes and analyzing interpretation accuracy levels were developed. All stages of the methodology were assessed, in the operational sense, during the production of a 1:250,000 rural land use map of Murcia Province, Southeast Spain.

  13. Observational methodology in sport sciences

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2013-11-01

    Full Text Available This paper reviews the conceptual framework, the key literature and the methods (observation tools, such as category systems and field formats, and coding software, etc. that should be followed when conducting research from the perspective of observational methodology. The observational designs used by the authors’ research group over the last twenty years are discussed, and the procedures for analysing data and assessing their quality are described. Mention is also made of the latest methodological trends in this field, such as the use of mixed methods.

  14. Regional cerebral blood flow (rCBF) in psychiatry: Methodological issues

    International Nuclear Information System (INIS)

    Prohovnik, I.

    1984-01-01

    Traditionally, measurements of regional cerebral blood flow (rCBF) have been confined to neurology and nuclear medicine. Only one laboratory had concentrated on using this technique in psychiatric studies. Recently, however, rCBF has been increasingly used in psychiatry, and it seems appropriate at this time to examine the value and limitations of this method. The present article reviews selected methodological issues that may complicate the performance and interpretation of rCBF studies, with the aim of providing some means to evaluate published work and to plan further psychiatric research. In this paper, the term rCBF refers only to the two-dimensional, noninvasive methods that rely on inhalation or intravenous injection of xenon-133. The growing interest of rCBF to psychiatry stems mostly from the fact that this technique can indirectly map cerebral metabolism and, by interface, neural activity or information processing. Regional metabolism and blood flow are closely coupled to the human brain in the absence of gross pathology, and since psychiatric patients rarely present acute neurological abnormalities that might disrupt this coupling, one may infer regional metabolism from flow

  15. Reliability-Based Stability Analysis of Rock Slopes Using Numerical Analysis and Response Surface Method

    Science.gov (United States)

    Dadashzadeh, N.; Duzgun, H. S. B.; Yesiloglu-Gultekin, N.

    2017-08-01

    While advanced numerical techniques in slope stability analysis are successfully used in deterministic studies, they have so far found limited use in probabilistic analyses due to their high computation cost. The first-order reliability method (FORM) is one of the most efficient probabilistic techniques to perform probabilistic stability analysis by considering the associated uncertainties in the analysis parameters. However, it is not possible to directly use FORM in numerical slope stability evaluations as it requires definition of a limit state performance function. In this study, an integrated methodology for probabilistic numerical modeling of rock slope stability is proposed. The methodology is based on response surface method, where FORM is used to develop an explicit performance function from the results of numerical simulations. The implementation of the proposed methodology is performed by considering a large potential rock wedge in Sumela Monastery, Turkey. The accuracy of the developed performance function to truly represent the limit state surface is evaluated by monitoring the slope behavior. The calculated probability of failure is compared with Monte Carlo simulation (MCS) method. The proposed methodology is found to be 72% more efficient than MCS, while the accuracy is decreased with an error of 24%.

  16. Immunodiagnosis of parasitic infections using nuclear techniques

    International Nuclear Information System (INIS)

    1985-07-01

    This report documents the recommendations of the ''Advisory Group on Immunodiagnosis of Parasitic Infections Using Nuclear Techniques'' with a focus on malaria, schistosomiasis and filariasis. Radionuclide tracers are considered an important component of present and future immunological methods for the assessment of the host's humoral and cellular immunity to the parasite and the detection of parasite antigen(s) in human body fluids. The Advisory Group has concluded that there is a continuing need for the development and application of immunodiagnostic methods in parasitic diseases. This report concerns methods which are currently or potentially applicable to immunodiagnostic investigations in parasitic diseases. Reference is made, where appropriate, to recent developments in research which may lead to improvement and standardization of methods now available and the development of new methodology. Separate abstracts on various papers presented were prepared

  17. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  18. METHODOLOGY FOR DETERMINATION OF SOUND INSULATION OF APARTMENTS’ ENCLOSING STRUCTURES TO MEET NOISE PROTECTION REQUIREMENTS

    Directory of Open Access Journals (Sweden)

    Giyasov Botir Iminzhonovich

    2017-10-01

    Full Text Available Subject: an important task in the design of internal enclosing structures of apartments is the establishment of their required soundproofing ability. At present, there is no reliable method for determining the required sound insulation and in this regard internal enclosures are designed without proper justification for noise protection. Research objectives: development of a technique for determining the required sound insulation of apartment’s internal enclosures to ensure an acceptable noise regime in the apartments’ rooms under the action of intra-apartment noise sources. Materials and methods: the methodology was developed on the basis of a statistical method for noise calculation in the apartments, treated as systems of acoustically coupled proportionate rooms, and with the help of a computer program that implements this method. Results: the technique makes it possible to generate, with the use of computer technologies, a targeted selection of internal enclosures of the apartment to meet their soundproofing requirements. Conclusions: the technique proposed in the article can be used at the design stage of apartments when determining the required soundproofing of partitions and doors. Using this technique, it is possible to harmonize the sound insulation ratio of individual elements among themselves and thereby guarantee a selection of internal structures for their acoustic and economic efficiency.

  19. Method for plant operation guidance by knowledge engineering technique

    International Nuclear Information System (INIS)

    Kiguchi, Takashi; Yoshida, Kenichi; Motoda, Hiroshi; Kobayashi, Setsuo

    1983-01-01

    A method for plant operation guidance has been developed by using the Knowledge Engineering technique. The method is characterized by its capability of handling plant dynamics. The knowledge-base includes plant simulation programs as tools to evaluate dynamic behaviors as well as production rules of ''if..., then...'' type. The inference engine is thus capable of predicting plant dynamics and making decisions in accordance with time progress. The performance of the guidance method was evaluated by simulation tests assuming various abnormal situations of a BWR power plant. It was shown that the method can detect each of the abnormal events along the course of their occurrence, and provide the guidance for corrective actions. The operation guidance method proposed in this paper is general and is applicable not only to nuclear power plants but also to other plants such as chemical production plants and fossile power plants. (author)

  20. The laddering method in service innovation research

    DEFF Research Database (Denmark)

    Grünbaum, Niels Nolsøe

    2017-01-01

    The laddering method is a qualitative interview technique applied in a situation with one interviewer and one informant with the aim of creating an understanding of the value that business-to-consumer (B2C) customers extract from product attributes. Thus, this methodology aims to depict a mental ...

  1. Self-cleaning Foliar Surfaces Characterization using RIMAPS Technique and Variogram Method

    International Nuclear Information System (INIS)

    Rosi, Pablo E.

    2002-01-01

    Along the last ten years many important studies about characterization of self-cleaning foliar surfaces have been done and focused new interest on this kind of surfaces.These studies were possible due to the development of a novel preparation technique for this biological material that let us observe the delicate structures of a foliar surface under scanning electron microscope (S.E.M.).This technique consists of replacing the natural water of the specimen by glycerol. Digital S.E.M. images from both self-cleaning and non-self-cleaning foliar surfaces were obtained and analyzed using RIMAPS technique and Variograms method. Our results revealed the existence of a common and exclusive geometrical pattern that is found in species which present self-cleaning foliar surfaces.This pattern combines at least nine different directions.The results from the Variograms method showed that the stomata play a key role in the determination of foliar surface roughness. In addition, spectra from RIMAPS technique constitute a fingerprint of a foliar surface so they can be used to find evolutionary relationships among species.Further studies will provide more detailed information to fully elucidate the self-cleaning pattern, so it might be possible to reproduce it on an artificial surface and make it self-cleaning

  2. Methodology for the evaluation process in the director´s preparation from education.

    Directory of Open Access Journals (Sweden)

    Humberto Clemente Calderón Echevarría

    2014-03-01

    Full Text Available The presented work proposes a methodology oriented to the evaluation of the process of director´s preparation which may contribute to the improvement of the program. It explains the need of the evaluation, activity as such, indicators to evaluate, methods and techniques to be used and the steps in which have to be done. Until now doesn´t exist a methodology which can evaluate the process of directors’ preparation in the educational sector. The development of this methodology has as a background the result obtained by means of different investigations made in the Provincial Post Office and that later were applied in the Provincial Department of Education. Nowadays is perfected in the Pedagogical University "Capitán Silverio Blanco Núñez", thus, the employment opportunities in similar processes of other entities. In the proposed methodology is conceived that the process evaluation of the director’s preparation flows out in a cyclical manner, continuous, flexible, and interactive, away from the traditional linear formula, rigid and schematic. From the above idea it can be identify four stages, and the relevant procedures, the evaluation of the process of directors’ preparation in education.

  3. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  4. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  5. A Methodology for Anatomic Ultrasound Image Diagnostic Quality Assessment

    DEFF Research Database (Denmark)

    Hemmsen, Martin Christian; Lange, Theis; Brandt, Andreas Hjelm

    2017-01-01

    are presented. Earlier uses of the methodology has shown that it ensures validity of the assessment, as it separates the influences between developer, investigator, and assessor once a research protocol has been established. This separation reduces confounding influences on the result from the developer......This paper discusses methods for assessment of ultrasound image quality based on our experiences with evaluating new methods for anatomic imaging. It presents a methodology to ensure a fair assessment between competing imaging methods using clinically relevant evaluations. The methodology...... to properly reveal the clinical value. The paper exemplifies the methodology using recent studies of Synthetic Aperture Sequential Beamforming tissue harmonic imaging....

  6. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  7. A review on fault classification methodologies in power transmission systems: Part—I

    Directory of Open Access Journals (Sweden)

    Avagaddi Prasad

    2018-05-01

    Full Text Available This paper presents a survey on different fault classification methodologies in transmission lines. Efforts have been made to include almost all the techniques and philosophies of transmission lines reported in the literature. Fault classification is necessary for reliable and high speed protective relaying followed by digital distance protection. Hence, a suitable review of these methods is needed. The contribution consists of two parts. This is part 1 of the series of two parts. Part 1, it is a review on brief introduction on faults in transmission lines and the scope of various old approaches in this field are reviewed. Part 2 will focus and present a newly developed approaches in this field. Keywords: Fault, Fault classification, Protection, Soft computing techniques, Transmission lines

  8. Analytical methodologies for broad metabolite coverage of exhaled breath condensate.

    Science.gov (United States)

    Aksenov, Alexander A; Zamuruyev, Konstantin O; Pasamontes, Alberto; Brown, Joshua F; Schivo, Michael; Foutouhi, Soraya; Weimer, Bart C; Kenyon, Nicholas J; Davis, Cristina E

    2017-09-01

    Breath analysis has been gaining popularity as a non-invasive technique that is amenable to a broad range of medical uses. One of the persistent problems hampering the wide application of the breath analysis method is measurement variability of metabolite abundances stemming from differences in both sampling and analysis methodologies used in various studies. Mass spectrometry has been a method of choice for comprehensive metabolomic analysis. For the first time in the present study, we juxtapose the most commonly employed mass spectrometry-based analysis methodologies and directly compare the resultant coverages of detected compounds in exhaled breath condensate in order to guide methodology choices for exhaled breath condensate analysis studies. Four methods were explored to broaden the range of measured compounds across both the volatile and non-volatile domain. Liquid phase sampling with polyacrylate Solid-Phase MicroExtraction fiber, liquid phase extraction with a polydimethylsiloxane patch, and headspace sampling using Carboxen/Polydimethylsiloxane Solid-Phase MicroExtraction (SPME) followed by gas chromatography mass spectrometry were tested for the analysis of volatile fraction. Hydrophilic interaction liquid chromatography and reversed-phase chromatography high performance liquid chromatography mass spectrometry were used for analysis of non-volatile fraction. We found that liquid phase breath condensate extraction was notably superior compared to headspace extraction and differences in employed sorbents manifested altered metabolite coverages. The most pronounced effect was substantially enhanced metabolite capture for larger, higher-boiling compounds using polyacrylate SPME liquid phase sampling. The analysis of the non-volatile fraction of breath condensate by hydrophilic and reverse phase high performance liquid chromatography mass spectrometry indicated orthogonal metabolite coverage by these chromatography modes. We found that the metabolite coverage

  9. Compendium of Greenhouse Gas Emissions Estimation Methodologies for the Oil and Gas Industry

    Energy Technology Data Exchange (ETDEWEB)

    Shires, T.M.; Loughran, C.J. [URS Corporation, Austin, TX (United States)

    2004-02-01

    This document is a compendium of currently recognized methods and provides details for all oil and gas industry segments to enhance consistency in emissions estimation. This Compendium aims to accomplish the following goals: Assemble an expansive collection of relevant emission factors for estimating GHG emissions, based on currently available public documents; Outline detailed procedures for conversions between different measurement unit systems, with particular emphasis on implementation of oil and gas industry standards; Provide descriptions of the multitude of oil and gas industry operations, in its various segments, and the associated emissions sources that should be considered; and Develop emission inventory examples, based on selected facilities from the various segments, to demonstrate the broad applicability of the methodologies. The overall objective of developing this document is to promote the use of consistent, standardized methodologies for estimating GHG emissions from petroleum industry operations. The resulting Compendium documents recognized calculation techniques and emission factors for estimating GHG emissions for oil and gas industry operations. These techniques cover the calculation or estimation of emissions from the full range of industry operations - from exploration and production through refining, to the marketing and distribution of products. The Compendium presents and illustrates the use of preferred and alternative calculation approaches for carbon dioxide (CO2), methane (CH4), and nitrous oxide (N2O) emissions for all common emission sources, including combustion, vented, and fugitive. Decision trees are provided to guide the user in selecting an estimation technique based on considerations of materiality, data availability, and accuracy. API will provide (free of charge) a calculation tool based on the emission estimation methodologies described herein. The tool will be made available at http://ghg.api.org/.

  10. Techniques for detecting explosives and contraband

    International Nuclear Information System (INIS)

    Vourvopoulos, G.

    1994-01-01

    Because terrorism continues to be a societal threat, scientists are still searching for ways to identify concealed weapons that can be used in terrorist attacks. Explosives are singled out for particular attention because they can easily be shaped to look innocuous, and are still hard to detect. At present, there are three methods under development for the detection of explosives: X-ray imaging, vapour detection and nuclear techniques, and this article will concentrate on the latter. Since there is no single technology that can address all the questions concerning the detection of explosives and other illicit contraband, the philosophy that emerges is that of an integral system combining methodologies. Such a system could contain a nuclear technology device, a vapour detector, and an X-ray imaging device, all backed by an intelligence gathering system. In this paper methods are suggested for identifying explosives which may be used in terrorist attacks and for detecting concealed drugs. Techniques discussed are X-ray imaging, combining high and low energy x-ray machines, vapour detection using a ''sniffer'' to collect vapour samples then analysing the vapour by gas chromatography, chemiluminescence and mass spectroscopy and nuclear techniques. Nuclear techniques, such as neutron activation analysis, are discussed in detail but it is stressed that they need to be carried out at speed to eliminate disruption and delay at airports etc. (UK)

  11. Design Methodologies: Industrial and Educational Applications

    NARCIS (Netherlands)

    Tomiyama, T.; Gul, P.; Jin, Y.; Lutters, Diederick; Kind, Ch.; Kimura, F.

    2009-01-01

    The field of Design Theory and Methodology has a rich collection of research results that has been taught at educational institutions as well as applied to design practices. First, this keynote paper describes some methods to classify them. It then illustrates individual theories and methodologies

  12. 24 CFR 904.205 - Training methodology.

    Science.gov (United States)

    2010-04-01

    ... Training methodology. Equal in importance to the content of the pre- and post-occupancy training is the training methodology. Because groups vary, there should be adaptability in the communication and learning experience. Methods to be utilized may include group presentations, small discussion groups, special classes...

  13. Microemulsion extrusion technique: a new method to produce lipid nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, Marcelo Bispo de, E-mail: dejesusmb@gmail.com; Radaic, Allan [University of Campinas-UNICAMP, Department of Biochemistry, Institute of Biology (Brazil); Zuhorn, Inge S. [University of Groningen, Department of Membrane Cell Biology, University Medical Center (Netherlands); Paula, Eneida de [University of Campinas-UNICAMP, Department of Biochemistry, Institute of Biology (Brazil)

    2013-10-15

    Solid lipid nanoparticles (SLN) and nanostructured lipid carriers (NLC) have been intensively investigated for different applications, including their use as drug and gene delivery systems. Different techniques have been employed to produce lipid nanoparticles, of which high pressure homogenization is the standard technique that is adopted nowadays. Although this method has a high efficiency, does not require the use of organic solvents, and allows large-scale production, some limitations impede its application at laboratory scale: the equipment is expensive, there is a need of huge amounts of surfactants and co-surfactants during the preparation, and the operating conditions are energy intensive. Here, we present the microemulsion extrusion technique as an alternative method to prepare lipid nanoparticles. The parameters to produce lipid nanoparticles using microemulsion extrusion were established, and the lipid particles produced (SLN, NLC, and liposomes) were characterized with regard to size (from 130 to 190 nm), zeta potential, and drug (mitoxantrone) and gene (pDNA) delivery properties. In addition, the particles' in vitro co-delivery capacity (to carry mitoxantrone plus pDNA encoding the phosphatase and tensin homologue, PTEN) was tested in normal (BALB 3T3 fibroblast) and cancer (PC3 prostate and MCF-7 breast) cell lines. The results show that the microemulsion extrusion technique is fast, inexpensive, reproducible, free of organic solvents, and suitable for small volume preparations of lipid nanoparticles. Its application is particularly interesting when using rare and/or costly drugs or ingredients (e.g., cationic lipids for gene delivery or labeled lipids for nanoparticle tracking/diagnosis)

  14. Microemulsion extrusion technique: a new method to produce lipid nanoparticles

    International Nuclear Information System (INIS)

    Jesus, Marcelo Bispo de; Radaic, Allan; Zuhorn, Inge S.; Paula, Eneida de

    2013-01-01

    Solid lipid nanoparticles (SLN) and nanostructured lipid carriers (NLC) have been intensively investigated for different applications, including their use as drug and gene delivery systems. Different techniques have been employed to produce lipid nanoparticles, of which high pressure homogenization is the standard technique that is adopted nowadays. Although this method has a high efficiency, does not require the use of organic solvents, and allows large-scale production, some limitations impede its application at laboratory scale: the equipment is expensive, there is a need of huge amounts of surfactants and co-surfactants during the preparation, and the operating conditions are energy intensive. Here, we present the microemulsion extrusion technique as an alternative method to prepare lipid nanoparticles. The parameters to produce lipid nanoparticles using microemulsion extrusion were established, and the lipid particles produced (SLN, NLC, and liposomes) were characterized with regard to size (from 130 to 190 nm), zeta potential, and drug (mitoxantrone) and gene (pDNA) delivery properties. In addition, the particles’ in vitro co-delivery capacity (to carry mitoxantrone plus pDNA encoding the phosphatase and tensin homologue, PTEN) was tested in normal (BALB 3T3 fibroblast) and cancer (PC3 prostate and MCF-7 breast) cell lines. The results show that the microemulsion extrusion technique is fast, inexpensive, reproducible, free of organic solvents, and suitable for small volume preparations of lipid nanoparticles. Its application is particularly interesting when using rare and/or costly drugs or ingredients (e.g., cationic lipids for gene delivery or labeled lipids for nanoparticle tracking/diagnosis)

  15. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  16. 34 CFR 429.1 - What is the Bilingual Vocational Materials, Methods, and Techniques Program?

    Science.gov (United States)

    2010-07-01

    ... techniques for bilingual vocational training for individuals with limited English proficiency. (Authority..., and Techniques Program? 429.1 Section 429.1 Education Regulations of the Offices of the Department of... MATERIALS, METHODS, AND TECHNIQUES PROGRAM General § 429.1 What is the Bilingual Vocational Materials...

  17. Design methodology for integrated downstream separation systems in an ethanol biorefinery

    Science.gov (United States)

    Mohammadzadeh Rohani, Navid

    and obtaining energy security. On the other hand, Process Integration (PI) as defined by Natural Resource Canada as the combination of activities which aim at improving process systems, their unit operations and their interactions in order to maximize the efficiency of using water, energy and raw materials can also help biorefineries lower their energy consumptions and improve their economics. Energy integration techniques such as pinch analysis adopted by different industries over the years have ensured using heat sources within a plant to supply the demand internally and decrease the external utility consumption. Therefore, adopting energy integration can be one of the ways biorefinery technology owners can consider in their process development as well as their business model in order to improve their overall economics. The objective of this thesis is to propose a methodology for designing integrated downstream separation in a biorefinery. This methodology is tested in an ethanol biorefinery case study. Several alternative separation techniques are evaluated in their energy consumption and economics in three different scenarios; stand-alone without energy integration, stand-alone with internal energy integration and integrated-with Kraft. The energy consumptions and capital costs of separation techniques are assessed in each scenario and the cost and benefit of integration are determined and finally the best alternative is found through techno-economic metrics. Another advantage of this methodology is the use of a graphical tool which provides insights on decreasing energy consumption by modifying the process condition. The pivot point of this work is the use of a novel energy integration method called Bridge analysis. This systematic method which originally is intended for retrofit situation is used here for integration with Kraft process. Integration potentials are identified through this method and savings are presented for each design. In stand-alone with

  18. An Integrated Methodology for Emulsified Formulated Product Design

    DEFF Research Database (Denmark)

    Mattei, Michele

    are mixed together to determine the desired emulsified product. They are still mainly designed and analysed through trial - and - error based exper- imental techniques, therefore a systematic approach , integrating model-based as well a s experiment - based techniques, for design of these products could......The consumer oriented chemical based products are used every day by millions of people. They are structured products constituted of numerous chemicals, and many of them, especially household and personal care products, are emulsions where active ingredients, solvents, additives and surfactants...... significantly reduce both time and cost connected to product development by doing only the necessary experi- ments , and ensuring chances for innovation . The main contribution of this project i s the development of an integrated methodology for the design of emulsified formulated products. The methodology...

  19. Proper Methodology and Methods of Collecting and Analyzing Slavery Data: An Examination of the Global Slavery Index

    Directory of Open Access Journals (Sweden)

    Andrew Guth

    2014-11-01

    Full Text Available The Global Slavery Index aims to, among other objectives, recognize the forms, size, and scope of slavery worldwide as well as the strengths and weaknesses of individual countries. An analysis of the Index’s methods exposes significant and critical weaknesses and raises questions into its replicability and validity. The Index may prove more valuable in the future if proper methods are implemented, but the longer improper methods are used the more damage is done to the public policy debate on slavery by advancing data and policy that is not based on sound methodology. To implement proper methods, a committee of sophisticated methodologists needs to develop measurement tools and constantly analyze and refine these methods over the years as data is collected.

  20. The comparison of MCNP perturbation technique with MCNP difference method in critical calculation

    International Nuclear Information System (INIS)

    Liu Bin; Lv Xuefeng; Zhao Wei; Wang Kai; Tu Jing; Ouyang Xiaoping

    2010-01-01

    For a nuclear fission system, we calculated Δk eff , which arise from system material composition changes, by two different approaches, the MCNP perturbation technique and the MCNP difference method. For every material composition change, we made four different runs, each run with different cycles or each cycle generating different neutrons, then we compared the two Δk eff that are obtained by two different approaches. As a material composition change in any particular cell of the nuclear fission system is small compared to the material compositions in the whole nuclear fission system, in other words, this composition change can be treated as a small perturbation, the Δk eff results obtained from the MCNP perturbation technique are much quicker, much more efficient and reliable than the results from the MCNP difference method. When a material composition change in any particular cell of the nuclear fission system is significant compared to the material compositions in the whole nuclear fission system, both the MCNP perturbation technique and the MCNP difference method can give satisfactory results. But for the run with the same cycles and each cycle generating the same neutrons, the results obtained from the MCNP perturbation technique are systemically less than the results obtained from the MCNP difference method. To further confirm our calculation results from the MCNP4C, we run the exact same MCNP4C input file in MCNP5, the calculation results from MCNP5 are the same as the calculation results from MCNP4C. We need caution when using the MCNP perturbation technique to calculate the Δk eff as the material composition change is large compared to the material compositions in the whole nuclear fission system, even though the material composition changes of any particular cell of the fission system still meet the criteria of MCNP perturbation technique.

  1. Assessment of heterogeneous geological environment using geostatistical techniques

    International Nuclear Information System (INIS)

    Toida, Masaru; Suyama, Yasuhiro; Shiogama, Yukihiro; Atsumi, Hiroyuki; Abe, Yasunori; Furuichi, Mitsuaki

    2003-02-01

    'Geoscientific' research at Tono are developing site investigation and assessment techniques in geological environment. One of their important themes is to establish rational methodology to reduce uncertainties associated with the understanding of geological environment, which often exhibits significant heterogeneity. Purpose of this study is to identify and evaluate uncertainties associated with the understanding of geological environment. Because it is useful to guide designing effective site investigation techniques to reduce the uncertainty. For this, a methodology of the uncertainty analysis concerning the heterogeneous geological environment has been developed. In this report the methodology has also been tested through an exercise attempted in Tono area to demonstrate its applicability. This report summarizes as follows: 1) The exercise shows that the methodology considered 'variability' and 'ignorance' can demonstrate its applicability at three-dimensional case. 2) The exercise shows that the methodology can identity and evaluate uncertainties concerning ground water flow associated with performance assessment. 3) Based on sensitivity analyses, it is possible for the methodology to support designs of the following stage investigations to reduce the uncertainties efficiently. (author)

  2. The Stonehenge technique. A method for aligning coherent bremsstrahlung radiators

    International Nuclear Information System (INIS)

    Livingston, Ken

    2009-01-01

    This paper describes a technique for the alignment of crystal radiators used to produce high energy, linearly polarized photons via coherent bremsstrahlung scattering at electron beam facilities. In these experiments the crystal is mounted on a goniometer which is used to adjust its orientation relative to the electron beam. The angles and equations which relate the crystal lattice, goniometer and electron beam direction are presented here, and the method of alignment is illustrated with data taken at MAMI (the Mainz microtron). A practical guide to setting up a coherent bremsstrahlung facility and installing new crystals using this technique is also included.

  3. The Stonehenge technique. A method for aligning coherent bremsstrahlung radiators

    Energy Technology Data Exchange (ETDEWEB)

    Livingston, Ken [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom)], E-mail: k.livingston@physics.gla.ac.uk

    2009-05-21

    This paper describes a technique for the alignment of crystal radiators used to produce high energy, linearly polarized photons via coherent bremsstrahlung scattering at electron beam facilities. In these experiments the crystal is mounted on a goniometer which is used to adjust its orientation relative to the electron beam. The angles and equations which relate the crystal lattice, goniometer and electron beam direction are presented here, and the method of alignment is illustrated with data taken at MAMI (the Mainz microtron). A practical guide to setting up a coherent bremsstrahlung facility and installing new crystals using this technique is also included.

  4. The Stonehenge technique. A method for aligning coherent bremsstrahlung radiators

    Science.gov (United States)

    Livingston, Ken

    2009-05-01

    This paper describes a technique for the alignment of crystal radiators used to produce high energy, linearly polarized photons via coherent bremsstrahlung scattering at electron beam facilities. In these experiments the crystal is mounted on a goniometer which is used to adjust its orientation relative to the electron beam. The angles and equations which relate the crystal lattice, goniometer and electron beam direction are presented here, and the method of alignment is illustrated with data taken at MAMI (the Mainz microtron). A practical guide to setting up a coherent bremsstrahlung facility and installing new crystals using this technique is also included.

  5. Performance of neutron activation analysis in the evaluation of bismuth iodide purification methodology

    International Nuclear Information System (INIS)

    Armelin, Maria Jose A.; Ferraz, Caue de Mello; Hamada, Margarida M.

    2015-01-01

    Bismuth tri-iodide (BrI 3 ) is an attractive material for using as a semiconductor. In this paper, BiI 3 crystals have been grown by the vertical Bridgman technique using commercially available powder. The impurities were evaluated by instrumental neutron activation analysis (INAA). The results show that INAA is an analytical method appropriate for monitoring the impurities of: Ag, As, Br, Cr, K, Mo, Na and Sb in the various stages of the BiI 3 purification methodology. (author)

  6. A Modeling methodology for NoSQL Key-Value databases

    Directory of Open Access Journals (Sweden)

    Gerardo ROSSEL

    2017-08-01

    Full Text Available In recent years, there has been an increasing interest in the field of non-relational databases. However, far too little attention has been paid to design methodology. Key-value data stores are an important component of a class of non-relational technologies that are grouped under the name of NoSQL databases. The aim of this paper is to propose a design methodology for this type of database that allows overcoming the limitations of the traditional techniques. The proposed methodology leads to a clean design that also allows for better data management and consistency.

  7. Adjustment technique without explicit formation of normal equations /conjugate gradient method/

    Science.gov (United States)

    Saxena, N. K.

    1974-01-01

    For a simultaneous adjustment of a large geodetic triangulation system, a semiiterative technique is modified and used successfully. In this semiiterative technique, known as the conjugate gradient (CG) method, original observation equations are used, and thus the explicit formation of normal equations is avoided, 'huge' computer storage space being saved in the case of triangulation systems. This method is suitable even for very poorly conditioned systems where solution is obtained only after more iterations. A detailed study of the CG method for its application to large geodetic triangulation systems was done that also considered constraint equations with observation equations. It was programmed and tested on systems as small as two unknowns and three equations up to those as large as 804 unknowns and 1397 equations. When real data (573 unknowns, 965 equations) from a 1858-km-long triangulation system were used, a solution vector accurate to four decimal places was obtained in 2.96 min after 1171 iterations (i.e., 2.0 times the number of unknowns).

  8. Integrating FMEA in a Model-Driven Methodology

    Science.gov (United States)

    Scippacercola, Fabio; Pietrantuono, Roberto; Russo, Stefano; Esper, Alexandre; Silva, Nuno

    2016-08-01

    Failure Mode and Effects Analysis (FMEA) is a well known technique for evaluating the effects of potential failures of components of a system. FMEA demands for engineering methods and tools able to support the time- consuming tasks of the analyst. We propose to make FMEA part of the design of a critical system, by integration into a model-driven methodology. We show how to conduct the analysis of failure modes, propagation and effects from SysML design models, by means of custom diagrams, which we name FMEA Diagrams. They offer an additional view of the system, tailored to FMEA goals. The enriched model can then be exploited to automatically generate FMEA worksheet and to conduct qualitative and quantitative analyses. We present a case study from a real-world project.

  9. Method for Controlling Space Transportation System Life Cycle Costs

    Science.gov (United States)

    McCleskey, Carey M.; Bartine, David E.

    2006-01-01

    A structured, disciplined methodology is required to control major cost-influencing metrics of space transportation systems during design and continuing through the test and operations phases. This paper proposes controlling key space system design metrics that specifically influence life cycle costs. These are inclusive of flight and ground operations, test, and manufacturing and infrastructure. The proposed technique builds on today's configuration and mass properties control techniques and takes on all the characteristics of a classical control system. While the paper does not lay out a complete math model, key elements of the proposed methodology are explored and explained with both historical and contemporary examples. Finally, the paper encourages modular design approaches and technology investments compatible with the proposed method.

  10. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  11. D Integrated Methodologies for the Documentation and the Virtual Reconstruction of AN Archaeological Site

    Science.gov (United States)

    Balletti, C.; Guerra, F.; Scocca, V.; Gottardi, C.

    2015-02-01

    Highly accurate documentation and 3D reconstructions are fundamental for analyses and further interpretations in archaeology. In the last years the integrated digital survey (ground-based survey methods and UAV photogrammetry) has confirmed its main role in the documentation and comprehension of excavation contexts, thanks to instrumental and methodological development concerning the on site data acquisition. The specific aim of the project, reported in this paper and realized by the Laboratory of Photogrammetry of the IUAV University of Venice, is to check different acquisition systems and their effectiveness test, considering each methodology individually or integrated. This research focuses on the awareness that the integration of different survey's methodologies can as a matter of fact increase the representative efficacy of the final representations; these are based on a wider and verified set of georeferenced metric data. Particularly the methods' integration allows reducing or neutralizing issues related to composite and complex objects' survey, since the most appropriate tools and techniques can be chosen considering the characteristics of each part of an archaeological site (i.e. urban structures, architectural monuments, small findings). This paper describes the experience in several sites of the municipality of Sepino (Molise, Italy), where the 3d digital acquisition of cities and structure of monuments, sometimes hard to reach, was realized using active and passive techniques (rage-based and image based methods). This acquisition was planned in order to obtain not only the basic support for interpretation analysis, but also to achieve models of the actual state of conservation of the site on which some reconstructive hypotheses can be based on. Laser scanning data were merged with Structure from Motion techniques' clouds into the same reference system, given by a topographical and GPS survey. These 3d models are not only the final results of the metric

  12. A Novel Technique to Enhance Demand Responsiveness

    DEFF Research Database (Denmark)

    Farashbashi-Astaneh, Seyed-Mostafa; Bhattarai, Bishnu Prasad; Bak-Jensen, Birgitte

    2015-01-01

    In this study, a new pricing approach is proposed to increase demand responsiveness. The proposed approach considers two well-known demand side management techniques, namely peak shaving and valley filling. This is done by incentivising consumers by magnifying price difference between peak and off......-peak hours. The usefulness of the suggested method is then investigated by its combination with an electric vehicle optimal scheduling methodology which captures both economic valuation and grid technical constraints. This case is chosen in this study to address network congestion issues, namely under...

  13. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    Science.gov (United States)

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  14. Quantitative renal cinescintigraphy with iodine-123 hippuran methodological aspects, kit for labeling of hippuran

    International Nuclear Information System (INIS)

    Mehdaoui, A.; Pecking, A.; Delorme, G.; Mathonnat, F.; Debaud, B.; Bardy, A.; Coornaert, S.; Merlin, L.; Vinot, J.M.; Desgrez, A.; Gambini, D.; Vernejoul, P. de.

    1981-08-01

    The development of an extemporaneous kit for the labeling of ortho-iodo-hippuric acid (Hippuran) with iodine 123 allows the performance of a routine quantitative renal cinescintigraphy providing in 20 minutes, and in an absolutely non-traumatic way, a very complete renal morphofunctional study including: a cortical renal scintigraphy, sequential scintigraphies of excretory tract, renal functional curves, tubular, global, and separate clearances for each kidney. This functional quantitative investigation method should take a preferential place in the routine renal balance. The methodology of the technique is explained and compared to classical methods for estimation of tubular, global and separate clearances [fr

  15. Thematic Analysis of the Children's Drawings on Museum Visit: Adaptation of the Kuhn's Method

    Science.gov (United States)

    Kisovar-Ivanda, Tamara

    2014-01-01

    Researchers are using techniques that allow children to express their perspectives. In 2003, Kuhn developed the method of data collection and analysis which combined thematic drawing and focused, episodic interview. In this article the Kuhn's method is adjusted using the draw and write technique as a research methodology. Reflections on the…

  16. Methodology for Evaluating Safety System Operability using Virtual Parameter Network

    International Nuclear Information System (INIS)

    Park, Sukyoung; Heo, Gyunyoung; Kim, Jung Taek; Kim, Tae Wan

    2014-01-01

    KAERI (Korea Atomic Energy Research Institute) and UTK (University of Tennessee Knoxville) are working on the I-NERI project to suggest complement of this problem. This research propose the methodology which provide the alternative signal in case of unable guaranteed reliability of some instrumentation with KAERI. Proposed methodology is assumed that several instrumentations are working normally under the power supply condition because we do not consider the instrumentation survivability itself. Thus, concept of the Virtual Parameter Network (VPN) is used to identify the associations between plant parameters. This paper is extended version of the paper which was submitted last KNS meeting by changing the methodology and adding the result of the case study. In previous research, we used Artificial Neural Network (ANN) inferential technique for estimation model but every time this model showed different estimate value due to random bias each time. Therefore Auto-Associative Kernel Regression (AAKR) model which have same number of inputs and outputs is used to estimate. Also the importance measures in the previous method depend on estimation model but importance measure of improved method independent on estimation model. Also importance index of previous method depended on estimation model but importance index of improved method is independent on estimation model. In this study, we proposed the methodology to identify the internal state of power plant when severe accident happens also it has been validated through case study. SBLOCA which has large contribution to severe accident is considered as initiating event and relationship amongst parameter has been identified. VPN has ability to identify that which parameter has to be observed and which parameter can be alternative to the missing parameter when some instruments are failed in severe accident. In this study we have identified through results that commonly number 2, 3, 4 parameter has high connectivity while

  17. Methodology for Evaluating Safety System Operability using Virtual Parameter Network

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sukyoung; Heo, Gyunyoung [Kyung Hee Univ., Yongin (Korea, Republic of); Kim, Jung Taek [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kim, Tae Wan [Kepco International Nuclear Graduate School, Ulsan (Korea, Republic of)

    2014-05-15

    KAERI (Korea Atomic Energy Research Institute) and UTK (University of Tennessee Knoxville) are working on the I-NERI project to suggest complement of this problem. This research propose the methodology which provide the alternative signal in case of unable guaranteed reliability of some instrumentation with KAERI. Proposed methodology is assumed that several instrumentations are working normally under the power supply condition because we do not consider the instrumentation survivability itself. Thus, concept of the Virtual Parameter Network (VPN) is used to identify the associations between plant parameters. This paper is extended version of the paper which was submitted last KNS meeting by changing the methodology and adding the result of the case study. In previous research, we used Artificial Neural Network (ANN) inferential technique for estimation model but every time this model showed different estimate value due to random bias each time. Therefore Auto-Associative Kernel Regression (AAKR) model which have same number of inputs and outputs is used to estimate. Also the importance measures in the previous method depend on estimation model but importance measure of improved method independent on estimation model. Also importance index of previous method depended on estimation model but importance index of improved method is independent on estimation model. In this study, we proposed the methodology to identify the internal state of power plant when severe accident happens also it has been validated through case study. SBLOCA which has large contribution to severe accident is considered as initiating event and relationship amongst parameter has been identified. VPN has ability to identify that which parameter has to be observed and which parameter can be alternative to the missing parameter when some instruments are failed in severe accident. In this study we have identified through results that commonly number 2, 3, 4 parameter has high connectivity while

  18. ATHEANA: A Technique for Human Error Analysis: An Overview of Its Methodological Basis

    International Nuclear Information System (INIS)

    Wreathall, John; Ramey-Smith, Ann

    1998-01-01

    The U.S. NRC has developed a new human reliability analysis (HRA) method, called A Technique for Human Event Analysis (ATHEANA), to provide a way of modeling the so-called 'errors of commission' - that is, situations in which operators terminate or disable engineered safety features (ESFs) or similar equipment during accident conditions, thereby putting the plant at an increased risk of core damage. In its reviews of operational events, NRC has found that these errors of commission occur with a relatively high frequency (as high as 2 or 3 per year), but are noticeably missing from the scope of most current probabilistic risk assessments (PRAs). This new method was developed through a formalized approach that describes what can occur when operators behave rationally but have inadequate knowledge or poor judgement. In particular, the method is based on models of decision-making and response planning that have been used extensively in the aviation field, and on the analysis of major accidents in both the nuclear and non-nuclear fields. Other papers at this conference present summaries of these event analyses in both the nuclear and non-nuclear fields. This paper presents an overview of ATHEANA and summarizes how the method structures the analysis of operationally significant events, and helps HRA analysts identify and model potentially risk-significant errors of commission in plant PRAs. (authors)

  19. Methodology for Clustering High-Resolution Spatiotemporal Solar Resource Data

    Energy Technology Data Exchange (ETDEWEB)

    Getman, Dan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dyson, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2015-09-01

    In this report, we introduce a methodology to achieve multiple levels of spatial resolution reduction of solar resource data, with minimal impact on data variability, for use in energy systems modeling. The selection of an appropriate clustering algorithm, parameter selection including cluster size, methods of temporal data segmentation, and methods of cluster evaluation are explored in the context of a repeatable process. In describing this process, we illustrate the steps in creating a reduced resolution, but still viable, dataset to support energy systems modeling, e.g. capacity expansion or production cost modeling. This process is demonstrated through the use of a solar resource dataset; however, the methods are applicable to other resource data represented through spatiotemporal grids, including wind data. In addition to energy modeling, the techniques demonstrated in this paper can be used in a novel top-down approach to assess renewable resources within many other contexts that leverage variability in resource data but require reduction in spatial resolution to accommodate modeling or computing constraints.

  20. A design methodology for unattended monitoring systems

    International Nuclear Information System (INIS)

    SMITH, JAMES D.; DELAND, SHARON M.

    2000-01-01

    The authors presented a high-level methodology for the design of unattended monitoring systems, focusing on a system to detect diversion of nuclear materials from a storage facility. The methodology is composed of seven, interrelated analyses: Facility Analysis, Vulnerability Analysis, Threat Assessment, Scenario Assessment, Design Analysis, Conceptual Design, and Performance Assessment. The design of the monitoring system is iteratively improved until it meets a set of pre-established performance criteria. The methodology presented here is based on other, well-established system analysis methodologies and hence they believe it can be adapted to other verification or compliance applications. In order to make this approach more generic, however, there needs to be more work on techniques for establishing evaluation criteria and associated performance metrics. They found that defining general-purpose evaluation criteria for verifying compliance with international agreements was a significant undertaking in itself. They finally focused on diversion of nuclear material in order to simplify the problem so that they could work out an overall approach for the design methodology. However, general guidelines for the development of evaluation criteria are critical for a general-purpose methodology. A poor choice in evaluation criteria could result in a monitoring system design that solves the wrong problem

  1. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  2. Application of Response Surface Methodology in Development of Sirolimus Liposomes Prepared by Thin Film Hydration Technique

    Directory of Open Access Journals (Sweden)

    Saeed Ghanbarzadeh

    2013-04-01

    Full Text Available Introduction: The present investigation was aimed to optimize the formulating process of sirolimus liposomes by thin film hydration method. Methods: In this study, a 32 factorial design method was used to investigate the influence of two independent variables in the preparation of sirolimus liposomes. The dipalmitoylphosphatidylcholine (DPPC /Cholesterol (Chol and dioleoyl phosphoethanolamine(DOPE /DPPC molar ratios were selected as the independent variables. Particle size (PS and Encapsulation Efficiency (EE % were selected as the dependent variables. To separate the un-encapsulated drug, dialysis method was used. Drug analysis was performed with a validated RP-HPLC method. Results: Using response surface methodology and based on the coefficient values obtained for independent variables in the regression equations, it was clear that the DPPC/Chol molar ratio was the major contributing variable in particle size and EE %. The use of a statistical approach allowed us to see individual and/or interaction effects of influencing parameters in order to obtain liposomes with desired properties and to determine the optimum experimental conditions that lead to the enhancement of characteristics. In the prediction of PS and EE % values, the average percent errors are found to be as 3.59 and 4.09%. This value is sufficiently low to confirm the high predictive power of model. Conclusion: Experimental results show that the observed responses were in close agreement with the predicted values and this demonstrates the reliability of the optimization procedure in prediction of PS and EE % in sirolimus liposomes preparation.

  3. Methodology For Determination Of Space Control For 3D Reconstruction In Statscan Digital X-Ray Radiology Using Static Frame Model

    Directory of Open Access Journals (Sweden)

    Jacinta S. Kimuyu

    2015-08-01

    Full Text Available The methodology was designed to employ two positioning techniques in order to determine the three-dimensional control space of target points on static metal frame model to be used as space control data in 3D reconstructions in Statscan digital X-Ray imaging. These techniques were digital close-range photogrammetry and precise theodolite positioning method. The space coordinates for the target points were determined 3D using both techniques. Point positioning accuracy 0.5mm in root mean square error of X Y and Z space coordinates was achieved. The outcome of the comparison of the results obtained from both methods were of satisfactory accuracy hence further use of the control space data in Stastcan imaging and 3D reconstruction.

  4. Methodological basis for formation of uniterruptible education content for future specialists of atomic-nuclear complex

    International Nuclear Information System (INIS)

    Burtebayev, N.; Burtebayeva, J.T.; Basharuly, R.; Altynsarin, Y.

    2009-01-01

    being able to self-orientate in a nonstandard situations, appearing either in global or regional or specific national scale. In other case, that is in breach of 'common', 'special' and 'single' dialectical unity any system or model formed without any of these components is only an empty abstraction [2], as warned by Gegel, that is of no benefits for practice. Naturally, this conclusion has a deep science-methodological sense for content formation of any national system of uninterruptible education. From the other hand, the science-wide principles of system analysis such as the continuity principle, the integrity principle, the principle of functional adequacy, the rules of system sub-optimization form the main methodological base of education content formation in general and the specialists of atomic-nuclear profile in particular. Based on these principles of system analysis standing on the second level of science-wide methodological approach the content of uninterrupted education system is classified into traditional general education, polytechnic (pre-profile and profile) education, professional education [3]. The indicated components of the content of uninterrupted education system are available in all its levels and stages in different proportion. This proves that the main point of subject investigation is not only the determination of education content for the nuclear profile specialists, but also its optimal view at all levels and stages of general, polytechnic (pre-profile and profile) and professional education. The laws, principles, and provisions of specific science, presenting the particular science methodological level of methodological approach shall be at least involved for these aims. The laws, principles, and provisions of physics aimed to determine the education content aspect and those of pedagogic and psychology aimed to determine the aims, forms, methods and techniques of 'methods system' teaching are mainly used for determination of education

  5. Formation factor logging in-situ by electrical methods. Background and methodology

    International Nuclear Information System (INIS)

    Loefgren, Martin; Neretnieks, Ivars

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  6. Formation factor logging in-situ by electrical methods. Background and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Loefgren, Martin; Neretnieks, Ivars [Royal Inst. of Tech., Stockholm (Sweden). Dept. of Chemical Engineering and Technology

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  7. Methodology to calculate wall thickness in metallic pipes

    International Nuclear Information System (INIS)

    Ramirez, G.F.; Feliciano, H.J.

    1992-01-01

    The principal objective in the developing of the activities of industrial type is to carry out a efficient and productive task: that implies necessarily to know the best working conditions of the equipment and installations to be concerned. The applications of the radioisotope techniques have a long time as useful tools in several fields of human work. For example, in the Petroleos Mexicanos petrochemical complexes, by safety reasons and for to avoid until maximum the losses, it must be know with a high possible precision the operation regimes of the lines of tubes that they conduce the hydrocarbons, with the purpose to know when they should be replaced the defective or wasted pieces. In the Mexican Petroleum Institute is carrying out a work that it has by objective to develop a methodology bases in the use of radioisotopes that permits to determine the average thickness of the metallic tubes wall, that they have thermic insulator, with a precision of ±0.127 mm (±5 thousandth inch). The method is based in the radiation use emitted by Cs-137 sources. In this work it is described the methodology development so as the principal results obtained. (Author)

  8. A Preconditioning Technique for First-Order Primal-Dual Splitting Method in Convex Optimization

    Directory of Open Access Journals (Sweden)

    Meng Wen

    2017-01-01

    Full Text Available We introduce a preconditioning technique for the first-order primal-dual splitting method. The primal-dual splitting method offers a very general framework for solving a large class of optimization problems arising in image processing. The key idea of the preconditioning technique is that the constant iterative parameters are updated self-adaptively in the iteration process. We also give a simple and easy way to choose the diagonal preconditioners while the convergence of the iterative algorithm is maintained. The efficiency of the proposed method is demonstrated on an image denoising problem. Numerical results show that the preconditioned iterative algorithm performs better than the original one.

  9. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  10. Application of Taguchi methodology to improve the functional quality of a mechanical device

    International Nuclear Information System (INIS)

    Regeai, Awatef Omar

    2005-01-01

    Manufacturing and quality control are recognized branches of engineering management. special attention has been made to improve thr tools and methods for the purpose of improving the products quality and finding solutions for any Obstacles and/or problems during the production process. Taguchi methodology is one of the most powerful techniques for improving product and manufacturing process quality at low cost. It is a strategical and practical method that aims to assist managers and industrial engineers to tackle manufacturing quality problems in a systematic and structured manner. The potential benefit of Taguchi methodology lies in its ease of use, its emphasis on reducing variability to give more economical products and hence the accessibility to the engineering fraternity for solving real life quality problems. This study applies Taguchi methodology to improve the functional quality of a local made chain gear by a purposed heat treatment process. The hardness of steel is generally a function not of its composition only, but rather of its heat treatment. The study investigates the effects of various heat treatment parameters, including ramp rate of heating, normalizing holding time, normalizing temperature, annealing holding time, annealing temperature, hardening holding time, hardening temperature, quenching media, tempering temperature and tempering holding time upon the hardness, which is a measure of resistance to plastic deformation. Both the analysis of means (ANOM) and Signal to Noise ratio (S/N) have been carried out for determining the optimal condition of the process. A significant improvement of the functional quality characteristic (hardness) by more than 32% was obtained. The Scanning Electron Microscopy technique was used in this study to obtain visual evidence of the quality and continuous improvement of the heat treated samples. (author)

  11. Star identification methods, techniques and algorithms

    CERN Document Server

    Zhang, Guangjun

    2017-01-01

    This book summarizes the research advances in star identification that the author’s team has made over the past 10 years, systematically introducing the principles of star identification, general methods, key techniques and practicable algorithms. It also offers examples of hardware implementation and performance evaluation for the star identification algorithms. Star identification is the key step for celestial navigation and greatly improves the performance of star sensors, and as such the book include the fundamentals of star sensors and celestial navigation, the processing of the star catalog and star images, star identification using modified triangle algorithms, star identification using star patterns and using neural networks, rapid star tracking using star matching between adjacent frames, as well as implementation hardware and using performance tests for star identification. It is not only valuable as a reference book for star sensor designers and researchers working in pattern recognition and othe...

  12. Models of expected returns on the brazilian market: Empirical tests using predictive methodology

    Directory of Open Access Journals (Sweden)

    Adriano Mussa

    2009-01-01

    Full Text Available Predictive methodologies for test of the expected returns models are largely diffused on the international academic environment. However, these methods have not been used in Brazil in a systematic way. Generally, empirical studies proceeded with Brazilian stock market data are concentrated only in the first step of these methodologies. The purpose of this article was test and compare the models CAPM, 3-factors and 4-factors using a predictive methodology, considering two steps – temporal and cross-section regressions – with standard errors obtained by the techniques of Fama and Macbeth (1973. The results indicated the superiority of the 4-fators model as compared to the 3-fators model, and the superiority of the 3- factors model as compared to the CAPM, but no one of the tested models were enough on the explanation of the Brazilian stock returns. Contrary to some empirical evidences, that do not use predictive methodology, the size and momentum effect seem do not exist on the Brazilian capital markets, but there are evidences of the value effect and the relevance of the market for explanation of expected returns. These finds rise some questions, mainly caused by the originality of the methodology on the local market and by the fact that this subject is still incipient and polemic on the Brazilian academic environment.

  13. Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM) and genetic algorithm method (GA)

    Science.gov (United States)

    Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    In this study, Computer Aided Engineering was used for injection moulding simulation. The method of Design of experiment (DOE) was utilize according to the Latin Square orthogonal array. The relationship between the injection moulding parameters and warpage were identify based on the experimental data that used. Response Surface Methodology (RSM) was used as to validate the model accuracy. Then, the RSM and GA method were combine as to examine the optimum injection moulding process parameter. Therefore the optimisation of injection moulding is largely improve and the result shown an increasing accuracy and also reliability. The propose method by combining RSM and GA method also contribute in minimising the warpage from occur.

  14. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  15. A Clustering Methodology of Web Log Data for Learning Management Systems

    Science.gov (United States)

    Valsamidis, Stavros; Kontogiannis, Sotirios; Kazanidis, Ioannis; Theodosiou, Theodosios; Karakos, Alexandros

    2012-01-01

    Learning Management Systems (LMS) collect large amounts of data. Data mining techniques can be applied to analyse their web data log files. The instructors may use this data for assessing and measuring their courses. In this respect, we have proposed a methodology for analysing LMS courses and students' activity. This methodology uses a Markov…

  16. Field Application of Cable Tension Estimation Technique Using the h-SI Method

    Directory of Open Access Journals (Sweden)

    Myung-Hyun Noh

    2015-01-01

    Full Text Available This paper investigates field applicability of a new system identification technique of estimating tensile force for a cable of long span bridges. The newly proposed h-SI method using the combination of the sensitivity updating algorithm and the advanced hybrid microgenetic algorithm can allow not only avoiding the trap of local minimum at initial searching stage but also finding the optimal solution in terms of better numerical efficiency than existing methods. First, this paper overviews the procedure of tension estimation through a theoretical formulation. Secondly, the validity of the proposed technique is numerically examined using a set of dynamic data obtained from benchmark numerical samples considering the effect of sag extensibility and bending stiffness of a sag-cable system. Finally, the feasibility of the proposed method is investigated through actual field data extracted from a cable-stayed Seohae Bridge. The test results show that the existing methods require precise initial data in advance but the proposed method is not affected by such initial information. In particular, the proposed method can improve accuracy and convergence rate toward final values. Consequently, the proposed method can be more effective than existing methods in terms of characterizing the tensile force variation for cable structures.

  17. Headspace mass spectrometry methodology: application to oil spill identification in soils

    Energy Technology Data Exchange (ETDEWEB)

    Perez Pavon, J.L.; Garcia Pinto, C.; Moreno Cordero, B. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Guerrero Pena, A. [Universidad de Salamanca, Departamento de Quimica Analitica, Nutricion y Bromatologia, Facultad de Ciencias Quimicas, Salamanca (Spain); Laboratorio de Suelos, Plantas y Aguas, Campus Tabasco, Colegio de Postgraduados, Cardenas, Tabasco (Mexico)

    2008-05-15

    In the present work we report the results obtained with a methodology based on direct coupling of a headspace generator to a mass spectrometer for the identification of different types of petroleum crudes in polluted soils. With no prior treatment, the samples are subjected to the headspace generation process and the volatiles generated are introduced directly into the mass spectrometer, thereby obtaining a fingerprint of volatiles in the sample analysed. The mass spectrum corresponding to the mass/charge ratios (m/z) contains the information related to the composition of the headspace and is used as the analytical signal for the characterization of the samples. The signals obtained for the different samples were treated by chemometric techniques to obtain the desired information. The main advantage of the proposed methodology is that no prior chromatographic separation and no sample manipulation are required. The method is rapid, simple and, in view of the results, highly promising for the implementation of a new approach for oil spill identification in soils. (orig.)

  18. Constructivism: a naturalistic methodology for nursing inquiry.

    Science.gov (United States)

    Appleton, J V; King, L

    1997-12-01

    This article will explore the philosophical underpinnings of the constructivist research paradigm. Despite its increasing popularity in evaluative health research studies there is limited recognition of constructivism in popular research texts. Lincoln and Guba's original approach to constructivist methodology is outlined and a detailed framework for nursing research is offered. Fundamental issues and concerns surrounding this methodology are debated and differences between method and methodology are highlighted.

  19. Analysing Medieval Urban Space; a methodology

    Directory of Open Access Journals (Sweden)

    Marlous L. Craane MA

    2007-08-01

    Full Text Available This article has been written in reaction to recent developments in medieval history and archaeology, to study not only the buildings in a town but also the spaces that hold them together. It discusses a more objective and interdisciplinary approach for analysing urban morphology and use of space. It proposes a 'new' methodology by combining town plan analysis and space syntax. This methodology was trialled on the city of Utrecht in the Netherlands. By comparing the results of this 'new' methodology with the results of previous, more conventional, research, this article shows that space syntax can be applied successfully to medieval urban contexts. It does this by demonstrating a strong correlation between medieval economic spaces and the most integrated spaces, just as is found in the study of modern urban environments. It thus provides a strong basis for the use of this technique in future research of medieval urban environments.

  20. An Annotated Bibliography of the Gestalt Methods, Techniques, and Therapy

    Science.gov (United States)

    Prewitt-Diaz, Joseph O.

    The purpose of this annotated bibliography is to provide the reader with a guide to relevant research in the area of Gestalt therapy, techniques, and methods. The majority of the references are journal articles written within the last 5 years or documents easily obtained through interlibrary loans from local libraries. These references were…

  1. Strategic Management Tools and Techniques Usage: a Qualitative Review

    Directory of Open Access Journals (Sweden)

    Albana Berisha Qehaja

    2017-01-01

    Full Text Available This paper is one of the few studies to review the empirical literature on strategic management tools and techniques usage. There are many techniques, tools and methods, models, frameworks, approaches and methodologies, available to support strategic managers in decision making. They are developed and designed to support managers in all stages of strategic management process to achieve better performance. Management schools provide knowledge of these tools. But their use in organizations should be seen in practice‑based context. Consequently, some questions arise: Do they use these strategic tools and techniques in their workplace? Which strategic tools and techniques are used more in organizations? To answer these questions we have made a review of empirical studies using textual narrative synthesis method. Initially, this study presents a tabulation with a summary of empirical research for the period 1990–2015. The included studies are organized clustering them by enterprise size and sector and by country level development. A synopsis of the ten most used strategic tools and techniques worldwide resulted as follows: SWOT analysis, benchmarking, PEST analysis, “what if” analysis, vision and mission statements, Porter’s five forces analysis, business financial analysis, key success factors analysis, cost‑benefit analysis and customer satisfaction.

  2. The study of insect blood-feeding behaviour: 2. Recording techniques and the use of flow charts

    Directory of Open Access Journals (Sweden)

    J. J. B. Smith

    1987-01-01

    Full Text Available This paper continues a discussion of approaches and methodologies we have used in our studies of feeding in haematophagous insects. Described are techniques for directly monitoring behaviour: electrical recording of feeding behaviour via resistance changes in the food canal, optical methods for monitoring mouthpart activity, and a computer technique for behavioural event recording. Also described is the use of "flow charts" or "decision diagrams" to model interrelated sequences of behaviours.

  3. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data.

    Science.gov (United States)

    Vanegas, Fernando; Bratanov, Dmitry; Powell, Kevin; Weiss, John; Gonzalez, Felipe

    2018-01-17

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used-the sensors, the UAV, and the flight operations-the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analising and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications.

  4. A Novel Methodology for Improving Plant Pest Surveillance in Vineyards and Crops Using UAV-Based Hyperspectral and Spatial Data

    Science.gov (United States)

    Vanegas, Fernando; Weiss, John; Gonzalez, Felipe

    2018-01-01

    Recent advances in remote sensed imagery and geospatial image processing using unmanned aerial vehicles (UAVs) have enabled the rapid and ongoing development of monitoring tools for crop management and the detection/surveillance of insect pests. This paper describes a (UAV) remote sensing-based methodology to increase the efficiency of existing surveillance practices (human inspectors and insect traps) for detecting pest infestations (e.g., grape phylloxera in vineyards). The methodology uses a UAV integrated with advanced digital hyperspectral, multispectral, and RGB sensors. We implemented the methodology for the development of a predictive model for phylloxera detection. In this method, we explore the combination of airborne RGB, multispectral, and hyperspectral imagery with ground-based data at two separate time periods and under different levels of phylloxera infestation. We describe the technology used—the sensors, the UAV, and the flight operations—the processing workflow of the datasets from each imagery type, and the methods for combining multiple airborne with ground-based datasets. Finally, we present relevant results of correlation between the different processed datasets. The objective of this research is to develop a novel methodology for collecting, processing, analysing and integrating multispectral, hyperspectral, ground and spatial data to remote sense different variables in different applications, such as, in this case, plant pest surveillance. The development of such methodology would provide researchers, agronomists, and UAV practitioners reliable data collection protocols and methods to achieve faster processing techniques and integrate multiple sources of data in diverse remote sensing applications. PMID:29342101

  5. Methodological issues about techniques for the spiking of standard OECD soil with nanoparticles: evidence of different behaviours

    International Nuclear Information System (INIS)

    Miglietta, Maria Lucia; Rametta, Gabriella; Manzo, Sonia; Salluzzo, Antonio; Rimauro, Juri; Francia, Girolamo Di

    2015-01-01

    The aim of this study is to investigate at what extent the results of standard nanoparticle (NP) toxicity testing methodologies are affected by the different exposure procedures on soil organisms. In this view, differences in physicochemical properties of ZnO NPs (<100 nm), ZnO bulk (<200 nm) and ionic Zinc (ZnCl 2 ) and their ecotoxicological potential toward Lepidium sativum were investigated with respect to three different spiking methods. Results show that the spiking procedures give homogeneous distribution of the testing nanomaterial in soil but the physicochemical and ecotoxicological properties of the testing species differ according to the spiking procedure. Dry spiking produced the highest ZnO solubility whereas spiking through dispersions of ZnO in water and in aqueous soil extracts produced the lowest. At the same time, the ecotoxic effects showed different trends with regard to the spiking route. The need for a definition of agreed methods concerning the NP spiking procedures is, therefore, urgent

  6. Methodological issues about techniques for the spiking of standard OECD soil with nanoparticles: evidence of different behaviours

    Energy Technology Data Exchange (ETDEWEB)

    Miglietta, Maria Lucia, E-mail: mara.miglietta@enea.it; Rametta, Gabriella; Manzo, Sonia; Salluzzo, Antonio; Rimauro, Juri; Francia, Girolamo Di [ENEA, Portici Technical Unit, C.R. Portici (Italy)

    2015-07-15

    The aim of this study is to investigate at what extent the results of standard nanoparticle (NP) toxicity testing methodologies are affected by the different exposure procedures on soil organisms. In this view, differences in physicochemical properties of ZnO NPs (<100 nm), ZnO bulk (<200 nm) and ionic Zinc (ZnCl{sub 2}) and their ecotoxicological potential toward Lepidium sativum were investigated with respect to three different spiking methods. Results show that the spiking procedures give homogeneous distribution of the testing nanomaterial in soil but the physicochemical and ecotoxicological properties of the testing species differ according to the spiking procedure. Dry spiking produced the highest ZnO solubility whereas spiking through dispersions of ZnO in water and in aqueous soil extracts produced the lowest. At the same time, the ecotoxic effects showed different trends with regard to the spiking route. The need for a definition of agreed methods concerning the NP spiking procedures is, therefore, urgent.

  7. CATEGORICAL IMAGE COMPONENTS IN THE FORMING SYSTEM OF A MARKETING TECHNIQUES MANAGER’S IMAGE CULTURE

    Directory of Open Access Journals (Sweden)

    Anna Borisovna Cherednyakova

    2015-08-01

    Full Text Available Based on the understanding of the image culture formation of managers of marketing techniques, as a representative of the social and communication interaction of public structures, categorical apparatus of image culture with an emphasis on the etymology of the image, as an integral component of image culture was analyzed. Categorical components of the image are presented from the standpoint of image culture, as personal new formation, an integral part of the professional activity of the marketing techniques manager: object-communicative categorical component, subject-activity categorical component of image, personality-oriented categorical component, value-acmeological categorical component of image.The aim is to identify and justify the image categorical components as a component of image culture of the marketing techniques manager.Method and methodology of work – a general scientific research approach reflecting scientific apparatus of research.Results. Categorical components of the image, as an image culture component of manager of marketing techniques were defined.Practical implication of the results. The theoretical part of «Imageology» course, special course «Image culture of manager of marketing techniques», the theoretical and methodological study and the formation of image culture.

  8. Improving the sterile sperm identification method for its implementation in the area-wide sterile insect technique program against Ceratitis capitata (Diptera: Tephritidae) in Spain

    International Nuclear Information System (INIS)

    Juan-Blasco, M.; Urbaneja, A.; San Andrés, V.; Sabater-Muñoz, B.; Castañera, P.

    2014-01-01

    The success of sterile males in area-wide sterile insect technique (aw-SIT) programs against Ceratitis capitata (Wiedemann) is currently measured by using indirect methods as the wild: sterile male ratio captured in monitoring traps. In the past decade, molecular techniques have been used to improve these methods. The development of a polymerase chain reaction-restriction fragment-length polymorphism- based method to identify the transfer of sterile sperm to wild females, the target of SIT, was considered a significant step in this direction. This method relies on identification of sperm by detecting the presence of Y chromosomes in spermathecae DNA extract complemented by the identification of the genetic origin of this sperm: Vienna-8 males or wild haplotype. However, the application of this protocol to aw-SIT programs is limited by handling time and personnel cost. The objective of this work was to obtain a high-throughput protocol to facilitate the routine measurement in a pest population of sterile sperm presence in wild females. The polymerase chain reactionrestriction fragment-length polymorphism markers previously developed were validated in Mediterranean fruit by samples collected from various locations worldwide. A laboratory protocol previously published was modified to allow for the analysis of more samples at the same time. Preservation methods and preservation times commonly used for Mediterranean fruit by female samples were assessed for their influence on the correct molecular detection of sterile sperm. This high-throughput methodology, as well as the results of sample management presented here, provide a robust, efficient, fast, and economical sterile sperm identification method ready to be used in all Mediterranean fruit by SIT programs. (author)

  9. Single Case Method in Psychology: How to Improve as a Possible Methodology in Quantitative Research.

    Science.gov (United States)

    Krause-Kjær, Elisa; Nedergaard, Jensine I

    2015-09-01

    Awareness of including Single-Case Method (SCM), as a possible methodology in quantitative research in the field of psychology, has been argued as useful, e.g., by Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Their article introduces a historical and conceptual analysis of SCMs and proposes changing the, often prevailing, tendency of neglecting SCM as an alternative to Null Hypothesis Significance Testing (NHST). This article contributes by putting a new light on SCM as an equally important methodology in psychology. The intention of the present article is to elaborate this point of view further by discussing one of the most fundamental requirements as well as main characteristics of SCM regarding temporality. In this respect that; "…performance is assessed continuously over time and under different conditions…" Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Defining principles when it comes to particular units of analysis, both synchronic (spatial) and diachronic (temporal) elements should be incorporated. In this article misunderstandings of the SCM will be adduced, and further the temporality will be described in order to propose how the SCM could have a more severe usability in psychological research. It is further discussed how to implement SCM in psychological methodology. It is suggested that one solution might be to reconsider the notion of time in psychological research to cover more than a variable of control and in this respect also include the notion of time as an irreversible unity within life.

  10. Thinning: A Preprocessing Technique for an OCR System for the Brahmi Script

    Directory of Open Access Journals (Sweden)

    H. K. Anasuya Devi

    2006-12-01

    Full Text Available In this paper we study the methodology employed for preprocessing the archaeological images. We present the various algorithms used in the low level processing stage of image analysis for Optical Character Recognition System for Brahmi Script. The image preprocessing technique covered in this paper include Thinning method. We also try to analyze the results obtained by the pixel-level processing algorithms.

  11. Double contrast barium enema: technique, indications, results and limitations of a conventional imaging methodology in the MDCT virtual endoscopy era.

    Science.gov (United States)

    Rollandi, Gian Andrea; Biscaldi, Ennio; DeCicco, Enzo

    2007-03-01

    The double contrast barium enema of the colon continues to be a diffused conventional radiological technique and allows for the diagnosis of neoplastic and inflammatory pathology. After the '70s, a massive initiative is undertaken to simplify, perfect and encode the method of the double contrast barium enema: Altaras from Germany, Miller from USA and Cittadini from Italy are responsible for the perfection of this technique in the last 30 years. The tailored patient preparation, a perfect technique of execution and a precise radiological documentation are essentials steps to obtain a reliable examination. The main limit of double contrast enema is that it considers the pathology only from the mucosal surface. In neoplastic pathology evaluation the main limit is the "T" parameter staging, but more limited are the "N" and "M" parameters evaluation. Today the double contrast technique continues to be a refined, sensitive and specific diagnostic method, moreover, diagnostic results cannot compete with the new CT multislice techniques (CT-enteroclysis and virtual colonoscopy) which can examine both the lumen and the wall of the colon. The double contrast is a cheap and simple examination but in the next future is predictably a progressive substitution of conventional radiology from new multislice techniques, because the cross sectional imaging is more frequently able to detect causes of the symptoms whether resulting both from colonic or non colonic origin.

  12. Double contrast barium enema: Technique, indications, results and limitations of a conventional imaging methodology in the MDCT virtual endoscopy era

    International Nuclear Information System (INIS)

    Rollandi, Gian Andrea; Biscaldi, Ennio; DeCicco, Enzo

    2007-01-01

    The double contrast barium enema of the colon continues to be a diffused conventional radiological technique and allows for the diagnosis of neoplastic and inflammatory pathology. After the '70s, a massive initiative is undertaken to simplify, perfect and encode the method of the double contrast barium enema: Altaras from Germany, Miller from USA and Cittadini from Italy are responsible for the perfection of this technique in the last 30 years. The tailored patient preparation, a perfect technique of execution and a precise radiological documentation are essentials steps to obtain a reliable examination. The main limit of double contrast enema is that it considers the pathology only from the mucosal surface. In neoplastic pathology evaluation the main limit is the 'T' parameter staging, but more limited are the 'N' and 'M' parameters evaluation. Today the double contrast technique continues to be a refined, sensitive and specific diagnostic method, moreover, diagnostic results cannot compete with the new CT multislice techniques (CT-enteroclysis and virtual colonoscopy) which can examine both the lumen and the wall of the colon. The double contrast is a cheap and simple examination but in the next future is predictably a progressive substitution of conventional radiology from new multislice techniques, because the cross sectional imaging is more frequently able to detect causes of the symptoms whether resulting both from colonic or non colonic origin

  13. Changing methodologies in TESOL

    CERN Document Server

    Spiro, Jane

    2013-01-01

    Covering core topics from vocabulary and grammar to teaching, writing speaking and listening, this textbook shows you how to link research to practice in TESOL methodology. It emphasises how current understandings have impacted on the language classroom worldwide and investigates the meaning of 'methods' and 'methodology' and the importance of these for the teacher: as well as the underlying assumptions and beliefs teachers bring to bear in their practice. By introducing you to language teaching approaches, you will explore the way these are influenced by developments in our understanding of l

  14. A new technique in the excavation of ground-nest bee burrows (Hymenoptera: Apoidea

    Directory of Open Access Journals (Sweden)

    Diego Marinho

    2018-01-01

    Full Text Available Bees have a diversified natural history, thus the methods applied to study such diversity are varied. When it comes to studies of nesting biology, bees which nest in pre-existing cavities have been reasonably well studied since researchers started using trap-nests. However, bees whose nests are built underground are poorly studied due to the difficulty of finding natural nesting areas and the absence of a method that facilitates bee nest excavation. The latter is evidenced by the lack of accurate descriptions in literature of how nests are excavated. In this study we tested cylindrical rubber refills of eraser pen as a new material to be used as a tracer of underground nest galleries in a natural nesting area of two species of Epicharis Klug, 1807 (Apidae. We compared this technique directly with plaster in powder form mixed with water and our results with other methodological studies describing alternative methods and materials. The rubber refill technique overcame the main issues presented by materials such as plaster, molten metal alloys and bioplastic, namely: death of the organisms by high temperatures and/or formation of plugs and materials unduly following the roots inside the galleries. Keywords: Apidae, Apoidea, Brood cells, Methodology, Solitary bees

  15. Building block method: a bottom-up modular synthesis methodology for distributed compliant mechanisms

    Directory of Open Access Journals (Sweden)

    G. Krishnan

    2012-03-01

    Full Text Available Synthesizing topologies of compliant mechanisms are based on rigid-link kinematic designs or completely automated optimization techniques. These designs yield mechanisms that match the kinematic specifications as a whole, but seldom yield user insight on how each constituent member contributes towards the overall mechanism performance. This paper reviews recent developments in building block based design of compliant mechanisms. A key aspect of such a methodology is formulating a representation of compliance at a (i single unique point of interest in terms of geometric quantities such as ellipses and vectors, and (ii relative compliance between distinct input(s and output(s in terms of load flow. This geometric representation provides a direct mapping between the mechanism geometry and their behavior, and is used to characterize simple deformable members that form a library of building blocks. The design space spanned by the building block library guides the decomposition of a given problem specification into tractable sub-problems that can be each solved from an entry in the library. The effectiveness of this geometric representation aids user insight in design, and enables discovery of trends and guidelines to obtain practical conceptual designs.

  16. Slot technique - an alternative method of scatter reduction in radiography

    International Nuclear Information System (INIS)

    Panzer, W.; Widenmann, L.

    1983-01-01

    The most common method of scatter reduction in radiography is the use of an antiscatter grid. Its disadvantage is the absorption of a certain percentage of primary radiation in the lead strips of the grid and the fact that due to the limited thickness of the lead strips their scatter absorption is also limited. A possibility for avoiding this disadvantage is offered by the so-called slot technique, ie, the successive exposure of the subject with a narrow fan beam provided by slots in rather thick lead plates. The results of a comparison between grid and slot technique regarding dose to the patient, scatter reduction, image quality and the effect of automatic exposure control are reported. (author)

  17. Probabilistic method/techniques of evaluation/modeling that permits to optimize/reduce the necessary resources

    International Nuclear Information System (INIS)

    Florescu, G.; Apostol, M.; Farcasiu, M.; Luminita Bedreaga, M.; Nitoi, M.; Turcu, I.

    2004-01-01

    Fault tree/event tree modeling approach is widely used in modeling and behavior simulation of nuclear structures, systems and components (NSSCs), during different condition of operation. Evaluation of NSSCs reliability, availability, risk or safety, during operation, by using probabilistic techniques, is also largely used. Development of computer capabilities offered new possibilities for large NSSCs models designing, processing and using. There are situations where large, complex and correct NSSC models are desired to be associated with rapid results/solutions/decisions or with multiple processing in order to obtain specific results. Large fault/event trees are hardly to be developed, reviewed and processed. During operation of NSSCs, the time, especially, is an important factor in taking decision. The paper presents a probabilistic method that permits evaluation/modeling of NSSCs and intents to solve the above problems by adopting appropriate techniques. The method is stated for special applications and is based on specific PSA analysis steps, information, algorithms, criteria and relations, in correspondence with the fault tree/event tree modeling and similar techniques, in order to obtain appropriate results for NSSC model analysis. Special classification of NSSCs is stated in order to reflect aspects of use of the method. Also the common reliability databases are part of information necessary to complete the analysis process. Special data and information bases contribute to state the proposed method/techniques. The paper also presents the specific steps of the method, its applicability, the main advantages and problems to be furthermore studied. The method permits optimization/reducing of resources used to perform the PSA activities. (author)

  18. A dose to curie conversion methodology

    International Nuclear Information System (INIS)

    Stowe, P.A.

    1987-01-01

    Development of the computer code RadCAT (Radioactive waste Classification And Tracking) has led to the development of a simple dose rate to curie content conversion methodology for containers with internally distributed radioactive material. It was determined early on that, if possible, the computerized dose rate to curie evaluation model employed in RadCAT should yield the same results as the hand method utilized and specified in plant procedures. A review of current industry practices indicated two distinct types of computational methodologies are presently in use. The most common methods are computer based calculations utilizing complex mathematical models specifically established for various containers geometries. This type of evaluation is tedious, however, and does not lend itself to repetition by hand. The second method of evaluation, therefore, is simplified expressions that sacrifice accuracy for ease of computation, and generally over estimate container curie content. To meet the aforementioned criterion current computer based models were deemed unacceptably complex and hand computational methods to be too inaccurate for serious consideration. The contact dose rate/curie content analysis methodology presented herein provides an equation that is easy to use in hand calculations yet provides accuracy equivalent to other computer based computations

  19. Application of the PISC results and methodology to assess the effectiveness of NDT techniques applied on non nuclear components

    International Nuclear Information System (INIS)

    Maciga, G.; Papponetti, M.; Crutzen, S.; Jehenson, P.

    1990-01-01

    Performance demonstration for NDT has been an active topic for several years. Interest in it came to the fore in the early 1980's when several institutions started to propose to use of realistic training assemblies and the formal approach of Validation Centers. These steps were justified for example by the results of the PISC exercises which concluded that there was a need for performance demonstration starting with capability assessment of techniques and procedure as they were routinely applied. If the PISC programme is put under the general ''Nuclear Motivation'', the PISC Methodology could be extended to problems to structural components in general, such as on conventional power plants, chemical, aerospace and offshore industries, where integrity and safety have regarded as being of great importance. Some themes of NDT inspections of fossil power plant and offshore components that could be objects of validation studies will be illustrated. (author)

  20. Shielding methods development in the United States

    International Nuclear Information System (INIS)

    Mynatt, F.R.

    1977-01-01

    A generalized shielding methodology has been developed in the U.S.A. that is adaptable to the shielding analyses of all reactor types. Thus far used primarily for liquid-metal fast breeder reactors, the methodology includes several component activities: (1) developing methods for calculating radiation transport through reactor-shield systems; (2) processing cross-section libraries; (3) performing design calculations for specific systems; (4) performing and analyzing pertinent integral experiments; (5) performing sensitivity studies on both the design calculations and the experimental analyses; and, finally, (6) calculating shield design parameters and their uncertainties. The criteria for the methodology are a 5 to 10 percent accuracy for responses at locations near the core and a factor of 2 accuracy for responses at distant locations. The methodology has been successfully adapted to most in-vessel and ex-vessel problems encountered in the shield analyses of the Fast Flux Test Facility and the Fast Flux Test Facility and the Clinch River Breeder Reactor; however, improved techniques are needed for calculating regions in which radiation streaming is dominant. Areas of the methodology in which significant progress has recently been made are those involving the development of cross-section libraries, sensitivity analysis methods, and transport codes