WorldWideScience

Sample records for methodologies methods techniques

  1. Artificial Intelligence Techniques and Methodology

    OpenAIRE

    Carbonell, Jaime G.; Sleeman, Derek

    1982-01-01

    Two closely related aspects of artificial intelligence that have received comparatively little attention in the recent literature are research methodology, and the analysis of computational techniques that span multiple application areas. We believe both issues to be increasingly significant as Artificial Intelligence matures into a science and spins off major application efforts. It is imperative to analyze the repertoire of AI methods with respect to past experience, utility in new domains,...

  2. METHODOLOGY OF TECHNIQUE PREPARATION FOR LOW VISION JAVELIN THROWERS

    Directory of Open Access Journals (Sweden)

    Milan Matić

    2013-07-01

    Full Text Available Javelin throwing discipline for disabled people has been expanding couple of years back. In addition, world’s records have been improving year after year. The esential part in preparation of low vision javelin throwers is mastering the technique elements, crucial for acquiring better results. Method of theoretical analysis, decriptive and comparative methods of survey were applied. Relevant knowledge in the area of low vision javelin throwers was analyzed and systematized, and then interpretated theoretically and applied on the top javelin thrower, which served as a base for the inovative apporoach in methodology and praxis with disabled people. Due to visual impairment, the coordination and balance are challenged. This limitation practically makes the difference in methodology, explained in this article. Apart from the goals focused on improving the condition and results on competitions, more specialized goals should be considered, e.g. improving of orientation, balance and socialization process for the people who have low vision. Special approach used in the technique preparation brought the significant improvement in techique of our famous Paralympian Grlica Miloš. In addition to the technique improvement he acquired better results on the big competitions and a few worldwide valuable prizes were won. The area of ’sport for disabled people’ is not enough present in the praxis of sport’s workers. More articles and scientific surveys on this topic are needed for further work and results improvement with these kind of sportsmen.

  3. Mixed-Methods Research Methodologies

    Science.gov (United States)

    Terrell, Steven R.

    2012-01-01

    Mixed-Method studies have emerged from the paradigm wars between qualitative and quantitative research approaches to become a widely used mode of inquiry. Depending on choices made across four dimensions, mixed-methods can provide an investigator with many design choices which involve a range of sequential and concurrent strategies. Defining…

  4. Methodology for developing new test methods

    Directory of Open Access Journals (Sweden)

    A. I. Korobko

    2017-06-01

    Full Text Available The paper describes the methodology for developing new test methods and forming solutions for the development of new test methods. The basis of the methodology for developing new test methods is the individual elements of the system and process approaches. They contribute to the development of an effective research strategy for the object, the study of interrelations, the synthesis of an adequate model of the test method. The effectiveness of the developed test method is determined by the correct choice of the set of concepts, their interrelations and mutual influence. This allows you to solve the tasks assigned to achieve the goal. The methodology is based on the use of fuzzy cognitive maps. The question of the choice of the method on the basis of which the model for the formation of solutions is based is considered. The methodology provides for recording a model for a new test method in the form of a finite set of objects. These objects are significant for the test method characteristics. Then a causal relationship is established between the objects. Further, the values of fitness indicators and the observability of the method and metrological tolerance for the indicator are established. The work is aimed at the overall goal of ensuring the quality of tests by improving the methodology for developing the test method.

  5. TECHNIQUE AND METHODOLOGY OF TRAINING IN SWIMMING CRAWL

    Directory of Open Access Journals (Sweden)

    Selim Alili

    2013-07-01

    Full Text Available The paper shows the technique and methodology training crawl swimming. Developed: the position of the head and body, footwork, hand movements, exercises for training footwork training drills and exercises for improving coordination technique on dry land and in water. Stated that accomplishes this swimmer swimming technique allows fast and is the fastest discipline. Therefore we can say that it is a favorite way of swimming and a pleasure to watch on the big stage.

  6. Information System Design Methodology Based on PERT/CPM Networking and Optimization Techniques.

    Science.gov (United States)

    Bose, Anindya

    The dissertation attempts to demonstrate that the program evaluation and review technique (PERT)/Critical Path Method (CPM) or some modified version thereof can be developed into an information system design methodology. The methodology utilizes PERT/CPM which isolates the basic functional units of a system and sets them in a dynamic time/cost…

  7. Quantitative assessments of distributed systems methodologies and techniques

    CERN Document Server

    Bruneo, Dario

    2015-01-01

    Distributed systems employed in critical infrastructures must fulfill dependability, timeliness, and performance specifications. Since these systems most often operate in an unpredictable environment, their design and maintenance require quantitative evaluation of deterministic and probabilistic timed models. This need gave birth to an abundant literature devoted to formal modeling languages combined with analytical and simulative solution techniques The aim of the book is to provide an overview of techniques and methodologies dealing with such specific issues in the context of distributed

  8. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    Energy Technology Data Exchange (ETDEWEB)

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  9. Carving TechniqueMethodical Perspectives

    Directory of Open Access Journals (Sweden)

    Adela BADAU

    2015-09-01

    Full Text Available The alpine skiing has undergone major changes and adjustments due to both technological innovations of materials and update of theoretical and methodological concepts on all levels of specific training. The purpose: the introduction of technological innovation in the field of materials specif ic to carving ski causes a review of methodology, aiming at bringing the execution technique to superior indices in order to obtain positive results. The event took place in Poiana Brasov between December 2014 and March 2015, on an 800m long slope and comp rised a single experimental group made of four males and four females, cadet category, that carried out two lessons per day. The tests targeted the technique level for slalom skiing and giant slalom skiing, having in view four criteria: leg work, basin mov ement, torso position and arms work. As a result of the research and of the statistic - mathematical analysis of the individual values, the giant slalom race registered an average improvement of 3.5 points between the tests, while the slalom race registered 4 points. In conclusion, the use of a specific methodology applied scientifically, which aims to select the most efficient means of action specific to children’s ski, determines technical improvement at an advanced level.

  10. Adaptability of laser diffraction measurement technique in soil physics methodology

    Science.gov (United States)

    Barna, Gyöngyi; Szabó, József; Rajkai, Kálmán; Bakacsi, Zsófia; Koós, Sándor; László, Péter; Hauk, Gabriella; Makó, András

    2016-04-01

    There are intentions all around the world to harmonize soils' particle size distribution (PSD) data by the laser diffractometer measurements (LDM) to that of the sedimentation techniques (pipette or hydrometer methods). Unfortunately, up to the applied methodology (e. g. type of pre-treatments, kind of dispersant etc.), PSDs of the sedimentation methods (due to different standards) are dissimilar and could be hardly harmonized with each other, as well. A need was arisen therefore to build up a database, containing PSD values measured by the pipette method according to the Hungarian standard (MSZ-08. 0205: 1978) and the LDM according to a widespread and widely used procedure. In our current publication the first results of statistical analysis of the new and growing PSD database are presented: 204 soil samples measured with pipette method and LDM (Malvern Mastersizer 2000, HydroG dispersion unit) were compared. Applying usual size limits at the LDM, clay fraction was highly under- and silt fraction was overestimated compared to the pipette method. Subsequently soil texture classes determined from the LDM measurements significantly differ from results of the pipette method. According to previous surveys and relating to each other the two dataset to optimizing, the clay/silt boundary at LDM was changed. Comparing the results of PSDs by pipette method to that of the LDM, in case of clay and silt fractions the modified size limits gave higher similarities. Extension of upper size limit of clay fraction from 0.002 to 0.0066 mm, and so change the lower size limit of silt fractions causes more easy comparability of pipette method and LDM. Higher correlations were found between clay content and water vapor adsorption, specific surface area in case of modified limit, as well. Texture classes were also found less dissimilar. The difference between the results of the two kind of PSD measurement methods could be further reduced knowing other routinely analyzed soil parameters

  11. Methodology for attainment of density and effective atomic number through dual energy technique using microtomographic images

    International Nuclear Information System (INIS)

    Alves, H.; Lima, I.; Lopes, R.T.

    2014-01-01

    Dual energy technique for computerized microtomography shows itself as a promising method for identification of mineralogy on geological samples of heterogeneous composition. It can also assist with differentiating very similar objects regarding the attenuation coefficient, which are usually not separable during image processing and analysis of microtomographic data. Therefore, the development of a feasible and applicable methodology of dual energy in the analysis of microtomographic images was sought. - Highlights: • Dual energy technique is promising for identification of distribution of minerals. • A feasible methodology of dual energy in analysis of tomographic images was sought. • The dual energy technique is efficient for density and atomic number identification. • Simulation showed that the proposed methodology agrees with theoretical data. • Nondestructive characterization of distribution of density and chemical composition

  12. A methodology for small scale rural land use mapping in semi-arid developing countries using orbital imagery. Part 6: A low-cost method for land use mapping using simple visual techniques of interpretation. [Spain

    Science.gov (United States)

    Vangenderen, J. L. (Principal Investigator); Lock, B. F.

    1976-01-01

    The author has identified the following significant results. It was found that color composite transparencies and monocular magnification provided the best base for land use interpretation. New methods for determining optimum sample sizes and analyzing interpretation accuracy levels were developed. All stages of the methodology were assessed, in the operational sense, during the production of a 1:250,000 rural land use map of Murcia Province, Southeast Spain.

  13. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique

    Directory of Open Access Journals (Sweden)

    Diana Guzys

    2015-05-01

    Full Text Available In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  14. Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.

    Science.gov (United States)

    Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever

    2015-01-01

    In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.

  15. Methodologies and Methods for User Behavioral Research.

    Science.gov (United States)

    Wang, Peiling

    1999-01-01

    Discusses methodological issues in empirical studies of information-related behavior in six specific research areas: information needs and uses; information seeking; relevance judgment; online searching (including online public access catalog, online database, and the Web); human-system interactions; and reference transactions. (Contains 191…

  16. Methodology for using root locus technique for mobile robots path planning

    Directory of Open Access Journals (Sweden)

    Mario Ricardo Arbulú Saavedra

    2015-11-01

    Full Text Available This paper shows the analysis and the implementation methodology of the technique of dynamic systems roots location used in free-obstacle path planning for mobile robots. First of all, the analysis and morphologic behavior identification of the paths depending on roots location in complex plane are performed, where paths type and their attraction and repulsion features in the presence of other roots similarly to the obtained with artificial potential fields are identified. An implementation methodology for this technique of mobile robots path planning is proposed, starting from three different methods of roots location for obstacles in the scene. Those techniques change depending on the obstacle key points selected for roots, such as borders, crossing points with original path, center and vertices. Finally, a behavior analysis of general technique and the effectiveness of each tried method is performed, doing 20 tests for each one, obtaining a value of 65% for the selected method. Modifications and possible improvements to this methodology are also proposed.

  17. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  18. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Directory of Open Access Journals (Sweden)

    Muhammad Nurul Zhafirah

    2017-01-01

    Full Text Available Increased demand in internet of thing (IOT application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  19. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    Science.gov (United States)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  20. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  1. New Teaching Techniques to Improve Critical Thinking. The Diaprove Methodology

    Science.gov (United States)

    Saiz, Carlos; Rivas, Silvia F.

    2016-01-01

    The objective of this research is to ascertain whether new instructional techniques can improve critical thinking. To achieve this goal, two different instruction techniques (ARDESOS--group 1--and DIAPROVE--group 2--) were studied and a pre-post assessment of critical thinking in various dimensions such as argumentation, inductive reasoning,…

  2. Simplified dose calculation method for mantle technique

    International Nuclear Information System (INIS)

    Scaff, L.A.M.

    1984-01-01

    A simplified dose calculation method for mantle technique is described. In the routine treatment of lymphom as using this technique, the daily doses at the midpoints at five anatomical regions are different because the thicknesses are not equal. (Author) [pt

  3. Methodological integrative review of the work sampling technique used in nursing workload research.

    Science.gov (United States)

    Blay, Nicole; Duffield, Christine M; Gallagher, Robyn; Roche, Michael

    2014-11-01

    To critically review the work sampling technique used in nursing workload research. Work sampling is a technique frequently used by researchers and managers to explore and measure nursing activities. However, work sampling methods used are diverse making comparisons of results between studies difficult. Methodological integrative review. Four electronic databases were systematically searched for peer-reviewed articles published between 2002-2012. Manual scanning of reference lists and Rich Site Summary feeds from contemporary nursing journals were other sources of data. Articles published in the English language between 2002-2012 reporting on research which used work sampling to examine nursing workload. Eighteen articles were reviewed. The review identified that the work sampling technique lacks a standardized approach, which may have an impact on the sharing or comparison of results. Specific areas needing a shared understanding included the training of observers and subjects who self-report, standardization of the techniques used to assess observer inter-rater reliability, sampling methods and reporting of outcomes. Work sampling is a technique that can be used to explore the many facets of nursing work. Standardized reporting measures would enable greater comparison between studies and contribute to knowledge more effectively. Author suggestions for the reporting of results may act as guidelines for researchers considering work sampling as a research method. © 2014 John Wiley & Sons Ltd.

  4. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Applications of mixed-methods methodology in clinical pharmacy research.

    Science.gov (United States)

    Hadi, Muhammad Abdul; Closs, S José

    2016-06-01

    Introduction Mixed-methods methodology, as the name suggests refers to mixing of elements of both qualitative and quantitative methodologies in a single study. In the past decade, mixed-methods methodology has gained popularity among healthcare researchers as it promises to bring together the strengths of both qualitative and quantitative approaches. Methodology A number of mixed-methods designs are available in the literature and the four most commonly used designs in healthcare research are: the convergent parallel design, the embedded design, the exploratory design, and the explanatory design. Each has its own unique advantages, challenges and procedures and selection of a particular design should be guided by the research question. Guidance on designing, conducting and reporting mixed-methods research is available in the literature, so it is advisable to adhere to this to ensure methodological rigour. When to use it is best suited when the research questions require: triangulating findings from different methodologies to explain a single phenomenon; clarifying the results of one method using another method; informing the design of one method based on the findings of another method, development of a scale/questionnaire and answering different research questions within a single study. Two case studies have been presented to illustrate possible applications of mixed-methods methodology. Limitations Possessing the necessary knowledge and skills to undertake qualitative and quantitative data collection, analysis, interpretation and integration remains the biggest challenge for researchers conducting mixed-methods studies. Sequential study designs are often time consuming, being in two (or more) phases whereas concurrent study designs may require more than one data collector to collect both qualitative and quantitative data at the same time.

  6. A dynamic systems engineering methodology research study. Phase 2: Evaluating methodologies, tools, and techniques for applicability to NASA's systems projects

    Science.gov (United States)

    Paul, Arthur S.; Gill, Tepper L.; Maclin, Arlene P.

    1989-01-01

    A study of NASA's Systems Management Policy (SMP) concluded that the primary methodology being used by the Mission Operations and Data Systems Directorate and its subordinate, the Networks Division, is very effective. Still some unmet needs were identified. This study involved evaluating methodologies, tools, and techniques with the potential for resolving the previously identified deficiencies. Six preselected methodologies being used by other organizations with similar development problems were studied. The study revealed a wide range of significant differences in structure. Each system had some strengths but none will satisfy all of the needs of the Networks Division. Areas for improvement of the methodology being used by the Networks Division are listed with recommendations for specific action.

  7. A comparison of results of empirical studies of supplementary search techniques and recommendations in review methodology handbooks: a methodological review.

    Science.gov (United States)

    Cooper, Chris; Booth, Andrew; Britten, Nicky; Garside, Ruth

    2017-11-28

    The purpose and contribution of supplementary search methods in systematic reviews is increasingly acknowledged. Numerous studies have demonstrated their potential in identifying studies or study data that would have been missed by bibliographic database searching alone. What is less certain is how supplementary search methods actually work, how they are applied, and the consequent advantages, disadvantages and resource implications of each search method. The aim of this study is to compare current practice in using supplementary search methods with methodological guidance. Four methodological handbooks in informing systematic review practice in the UK were read and audited to establish current methodological guidance. Studies evaluating the use of supplementary search methods were identified by searching five bibliographic databases. Studies were included if they (1) reported practical application of a supplementary search method (descriptive) or (2) examined the utility of a supplementary search method (analytical) or (3) identified/explored factors that impact on the utility of a supplementary method, when applied in practice. Thirty-five studies were included in this review in addition to the four methodological handbooks. Studies were published between 1989 and 2016, and dates of publication of the handbooks ranged from 1994 to 2014. Five supplementary search methods were reviewed: contacting study authors, citation chasing, handsearching, searching trial registers and web searching. There is reasonable consistency between recommended best practice (handbooks) and current practice (methodological studies) as it relates to the application of supplementary search methods. The methodological studies provide useful information on the effectiveness of the supplementary search methods, often seeking to evaluate aspects of the method to improve effectiveness or efficiency. In this way, the studies advance the understanding of the supplementary search methods. Further

  8. Methodology of shooting training using modern IT techniques

    Science.gov (United States)

    Gudzbeler, Grzegorz; Struniawski, Jarosław

    2017-08-01

    Mastering, improvement, shaping and preservation of skills of safe, efficient and effective use of the firearm requires the use of relevant methodology of conducting the shooting training. However reality of police trainings does not usually allow for intensive training shooting with the use of ammunition. An alternative solution is the use of modern training technologies. Example of this is the "Virtual system of improvement tactics of intervention services responsible for security and shooting training." Introduction of stimulator to police trainings will enable complete stuff preparation to achieve its tasks, creating potential of knowledge and experience in many areas, far exceeding the capabilities of conventional training.

  9. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  10. Group techniques as a methodological strategy in acquiring teamwork abilities by college students

    Directory of Open Access Journals (Sweden)

    César Torres Martín

    2013-02-01

    Full Text Available From the frame of the European Higher Education Space an adaptation of teaching-learning process is being promoted by means of the pedagogical renewal, introducing into the class a major number of active or participative methodologies in order to provide students with a major autonomy in said process. This requires taking into account the incorporation of basic skills within university curriculum, especially “teamwork”. By means of group techniques students can acquire interpersonal and cognitive skills, as well as abilities that will enable them to face different group situations throughout their academic and professional career. These techniques are necessary not only as a methodological strategy in the classroom, but also as a reflection instrument for students to assess their behavior in group, with an aim to modify conduct strategies that make that relationship with others influences their learning process. Hence the importance of this ability to sensitize students positively for collective work. Thus using the research-action method in the academic classroom during one semester and making systematic intervention with different group techniques, we manage to present obtained results by means of an analysis of the qualitative data, where the selected instruments are group discussion and personal reflection.

  11. Acceleration techniques for the discrete ordinate method

    International Nuclear Information System (INIS)

    Efremenko, Dmitry; Doicu, Adrian; Loyola, Diego; Trautmann, Thomas

    2013-01-01

    In this paper we analyze several acceleration techniques for the discrete ordinate method with matrix exponential and the small-angle modification of the radiative transfer equation. These techniques include the left eigenvectors matrix approach for computing the inverse of the right eigenvectors matrix, the telescoping technique, and the method of false discrete ordinate. The numerical simulations have shown that on average, the relative speedup of the left eigenvector matrix approach and the telescoping technique are of about 15% and 30%, respectively. -- Highlights: ► We presented the left eigenvector matrix approach. ► We analyzed the method of false discrete ordinate. ► The telescoping technique is applied for matrix operator method. ► Considered techniques accelerate the computations by 20% in average.

  12. Dosimetric methodology for extremities of individuals occupationally exposed to beta radiation using the optically stimulated luminescence technique

    International Nuclear Information System (INIS)

    Pinto, Teresa Cristina Nathan Outeiro

    2010-01-01

    A dosimetric methodology was established for the determination of extremity doses of individuals occupationally exposed to beta radiation, using Al 2 O 3 :C detectors and the optically stimulated luminescence (OSL) reader system microStar, Landauer. The main parts of the work were: characterization of the dosimetric material Al 2 O 3 :C using the OSL technique; establishment of the dose evaluation methodology; dose rate determination of beta radiation sources; application of the established method in a practical test with individuals occupationally exposed to beta radiation during a calibration simulation of clinical applicators; validation of the methodology by the comparison between the dose results of the practical test using the OSL and the thermoluminescence (TL) techniques. The results show that both the OSL Al-2O 3 :C detectors and the technique may be utilized for individual monitoring of extremities and beta radiation. (author)

  13. New approaches in intelligent control techniques, methodologies and applications

    CERN Document Server

    Kountchev, Roumen

    2016-01-01

    This volume introduces new approaches in intelligent control area from both the viewpoints of theory and application. It consists of eleven contributions by prominent authors from all over the world and an introductory chapter. This volume is strongly connected to another volume entitled "New Approaches in Intelligent Image Analysis" (Eds. Roumen Kountchev and Kazumi Nakamatsu). The chapters of this volume are self-contained and include summary, conclusion and future works. Some of the chapters introduce specific case studies of various intelligent control systems and others focus on intelligent theory based control techniques with applications. A remarkable specificity of this volume is that three chapters are dealing with intelligent control based on paraconsistent logics.

  14. Theoretical Significance in Q Methodology: A Qualitative Approach to a Mixed Method

    Science.gov (United States)

    Ramlo, Susan

    2015-01-01

    Q methodology (Q) has offered researchers a unique scientific measure of subjectivity since William Stephenson's first article in 1935. Q's focus on subjectivity includes self-referential meaning and interpretation. Q is most often identified with its technique (Q-sort) and its method (factor analysis to group people); yet, it consists of a…

  15. Computational techniques of the simplex method

    CERN Document Server

    Maros, István

    2003-01-01

    Computational Techniques of the Simplex Method is a systematic treatment focused on the computational issues of the simplex method. It provides a comprehensive coverage of the most important and successful algorithmic and implementation techniques of the simplex method. It is a unique source of essential, never discussed details of algorithmic elements and their implementation. On the basis of the book the reader will be able to create a highly advanced implementation of the simplex method which, in turn, can be used directly or as a building block in other solution algorithms.

  16. Detectors for LEP: methods and techniques

    International Nuclear Information System (INIS)

    Fabjan, C.

    1979-01-01

    This note surveys detection methods and techniques of relevance for the LEP physics programme. The basic principles of the detector physics are sketched, as recent improvement in understanding points towards improvements and also limitations in performance. Development and present status of large detector systems is presented and permits some conservative extrapolations. State-of-the-art techniques and technologies are presented and their potential use in the LEP physics programme assessed. (Auth.)

  17. A methodology for semiautomatic taxonomy of concepts extraction from nuclear scientific documents using text mining techniques

    International Nuclear Information System (INIS)

    Braga, Fabiane dos Reis

    2013-01-01

    This thesis presents a text mining method for semi-automatic extraction of taxonomy of concepts, from a textual corpus composed of scientific papers related to nuclear area. The text classification is a natural human practice and a crucial task for work with large repositories. The document clustering technique provides a logical and understandable framework that facilitates the organization, browsing and searching. Most clustering algorithms using the bag of words model to represent the content of a document. This model generates a high dimensionality of the data, ignores the fact that different words can have the same meaning and does not consider the relationship between them, assuming that words are independent of each other. The methodology presents a combination of a model for document representation by concepts with a hierarchical document clustering method using frequency of co-occurrence concepts and a technique for clusters labeling more representatives, with the objective of producing a taxonomy of concepts which may reflect a structure of the knowledge domain. It is hoped that this work will contribute to the conceptual mapping of scientific production of nuclear area and thus support the management of research activities in this area. (author)

  18. Variable identification in group method of data handling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Iraci Martinez, E-mail: martinez@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Bueno, Elaine Inacio [Instituto Federal de Educacao, Ciencia e Tecnologia, Guarulhos, SP (Brazil)

    2011-07-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  19. Variable identification in group method of data handling methodology

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Bueno, Elaine Inacio

    2011-01-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a preselected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and Artificial Neural Network - ANN methodologies, and applied to the IPEN research Reactor IEA-R1. The GMDH was used to study the best set of variables to be used to train an ANN, resulting in a best monitoring variable estimative. The system performs the monitoring by comparing these estimative calculated values with measured ones. The IPEN Reactor Data Acquisition System is composed of 58 variables (process and nuclear variables). As the GMDH is a self-organizing methodology, the input variables choice is made automatically, and the real input variables used in the Monitoring and Diagnosis System were not showed in the final result. This work presents a study of variable identification of GMDH methodology by means of an algorithm that works in parallel with the GMDH algorithm and traces the initial variables paths, resulting in an identification of the variables that composes the best Monitoring and Diagnosis Model. (author)

  20. CASE METHOD. ACTIVE LEARNING METHODOLOGY TO ACQUIRE SIGNIFICANT IN CHEMISTRY

    Directory of Open Access Journals (Sweden)

    Clotilde Pizarro

    2015-09-01

    Full Text Available In this paper the methodology of cases in first year students of the Engineering Risk Prevention and Environment is applied. For this purpose a real case of contamination occurred at a school in the region of Valparaiso called "La Greda" is presented. If the application starts delivering an extract of the information collected from the media and they made a brief induction on the methodology to be applied. A plenary session, which is debate about possible solutions to the problem and establishing a relationship between the case and drives the chemistry program is then performed. Is concluded that the application of the case method, was a fruitful tool in yields obtained by students, since the percentage of approval was 75%, which is considerably higher than previous years.

  1. Variance Reduction Techniques in Monte Carlo Methods

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Ridder, A.A.N.; Rubinstein, R.Y.

    2010-01-01

    Monte Carlo methods are simulation algorithms to estimate a numerical quantity in a statistical model of a real system. These algorithms are executed by computer programs. Variance reduction techniques (VRT) are needed, even though computer speed has been increasing dramatically, ever since the

  2. Applying Mixed Methods Techniques in Strategic Planning

    Science.gov (United States)

    Voorhees, Richard A.

    2008-01-01

    In its most basic form, strategic planning is a process of anticipating change, identifying new opportunities, and executing strategy. The use of mixed methods, blending quantitative and qualitative analytical techniques and data, in the process of assembling a strategic plan can help to ensure a successful outcome. In this article, the author…

  3. Methodological study of volcanic glass dating by fission track method

    International Nuclear Information System (INIS)

    Araya, A.M.O.

    1987-01-01

    After a description of the method and from the analysis of the age equation we show the methodology used in the plotting of the correction curve and the results of the study of correction curves and corrected ages. From a study of the size correction method we see that the reactor irradiation effect on the curve is negligible and that the correction curve is independent of the thermal treatment but, it depends on chemical treatment and sample. Comparing the corrected ages obtained from both correction method and the ages given by other authors we can conclude that they are in agreement and concerning the plateau method, both isothermal and isochronic plateau give the same results. (author) [pt

  4. Methodology for qualitative content analysis with the technique of mind maps using Nvivo and FreeMind softwares

    Directory of Open Access Journals (Sweden)

    José Leonardo Oliveira Lima

    2016-12-01

    Full Text Available Introduction: In a survey it is not enough choosing tools, resources and procedures. It is important to understand the method beyond the technics and their relationship with philosophy, epistemology and methodology. Objective: To discuss theoretical and methodological concerns on Qualitative Research in Information Science and the process of Qualitative Content Analysis (QCA at User Studies field and to show a followed path of QCA integrated with Mind Maps technic for developing categories and indicators, by using Qualitative Data Analysis Software (QDAS and Mind Maps designing tools. Methodology: The research was descriptive, methodological, bibliographical and fieldwork conducted with open interviews that were processed using the QCA method with the support of QDAS Nvivo and FreeMind Software for Mind Map design. Results: It is shown the theory of qualitative research and QCA and a methodological path of QCA by using techniques and software mentioned above. Conclusions: When it comes to qualitative researches, the theoretical framework suggests the need of more dialogue among Information Science and other disciplines. The process of QCA evidenced: a viable path that might help further related investigations using the QDAS; the contribution of Mind Maps and their design softwares to develop the indicators and categories of QCA.

  5. Empowerment methods and techniques for sport managers

    Directory of Open Access Journals (Sweden)

    THANOS KRIEMADIS

    2006-01-01

    Full Text Available We live in a globalize economic, social and technological environment where organizations can be successful only if they have required resources (material resources, facilities and equipment, and human resources. The managers and the organizations should empower and enable employees to accomplish their work in meaningful ways. Empowerment has been described as a means to enable employees to make decisions and as a personal phenomenon where individuals take responsibility for their own actions. The aim of the present study was to present effective methods and techniques of employee empowerment which constitute for the organization a source of competitive advantage. The paper will present and explain empowerment methods and techniques such as: (a organizational culture, (b vision statements, (c organizational values, (d teamwork, (e the role of manager - leadership, (f devolving responsibility accountability, (g information sharing, (h continuous training, (i appraisal rewards, (j goal setting, and (k performance appraisal process.

  6. Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio-Inspired Optimization Techniques

    Science.gov (United States)

    2017-11-01

    on Bio -Inspired Optimization Techniques by Canh Ly, Nghia Tran, and Ozlem Kilic Approved for public release; distribution is...Research Laboratory Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques by...SUBTITLE Methodology for Designing and Developing a New Ultra-Wideband Antenna Based on Bio -Inspired Optimization Techniques 5a. CONTRACT NUMBER

  7. Methods and experimental techniques in computer engineering

    CERN Document Server

    Schiaffonati, Viola

    2014-01-01

    Computing and science reveal a synergic relationship. On the one hand, it is widely evident that computing plays an important role in the scientific endeavor. On the other hand, the role of scientific method in computing is getting increasingly important, especially in providing ways to experimentally evaluate the properties of complex computing systems. This book critically presents these issues from a unitary conceptual and methodological perspective by addressing specific case studies at the intersection between computing and science. The book originates from, and collects the experience of, a course for PhD students in Information Engineering held at the Politecnico di Milano. Following the structure of the course, the book features contributions from some researchers who are working at the intersection between computing and science.

  8. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  9. Convergence studies of deterministic methods for LWR explicit reflector methodology

    International Nuclear Information System (INIS)

    Canepa, S.; Hursin, M.; Ferroukhi, H.; Pautz, A.

    2013-01-01

    The standard approach in modem 3-D core simulators, employed either for steady-state or transient simulations, is to use Albedo coefficients or explicit reflectors at the core axial and radial boundaries. In the latter approach, few-group homogenized nuclear data are a priori produced with lattice transport codes using 2-D reflector models. Recently, the explicit reflector methodology of the deterministic CASMO-4/SIMULATE-3 code system was identified to potentially constitute one of the main sources of errors for core analyses of the Swiss operating LWRs, which are all belonging to GII design. Considering that some of the new GIII designs will rely on very different reflector concepts, a review and assessment of the reflector methodology for various LWR designs appeared as relevant. Therefore, the purpose of this paper is to first recall the concepts of the explicit reflector modelling approach as employed by CASMO/SIMULATE. Then, for selected reflector configurations representative of both GII and GUI designs, a benchmarking of the few-group nuclear data produced with the deterministic lattice code CASMO-4 and its successor CASMO-5, is conducted. On this basis, a convergence study with regards to geometrical requirements when using deterministic methods with 2-D homogenous models is conducted and the effect on the downstream 3-D core analysis accuracy is evaluated for a typical GII deflector design in order to assess the results against available plant measurements. (authors)

  10. A novelty detection diagnostic methodology for gearboxes operating under fluctuating operating conditions using probabilistic techniques

    Science.gov (United States)

    Schmidt, S.; Heyns, P. S.; de Villiers, J. P.

    2018-02-01

    In this paper, a fault diagnostic methodology is developed which is able to detect, locate and trend gear faults under fluctuating operating conditions when only vibration data from a single transducer, measured on a healthy gearbox are available. A two-phase feature extraction and modelling process is proposed to infer the operating condition and based on the operating condition, to detect changes in the machine condition. Information from optimised machine and operating condition hidden Markov models are statistically combined to generate a discrepancy signal which is post-processed to infer the condition of the gearbox. The discrepancy signal is processed and combined with statistical methods for automatic fault detection and localisation and to perform fault trending over time. The proposed methodology is validated on experimental data and a tacholess order tracking methodology is used to enhance the cost-effectiveness of the diagnostic methodology.

  11. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  12. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  13. Searching for rigour in the reporting of mixed methods population health research: a methodological review.

    Science.gov (United States)

    Brown, K M; Elliott, S J; Leatherdale, S T; Robertson-Wilson, J

    2015-12-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing rigour in quantitative and qualitative research, there is poor consensus regarding rigour in mixed methods. Using the empirical example of school-based obesity interventions, this methodological review examined how mixed methods have been used and reported, and how rigour has been addressed. Twenty-three peer-reviewed mixed methods studies were identified through a systematic search of five databases and appraised using the guidelines for Good Reporting of a Mixed Methods Study. In general, more detailed description of data collection and analysis, integration, inferences and justifying the use of mixed methods is needed. Additionally, improved reporting of methodological rigour is required. This review calls for increased discussion of practical techniques for establishing rigour in mixed methods research, beyond those for quantitative and qualitative criteria individually. A guide for reporting mixed methods research in population health should be developed to improve the reporting quality of mixed methods studies. Through improved reporting, mixed methods can provide strong evidence to inform policy and practice. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  14. Techniques and methods in nuclear materials traceability

    International Nuclear Information System (INIS)

    Persiani, P.J.

    1996-01-01

    The nonproliferation community is currently addressing concerns that the access to special nuclear materials may increase the illicit trafficking in weapons-usable materials from civil and/or weapons material stores and/or fuel cycles systems. Illicit nuclear traffic usually involves reduced quantities of nuclear materials perhaps as samplings of a potential protracted diversionary flow from sources to users. To counter illicit nuclear transactions requires the development of techniques and methods in nuclear material traceability as an important phase of a broad forensic analysis capability. This report discusses how isotopic signatures and correlation methods were applied to determine the origins of Highly Enriched Uranium (HEU) and Plutonium samples reported as illicit trafficking in nuclear materials

  15. Test cases for interface tracking methods: methodology and current status

    International Nuclear Information System (INIS)

    Lebaigue, O.; Jamet, D.; Lemonnier, E.

    2004-01-01

    Full text of publication follows:In the past decade, a large number of new methods have been developed to deal with interfaces in the numerical simulation of two-phase flows. We have collected a set of 36 test cases, which can be seen as a tool to help engineers and researchers selecting the most appropriate method(s) for their specific fields of application. This set can be use: - To perform an initial evaluation of the capabilities of available methods with regard to the specificity of the final application and the most important features to be recovered from the simulation. - To measure the maximum mesh size to be used for a given physical problem in order to obtain an accurate enough solution. - To assess and quantify the performances of a selected method equipped with its set of physical models. The computation of a well-documented test case allows estimating the error due to the numerical technique by comparison with reference solutions. This process is compulsory to gain confidence and credibility on the prediction capabilities of a numerical method and its physical models. - To broaden the capabilities of a given numerical technique. The test cases may be used to identify the need for improvement of the overall numerical scheme or to determine the physical part of the model, which is responsible for the observed limitations. Each test case falls within one of the following categories: - Analytical solutions of well-known sets of equations corresponding to simple geometrical situations. - Reference numerical solutions of moderately complex problems, produced by accurate methods (e.g., boundary Fitted coordinate method) on refined meshes. - Separate effects analytical experiments. The presentation will suggest how to use the test cases for assessing the physical models and the numerical methods. The expected fallout of using test cases is indeed on the one hand to identify the merits of existing methods and on the other hand to orient further research towards

  16. In vitro cumulative gas production techniques: history, methodological considerations and challenges

    NARCIS (Netherlands)

    Rymer, C.; Huntington, J.A.; Williams, B.A.; Givens, D.I.

    2005-01-01

    Methodology used to measure in vitro gas production is reviewed to determine impacts of sources of variation on resultant gas production profiles (GPP). Current methods include measurement of gas production at constant pressure (e.g., use of gas tight syringes), a system that is inexpensive, but may

  17. Star identification methods, techniques and algorithms

    CERN Document Server

    Zhang, Guangjun

    2017-01-01

    This book summarizes the research advances in star identification that the author’s team has made over the past 10 years, systematically introducing the principles of star identification, general methods, key techniques and practicable algorithms. It also offers examples of hardware implementation and performance evaluation for the star identification algorithms. Star identification is the key step for celestial navigation and greatly improves the performance of star sensors, and as such the book include the fundamentals of star sensors and celestial navigation, the processing of the star catalog and star images, star identification using modified triangle algorithms, star identification using star patterns and using neural networks, rapid star tracking using star matching between adjacent frames, as well as implementation hardware and using performance tests for star identification. It is not only valuable as a reference book for star sensor designers and researchers working in pattern recognition and othe...

  18. Issues in Learning About and Teaching Qualitative Research Methods and Methodology in the Social Sciences

    Directory of Open Access Journals (Sweden)

    Franz Breuer

    2007-01-01

    Full Text Available For many qualitative researchers in the social sciences, learning about and teaching qualitative research methods and methodology raises a number of questions. This topic was the focus of a symposium held during the Second Berlin Summer School for Qualitative Research Methods in July 2006. In this contribution, some of the issues discussed during the symposium are taken up and extended, and some basic dimensions underlying these issues are summarized. How qualitative research methods and methodology are taught is closely linked to the ways in which qualitative researchers in the social sciences conceptualize themselves and their discipline. In the following, we distinguish between a paradigmatic and a pragmatic view. From a pragmatic point of view, qualitative research methods are considered research strategies or techniques and can be taught in the sense of recipes with specific steps to be carried out. According to a paradigmatic point of view (strongly inspired by constructivism, qualitative research methods and methodology are conceptualized as a craft to be practiced together by a "master" and an "apprentice." Moreover, the teaching of qualitative research methods also depends heavily on the institutional standing of qualitative compared to quantitative research method. Based on these considerations, five basic dimensions of learning about and teaching qualitative research methods are suggested: ways of teaching (ranging from the presentation of textbook knowledge to cognitive apprenticeship and instructors' experience with these; institutional contexts, including their development and the teaching of qualitative research methods in other than university contexts; the "fit" between personality and method, including relevant personal skills and talents; and, as a special type of instructional context that increasingly has gained importance, distance learning and its implications for learning about and teaching qualitative research methods

  19. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1992-01-01

    The Analysis and Testing Group (WX-11) of the Design Engineering Division at Los Alamos National Laboratory (LANL) is developing methodology for designing and providing a basis for certification of Type B shipping containers. This methodology will include design, analysis, testing, fabrication, procurement, and obtaining certification of the Type B containers, allowing usage in support of the United States Department of Energy programs. While all aspects of the packaging development are included in this methodology, this paper focuses on the use of analysis and testing techniques for enhancing the design and providing a basis for certification. This methodology is based on concurrent engineering principles. Multidisciplinary teams within LANL are responsible for the design and certification of specific Type B Radioactive Material Shipping Containers. These teams include personnel with the various backgrounds and areas of expertise required to support the design, testing, analysis and certification tasks. To demonstrate that a package can pass all the performance requirements, the design needs to be characterized as completely as possible. Understanding package responses to the various environments and how these responses influence the effectiveness of the packaging requires expertise in several disciplines. In addition to characterizing the shipping container designs, these multidisciplinary teams should be able to provide insight into improving new package designs

  20. New design methods for computer aided architecturald design methodology teaching

    NARCIS (Netherlands)

    Achten, H.H.

    2003-01-01

    Architects and architectural students are exploring new ways of design using Computer Aided Architectural Design software. This exploration is seldom backed up from a design methodological viewpoint. In this paper, a design methodological framework for reflection on innovate design processes by

  1. Accelerated Testing Methodology in Constant Stress-Rate Testing for Advanced Structural Ceramics: A Preloading Technique

    Science.gov (United States)

    Choi, Sung R.; Gyekenyesi, John P.; Huebert, Dean; Bartlett, Allen; Choi, Han-Ho

    2001-01-01

    Preloading technique was used as a means of an accelerated testing methodology in constant stress-rate (dynamic fatigue) testing for two different brittle materials. The theory developed previously for fatigue strength as a function of preload was further verified through extensive constant stress-rate testing for glass-ceramic and CRT glass in room temperature distilled water. The preloading technique was also used in this study to identify the prevailing failure mechanisms at elevated temperatures, particularly at lower test rates in which a series of mechanisms would be associated simultaneously with material failure, resulting in significant strength increase or decrease. Two different advanced ceramics including SiC whisker-reinforced composite silicon nitride and 96 wt% alumina were used at elevated temperatures. It was found that the preloading technique can be used as an additional tool to pinpoint the dominant failure mechanism that is associated with such a phenomenon of considerable strength increase or decrease.

  2. Methods and techniques for prediction of environmental impact

    International Nuclear Information System (INIS)

    1992-04-01

    Environmental impact assessment (EIA) is the procedure that helps decision makers understand the environmental implications of their decisions. The prediction of environmental effects or impact is an extremely important part of the EIA procedure and improvements in existing capabilities are needed. Considerable attention is paid within environmental impact assessment and in handbooks on EIA to methods for identifying and evaluating environmental impacts. However, little attention is given to the issue distribution of information on impact prediction methods. The quantitative or qualitative methods for the prediction of environmental impacts appear to be the two basic approaches for incorporating environmental concerns into the decision-making process. Depending on the nature of the proposed activity and the environment likely to be affected, a combination of both quantitative and qualitative methods is used. Within environmental impact assessment, the accuracy of methods for the prediction of environmental impacts is of major importance while it provides for sound and well-balanced decision making. Pertinent and effective action to deal with the problems of environmental protection and the rational use of natural resources and sustainable development is only possible given objective methods and techniques for the prediction of environmental impact. Therefore, the Senior Advisers to ECE Governments on Environmental and Water Problems, decided to set up a task force, with the USSR as lead country, on methods and techniques for the prediction of environmental impacts in order to undertake a study to review and analyse existing methodological approaches and to elaborate recommendations to ECE Governments. The work of the task force was completed in 1990 and the resulting report, with all relevant background material, was approved by the Senior Advisers to ECE Governments on Environmental and Water Problems in 1991. The present report reflects the situation, state of

  3. Assessment of change in knowledge about research methods among delegates attending research methodology workshop

    Directory of Open Access Journals (Sweden)

    Manisha Shrivastava

    2018-01-01

    Conclusion: There was increase in knowledge of the delegates after attending research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  4. Systematization of types and methods of radiation therapy methods and techniques of irradiation of patients

    International Nuclear Information System (INIS)

    Vajnberg, M.Sh.

    1991-01-01

    The paper is concerned with the principles of systematization and classification of types and methods of radiation therapy, approaches to the regulation of its terminology. They are based on the distinction of the concepts of radiation therapy and irradiation of patients. The author gives a consice historical review of improvement of the methodology of radiation therapy in the course of developing of its methods and facilities. Problems of terminology are under discussion. There is a table of types and methods of radiation therapy, methods and techniques of irradiation. In the appendices one can find a table of typical legends and examples of graphic signs to denote methods of irradiation. Potentialities of a practical use of the system are described

  5. Statistical methods of evaluating and comparing imaging techniques

    International Nuclear Information System (INIS)

    Freedman, L.S.

    1987-01-01

    Over the past 20 years several new methods of generating images of internal organs and the anatomy of the body have been developed and used to enhance the accuracy of diagnosis and treatment. These include ultrasonic scanning, radioisotope scanning, computerised X-ray tomography (CT) and magnetic resonance imaging (MRI). The new techniques have made a considerable impact on radiological practice in hospital departments, not least on the investigational process for patients suspected or known to have malignant disease. As a consequence of the increased range of imaging techniques now available, there has developed a need to evaluate and compare their usefulness. Over the past 10 years formal studies of the application of imaging technology have been conducted and many reports have appeared in the literature. These studies cover a range of clinical situations. Likewise, the methodologies employed for evaluating and comparing the techniques in question have differed widely. While not attempting an exhaustive review of the clinical studies which have been reported, this paper aims to examine the statistical designs and analyses which have been used. First a brief review of the different types of study is given. Examples of each type are then chosen to illustrate statistical issues related to their design and analysis. In the final sections it is argued that a form of classification for these different types of study might be helpful in clarifying relationships between them and bringing a perspective to the field. A classification based upon a limited analogy with clinical trials is suggested

  6. Particle fluxes above forests: Observations, methodological considerations and method comparisons

    International Nuclear Information System (INIS)

    Pryor, S.C.; Larsen, S.E.; Sorensen, L.L.; Barthelmie, R.J.

    2008-01-01

    This paper reports a study designed to test, evaluate and compare micro-meteorological methods for determining the particle number flux above forest canopies. Half-hour average particle number fluxes above a representative broad-leaved forest in Denmark derived using eddy covariance range from -7 x 10 7 m -2 s -1 (1st percentile) to 5 x 10 7 m -2 s -1 (99th percentile), and have a median value of -1.6 x 10 6 m -2 s -1 . The statistical uncertainties associated with the particle number flux estimates are larger than those for momentum fluxes and imply that in this data set approximately half of the particle number fluxes are not statistically different to zero. Particle number fluxes from relaxed eddy accumulation (REA) and eddy covariance are highly correlated and of almost identical magnitude. Flux estimates from the co-spectral and dissipation methods are also correlated with those from eddy covariance but exhibit higher absolute magnitude of fluxes. - Number fluxes of ultra-fine particles over a forest computed using four micro-meteorological techniques are highly correlated but vary in magnitude

  7. Theoretical and methodological basis of the comparative historical and legal method development

    Directory of Open Access Journals (Sweden)

    Д. А. Шигаль

    2015-05-01

    Full Text Available Problem setting. Development of any scientific method is always both a question of its structural and functional characteristics and place in the system of scientific methods, and a comment as for practicability of such methodological work. This paper attempts to give a detailed response to the major comments and objections arising in respect of the separation as an independent means of special and scientific knowledge of comparative historical and legal method. Recent research and publications analysis. Analyzing research and publications within the theme of the scientific article, it should be noted that attention to methodological issues of both general and legal science at the time was paid by such prominent foreign and domestic scholars as I. D. Andreev, Yu. Ya. Baskin, O. L. Bygych, M. A. Damirli, V. V. Ivanov, I. D. Koval'chenko, V. F. Kolomyitsev, D. V. Lukyanov, L. A. Luts, J. Maida, B. G. Mogilnytsky, N. M. Onishchenko, N. M. Parkhomenko, O. V. Petryshyn, S. P. Pogrebnyak, V. I. Synaisky, V. M. Syryh, O. F. Skakun, A. O. Tille, D. I. Feldman and others. It should be noted that, despite a large number of scientific papers in this field, the interest of research partnership in the methodology of history of state and law science still unfairly remains very low. Paper objective. The purpose of this scientific paper is theoretical and methodological rationale for the need of separation and development of comparative historical and legal method in the form of answers to more common questions and objections that arise in scientific partnership in this regard. Paper main body. Development of comparative historical and legal means of knowledge is quite justified because it meets the requirements of the scientific method efficiency, which criteria are the speed for achieving this goal, ease of use of one or another way of scientific knowledge, universality of research methods, convenience of techniques that are used and so on. Combining the

  8. Radiologic examination of orthopaedics. Methods and techniques

    International Nuclear Information System (INIS)

    Hafner, E.; Meuli, H.C.

    1976-01-01

    This volume describes in detail radiological examinations of the skeleton modern procedures in orthopaedic surgery. Special emphasis is given to functional examination techniques based upon the authors' extensive work on standardized radiological examinations best suited to the needs of orthopaedic surgeons. These techniques were developed at the Radiodiagnostic Department of the Central Radiological Clinic, Bern University, in cooperation with the University Clinic of Orthopaedics and Surgery of the Locomotor System. Exposure techniques are explained concisely, yet with extraordinary precision and attention to detail. They have proved highly successful in teaching programs for X-ray technicians and as standard examination techniques for many hospitals, X-ray departments, orthopaedic units, and private clinics. Recommended for orthopaedic surgeons, radiologists, general surgeons, and X-ray technicians, this definitive treatise, with its superb X-ray reproductions and complementary line drawings, explains how to achieve improved diagnoses and standardized control with the least possible radiation exposure to the patient

  9. Biological indication in aquatic ecosystems. Biological indication in limnic and coastal ecosystems - fundamentals, techniques, methodology

    International Nuclear Information System (INIS)

    Gunkel, G.

    1994-01-01

    Biological methods of water quality evaluation today form an integral part of environmental monitoring and permit to continuously monitor the condition of aquatic ecosystems. They indicate both improvements in water quality following redevelopment measures, and the sometimes insidious deterioration of water quality. This book on biological indication in aquatic ecosystems is a compendium of measurement and evaluation techniques for limnic systems by means of biological parameters. At present, however, an intense discussion of biological evaluation techniques is going on, for one thing as a consequence of the German reunification and the need to unify evaluation techniques, and for another because of harmonizations within the European Community. (orig./EF) [de

  10. Towards the methodological optimization of the moss bag technique in terms of contaminants concentrations and replicability values

    Science.gov (United States)

    Ares, A.; Fernández, J. A.; Carballeira, A.; Aboal, J. R.

    2014-09-01

    The moss bag technique is a simple and economical environmental monitoring tool used to monitor air quality. However, routine use of the method is not possible because the protocols involved have not yet been standardized. Some of the most variable methodological aspects include (i) selection of moss species, (ii) ratio of moss weight to surface area of the bag, (iii) duration of exposure, and (iv) height of exposure. In the present study, the best option for each of these aspects was selected on the basis of the mean concentrations and data replicability of Cd, Cu, Hg, Pb and Zn measured during at least two exposure periods in environments affected by different degrees of contamination. The optimal choices for the studied aspects were the following: (i) Sphagnum denticulatum, (ii) 5.68 mg of moss tissue for each cm-2 of bag surface, (iii) 8 weeks of exposure, and (iv) 4 m height of exposure. Duration of exposure and height of exposure accounted for most of the variability in the data. The aim of this methodological study was to provide data to help establish a standardized protocol that will enable use of the moss bag technique by public authorities.

  11. Failure modes induced by natural radiation environments on DRAM memories: study, test methodology and mitigation technique

    International Nuclear Information System (INIS)

    Bougerol, A.

    2011-05-01

    DRAMs are frequently used in space and aeronautic systems. Their sensitivity to cosmic radiations have to be known in order to satisfy reliability requirements for critical applications. These evaluations are traditionally done with particle accelerators. However, devices become more complex with technology integration. Therefore new effects appear, inducing longer and more expensive tests. There is a complementary solution: the pulsed laser, which triggers similar effects as particles. Thanks to these two test tools, main DRAM radiation failure modes were studied: SEUs (Single Event Upset) in memory blocks, and SEFIs (Single Event Functional Interrupt) in peripheral circuits. This work demonstrates the influence of test patterns on SEU and SEFI sensitivities depending on technology used. In addition, this study identifies the origin of the most frequent type of SEFIs. Moreover, laser techniques were developed to quantify sensitive surfaces of the different effects. This work led to a new test methodology for industry, in order to optimize test cost and efficiency using both pulsed laser beams and particle accelerators. Finally, a new fault tolerant technique is proposed: based on DRAM cell radiation immunity when discharged, this technique allows to correct all bits of a logic word. (author)

  12. METHODOLOGICAL PRINCIPLES AND METHODS OF TERMS OF TRADE STATISTICAL EVALUATION

    Directory of Open Access Journals (Sweden)

    N. Kovtun

    2014-09-01

    Full Text Available The paper studies the methodological principles and guidance of the statistical evaluation of terms of trade for the United Nations classification model – Harmonized Commodity Description and Coding System (HS. The practical implementation of the proposed three-stage model of index analysis and estimation of terms of trade for Ukraine's commodity-members for the period of 2011-2012 are realized.

  13. Assessment of proliferation resistances of aqueous reprocessing techniques using the TOPS methodology

    International Nuclear Information System (INIS)

    Åberg Lindell, M.; Grape, S.; Håkansson, A.; Jacobsson Svärd, S.

    2013-01-01

    Highlights: • Proliferation resistances of three possible LFR fuel cycles are assessed. • The TOPS methodology has been chosen for the PR assessment. • Reactor operation, reprocessing and fuel fabrication are examined. • Purex, Ganex, and a combination of Purex, Diamex and Sanex, are compared. • The safeguards analysis speaks in favor of Ganex as opposed to the Purex process. - Abstract: The aim of this study is to assess and compare the proliferation resistances (PR) of three possible Generation IV lead-cooled fast reactor fuel cycles, involving the reprocessing techniques Purex, Ganex and a combination of Purex, Diamex and Sanex, respectively. The examined fuel cycle stages are reactor operation, reprocessing and fuel fabrication. The TOPS methodology has been chosen for the PR assessment, and the only threat studied is the case where a technically advanced state diverts nuclear material covertly. According to the TOPS methodology, the facilities have been divided into segments, here roughly representing the different forms of nuclear material occurring in each examined fuel cycle stage. For each segment, various proliferation barriers have been assessed. The results make it possible to pinpoint where the facilities can be improved. The results show that the proliferation resistance of a fuel cycle involving recycling of minor actinides is higher than for the traditional Purex reprocessing cycle. Furthermore, for the purpose of nuclear safeguards, group actinide extraction should be preferred over reprocessing options where pure plutonium streams occur. This is due to the fact that a solution containing minor actinides is less attractive to a proliferator than a pure Pu solution. Thus, the safeguards analysis speaks in favor of Ganex as opposed to the Purex process

  14. Method of insulin determination by radioimmunoassay technique

    Energy Technology Data Exchange (ETDEWEB)

    Kokot, F; Kuska, J [Slaska Akademia Medyczna, Katowice (Poland)

    1973-01-01

    Technical details of a radioimmunological method of insulin determination in blood serum have been presented. Clinical value of the method was checked in 31 healthy subjects following oral or intravenous glucose administration, or after pancreatic islet stimulation using tolbutamide.

  15. Methodology for the application of probabilistic safety assessment techniques (PSA) to the cobalt-therapy units in Cuba

    International Nuclear Information System (INIS)

    Vilaragut Llanes, J.J.; Ferro Fernandez, R.; Troncoso Fleitas, M.; Lozano Lima, B.; Fuente Puch, A. de la; Perez Reyes, Y.; Dumenigo Gonzalez, C.

    2001-01-01

    The applications of PSA techniques in the nuclear power plants during the last two decades and the positive results obtained for decision making in relation with safety, as a complement to deterministic methods, have increased their use in the rest of the nuclear applications. At present a large set of documents from international institutions can be found summarizing the investigations carried out in this field and promoting their use in radioactive facilities. Although still without a mandatory character, the new regulations on radiological safety also promote the complete or partial application of the PSA techniques in the safety assessment of the radiological practices. Also the IAEA, through various programs in which Cuba has been inserted, is taking a group of actions so that the nuclear community will encourage the application of the probabilistic risk methods for the evaluations and decision making with respect to safety. However, the fact that in no radioactive installation has a complete PSA study been carried out, makes that certain methodological aspects require to be improved and modified for the application of these techniques. This work presents the main elements for the use of PSA in the evaluation of the safety of cobalt-therapy units in Cuba. Also presented, as part of the results of the first stage of the Study, are the Guidelines that are being applied in a Research Contract with the Agency by the authors themselves, who belong to the CNSN, together with other specialists from the Cuban Ministry of Public Health. (author) [es

  16. Covariance methodology applied to 35S disintegration rate measurements by the CIEMAT/NIST method

    International Nuclear Information System (INIS)

    Koskinas, M.F.; Nascimento, T.S.; Yamazaki, I.M.; Dias, M.S.

    2014-01-01

    The Nuclear Metrology Laboratory (LMN) at IPEN is carrying out measurements in a LSC (Liquid Scintillation Counting system), applying the CIEMAT/NIST method. In this context 35 S is an important radionuclide for medical applications and it is difficult to be standardized by other primary methods due to low beta ray energy. The CIEMAT/NIST is a standard technique used by most metrology laboratories in order to improve accuracy and speed up beta emitter standardization. The focus of the present work was to apply the covariance methodology for determining the overall uncertainty in the 35 S disintegration rate. All partial uncertainties involved in the measurements were considered, taking into account all possible correlations between each pair of them. - Highlights: ► 35 S disintegration rate measured in Liquid Scintillator system using CIEMAT/NIST method. ► Covariance methodology applied to the overall uncertainty in the 35 S disintegration rate. ► Monte Carlo simulation was applied to determine 35 S activity in the 4πβ(PC)-γ coincidence system

  17. Methods and Techniques of Enrollment Forecasting.

    Science.gov (United States)

    Brinkman, Paul T.; McIntyre, Chuck

    1997-01-01

    There is no right way to forecast college enrollments; in many instances, it will be prudent to use both qualitative and quantitative methods. Methods chosen must be relevant to questions addressed, policies and decisions at stake, and time and talent required. While it is tempting to start quickly, enrollment forecasting is an area in which…

  18. A Mixed Methods Sampling Methodology for a Multisite Case Study

    Science.gov (United States)

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  19. The Analysis of Classroom Talk: Methods and Methodologies

    Science.gov (United States)

    Mercer, Neil

    2010-01-01

    This article describes methods for analysing classroom talk, comparing their strengths and weaknesses. Both quantitative and qualitative methods are described and assessed for their strengths and weaknesses, with a discussion of the mixed use of such methods. It is acknowledged that particular methods are often embedded in particular…

  20. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    International Nuclear Information System (INIS)

    2013-01-01

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results

  1. National and Regional Surveys of Radon Concentration in Dwellings. Review of Methodology and Measurement Techniques

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    Reliable, comparable and 'fit for purpose' results are essential requirements for any decision based on analytical measurements. For the analyst, the availability of tested and validated sampling and analytical procedures is an extremely important tool for carrying out such measurements. For maximum utility, such procedures should be comprehensive, clearly formulated and readily available to both the analyst and the customer for reference. In the specific case of radon surveys, it is very important to design a survey in such a way as to obtain results that can reasonably be considered representative of a population. Since 2004, the Environment Programme of the IAEA has included activities aimed at the development of a set of procedures for the measurement of radionuclides in terrestrial environmental samples. The development of radon measurement procedures for national and regional surveys started with the collection and review of more than 160 relevant scientific papers. On the basis of this review, this publication summarizes the methodology and the measurement techniques suitable for a population representative national or regional survey on radon concentration in the indoor air of dwellings. The main elements of the survey design are described and discussed, such as the sampling scheme, the protocols, the questionnaire and the data analysis, with particular attention to the potential biases that can affect the representativeness of the results. Moreover, the main measurement techniques suitable for national surveys on indoor radon are reviewed, with particular attention to the elements that can affect the precision and accuracy of the results.

  2. Technique and methods in uterine leiomyoma embolization

    International Nuclear Information System (INIS)

    Helmberger, T.K.; Jakobs, T.F.; Reiser, M.F.

    2003-01-01

    Uterine leiomyomas are the most common benign tumors of the female urogenital tract. Beside the classic surgical treatment options the minimal-invasive embolization therapy of the leiomyomas increasingly gains importance world-wide. Technique, complications, and results of uterine leiomyoma embolization will be presented. After careful evaluation of indications for embolization the procedure is mostly performed under conscious sedation. A single-sided femoral access route together with cross-over technique generally allows for a flow-directed embolization via both uterine arteries. After embolizing the vessels supplying the tumor, the uterine arteries should be still patent. The success rate of embolization of uterine leiomyomas ranges between 85 and 100%, whereas a reduction in size of the tumors in 42 to 83% and a relief of symptoms in up to 96% can be achieved. The total complication rate is about 10% with mainly ''minor complications''. Worldwide only three deaths following embolization of uterine leiomyomas were reported. The high technical and clinical success rate together with a low complication rate make the embolization of uterine leiomyomas a minimally-invasive alternative to the classic treatment. As long term results are not available indication to embolization of uterine leiomyomas must be carefully established in consensus with gynecologists. (orig.) [de

  3. Cost Cutting in Hospitals. Innovative Methods & Techniques

    OpenAIRE

    Pandit, Abhijit

    2016-01-01

    There is emerging need of hospitals to address efficiency issues confronting health care reforms in the present environment. The main objective is to investigate the cost efficiency of hospitals using various methods and variables, and compare the results estimated by the different methods and variables. Reinforcing a common agenda between medical, paramedical and administrative staff, and sharing a common vision among professionals and decision makers in the planning of care, may be the grea...

  4. The basis spline method and associated techniques

    International Nuclear Information System (INIS)

    Bottcher, C.; Strayer, M.R.

    1989-01-01

    We outline the Basis Spline and Collocation methods for the solution of Partial Differential Equations. Particular attention is paid to the theory of errors, and the handling of non-self-adjoint problems which are generated by the collocation method. We discuss applications to Poisson's equation, the Dirac equation, and the calculation of bound and continuum states of atomic and nuclear systems. 12 refs., 6 figs

  5. Methodological aspects and development of techniques for neutron activation analysis of microcomponents in materials of geologic origin

    International Nuclear Information System (INIS)

    Cohen, I.M.

    1982-01-01

    Some aspects of the activation analysis methodology applied to geological samples activated in nuclear reactors were studied, and techniques were developed for the determination of various elements in different types of matrixes, using gamma spectrometry for the measurement of the products. The consideration of the methodological aspects includes the study of the working conditions, the preparation of samples and standards, irradiations, treatment of the irradiated material, radiochemical separation and measurement. Experiments were carried out on reproducibility and errors in relation to the behaviour of the measurement equipment and that of the methods of area calculation (total area, Covell and Wasson), as well as on the effects of geometry variations on the results of the measurements, the RA-3 reactors's flux variations, and the homogeneity of the samples and standards. Also studied were: the selection of the conditions of determination, including the irradiation and decay times; the irradiation with thermal and epithermal neutrons; the measurement with the use of absorbers, and the resolution of complex peaks. Both non-destructive and radiochemical separation techniques were developed for the analysis of 5 types of geological materials. These methods were applied to the following determinations: a) In, Cd, Mn, Ga and Co in blende; b) La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb and Lu in fluorites; c) La, Ca, Eu, Tb, Yb, Se and Th in barites and celestites; d) Cu and Zn in soils. The spectral interferences or those due to nuclear reactions were studied and evaluated by mathematical calculation. (M.E.L.) [es

  6. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  7. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  8. A comparison of usability methods for testing interactive health technologies: methodological aspects and empirical evidence.

    Science.gov (United States)

    Jaspers, Monique W M

    2009-05-01

    Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the human-computer interaction field, we provide an overview of the methodological and empirical research available on the three usability inspection and testing methods most often used. We describe two 'expert-based' and one 'user-based' usability method: (1) the heuristic evaluation, (2) the cognitive walkthrough, and (3) the think aloud. All three usability evaluation methods are applied in laboratory settings. Heuristic evaluation is a relatively efficient usability evaluation method with a high benefit-cost ratio, but requires high skills and usability experience of the evaluators to produce reliable results. The cognitive walkthrough is a more structured approach than the heuristic evaluation with a stronger focus on the learnability of a computer application. Major drawbacks of the cognitive walkthrough are the required level of detail of task and user background descriptions for an adequate application of the latest version of the technique. The think aloud is a very direct method to gain deep insight in the problems end users encounter in interaction with a system but data analyses is extensive and requires a high level of expertise both in the cognitive ergonomics and in computer system application domain. Each of the three usability evaluation methods has shown its usefulness, has its own advantages and disadvantages; no single method has revealed any significant results indicating that it is singularly effective in all circumstances. A combination of different techniques that compliment one another should preferably be used as their collective application will be more powerful than applied in isolation. Innovative mobile and automated solutions to support end-user testing have

  9. Application of Response Surface Methodology in Development of Sirolimus Liposomes Prepared by Thin Film Hydration Technique

    Directory of Open Access Journals (Sweden)

    Saeed Ghanbarzadeh

    2013-04-01

    Full Text Available Introduction: The present investigation was aimed to optimize the formulating process of sirolimus liposomes by thin film hydration method. Methods: In this study, a 32 factorial design method was used to investigate the influence of two independent variables in the preparation of sirolimus liposomes. The dipalmitoylphosphatidylcholine (DPPC /Cholesterol (Chol and dioleoyl phosphoethanolamine(DOPE /DPPC molar ratios were selected as the independent variables. Particle size (PS and Encapsulation Efficiency (EE % were selected as the dependent variables. To separate the un-encapsulated drug, dialysis method was used. Drug analysis was performed with a validated RP-HPLC method. Results: Using response surface methodology and based on the coefficient values obtained for independent variables in the regression equations, it was clear that the DPPC/Chol molar ratio was the major contributing variable in particle size and EE %. The use of a statistical approach allowed us to see individual and/or interaction effects of influencing parameters in order to obtain liposomes with desired properties and to determine the optimum experimental conditions that lead to the enhancement of characteristics. In the prediction of PS and EE % values, the average percent errors are found to be as 3.59 and 4.09%. This value is sufficiently low to confirm the high predictive power of model. Conclusion: Experimental results show that the observed responses were in close agreement with the predicted values and this demonstrates the reliability of the optimization procedure in prediction of PS and EE % in sirolimus liposomes preparation.

  10. Methodology to evaluate the impact of the erosion in cultivated floors applying the technique of the 137CS

    International Nuclear Information System (INIS)

    Gil Castillo, R.; Peralta Vital, J.L.; Carrazana, J.; Riverol, M.; Penn, F.; Cabrera, E.

    2004-01-01

    The present paper shows the results obtained in the framework of 2 Nuclear Projects, in the topic of application of nuclear techniques to evaluate the erosion rates in cultivated soils. Taking into account the investigations with the 137 CS technique, carried out in the Province of Pinar del Rio, was obtained and validated (first time) a methodology to evaluate the erosion impact in a cropland. The obtained methodology includes all relevant stages for the adequate application of the 137 CS technique, from the initial step of area selection, the soil sampling process, selection of the models and finally, the results evaluation step. During the methodology validation process in soils of the Municipality of San Juan y Martinez, the erosion rates estimated by the methodology and the obtained values by watershed segment measures (traditional technique) were compared in a successful manner. The methodology is a technical guide, for the adequate application of the 137 CS technique to estimate the soil redistribution rates in cultivated soils

  11. Structural modeling techniques by finite element method

    International Nuclear Information System (INIS)

    Kang, Yeong Jin; Kim, Geung Hwan; Ju, Gwan Jeong

    1991-01-01

    This book includes introduction table of contents chapter 1 finite element idealization introduction summary of the finite element method equilibrium and compatibility in the finite element solution degrees of freedom symmetry and anti symmetry modeling guidelines local analysis example references chapter 2 static analysis structural geometry finite element models analysis procedure modeling guidelines references chapter 3 dynamic analysis models for dynamic analysis dynamic analysis procedures modeling guidelines and modeling guidelines.

  12. Method and techniques of radioactive waste treatment

    International Nuclear Information System (INIS)

    Ghafar, M.; Aasi, N.

    2002-04-01

    This study illustrates the characterization of radioactive wastes produced by the application of radioisotopes in industry and research. The treatment methods of such radioactive wastes, chemical co-precipitation and ion exchange depending on the technical state of radioactive waste management facility in Syria were described. The disposal of conditioned radioactive wastes, in a safe way, has been discussed including the disposal of the radioactive sources. The characterizations of the repository to stock conditioned radioactive wastes were mentioned. (author)

  13. Epidemiology of intestinal parasitosis in Italy between 2005 and 2008: diagnostic techniques and methodologies

    Directory of Open Access Journals (Sweden)

    Daniele Crotti

    2013-04-01

    Full Text Available Aim of the study was to keep a real and actual photo relating to 2005-2008 regarding to diagnostic techniques and methodologies for intestinal parasites; so it would be possible to know specific epidemiology and suggest more rational and efficacious guide-lines. All members of AMCLI were involved in the proposal of a retrospective study regarding bowel parasites, helminths and protozoa.To engaged laboratories we asked how O&P was performed, if a specifical research for E. vermicularis and S. stercoralis was performed, if for the identification of D. fragilis, Entamoeba histolytica/dispar and Cryptosporidum spp were performed recommended specific permanent stains. 23 laboratories gave assent; but for an inferior number was possible to use the data for analysis and evaluation. Relating O&P only some laboratories performed permanent stains: Giemsa for D. fragilis, antigen and/or Trichrome stain for E. histolytica/dispar, antigen and/or acid fast stain for Cryptosporidium spp.Not all laboratories research specifically S. stercoralis. So the epidemiology is differentiated and related more to adequate or not adequate techniques than cohorts of examined populations. The overall positivity for parasites ranged from 0% to18.7%,for protozoa (pathogens or not were from 0% to 14.7%; for nematodes from 0% to 3.7%; for cestodes from 0% to 1.0%; for trematodes from 0% to 1.0%.Among helminths, E. vermicularis, followed by S. stercoralis, also in O&P, is the most frequent.The specific research of S. stercoralis gave a positivity from 0% to 33.3%; the cellophane tape test was positive for E. vermicularis from 0% fo 21.9% of cases.Among pathogen protozoa, D. fragilis, when permanent stain were applied, prevailed from 0% to 16.6%; G. duodenalis from 0.8% to 4.3%; E. histolytica/dispar, using a permanent stain or research of antigen, was identified from 0% to 20.6%. Coccidia were very rare, with Cryptosporidium spp observed from 0% to 5.2%. These are our conclusions

  14. The Five Star Method: A Relational Dream Work Methodology

    Science.gov (United States)

    Sparrow, Gregory Scott; Thurston, Mark

    2010-01-01

    This article presents a systematic method of dream work called the Five Star Method. Based on cocreative dream theory, which views the dream as the product of the interaction between dreamer and dream, this creative intervention shifts the principal focus in dream analysis from the interpretation of static imagery to the analysis of the dreamer's…

  15. VIKOR Technique: A Systematic Review of the State of the Art Literature on Methodologies and Applications

    Directory of Open Access Journals (Sweden)

    Abbas Mardani

    2016-01-01

    Full Text Available The main objective of this paper is to present a systematic review of the VlseKriterijuska Optimizacija I Komoromisno Resenje (VIKOR method in several application areas such as sustainability and renewable energy. This study reviewed a total of 176 papers, published in 2004 to 2015, from 83 high-ranking journals; most of which were related to Operational Research, Management Sciences, decision making, sustainability and renewable energy and were extracted from the “Web of Science and Scopus” databases. Papers were classified into 15 main application areas. Furthermore, papers were categorized based on the nationalities of authors, dates of publications, techniques and methods, type of studies, the names of the journals and studies purposes. The results of this study indicated that more papers on VIKOR technique were published in 2013 than in any other year. In addition, 13 papers were published about sustainability and renewable energy fields. Furthermore, VIKOR and fuzzy VIKOR methods, had the first rank in use. Additionally, the Journal of Expert Systems with Applications was the most significant journal in this study, with 27 publications on the topic. Finally, Taiwan had the first rank from 22 nationalities which used VIKOR technique.

  16. Analytical techniques for instrument design - matrix methods

    International Nuclear Information System (INIS)

    Robinson, R.A.

    1997-01-01

    We take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalisation to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, we discuss a toolbox of matrix manipulations that can be performed on the 6- dimensional Cooper-Nathans matrix: diagonalisation (Moller-Nielsen method), coordinate changes e.g. from (Δk I ,Δk F to ΔE, ΔQ ampersand 2 dummy variables), integration of one or more variables (e.g. over such dummy variables), integration subject to linear constraints (e.g. Bragg's Law for analysers), inversion to give the variance-covariance matrix, and so on. We show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. We will argue that a generalised program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. We will also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question

  17. Analytical techniques for instrument design - Matrix methods

    International Nuclear Information System (INIS)

    Robinson, R.A.

    1997-01-01

    The authors take the traditional Cooper-Nathans approach, as has been applied for many years for steady-state triple-axis spectrometers, and consider its generalization to other inelastic scattering spectrometers. This involves a number of simple manipulations of exponentials of quadratic forms. In particular, they discuss a toolbox of matrix manipulations that can be performed on the 6-dimensional Cooper-Nathans matrix. They show how these tools can be combined to solve a number of important problems, within the narrow-band limit and the gaussian approximation. They will argue that a generalized program that can handle multiple different spectrometers could (and should) be written in parallel to the Monte-Carlo packages that are becoming available. They also discuss the complementarity between detailed Monte-Carlo calculations and the approach presented here. In particular, Monte-Carlo methods traditionally simulate the real experiment as performed in practice, given a model scattering law, while the Cooper-Nathans method asks the inverse question: given that a neutron turns up in a particular spectrometer configuration (e.g. angle and time of flight), what is the probability distribution of possible scattering events at the sample? The Monte-Carlo approach could be applied in the same spirit to this question

  18. Methodologies of Knowledge Discovery from Data and Data Mining Methods in Mechanical Engineering

    Directory of Open Access Journals (Sweden)

    Rogalewicz Michał

    2016-12-01

    Full Text Available The paper contains a review of methodologies of a process of knowledge discovery from data and methods of data exploration (Data Mining, which are the most frequently used in mechanical engineering. The methodologies contain various scenarios of data exploring, while DM methods are used in their scope. The paper shows premises for use of DM methods in industry, as well as their advantages and disadvantages. Development of methodologies of knowledge discovery from data is also presented, along with a classification of the most widespread Data Mining methods, divided by type of realized tasks. The paper is summarized by presentation of selected Data Mining applications in mechanical engineering.

  19. Thought Suppression Research Methods: Paradigms, Theories, Methodological Concerns

    Directory of Open Access Journals (Sweden)

    Niczyporuk Aneta

    2016-12-01

    Full Text Available It is hard to provide an unequivocal answer to the question of whether or not thought suppression is effective. Two thought suppression paradigms - the “white bear” paradigm and the think/no-think paradigm - give mixed results. Generally, “white bear” experiments indicate that thought suppression is counterproductive, while experiments in the think/no-think paradigm suggest that it is possible to effectively suppress a thought. There are also alternative methods used to study thought suppression, for instance the directed forgetting paradigm or the Stroop task. In the article, I describe the research methods used to explore thought suppression efficacy. I focus on the “white bear” and the think/no-think paradigms and discuss theories proposed to explain the results obtained. I also consider the internal and external validity of the methods used.

  20. Complexity, Methodology and Method: Crafting a Critical Process of Research

    Science.gov (United States)

    Alhadeff-Jones, Michel

    2013-01-01

    This paper defines a theoretical framework aiming to support the actions and reflections of researchers looking for a "method" in order to critically conceive the complexity of a scientific process of research. First, it starts with a brief overview of the core assumptions framing Morin's "paradigm of complexity" and Le…

  1. Mass Movement Hazards in the Mediterranean; A review on applied techniques and methodologies

    Science.gov (United States)

    Ziade, R.; Abdallah, C.; Baghdadi, N.

    2012-04-01

    Emergent population and expansions of settlements and life-lines over hazardous areas in the Mediterranean region have largely increased the impact of Mass Movements (MM) both in industrialized and developing countries. This trend is expected to continue in the next decades due to increased urbanization and development, continued deforestation and increased regional precipitation in MM-prone areas due to changing climatic patterns. Consequently, and over the past few years, monitoring of MM has acquired great importance from the scientific community as well as the civilian one. This article begins with a discussion of the MM classification, and the different topographic, geologic, hydrologic and environmental impacting factors. The intrinsic (preconditioning) variables determine the susceptibility of MM and extrinsic factors (triggering) can induce the probability of MM occurrence. The evolution of slope instability studies is charted from geodetic or observational techniques, to geotechnical field-based origins to recent higher levels of data acquisition through Remote Sensing (RS) and Geographic Information System (GIS) techniques. Since MM detection and zoning is difficult in remote areas, RS and GIS have enabled regional studies to predominate over site-based ones where they provide multi-temporal images hence facilitate greatly MM monitoring. The unusual extent of the spectrum of MM makes it difficult to define a single methodology to establish MM hazard. Since the probability of occurrence of MM is one of the key components in making rational decisions for management of MM risk, scientists and engineers have developed physical parameters, equations and environmental process models that can be used as assessment tools for management, education, planning and legislative purposes. Assessment of MM is attained through various modeling approaches mainly divided into three main sections: quantitative/Heuristic (1:2.000-1:10.000), semi-quantitative/Statistical (1

  2. Innovative Mixed-Methods Research: Moving beyond Design Technicalities to Epistemological and Methodological Realizations

    Science.gov (United States)

    Riazi, A. Mehdi

    2016-01-01

    Mixed-methods research (MMR), as an inter-discourse (quantitative and qualitative) methodology, can provide applied linguistics researchers the opportunity to draw on and integrate the strengths of the two research methodological approaches in favour of making more rigorous inferences about research problems. In this article, the argument is made…

  3. Valuation of micro and small enterprises using the methodology multicriteria and method of discounted cash flow

    Directory of Open Access Journals (Sweden)

    Marcus Vinicius Andrade de Lima

    2010-01-01

    Full Text Available This paper presents a contribution to the method of discounted cash flow methodology using multicriteria decision aid. This methodology incorporates qualitative variables and subjective to the traditional method of discounted cash flows used in valuation companies. To illustrate the proposed method was a descriptive study of exploratory nature applied to a multicase. The intervention was in Micro and Small Enterprises (MSE from the chemical, pharmaceutical and tourism. As a result, the appraiser set the price of the business taking into account the result of mixing between the two methodologies.

  4. Methodological proposal for environmental impact evaluation since different specific methods

    International Nuclear Information System (INIS)

    Leon Pelaez, Juan Diego; Lopera Arango Gabriel Jaime

    1999-01-01

    Some conceptual and practical elements related to environmental impact evaluation are described and related to the preparation of technical reports (environmental impact studies and environmental management plans) to be presented to environmental authorities for obtaining the environmental permits for development projects. In the first part of the document a summary of the main aspects of normative type is made that support the studies of environmental impact in Colombia. We propose a diagram for boarding and elaboration of the evaluation of environmental impact, which begins with the description of the project and of the environmental conditions in the area of the same. Passing then to identify the impacts through a method matricial and continuing with the quantitative evaluation of the same. For which we propose the use of the method developed by Arboleda (1994). Also we propose to qualify the activities of the project and the components of the environment in their relative importance, by means of a method here denominated agglomerate evaluation. Which allows finding those activities more impacting and the mostly impacted components. Lastly it is presented some models for the elaboration and presentation of the environmental management plans. The pursuit programs and those of environmental supervision

  5. Methods and techniques of nuclear in-core fuel management

    International Nuclear Information System (INIS)

    Jong, A.J. de.

    1992-04-01

    Review of methods of nuclear in-core fuel management (the minimal critical mass problem, minimal power peaking) and calculational techniques: reactorphysical calculations (point reactivity models, continuous refueling, empirical methods, depletion perturbation theory, nodal computer programs); optimization techniques (stochastic search, linear programming, heuristic parameter optimization). (orig./HP)

  6. The Sine Method: An Alternative Height Measurement Technique

    Science.gov (United States)

    Don C. Bragg; Lee E. Frelich; Robert T. Leverett; Will Blozan; Dale J. Luthringer

    2011-01-01

    Height is one of the most important dimensions of trees, but few observers are fully aware of the consequences of the misapplication of conventional height measurement techniques. A new approach, the sine method, can improve height measurement by being less sensitive to the requirements of conventional techniques (similar triangles and the tangent method). We studied...

  7. Methods and methodology of sexual dysfunctions with males. Chapter 5

    International Nuclear Information System (INIS)

    2000-01-01

    Examination of patients (liquidators of Chernobyl accident) was carried out in Republican Hospital for Great War Invalids. Sexual function of patients was estimated with help G.V. Vasilenko's questionnaire 'Men's sexual function'. For consideration of sexual dysfunctions the scale for qualitative estimation of potention dysfunction level was used. For revealing of vegetative nervous system dysfunction with examined patients the Ashner effect was studied. The functional status of pituitary gland - gonad system was studied by content of hormones in blood (follicle stimulating and luteinizing hormones). Results of investigations of both the sexual function and hormone level in blood were processed by method of variation statistics. For determination of each factors influence from all acting factors on sexual function the multifactorial dispersion analysis was carried out

  8. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More than Qualitative Methods

    Science.gov (United States)

    Bowleg, Lisa

    2017-01-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with…

  9. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-01-01

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform’s mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument’s working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform. PMID:27869722

  10. Virtual Distances Methodology as Verification Technique for AACMMs with a Capacitive Sensor Based Indexed Metrology Platform.

    Science.gov (United States)

    Acero, Raquel; Santolaria, Jorge; Brau, Agustin; Pueo, Marcos

    2016-11-18

    This paper presents a new verification procedure for articulated arm coordinate measuring machines (AACMMs) together with a capacitive sensor-based indexed metrology platform (IMP) based on the generation of virtual reference distances. The novelty of this procedure lays on the possibility of creating virtual points, virtual gauges and virtual distances through the indexed metrology platform's mathematical model taking as a reference the measurements of a ball bar gauge located in a fixed position of the instrument's working volume. The measurements are carried out with the AACMM assembled on the IMP from the six rotating positions of the platform. In this way, an unlimited number and types of reference distances could be created without the need of using a physical gauge, therefore optimizing the testing time, the number of gauge positions and the space needed in the calibration and verification procedures. Four evaluation methods are presented to assess the volumetric performance of the AACMM. The results obtained proved the suitability of the virtual distances methodology as an alternative procedure for verification of AACMMs using the indexed metrology platform.

  11. Micrometeorological Technique for Monitoring of Geological Carbon Capture, Utilization and Storage: Methodology, Workflow and Resources

    Science.gov (United States)

    Burba, G. G.; Madsen, R.; Feese, K.

    2013-12-01

    The eddy covariance (EC) method is a micrometeorological technique for direct high-speed measurements of the transport of gases and energy between land or water surfaces and the atmosphere [1]. This method allows for observations of gas transport scales from 20-40 times per second to multiple years, represents gas exchange integrated over a large area, from hundreds of square meters to tens of square kilometres, and corresponds to gas exchange from the entire surface, including canopy, and soil or water layers. Gas fluxes, emission and exchange rates are characterized from single-point in situ measurements using permanent or mobile towers, or moving platforms such as automobiles, helicopters, airplanes, etc. Presently, over 600 eddy covariance stations are in operation in over 120 countries [1]. EC is now recognized as an effective method in regulatory and industrial applications, including CCUS [2-10]. Emerging projects utilize EC to continuously monitor large areas before and after the injections, to locate and quantify leakages where CO2 may escape from the subsurface, to improve storage efficiency, and for other CCUS characterizations [5-10]. Although EC is one of the most direct and defensible micrometeorological techniques measuring gas emission and transport, and complete automated stations and processing are readily available, the method is mathematically complex, and requires careful setup and execution specific to the site and project. With this in mind, step-by-step instructions were created in [1] to introduce a novice to the EC method, and to assist in further understanding of the method through more advanced references. In this presentation we provide brief highlights of the eddy covariance method, its application to geological carbon capture, utilization and storage, key requirements, instrumentation and software, and review educational resources particularly useful for carbon sequestration research. References: [1] Burba G. Eddy Covariance Method

  12. Ridge Preservation with Modified “Socket-Shield” Technique: A Methodological Case Series

    Directory of Open Access Journals (Sweden)

    Markus Glocker

    2014-01-01

    Full Text Available After tooth extraction, the alveolar bone undergoes a remodeling process, which leads to horizontal and vertical bone loss. These resorption processes complicate dental rehabilitation, particularly in connection with implants. Various methods of guided bone regeneration (GBR have been described to retain the original dimension of the bone after extraction. Most procedures use filler materials and membranes to support the buccal plate and soft tissue, to stabilize the coagulum and to prevent epithelial ingrowth. It has also been suggested that resorption of the buccal bundle bone can be avoided by leaving a buccal root segment (socket shield technique in place, because the biological integrity of the buccal periodontium (bundle bone remains untouched. This method has also been described in connection with immediate implant placement. The present case report describes three consecutive cases in which a modified method was applied as part of a delayed implantation. The latter was carried out after six months, and during re-entry the new bone formation in the alveolar bone and the residual ridge was clinically evaluated as proof of principle. It was demonstrated that the bone was clinically preserved with this method. Possibilities and limitations are discussed and directions for future research are disclosed.

  13. ATHEANA: A Technique for Human Error Analysis: An Overview of Its Methodological Basis

    International Nuclear Information System (INIS)

    Wreathall, John; Ramey-Smith, Ann

    1998-01-01

    The U.S. NRC has developed a new human reliability analysis (HRA) method, called A Technique for Human Event Analysis (ATHEANA), to provide a way of modeling the so-called 'errors of commission' - that is, situations in which operators terminate or disable engineered safety features (ESFs) or similar equipment during accident conditions, thereby putting the plant at an increased risk of core damage. In its reviews of operational events, NRC has found that these errors of commission occur with a relatively high frequency (as high as 2 or 3 per year), but are noticeably missing from the scope of most current probabilistic risk assessments (PRAs). This new method was developed through a formalized approach that describes what can occur when operators behave rationally but have inadequate knowledge or poor judgement. In particular, the method is based on models of decision-making and response planning that have been used extensively in the aviation field, and on the analysis of major accidents in both the nuclear and non-nuclear fields. Other papers at this conference present summaries of these event analyses in both the nuclear and non-nuclear fields. This paper presents an overview of ATHEANA and summarizes how the method structures the analysis of operationally significant events, and helps HRA analysts identify and model potentially risk-significant errors of commission in plant PRAs. (authors)

  14. Techniques involving extreme environment, nondestructive techniques, computer methods in metals research, and data analysis

    International Nuclear Information System (INIS)

    Bunshah, R.F.

    1976-01-01

    A number of different techniques which range over several different aspects of materials research are covered in this volume. They are concerned with property evaluation of 4 0 K and below, surface characterization, coating techniques, techniques for the fabrication of composite materials, computer methods, data evaluation and analysis, statistical design of experiments and non-destructive test techniques. Topics covered in this part include internal friction measurements; nondestructive testing techniques; statistical design of experiments and regression analysis in metallurgical research; and measurement of surfaces of engineering materials

  15. Prosopography of social and political groups historically located: method or research technique?

    Directory of Open Access Journals (Sweden)

    Lorena Madruga Monteiro

    2014-06-01

    Full Text Available The prosopographical approach has been questioned in different disciplinary domains as its scientific nature. The debate prosopography is a technique, a tool for research, an auxiliary science or method transpires in scientific arguments and those who are dedicated to explaining the prosopographical research assumptions. In the social sciences, for example, prosopography is not seen only as an instrument of research, but as a method associated with a theoretical construct to apprehend the social world. The historians that use prosopographic analysis, in turn, oscillate about the analysis of collective biography is a method or a polling technique. Given this setting we aimed at in this article, discuss the prosopographical approach from their different uses. The study presents a literature review, demonstrating the technique of prosopography as historical research, and further as a method of sociological analysis, and then highlight your procedures and methodological limits.

  16. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    Science.gov (United States)

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  17. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    Energy Technology Data Exchange (ETDEWEB)

    Jeffs, S.P., E-mail: s.p.jeffs@swansea.ac.uk [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Lancaster, R.J. [Institute of Structural Materials, Swansea University, Singleton Park SA2 8PP (United Kingdom); Garcia, T.E. [IUTA (University Institute of Industrial Technology of Asturias), University of Oviedo, Edificio Departamental Oeste 7.1.17, Campus Universitario, 33203 Gijón (Spain)

    2015-06-11

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k{sub SP} method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results.

  18. Creep lifing methodologies applied to a single crystal superalloy by use of small scale test techniques

    International Nuclear Information System (INIS)

    Jeffs, S.P.; Lancaster, R.J.; Garcia, T.E.

    2015-01-01

    In recent years, advances in creep data interpretation have been achieved either by modified Monkman–Grant relationships or through the more contemporary Wilshire equations, which offer the opportunity of predicting long term behaviour extrapolated from short term results. Long term lifing techniques prove extremely useful in creep dominated applications, such as in the power generation industry and in particular nuclear where large static loads are applied, equally a reduction in lead time for new alloy implementation within the industry is critical. The latter requirement brings about the utilisation of the small punch (SP) creep test, a widely recognised approach for obtaining useful mechanical property information from limited material volumes, as is typically the case with novel alloy development and for any in-situ mechanical testing that may be required. The ability to correlate SP creep results with uniaxial data is vital when considering the benefits of the technique. As such an equation has been developed, known as the k SP method, which has been proven to be an effective tool across several material systems. The current work now explores the application of the aforementioned empirical approaches to correlate small punch creep data obtained on a single crystal superalloy over a range of elevated temperatures. Finite element modelling through ABAQUS software based on the uniaxial creep data has also been implemented to characterise the SP deformation and help corroborate the experimental results

  19. Practical implementation of a methodology for digital images authentication using forensics techniques

    OpenAIRE

    Francisco Rodríguez-Santos; Guillermo Delgado-Gutierréz; Leonardo Palacios-Luengas; Rubén Vázquez Medina

    2015-01-01

    This work presents a forensics analysis methodology implemented to detect modifications in JPEG digital images by analyzing the image’s metadata, thumbnail, camera traces and compression signatures. Best practices related with digital evidence and forensics analysis are considered to determine if the technical attributes and the qualities of an image are consistent with each other. This methodology is defined according to the recommendations of the Good Practice Guide for Computer-Based Elect...

  20. Comparing photo modeling methodologies and techniques: the instance of the Great Temple of Abu Simbel

    Directory of Open Access Journals (Sweden)

    Sergio Di Tondo

    2013-10-01

    Full Text Available After fifty years from the Salvage of the Abu Simbel Temples it has been possible to experiment the contemporary photo-modeling tools beginning from the original data of the photogrammetrical survey carried out in the 1950s. This produced a reflection on “Image Based” methods and modeling techniques, comparing strict 3d digital photogrammetry with the latest Structure From Motion (SFM systems. The topographic survey data, the original photogrammetric stereo couples, the points coordinates and their representation in contour lines, allowed to obtain a model of the monument in his configuration before the moving of the temples. The impossibility to carry out a direct survey led to touristic shots to create SFM models to use for geometric comparisons.

  1. Methodological Reporting in Qualitative, Quantitative, and Mixed Methods Health Services Research Articles

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-01-01

    Objectives Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. Data Sources All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. Study Design All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Principal Findings Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ2(1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ2(1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Conclusion Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the

  2. Efficacy of Blood Sources and Artificial Blood Feeding Methods in Rearing of Aedes aegypti (Diptera: Culicidae) for Sterile Insect Technique and Incompatible Insect Technique Approaches in Sri Lanka

    OpenAIRE

    Nayana Gunathilaka; Tharaka Ranathunge; Lahiru Udayanga; Wimaladharma Abeyewickreme

    2017-01-01

    Introduction Selection of the artificial membrane feeding technique and blood meal source has been recognized as key considerations in mass rearing of vectors. Methodology Artificial membrane feeding techniques, namely, glass plate, metal plate, and Hemotek membrane feeding method, and three blood sources (human, cattle, and chicken) were evaluated based on feeding rates, fecundity, and hatching rates of Aedes aegypti. Significance in the variations among blood feeding was investigated by one...

  3. EPA Method 245.2: Mercury (Automated Cold Vapor Technique)

    Science.gov (United States)

    Method 245.2 describes procedures for preparation and analysis of drinking water samples for analysis of mercury using acid digestion and cold vapor atomic absorption. Samples are prepared using an acid digestion technique.

  4. Methodology or method? A critical review of qualitative case study reports

    Directory of Open Access Journals (Sweden)

    Nerida Hyett

    2014-05-01

    Full Text Available Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12, social sciences and anthropology (n=7, or methods (n=15 case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.

  5. Methodology or method? A critical review of qualitative case study reports

    Science.gov (United States)

    Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia

    2014-01-01

    Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980

  6. Approaches to qualitative research in mathematics education examples of methodology and methods

    CERN Document Server

    Bikner-Ahsbahs, Angelika; Presmeg, Norma

    2014-01-01

    This volume documents a range of qualitative research approaches emerged within mathematics education over the last three decades, whilst at the same time revealing their underlying methodologies. Continuing the discussion as begun in the two 2003 ZDM issues dedicated to qualitative empirical methods, this book presents astate of the art overview on qualitative research in mathematics education and beyond. The structure of the book allows the reader to use it as an actual guide for the selection of an appropriate methodology, on a basis of both theoretical depth and practical implications. The methods and examples illustrate how different methodologies come to life when applied to a specific question in a specific context. Many of the methodologies described are also applicable outside mathematics education, but the examples provided are chosen so as to situate the approach in a mathematical context.

  7. DEMATEL Technique: A Systematic Review of the State-of-the-Art Literature on Methodologies and Applications

    Directory of Open Access Journals (Sweden)

    Sheng-Li Si

    2018-01-01

    Full Text Available Decision making trial and evaluation laboratory (DEMATEL is considered as an effective method for the identification of cause-effect chain components of a complex system. It deals with evaluating interdependent relationships among factors and finding the critical ones through a visual structural model. Over the recent decade, a large number of studies have been done on the application of DEMATEL and many different variants have been put forward in the literature. The objective of this study is to review systematically the methodologies and applications of the DEMATEL technique. We reviewed a total of 346 papers published from 2006 to 2016 in the international journals. According to the approaches used, these publications are grouped into five categories: classical DEMATEL, fuzzy DEMATEL, grey DEMATEL, analytical network process- (ANP- DEMATEL, and other DEMATEL. All papers with respect to each category are summarized and analyzed, pointing out their implementing procedures, real applications, and crucial findings. This systematic and comprehensive review holds valuable insights for researchers and practitioners into using the DEMATEL in terms of indicating current research trends and potential directions for further research.

  8. Leakage localisation method in a water distribution system based on sensitivity matrix: methodology and real test

    OpenAIRE

    Pascual Pañach, Josep

    2010-01-01

    Leaks are present in all water distribution systems. In this paper a method for leakage detection and localisation is presented. It uses pressure measurements and simulation models. Leakage localisation methodology is based on pressure sensitivity matrix. Sensitivity is normalised and binarised using a common threshold for all nodes, so a signatures matrix is obtained. A pressure sensor optimal distribution methodology is developed too, but it is not used in the real test. To validate this...

  9. Reported credibility techniques in higher education evaluation studies that use qualitative methods: A research synthesis.

    Science.gov (United States)

    Liao, Hongjing; Hitchcock, John

    2018-06-01

    This synthesis study examined the reported use of credibility techniques in higher education evaluation articles that use qualitative methods. The sample included 118 articles published in six leading higher education evaluation journals from 2003 to 2012. Mixed methods approaches were used to identify key credibility techniques reported across the articles, document the frequency of these techniques, and describe their use and properties. Two broad sets of techniques were of interest: primary design techniques (i.e., basic), such as sampling/participant recruitment strategies, data collection methods, analytic details, and additional qualitative credibility techniques (e.g., member checking, negative case analyses, peer debriefing). The majority of evaluation articles reported use of primary techniques although there was wide variation in the amount of supporting detail; most of the articles did not describe the use of additional credibility techniques. This suggests that editors of evaluation journals should encourage the reporting of qualitative design details and authors should develop strategies yielding fuller methodological description. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Alternative containment integrity test methods, an overview of possible techniques

    International Nuclear Information System (INIS)

    Spletzer, B.L.

    1986-01-01

    A study is being conducted to develop and analyze alternative methods for testing of containment integrity. The study is focused on techniques for continuously monitoring containment integrity to provide rapid detection of existing leaks, thus providing greater certainty of the integrity of the containment at any time. The study is also intended to develop techniques applicable to the currently required Type A integrated leakage rate tests. A brief discussion of the range of alternative methods currently being considered is presented. The methods include applicability to all major containment types, operating and shutdown plant conditions, and quantitative and qualitative leakage measurements. The techniques are analyzed in accordance with the current state of knowledge of each method. The bulk of the techniques discussed are in the conceptual stage, have not been tested in actual plant conditions, and are presented here as a possible future direction for evaluating containment integrity. Of the methods considered, no single method provides optimum performance for all containment types. Several methods are limited in the types of containment for which they are applicable. The results of the study to date indicate that techniques for continuous monitoring of containment integrity exist for many plants and may be implemented at modest cost

  11. La interpretacion consecutiva: metodologia y tecnicas (Consecutive Interpretation: Methodology and Techniques).

    Science.gov (United States)

    Drallny, Ines

    1987-01-01

    Describes the purpose and appropriate methodology for various levels of interpreter training, for both consecutive and simultaneous interpretation. The importance of relating the intent of the text to the explicit language forms through which that intent is realized is discussed, and appropriate criteria for evaluation of student interpreters are…

  12. Thresholding: A Pixel-Level Image Processing Methodology Preprocessing Technique for an OCR System for the Brahmi Script

    Directory of Open Access Journals (Sweden)

    H. K. Anasuya Devi

    2006-12-01

    Full Text Available In this paper we study the methodology employed for preprocessing the archaeological images. We present the various algorithms used in the low-level processing stage of image analysis for Optical Character Recognition System for Brahmi Script. The image preprocessing technique covered in this paper is thresholding. We also try to analyze the results obtained by the pixel-level processing algorithms.

  13. An effective vacuum assisted extraction method for the optimization of labdane diterpenoids from Andrographis paniculata by response surface methodology.

    Science.gov (United States)

    Wang, Ya-Qi; Wu, Zhen-Feng; Ke, Gang; Yang, Ming

    2014-12-31

    An effective vacuum assisted extraction (VAE) technique was proposed for the first time and applied to extract bioactive components from Andrographis paniculata. The process was carefully optimized by response surface methodology (RSM). Under the optimized experimental conditions, the best results were obtained using a boiling temperature of 65 °C, 50% ethanol concentration, 16 min of extraction time, one extraction cycles and a 12:1 liquid-solid ratio. Compared with conventional ultrasonic assisted extraction and heat reflux extraction, the VAE technique gave shorter extraction times and remarkable higher extraction efficiency, which indicated that a certain degree of vacuum gave the solvent a better penetration of the solvent into the pores and between the matrix particles, and enhanced the process of mass transfer. The present results demonstrated that VAE is an efficient, simple and fast method for extracting bioactive components from A. paniculata, which shows great potential for becoming an alternative technique for industrial scale-up applications.

  14. An Effective Vacuum Assisted Extraction Method for the Optimization of Labdane Diterpenoids from Andrographis paniculata by Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Ya-Qi Wang

    2014-12-01

    Full Text Available An effective vacuum assisted extraction (VAE technique was proposed for the first time and applied to extract bioactive components from Andrographis paniculata. The process was carefully optimized by response surface methodology (RSM. Under the optimized experimental conditions, the best results were obtained using a boiling temperature of 65 °C, 50% ethanol concentration, 16 min of extraction time, one extraction cycles and a 12:1 liquid-solid ratio. Compared with conventional ultrasonic assisted extraction and heat reflux extraction, the VAE technique gave shorter extraction times and remarkable higher extraction efficiency, which indicated that a certain degree of vacuum gave the solvent a better penetration of the solvent into the pores and between the matrix particles, and enhanced the process of mass transfer. The present results demonstrated that VAE is an efficient, simple and fast method for extracting bioactive components from A. paniculata, which shows great potential for becoming an alternative technique for industrial scale-up applications.

  15. Preferences of Teaching Methods and Techniques in Mathematics with Reasons

    Science.gov (United States)

    Ünal, Menderes

    2017-01-01

    In this descriptive study, the goal was to determine teachers' preferred pedagogical methods and techniques in mathematics. Qualitative research methods were employed, primarily case studies. 40 teachers were randomly chosen from various secondary schools in Kirsehir during the 2015-2016 educational terms, and data were gathered via…

  16. A Methodology for Conducting Integrative Mixed Methods Research and Data Analyses

    Science.gov (United States)

    Castro, Felipe González; Kellison, Joshua G.; Boyd, Stephen J.; Kopak, Albert

    2011-01-01

    Mixed methods research has gained visibility within the last few years, although limitations persist regarding the scientific caliber of certain mixed methods research designs and methods. The need exists for rigorous mixed methods designs that integrate various data analytic procedures for a seamless transfer of evidence across qualitative and quantitative modalities. Such designs can offer the strength of confirmatory results drawn from quantitative multivariate analyses, along with “deep structure” explanatory descriptions as drawn from qualitative analyses. This article presents evidence generated from over a decade of pilot research in developing an integrative mixed methods methodology. It presents a conceptual framework and methodological and data analytic procedures for conducting mixed methods research studies, and it also presents illustrative examples from the authors' ongoing integrative mixed methods research studies. PMID:22167325

  17. SCIENTIFIC METHODOLOGY FOR THE APPLIED SOCIAL SCIENCES: CRITICAL ANALYSES ABOUT RESEARCH METHODS, TYPOLOGIES AND CONTRIBUTIONS FROM MARX, WEBER AND DURKHEIM

    Directory of Open Access Journals (Sweden)

    Mauricio Corrêa da Silva

    2015-06-01

    Full Text Available This study aims to discuss the importance of the scientific method to conduct and advertise research in applied social sciences and research typologies, as well as to highlight contributions from Marx, Weber and Durkheim to the scientific methodology. To reach this objective, we conducted a review of the literature on the term research, the scientific method,the research techniques and the scientific methodologies. The results of the investigation revealed that it is fundamental that the academic investigator uses a scientific method to conduct and advertise his/her academic works in applied social sciences in comparison with the biochemical or computer sciences and in the indicated literature. Regarding the contributions to the scientific methodology, we have Marx, dialogued, the dialectical, striking analysis, explicative of social phenomenon, the need to understand the phenomena as historical and concrete totalities; Weber, the distinction between “facts” and “value judgments” to provide objectivity to the social sciences and Durkheim, the need to conceptualize very well its object of study, reject sensible data and imbue with the spirit of discovery and of being surprised with the results.

  18. Construction techniques and management methods for BWR plants

    International Nuclear Information System (INIS)

    Shimizu, Yohji; Tateishi, Mizuo; Hayashi, Yoshishige

    1989-01-01

    Toshiba is constantly striving for safer and more efficient plant construction to realize high-quality BWR plants within a short construction period. To achieve these aims, Toshiba has developed and improved a large number of construction techniques and construction management methods. In the area of installation, various techniques have been applied such as the modularization of piping and equipment, shop installation of reactor internals, etc. Further, installation management has been upgraded by the use of pre-installation review programs, the development of installation control systems, etc. For commissioning, improvements in commissioning management have been achieved through the use of computer systems, and testing methods have also been upgraded by the development of computer systems for the recording and analysis of test data and the automatic adjustment of controllers in the main control system of the BWR. This paper outlines these construction techniques and management methods. (author)

  19. Deuterium dilution technique for body composition assessment: resolving methodological issues in children with moderate acute malnutrition

    DEFF Research Database (Denmark)

    Fabiansen, Christian; Yaméogo, Charles W; Devi, Sarita

    2017-01-01

    Childhood malnutrition is highly prevalent and associated with high mortality risk. In observational and interventional studies among malnourished children, body composition is increasingly recognised as a key outcome. The deuterium dilution technique has generated high-quality data on body...... composition in studies of infants and young children in several settings, but its feasibility and accuracy in children suffering from moderate acute malnutrition requires further study. Prior to a large nutritional intervention trial among children with moderate acute malnutrition, we conducted pilot work...... quality when using the deuterium dilution technique in malnutrition studies in field conditions, and may encourage a wider use of isotope techniques....

  20. Assessment of change in knowledge about research methods among delegates attending research methodology workshop.

    Science.gov (United States)

    Shrivastava, Manisha; Shah, Nehal; Navaid, Seema

    2018-01-01

    In an era of evidence based medicine research is an essential part of medical profession whether clinical or academic. A research methodology workshop intends to help participants, those who are newer to research field or those who are already doing empirical research. The present study was conducted to assess the changes in knowledge of the participants of a research methodology workshop through a structured questionnaire. With administrative and ethical approval, a four day research methodology workshop was planned. The participants were subjected to a structured questionnaire (pre-test) containing 20 multiple choice questions (Q1-Q 20) related to the topics to be covered in research methodology workshop before the commencement of the workshop and then subjected to similar posttest questionnaire after the completion of workshop. The mean values of pre and post-test scores were calculated and the results were analyzed and compared. Out of the total 153 delegates, 45(29 %) were males and 108 were (71 %) females. 92 (60%) participants consented to fill the pre-test questionnaire and 68 (44%) filled the post-test questionnaire. The mean Pre-test and post-test scores at 95% Confidence Interval were 07.62 (SD ±3.220) and 09.66 (SD ±2.477) respectively. The differences were found to be significant using Paired Sample T test ( P research methodology workshops. Participatory research methodology workshops are good methods of imparting knowledge, also the long term effects needs to be evaluated.

  1. Deuterium dilution technique for body composition assessment: resolving methodological issues in children with moderate acute malnutrition.

    Science.gov (United States)

    Fabiansen, Christian; Yaméogo, Charles W; Devi, Sarita; Friis, Henrik; Kurpad, Anura; Wells, Jonathan C

    2017-08-01

    Childhood malnutrition is highly prevalent and associated with high mortality risk. In observational and interventional studies among malnourished children, body composition is increasingly recognised as a key outcome. The deuterium dilution technique has generated high-quality data on body composition in studies of infants and young children in several settings, but its feasibility and accuracy in children suffering from moderate acute malnutrition requires further study. Prior to a large nutritional intervention trial among children with moderate acute malnutrition, we conducted pilot work to develop and adapt the deuterium dilution technique. We refined procedures for administration of isotope doses and collection of saliva. Furthermore, we established that equilibration time in local context is 3 h. These findings and the resulting standard operating procedures are important to improve data quality when using the deuterium dilution technique in malnutrition studies in field conditions, and may encourage a wider use of isotope techniques.

  2. Methodological reporting in qualitative, quantitative, and mixed methods health services research articles.

    Science.gov (United States)

    Wisdom, Jennifer P; Cavaleri, Mary A; Onwuegbuzie, Anthony J; Green, Carla A

    2012-04-01

    Methodologically sound mixed methods research can improve our understanding of health services by providing a more comprehensive picture of health services than either method can alone. This study describes the frequency of mixed methods in published health services research and compares the presence of methodological components indicative of rigorous approaches across mixed methods, qualitative, and quantitative articles. All empirical articles (n = 1,651) published between 2003 and 2007 from four top-ranked health services journals. All mixed methods articles (n = 47) and random samples of qualitative and quantitative articles were evaluated to identify reporting of key components indicating rigor for each method, based on accepted standards for evaluating the quality of research reports (e.g., use of p-values in quantitative reports, description of context in qualitative reports, and integration in mixed method reports). We used chi-square tests to evaluate differences between article types for each component. Mixed methods articles comprised 2.85 percent (n = 47) of empirical articles, quantitative articles 90.98 percent (n = 1,502), and qualitative articles 6.18 percent (n = 102). There was a statistically significant difference (χ(2) (1) = 12.20, p = .0005, Cramer's V = 0.09, odds ratio = 1.49 [95% confidence interval = 1,27, 1.74]) in the proportion of quantitative methodological components present in mixed methods compared to quantitative papers (21.94 versus 47.07 percent, respectively) but no statistically significant difference (χ(2) (1) = 0.02, p = .89, Cramer's V = 0.01) in the proportion of qualitative methodological components in mixed methods compared to qualitative papers (21.34 versus 25.47 percent, respectively). Few published health services research articles use mixed methods. The frequency of key methodological components is variable. Suggestions are provided to increase the transparency of mixed methods studies and

  3. Development of methodology for certification of Type B shipping containers using analytical and testing techniques

    International Nuclear Information System (INIS)

    Sharp, R.R.; Varley, D.T.

    1993-01-01

    The use of multidisciplinary teams to develop Type B shipping containers improves the quality and reliability of these reusable packagings. Including the people involved in all aspects of the design, certification and use of the package leads to more innovative, user-friendly containers. Concurrent use of testing and analysis allows engineers to more fully characterize a shipping container's responses to the environments given in the regulations, and provides a strong basis for certification. The combination of the input and output of these efforts should provide a general methodology that designers of Type B radioactive material shipping containers can utilize to optimize and certify their designs. (J.P.N.)

  4. An Empirical Review of Research Methodologies and Methods in Creativity Studies (2003-2012)

    Science.gov (United States)

    Long, Haiying

    2014-01-01

    Based on the data collected from 5 prestigious creativity journals, research methodologies and methods of 612 empirical studies on creativity, published between 2003 and 2012, were reviewed and compared to those in gifted education. Major findings included: (a) Creativity research was predominantly quantitative and psychometrics and experiment…

  5. Methodological issues affecting the study of fish parasites. II. Sampling method affects ectoparasite studies

    Czech Academy of Sciences Publication Activity Database

    Kvach, Yuriy; Ondračková, Markéta; Janáč, Michal; Jurajda, Pavel

    2016-01-01

    Roč. 121, č. 1 (2016), s. 59-66 ISSN 0177-5103 R&D Projects: GA ČR GBP505/12/G112 Institutional support: RVO:68081766 Keywords : Parasite community * Fish sampling method * Methodology * Parasitological examination * Rutilus rutilus Subject RIV: EG - Zoology Impact factor: 1.549, year: 2016

  6. An Annotated Bibliography of the Gestalt Methods, Techniques, and Therapy

    Science.gov (United States)

    Prewitt-Diaz, Joseph O.

    The purpose of this annotated bibliography is to provide the reader with a guide to relevant research in the area of Gestalt therapy, techniques, and methods. The majority of the references are journal articles written within the last 5 years or documents easily obtained through interlibrary loans from local libraries. These references were…

  7. Towards a Critical Health Equity Research Stance: Why Epistemology and Methodology Matter More Than Qualitative Methods.

    Science.gov (United States)

    Bowleg, Lisa

    2017-10-01

    Qualitative methods are not intrinsically progressive. Methods are simply tools to conduct research. Epistemology, the justification of knowledge, shapes methodology and methods, and thus is a vital starting point for a critical health equity research stance, regardless of whether the methods are qualitative, quantitative, or mixed. In line with this premise, I address four themes in this commentary. First, I criticize the ubiquitous and uncritical use of the term health disparities in U.S. public health. Next, I advocate for the increased use of qualitative methodologies-namely, photovoice and critical ethnography-that, pursuant to critical approaches, prioritize dismantling social-structural inequities as a prerequisite to health equity. Thereafter, I discuss epistemological stance and its influence on all aspects of the research process. Finally, I highlight my critical discourse analysis HIV prevention research based on individual interviews and focus groups with Black men, as an example of a critical health equity research approach.

  8. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    Science.gov (United States)

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  9. A Alternative Analog Circuit Design Methodology Employing Integrated Artificial Intelligence Techniques

    Science.gov (United States)

    Tuttle, Jeffery L.

    In consideration of the computer processing power now available to the designer, an alternative analog circuit design methodology is proposed. Computer memory capacities no longer require the reduction of the transistor operational characteristics to an imprecise formulation. Therefore, it is proposed that transistor modelling be abandoned in favor of fully characterized transistor data libraries. Secondly, availability of the transistor libraries would facilitate an automated selection of the most appropriate device(s) for the circuit being designed. More specifically, a preprocessor computer program to a more sophisticated circuit simulator (e.g. SPICE) is developed to assist the designer in developing the basic circuit topology and the selection of the most appropriate transistor. Once this is achieved, the circuit topology and selected transistor data library would be downloaded to the simulator for full circuit operational characterization and subsequent design modifications. It is recognized that the design process is enhanced by the use of heuristics as applied to iterative design results. Accordingly, an artificial intelligence (AI) interface is developed to assist the designer in applying the preprocessor results. To demonstrate the retrofitability of the AI interface to established programs, the interface is specifically designed to be as non-intrusive to the host code as possible. Implementation of the proposed methodology offers the potential to speed the design process, since the preprocessor both minimizes the required number of simulator runs and provides a higher acceptance potential of the initial and subsequent simulator runs. Secondly, part count reductions may be realizable since the circuit topologies are not as strongly driven by transistor limitations. Thirdly, the predicted results should more closely match actual circuit operations since the inadequacies of the transistor models have been virtually eliminated. Finally, the AI interface

  10. The Stonehenge technique. A method for aligning coherent bremsstrahlung radiators

    International Nuclear Information System (INIS)

    Livingston, Ken

    2009-01-01

    This paper describes a technique for the alignment of crystal radiators used to produce high energy, linearly polarized photons via coherent bremsstrahlung scattering at electron beam facilities. In these experiments the crystal is mounted on a goniometer which is used to adjust its orientation relative to the electron beam. The angles and equations which relate the crystal lattice, goniometer and electron beam direction are presented here, and the method of alignment is illustrated with data taken at MAMI (the Mainz microtron). A practical guide to setting up a coherent bremsstrahlung facility and installing new crystals using this technique is also included.

  11. The Stonehenge technique. A method for aligning coherent bremsstrahlung radiators

    Energy Technology Data Exchange (ETDEWEB)

    Livingston, Ken [Department of Physics and Astronomy, University of Glasgow, Glasgow G12 8QQ (United Kingdom)], E-mail: k.livingston@physics.gla.ac.uk

    2009-05-21

    This paper describes a technique for the alignment of crystal radiators used to produce high energy, linearly polarized photons via coherent bremsstrahlung scattering at electron beam facilities. In these experiments the crystal is mounted on a goniometer which is used to adjust its orientation relative to the electron beam. The angles and equations which relate the crystal lattice, goniometer and electron beam direction are presented here, and the method of alignment is illustrated with data taken at MAMI (the Mainz microtron). A practical guide to setting up a coherent bremsstrahlung facility and installing new crystals using this technique is also included.

  12. The Stonehenge technique. A method for aligning coherent bremsstrahlung radiators

    Science.gov (United States)

    Livingston, Ken

    2009-05-01

    This paper describes a technique for the alignment of crystal radiators used to produce high energy, linearly polarized photons via coherent bremsstrahlung scattering at electron beam facilities. In these experiments the crystal is mounted on a goniometer which is used to adjust its orientation relative to the electron beam. The angles and equations which relate the crystal lattice, goniometer and electron beam direction are presented here, and the method of alignment is illustrated with data taken at MAMI (the Mainz microtron). A practical guide to setting up a coherent bremsstrahlung facility and installing new crystals using this technique is also included.

  13. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    Energy Technology Data Exchange (ETDEWEB)

    Laborda, Francisco, E-mail: flaborda@unizar.es; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-21

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  14. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples

    International Nuclear Information System (INIS)

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T.; Jiménez, María S.; Pérez-Arantegui, Josefina; Castillo, Juan R.

    2016-01-01

    dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. - Highlights: • The challenge to analyze inorganic nanomaterials is described. • Techniques for detection, characterization and quantification of inorganic nanomaterials are presented. • Sample preparation methods for the analysis of nanomaterials in complex samples are presented. • Methodological approaches posed by stakeholders for solving nanometrological problems are discussed.

  15. In Situ Analytical Characterization of Contaminated Sites Using Nuclear Spectrometry Techniques. Review of Methodologies and Measurements

    International Nuclear Information System (INIS)

    2017-01-01

    Past and current human activities can result in the contamination of sites by radionuclides and heavy metals. The sources of contamination are various. The most important sources for radionuclide release include global fallout from nuclear testing, nuclear and radiological accidents, waste production from nuclear facilities, and activities involving naturally occurring radioactive material (NORM). Contamination of the environment by heavy metals mainly originates from industrial applications and mineralogical background concentration. Contamination of sites by radionuclides and heavy metals can present a risk to people and the environment. Therefore, the estimation of the contamination level and the identification of the source constitute important information for the national authorities with the responsibility to protect people and the environment from adverse health effects. In situ analytical techniques based on nuclear spectrometry are important tools for the characterization of contaminated sites. Much progress has been made in the design and implementation of portable systems for efficient and effective monitoring of radioactivity and heavy metals in the environment directly on-site. Accordingly, the IAEA organized a Technical Meeting to review the current status and trends of various applications of in situ nuclear spectrometry techniques for analytical characterization of contaminated sites and to support Member States in their national environmental monitoring programmes applying portable instrumentation. This publication represents a comprehensive review of the in situ gamma ray spectrometry and field portable X ray fluorescence analysis techniques for the characterization of contaminated sites. It includes papers on the use of these techniques, which provide useful background information for conducting similar studies, in the following Member States: Argentina, Australia, Brazil, Czech Republic, Egypt, France, Greece, Hungary, Italy, Lithuania

  16. Methodological Challenges in Sustainability Science: A Call for Method Plurality, Procedural Rigor and Longitudinal Research

    Directory of Open Access Journals (Sweden)

    Henrik von Wehrden

    2017-02-01

    Full Text Available Sustainability science encompasses a unique field that is defined through its purpose, the problem it addresses, and its solution-oriented agenda. However, this orientation creates significant methodological challenges. In this discussion paper, we conceptualize sustainability problems as wicked problems to tease out the key challenges that sustainability science is facing if scientists intend to deliver on its solution-oriented agenda. Building on the available literature, we discuss three aspects that demand increased attention for advancing sustainability science: 1 methods with higher diversity and complementarity are needed to increase the chance of deriving solutions to the unique aspects of wicked problems; for instance, mixed methods approaches are potentially better suited to allow for an approximation of solutions, since they cover wider arrays of knowledge; 2 methodologies capable of dealing with wicked problems demand strict procedural and ethical guidelines, in order to ensure their integration potential; for example, learning from solution implementation in different contexts requires increased comparability between research approaches while carefully addressing issues of legitimacy and credibility; and 3 approaches are needed that allow for longitudinal research, since wicked problems are continuous and solutions can only be diagnosed in retrospect; for example, complex dynamics of wicked problems play out across temporal patterns that are not necessarily aligned with the common timeframe of participatory sustainability research. Taken together, we call for plurality in methodologies, emphasizing procedural rigor and the necessity of continuous research to effectively addressing wicked problems as well as methodological challenges in sustainability science.

  17. Developing a methodology to assess the impact of research grant funding: a mixed methods approach.

    Science.gov (United States)

    Bloch, Carter; Sørensen, Mads P; Graversen, Ebbe K; Schneider, Jesper W; Schmidt, Evanthia Kalpazidou; Aagaard, Kaare; Mejlgaard, Niels

    2014-04-01

    This paper discusses the development of a mixed methods approach to analyse research funding. Research policy has taken on an increasingly prominent role in the broader political scene, where research is seen as a critical factor in maintaining and improving growth, welfare and international competitiveness. This has motivated growing emphasis on the impacts of science funding, and how funding can best be designed to promote socio-economic progress. Meeting these demands for impact assessment involves a number of complex issues that are difficult to fully address in a single study or in the design of a single methodology. However, they point to some general principles that can be explored in methodological design. We draw on a recent evaluation of the impacts of research grant funding, discussing both key issues in developing a methodology for the analysis and subsequent results. The case of research grant funding, involving a complex mix of direct and intermediate effects that contribute to the overall impact of funding on research performance, illustrates the value of a mixed methods approach to provide a more robust and complete analysis of policy impacts. Reflections on the strengths and weaknesses of the methodology are used to examine refinements for future work. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Method for plant operation guidance by knowledge engineering technique

    International Nuclear Information System (INIS)

    Kiguchi, Takashi; Yoshida, Kenichi; Motoda, Hiroshi; Kobayashi, Setsuo

    1983-01-01

    A method for plant operation guidance has been developed by using the Knowledge Engineering technique. The method is characterized by its capability of handling plant dynamics. The knowledge-base includes plant simulation programs as tools to evaluate dynamic behaviors as well as production rules of ''if..., then...'' type. The inference engine is thus capable of predicting plant dynamics and making decisions in accordance with time progress. The performance of the guidance method was evaluated by simulation tests assuming various abnormal situations of a BWR power plant. It was shown that the method can detect each of the abnormal events along the course of their occurrence, and provide the guidance for corrective actions. The operation guidance method proposed in this paper is general and is applicable not only to nuclear power plants but also to other plants such as chemical production plants and fossile power plants. (author)

  19. Selected methods of waste monitoring using modern analytical techniques

    International Nuclear Information System (INIS)

    Hlavacek, I.; Hlavackova, I.

    1993-11-01

    Issues of the inspection and control of bituminized and cemented waste are discussed, and some methods of their nondestructive testing are described. Attention is paid to the inspection techniques, non-nuclear spectral techniques in particular, as employed for quality control of the wastes, waste concentrates, spent waste leaching solutions, as well as for the examination of environmental samples (waters and soils) from the surroundings of nuclear power plants. Some leaching tests used abroad for this purpose and practical analyses by the ICP-AES technique are given by way of example. The ICP-MS technique, which is unavailable in the Czech Republic, is routinely employed abroad for alpha nuclide measurements; examples of such analyses are also given. The next topic discussed includes the monitoring of organic acids and complexants to determine the degree of their thermal decomposition during the bituminization of wastes on an industrial line. All of the methods and procedures highlighted can be used as technical support during the monitoring of radioactive waste properties in industrial conditions, in the chemical and radiochemical analyses of wastes and related matter, in the calibration of nondestructive testing instrumentation, in the monitoring of contamination of the surroundings of nuclear facilities, and in trace analysis. (author). 10 tabs., 1 fig., 14 refs

  20. FORMATION OF COGNITIVE INTEREST AT ENGLISH LANGUAGE LESSONS IN PRIMARY SCHOOL: TECHNOLOGIES, METHODS, TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Kotova, E.G.

    2017-09-01

    Full Text Available There are a lot of didactic and technological methods and techniques that shape and develop cognitive interest of primary school students in modern methodology of teaching foreign languages. The use of various forms of gaming interaction, problem assignments, information and communication technologies in the teaching of primary school students allows diversifying the teaching of a foreign language, contributes to the development of their creative and cognitive activity. The use of health-saving technologies ensures the creation of a psychologically and emotionally supportive atmosphere at the lesson, which is an essential condition for acquiring new knowledge and maintaining stable cognitive interest among students while learning a foreign language.

  1. Floating Node Method and Virtual Crack Closure Technique for Modeling Matrix Cracking-Delamination Migration

    Science.gov (United States)

    DeCarvalho, Nelson V.; Chen, B. Y.; Pinho, Silvestre T.; Baiz, P. M.; Ratcliffe, James G.; Tay, T. E.

    2013-01-01

    A novel approach is proposed for high-fidelity modeling of progressive damage and failure in composite materials that combines the Floating Node Method (FNM) and the Virtual Crack Closure Technique (VCCT) to represent multiple interacting failure mechanisms in a mesh-independent fashion. In this study, the approach is applied to the modeling of delamination migration in cross-ply tape laminates. Delamination, matrix cracking, and migration are all modeled using fracture mechanics based failure and migration criteria. The methodology proposed shows very good qualitative and quantitative agreement with experiments.

  2. Efficacy of Blood Sources and Artificial Blood Feeding Methods in Rearing of Aedes aegypti (Diptera: Culicidae for Sterile Insect Technique and Incompatible Insect Technique Approaches in Sri Lanka

    Directory of Open Access Journals (Sweden)

    Nayana Gunathilaka

    2017-01-01

    Full Text Available Introduction. Selection of the artificial membrane feeding technique and blood meal source has been recognized as key considerations in mass rearing of vectors. Methodology. Artificial membrane feeding techniques, namely, glass plate, metal plate, and Hemotek membrane feeding method, and three blood sources (human, cattle, and chicken were evaluated based on feeding rates, fecundity, and hatching rates of Aedes aegypti. Significance in the variations among blood feeding was investigated by one-way ANOVA, cluster analysis of variance (ANOSIM, and principal coordinates (PCO analysis. Results. Feeding rates of Ae. aegypti significantly differed among the membrane feeding techniques as suggested by one-way ANOVA (p0.05. Conclusions. Metal plate method could be recommended as the most effective membrane feeding technique for mass rearing of Ae. aegypti, due to its high feeding rate and cost effectiveness. Cattle blood could be recommended for mass rearing Ae. aegypti.

  3. Distance learning methodology and technique in scientific and vocational communication (on the example of the master’s distance course in linguistics

    Directory of Open Access Journals (Sweden)

    S. S. Khromov

    2016-01-01

    Full Text Available The article is devoted to the elaboration of methodology and technique of the master’s distance course in linguistics for Russian students. The research novelty lies in the fact that the course presents the results methodic and scientific work of the teachers’ and students’ stuff. Within the course framework we plan to transfer the communicative activity concept to the distance forms of education and modeling a new type of the educational product.The purposes of the research are: 1 to develop the distance learning methodology and technique for a linguistic master’s course; 2 to elaborate an internal structure of the project; 3 to demonstrate which vocational, language and speech competencies are to appear as tge result of the project; 4 to describe the algorithm of the full-time lecture course in linguistics in a distance format; 5 to conduct a pedagogical experiment realizing the distance learning education in master’s linguistic course; 6 to prove the innovation and the productivity of the elaborated master’s course in linguistics.The research is based on 1 the paper variant of the full-time lecture course 2 the curriculum of the lecture course 3 the concept of the master’s course in linguistics 4 the concept of the distance course in linguistics 5 students’ interviews 6 virtual tools The research methods are 1 descriptive 2 project 3 comparative 4 statistic methodsConclusion. The novelty and the productivity of the course have been proved and they are manifested in the following 1 in the ability to develop vocational, language and speech competences of the students 2 in developing individual trajectories of the students 3 in expanding sociocultural potential of the students 4 in developing sociocultural potential of the students 5 in intensifying education process. As a result of the experiment we can state that 1 the methodology and technique of distance tools in projecting master’s course in linguistics are described 2 the

  4. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  5. Sleeve Push Technique: A Novel Method of Space Gaining.

    Science.gov (United States)

    Verma, Sanjeev; Bhupali, Nameksh Raj; Gupta, Deepak Kumar; Singh, Sombir; Singh, Satinder Pal

    2018-01-01

    Space gaining is frequently required in orthodontics. Multiple loops were initially used for space gaining and alignment. The most common used mechanics for space gaining is the use of nickel-titanium open coil springs. The disadvantage of nickel-titanium coil spring is that they cannot be used until the arches are well aligned to receive the stiffer stainless steel wires. Therefore, a new method of gaining space during initial alignment and leveling has been developed and named as sleeve push technique (SPT). The nickel-titanium wires, i.e. 0.012 inches and 0.014 inches along with archwire sleeve (protective tubing) can be used in a modified way to gain space along with alignment. This method helps in gaining space right from day 1 of treatment. The archwire sleeve and nickel-titanium wire in this new SPT act as a mutually synergistic combination and provide the orthodontist with a completely new technique for space opening.

  6. Lyoluminescence technique as an identification method for irradiated food stuffs

    International Nuclear Information System (INIS)

    Chazhoor, J.S.

    1988-01-01

    The paper presents the studies made on the suitability of lyoluminescence technique as an analytical method for the identification of irradiated food stuffs. Powder milk, cinnamon, cardamom, clove, red chilly, cocoa, pepper, tea, coffee, turmeric and coriander showed lyoluminescence response when irradiated by a 10 kGy 60 Co and dissolved in luminol solution. Various dosimetric parameters such as effect of storage time, proportionality of the lyoluminescence response to dose etc were studied. (author). 1 tab., 3 figs

  7. METHOD FOR SOLVING FUZZY ASSIGNMENT PROBLEM USING MAGNITUDE RANKING TECHNIQUE

    OpenAIRE

    D. Selvi; R. Queen Mary; G. Velammal

    2017-01-01

    Assignment problems have various applications in the real world because of their wide applicability in industry, commerce, management science, etc. Traditional classical assignment problems cannot be successfully used for real life problem, hence the use of fuzzy assignment problems is more appropriate. In this paper, the fuzzy assignment problem is formulated to crisp assignment problem using Magnitude Ranking technique and Hungarian method has been applied to find an optimal solution. The N...

  8. Mixed methods in psychotherapy research: A review of method(ology) integration in psychotherapy science.

    Science.gov (United States)

    Bartholomew, Theodore T; Lockard, Allison J

    2018-06-13

    Mixed methods can foster depth and breadth in psychological research. However, its use remains in development in psychotherapy research. Our purpose was to review the use of mixed methods in psychotherapy research. Thirty-one studies were identified via the PRISMA systematic review method. Using Creswell & Plano Clark's typologies to identify design characteristics, we assessed each study for rigor and how each used mixed methods. Key features of mixed methods designs and these common patterns were identified: (a) integration of clients' perceptions via mixing; (b) understanding group psychotherapy; (c) integrating methods with cases and small samples; (d) analyzing clinical data as qualitative data; and (e) exploring cultural identities in psychotherapy through mixed methods. The review is discussed with respect to the value of integrating multiple data in single studies to enhance psychotherapy research. © 2018 Wiley Periodicals, Inc.

  9. Are There Two Methods of Grounded Theory? Demystifying the Methodological Debate

    Directory of Open Access Journals (Sweden)

    Cheri Ann Hernandez, RN, Ph.D., CDE

    2008-06-01

    Full Text Available Grounded theory is an inductive research method for the generation of substantive or formal theory, using qualitative or quantitative data generated from research interviews, observation, or written sources, or some combination thereof (Glaser & Strauss, 1967. In recent years there has been much controversy over the etiology of its discovery, as well as, the exact way in which grounded theory research is to be operationalized. Unfortunately, this situation has resulted in much confusion, particularly among novice researchers who wish to utilize this research method. In this article, the historical, methodological and philosophical roots of grounded theory are delineated in a beginning effort to demystify this methodological debate. Grounded theory variants such as feminist grounded theory (Wuest, 1995 or constructivist grounded theory (Charmaz, 1990 are beyond the scope of this discussion.

  10. Methodological peculiarities of the Braun-Blanquet method used for marine bottom vegetation classification

    Directory of Open Access Journals (Sweden)

    AFANASYEV Dmitry F.

    2012-09-01

    Full Text Available The features of the Brown Blanquet method application for the classification of the Black and Azov Seas bottom phytocenoses are discussed. Special attention is given to following methodological questions: term of observations, necessary for associations revealing, size of relevance area, the features of geobotanic al underwater exploration technology, description of the bottom communities with epiphytes and the peculiarities of bottom vegetation syntaxonomic analysis.

  11. Methodological issues affecting the study of fish parasites. III. Effect of fish preservation method

    Czech Academy of Sciences Publication Activity Database

    Kvach, Yuriy; Ondračková, Markéta; Janáč, Michal; Jurajda, Pavel

    2018-01-01

    Roč. 127, č. 3 (2018), s. 213-224 ISSN 0177-5103 R&D Projects: GA ČR(CZ) GBP505/12/G112 Institutional support: RVO:68081766 Keywords : flounder Paralichthys-olivaceus * Neoheterobothrium-hirame * community structure * infection levels * Baltic sea * Odontobutidae * ectoparasites * Perciformes * collection * ecology * Parasite community * Preservation methods * Perca fluviatilis * Rhodeus amarus * Methodology * Parasitological examination Subject RIV: GL - Fish ing OBOR OECD: Fish ery Impact factor: 1.549, year: 2016

  12. A Method to Test Model Calibration Techniques: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, Ron; Polly, Ben; Neymark, Joel

    2016-09-01

    This paper describes a method for testing model calibration techniques. Calibration is commonly used in conjunction with energy retrofit audit models. An audit is conducted to gather information about the building needed to assemble an input file for a building energy modeling tool. A calibration technique is used to reconcile model predictions with utility data, and then the 'calibrated model' is used to predict energy savings from a variety of retrofit measures and combinations thereof. Current standards and guidelines such as BPI-2400 and ASHRAE-14 set criteria for 'goodness of fit' and assume that if the criteria are met, then the calibration technique is acceptable. While it is logical to use the actual performance data of the building to tune the model, it is not certain that a good fit will result in a model that better predicts post-retrofit energy savings. Therefore, the basic idea here is that the simulation program (intended for use with the calibration technique) is used to generate surrogate utility bill data and retrofit energy savings data against which the calibration technique can be tested. This provides three figures of merit for testing a calibration technique, 1) accuracy of the post-retrofit energy savings prediction, 2) closure on the 'true' input parameter values, and 3) goodness of fit to the utility bill data. The paper will also discuss the pros and cons of using this synthetic surrogate data approach versus trying to use real data sets of actual buildings.

  13. Modern Methodology and Techniques Aimed at Developing the Environmentally Responsible Personality

    Science.gov (United States)

    Ponomarenko, Yelena V.; Zholdasbekova, Bibisara A.; Balabekov, Aidarhan T.; Kenzhebekova, Rabiga I.; Yessaliyev, Aidarbek A.; Larchenkova, Liudmila A.

    2016-01-01

    The article discusses the positive impact of an environmentally responsible individual as the social unit able to live in harmony with the natural world, himself/herself and other people. The purpose of the article is to provide theoretical substantiation of modern teaching methods. The authors considered the experience of philosophy, psychology,…

  14. Improved parallel solution techniques for the integral transport matrix method

    Energy Technology Data Exchange (ETDEWEB)

    Zerr, R. Joseph, E-mail: rjz116@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA (United States); Azmy, Yousry Y., E-mail: yyazmy@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, Burlington Engineering Laboratories, Raleigh, NC (United States)

    2011-07-01

    Alternative solution strategies to the parallel block Jacobi (PBJ) method for the solution of the global problem with the integral transport matrix method operators have been designed and tested. The most straightforward improvement to the Jacobi iterative method is the Gauss-Seidel alternative. The parallel red-black Gauss-Seidel (PGS) algorithm can improve on the number of iterations and reduce work per iteration by applying an alternating red-black color-set to the subdomains and assigning multiple sub-domains per processor. A parallel GMRES(m) method was implemented as an alternative to stationary iterations. Computational results show that the PGS method can improve on the PBJ method execution time by up to 10´ when eight sub-domains per processor are used. However, compared to traditional source iterations with diffusion synthetic acceleration, it is still approximately an order of magnitude slower. The best-performing cases are optically thick because sub-domains decouple, yielding faster convergence. Further tests revealed that 64 sub-domains per processor was the best performing level of sub-domain division. An acceleration technique that improves the convergence rate would greatly improve the ITMM. The GMRES(m) method with a diagonal block pre conditioner consumes approximately the same time as the PBJ solver but could be improved by an as yet undeveloped, more efficient pre conditioner. (author)

  15. Improved parallel solution techniques for the integral transport matrix method

    International Nuclear Information System (INIS)

    Zerr, R. Joseph; Azmy, Yousry Y.

    2011-01-01

    Alternative solution strategies to the parallel block Jacobi (PBJ) method for the solution of the global problem with the integral transport matrix method operators have been designed and tested. The most straightforward improvement to the Jacobi iterative method is the Gauss-Seidel alternative. The parallel red-black Gauss-Seidel (PGS) algorithm can improve on the number of iterations and reduce work per iteration by applying an alternating red-black color-set to the subdomains and assigning multiple sub-domains per processor. A parallel GMRES(m) method was implemented as an alternative to stationary iterations. Computational results show that the PGS method can improve on the PBJ method execution time by up to 10´ when eight sub-domains per processor are used. However, compared to traditional source iterations with diffusion synthetic acceleration, it is still approximately an order of magnitude slower. The best-performing cases are optically thick because sub-domains decouple, yielding faster convergence. Further tests revealed that 64 sub-domains per processor was the best performing level of sub-domain division. An acceleration technique that improves the convergence rate would greatly improve the ITMM. The GMRES(m) method with a diagonal block pre conditioner consumes approximately the same time as the PBJ solver but could be improved by an as yet undeveloped, more efficient pre conditioner. (author)

  16. Slot technique - an alternative method of scatter reduction in radiography

    International Nuclear Information System (INIS)

    Panzer, W.; Widenmann, L.

    1983-01-01

    The most common method of scatter reduction in radiography is the use of an antiscatter grid. Its disadvantage is the absorption of a certain percentage of primary radiation in the lead strips of the grid and the fact that due to the limited thickness of the lead strips their scatter absorption is also limited. A possibility for avoiding this disadvantage is offered by the so-called slot technique, ie, the successive exposure of the subject with a narrow fan beam provided by slots in rather thick lead plates. The results of a comparison between grid and slot technique regarding dose to the patient, scatter reduction, image quality and the effect of automatic exposure control are reported. (author)

  17. [Hierarchy structuring for mammography technique by interpretive structural modeling method].

    Science.gov (United States)

    Kudo, Nozomi; Kurowarabi, Kunio; Terashita, Takayoshi; Nishimoto, Naoki; Ogasawara, Katsuhiko

    2009-10-20

    Participation in screening mammography is currently desired in Japan because of the increase in breast cancer morbidity. However, the pain and discomfort of mammography is recognized as a significant deterrent for women considering this examination. Thus quick procedures, sufficient experience, and advanced skills are required for radiologic technologists. The aim of this study was to make the point of imaging techniques explicit and to help understand the complicated procedure. We interviewed 3 technologists who were highly skilled in mammography, and 14 factors were retrieved by using brainstorming and the KJ method. We then applied Interpretive Structural Modeling (ISM) to the factors and developed a hierarchical concept structure. The result showed a six-layer hierarchy whose top node was explanation of the entire procedure on mammography. Male technologists were related to as a negative factor. Factors concerned with explanation were at the upper node. We gave attention to X-ray techniques and considerations. The findings will help beginners improve their skills.

  18. Microemulsion extrusion technique: a new method to produce lipid nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Jesus, Marcelo Bispo de, E-mail: dejesusmb@gmail.com; Radaic, Allan [University of Campinas-UNICAMP, Department of Biochemistry, Institute of Biology (Brazil); Zuhorn, Inge S. [University of Groningen, Department of Membrane Cell Biology, University Medical Center (Netherlands); Paula, Eneida de [University of Campinas-UNICAMP, Department of Biochemistry, Institute of Biology (Brazil)

    2013-10-15

    Solid lipid nanoparticles (SLN) and nanostructured lipid carriers (NLC) have been intensively investigated for different applications, including their use as drug and gene delivery systems. Different techniques have been employed to produce lipid nanoparticles, of which high pressure homogenization is the standard technique that is adopted nowadays. Although this method has a high efficiency, does not require the use of organic solvents, and allows large-scale production, some limitations impede its application at laboratory scale: the equipment is expensive, there is a need of huge amounts of surfactants and co-surfactants during the preparation, and the operating conditions are energy intensive. Here, we present the microemulsion extrusion technique as an alternative method to prepare lipid nanoparticles. The parameters to produce lipid nanoparticles using microemulsion extrusion were established, and the lipid particles produced (SLN, NLC, and liposomes) were characterized with regard to size (from 130 to 190 nm), zeta potential, and drug (mitoxantrone) and gene (pDNA) delivery properties. In addition, the particles' in vitro co-delivery capacity (to carry mitoxantrone plus pDNA encoding the phosphatase and tensin homologue, PTEN) was tested in normal (BALB 3T3 fibroblast) and cancer (PC3 prostate and MCF-7 breast) cell lines. The results show that the microemulsion extrusion technique is fast, inexpensive, reproducible, free of organic solvents, and suitable for small volume preparations of lipid nanoparticles. Its application is particularly interesting when using rare and/or costly drugs or ingredients (e.g., cationic lipids for gene delivery or labeled lipids for nanoparticle tracking/diagnosis)

  19. Microemulsion extrusion technique: a new method to produce lipid nanoparticles

    International Nuclear Information System (INIS)

    Jesus, Marcelo Bispo de; Radaic, Allan; Zuhorn, Inge S.; Paula, Eneida de

    2013-01-01

    Solid lipid nanoparticles (SLN) and nanostructured lipid carriers (NLC) have been intensively investigated for different applications, including their use as drug and gene delivery systems. Different techniques have been employed to produce lipid nanoparticles, of which high pressure homogenization is the standard technique that is adopted nowadays. Although this method has a high efficiency, does not require the use of organic solvents, and allows large-scale production, some limitations impede its application at laboratory scale: the equipment is expensive, there is a need of huge amounts of surfactants and co-surfactants during the preparation, and the operating conditions are energy intensive. Here, we present the microemulsion extrusion technique as an alternative method to prepare lipid nanoparticles. The parameters to produce lipid nanoparticles using microemulsion extrusion were established, and the lipid particles produced (SLN, NLC, and liposomes) were characterized with regard to size (from 130 to 190 nm), zeta potential, and drug (mitoxantrone) and gene (pDNA) delivery properties. In addition, the particles’ in vitro co-delivery capacity (to carry mitoxantrone plus pDNA encoding the phosphatase and tensin homologue, PTEN) was tested in normal (BALB 3T3 fibroblast) and cancer (PC3 prostate and MCF-7 breast) cell lines. The results show that the microemulsion extrusion technique is fast, inexpensive, reproducible, free of organic solvents, and suitable for small volume preparations of lipid nanoparticles. Its application is particularly interesting when using rare and/or costly drugs or ingredients (e.g., cationic lipids for gene delivery or labeled lipids for nanoparticle tracking/diagnosis)

  20. Study of input variables in group method of data handling methodology

    International Nuclear Information System (INIS)

    Pereira, Iraci Martinez; Bueno, Elaine Inacio

    2013-01-01

    The Group Method of Data Handling - GMDH is a combinatorial multi-layer algorithm in which a network of layers and nodes is generated using a number of inputs from the data stream being evaluated. The GMDH network topology has been traditionally determined using a layer by layer pruning process based on a pre-selected criterion of what constitutes the best nodes at each level. The traditional GMDH method is based on an underlying assumption that the data can be modeled by using an approximation of the Volterra Series or Kolmorgorov-Gabor polynomial. A Monitoring and Diagnosis System was developed based on GMDH and ANN methodologies, and applied to the IPEN research Reactor IEA-1. The system performs the monitoring by comparing the GMDH and ANN calculated values with measured ones. As the GMDH is a self-organizing methodology, the input variables choice is made automatically. On the other hand, the results of ANN methodology are strongly dependent on which variables are used as neural network input. (author)

  1. Co-design of RAD and ETHICS methodologies: a combination of information system development methods

    Science.gov (United States)

    Nasehi, Arezo; Shahriyari, Salman

    2011-12-01

    Co-design is a new trend in the social world which tries to capture different ideas in order to use the most appropriate features for a system. In this paper, co-design of two information system methodologies is regarded; rapid application development (RAD) and effective technical and human implementation of computer-based systems (ETHICS). We tried to consider the characteristics of these methodologies to see the possibility of having a co-design or combination of them for developing an information system. To reach this purpose, four different aspects of them are analyzed: social or technical approach, user participation and user involvement, job satisfaction, and overcoming change resistance. Finally, a case study using the quantitative method is analyzed in order to examine the possibility of co-design using these factors. The paper concludes that RAD and ETHICS are appropriate to be co-designed and brings some suggestions for the co-design.

  2. Development of a high-order finite volume method with multiblock partition techniques

    Directory of Open Access Journals (Sweden)

    E. M. Lemos

    2012-03-01

    Full Text Available This work deals with a new numerical methodology to solve the Navier-Stokes equations based on a finite volume method applied to structured meshes with co-located grids. High-order schemes used to approximate advective, diffusive and non-linear terms, connected with multiblock partition techniques, are the main contributions of this paper. Combination of these two techniques resulted in a computer code that involves high accuracy due the high-order schemes and great flexibility to generate locally refined meshes based on the multiblock approach. This computer code has been able to obtain results with higher or equal accuracy in comparison with results obtained using classical procedures, with considerably less computational effort.

  3. Modeling and optimization of effective parameters on the size of synthesized Fe{sub 3}O{sub 4} superparamagnetic nanoparticles by coprecipitation technique using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ghazanfari, Mohammad Reza, E-mail: Ghazanfari.mr@gmail.com [Department of Materials Science and Engineering, Ferdowsi University of Mashhad, 9177948974 Mashhad (Iran, Islamic Republic of); Kashefi, Mehrdad, E-mail: m-kashefi@um.ac.ir [Department of Materials Science and Engineering, Ferdowsi University of Mashhad, 9177948974 Mashhad (Iran, Islamic Republic of); Jaafari, Mahmoud Reza [Biotechnology Research Center, Nanotechnology Research Center, School of Pharmacy, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2016-05-01

    Generally, the statistical methods are defined as appropriate techniques to study the processes trends. In current research, the Fe{sub 3}O{sub 4} superparamagnetic nanoparticles were synthesized by coprecipitation method. In order to investigate the size properties of synthesized particles, the experimental design was done using central composite method (CCD) of response surface methodology (RSM) while the temperature, pH, and cation ratio of reaction were selected as influential factors. After particles synthesis based on designed runs, the different responses such as hydrodynamic size of particles (both freeze dried and air dried), size distribution, crystallite size, magnetic size, and zeta potential were evaluated by different techniques i.e. dynamic light scattering (DLS), X-ray diffraction (XRD), and vibrating sample magnetometer (VSM). Based on these results, the quadratic polynomial model was fitted for each response that could predict the response amounts. In following, the study of factors effects was carried out that showed the temperature, pH, and their interactions had higher effectiveness. Finally, by optimizing, it was clear that the minimum amounts of particle size (10.15 nm) and size distribution (13.01 nm) were reached in the minimum temperature (70 °C) and cation ratio (0.5) amounts and maximum pH amount (10.5). Moreover, the characterizations showed the particles size was about 10 nm while the amounts of M{sub s}, H{sub c}, and M{sub r} were equal to 60 (emu/g), 0.2 (Oe) and 0.22 (emu/g), respectively. - Highlights: • The Fe{sub 3}O{sub 4} nanoparticles were successfully synthesized by coprecipitation method. • By RSM technique, some predicted models were presented for particles size. • Temperature, pH and their interactions had most effectiveness on the particles size. • The drying techniques can effect on the size properties.

  4. Double contrast barium enema: technique, indications, results and limitations of a conventional imaging methodology in the MDCT virtual endoscopy era.

    Science.gov (United States)

    Rollandi, Gian Andrea; Biscaldi, Ennio; DeCicco, Enzo

    2007-03-01

    The double contrast barium enema of the colon continues to be a diffused conventional radiological technique and allows for the diagnosis of neoplastic and inflammatory pathology. After the '70s, a massive initiative is undertaken to simplify, perfect and encode the method of the double contrast barium enema: Altaras from Germany, Miller from USA and Cittadini from Italy are responsible for the perfection of this technique in the last 30 years. The tailored patient preparation, a perfect technique of execution and a precise radiological documentation are essentials steps to obtain a reliable examination. The main limit of double contrast enema is that it considers the pathology only from the mucosal surface. In neoplastic pathology evaluation the main limit is the "T" parameter staging, but more limited are the "N" and "M" parameters evaluation. Today the double contrast technique continues to be a refined, sensitive and specific diagnostic method, moreover, diagnostic results cannot compete with the new CT multislice techniques (CT-enteroclysis and virtual colonoscopy) which can examine both the lumen and the wall of the colon. The double contrast is a cheap and simple examination but in the next future is predictably a progressive substitution of conventional radiology from new multislice techniques, because the cross sectional imaging is more frequently able to detect causes of the symptoms whether resulting both from colonic or non colonic origin.

  5. Double contrast barium enema: Technique, indications, results and limitations of a conventional imaging methodology in the MDCT virtual endoscopy era

    International Nuclear Information System (INIS)

    Rollandi, Gian Andrea; Biscaldi, Ennio; DeCicco, Enzo

    2007-01-01

    The double contrast barium enema of the colon continues to be a diffused conventional radiological technique and allows for the diagnosis of neoplastic and inflammatory pathology. After the '70s, a massive initiative is undertaken to simplify, perfect and encode the method of the double contrast barium enema: Altaras from Germany, Miller from USA and Cittadini from Italy are responsible for the perfection of this technique in the last 30 years. The tailored patient preparation, a perfect technique of execution and a precise radiological documentation are essentials steps to obtain a reliable examination. The main limit of double contrast enema is that it considers the pathology only from the mucosal surface. In neoplastic pathology evaluation the main limit is the 'T' parameter staging, but more limited are the 'N' and 'M' parameters evaluation. Today the double contrast technique continues to be a refined, sensitive and specific diagnostic method, moreover, diagnostic results cannot compete with the new CT multislice techniques (CT-enteroclysis and virtual colonoscopy) which can examine both the lumen and the wall of the colon. The double contrast is a cheap and simple examination but in the next future is predictably a progressive substitution of conventional radiology from new multislice techniques, because the cross sectional imaging is more frequently able to detect causes of the symptoms whether resulting both from colonic or non colonic origin

  6. Application of the PISC results and methodology to assess the effectiveness of NDT techniques applied on non nuclear components

    International Nuclear Information System (INIS)

    Maciga, G.; Papponetti, M.; Crutzen, S.; Jehenson, P.

    1990-01-01

    Performance demonstration for NDT has been an active topic for several years. Interest in it came to the fore in the early 1980's when several institutions started to propose to use of realistic training assemblies and the formal approach of Validation Centers. These steps were justified for example by the results of the PISC exercises which concluded that there was a need for performance demonstration starting with capability assessment of techniques and procedure as they were routinely applied. If the PISC programme is put under the general ''Nuclear Motivation'', the PISC Methodology could be extended to problems to structural components in general, such as on conventional power plants, chemical, aerospace and offshore industries, where integrity and safety have regarded as being of great importance. Some themes of NDT inspections of fossil power plant and offshore components that could be objects of validation studies will be illustrated. (author)

  7. Connective Tissue Characteristics around Healing Abutments of Different Geometries: New Methodological Technique under Circularly Polarized Light.

    Science.gov (United States)

    Delgado-Ruiz, Rafael Arcesio; Calvo-Guirado, Jose Luis; Abboud, Marcus; Ramirez-Fernandez, Maria Piedad; Maté-Sánchez de Val, José Eduardo; Negri, Bruno; Gomez-Moreno, Gerardo; Markovic, Aleksa

    2015-08-01

    To describe contact, thickness, density, and orientation of connective tissue fibers around healing abutments of different geometries by means of a new method using coordinates. Following the bilateral extraction of mandibular premolars (P2, P3, and P4) from six fox hound dogs and a 2-month healing period, 36 titanium implants were inserted, onto which two groups of healing abutments of different geometry were screwed: Group A (concave abutments) and Group B (wider healing abutment). After 3 months the animals were sacrificed and samples extracted containing each implant and surrounding soft and hard tissues. Histological analysis was performed without decalcifying the samples by means of circularly polarized light under optical microscope and a system of vertical and horizontal coordinates across all the connective tissue in an area delimited by the implant/abutment, epithelium, and bone tissue. In no case had the connective tissue formed a connection to the healing abutment/implant in the internal zone; a space of 35 ± 10 μm separated the connective tissue fibers from the healing abutment surface. The total thickness of connective tissue in the horizontal direction was significantly greater in the medial zone in Group B than in Group A (p connective tissue thickness. © 2013 Wiley Periodicals, Inc.

  8. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    Energy Technology Data Exchange (ETDEWEB)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed.

  9. Human factors analysis and design methods for nuclear waste retrieval systems. Human factors design methodology and integration plan

    International Nuclear Information System (INIS)

    Casey, S.M.

    1980-06-01

    The purpose of this document is to provide an overview of the recommended activities and methods to be employed by a team of human factors engineers during the development of a nuclear waste retrieval system. This system, as it is presently conceptualized, is intended to be used for the removal of storage canisters (each canister containing a spent fuel rod assembly) located in an underground salt bed depository. This document, and the others in this series, have been developed for the purpose of implementing human factors engineering principles during the design and construction of the retrieval system facilities and equipment. The methodology presented has been structured around a basic systems development effort involving preliminary development, equipment development, personnel subsystem development, and operational test and evaluation. Within each of these phases, the recommended activities of the human engineering team have been stated, along with descriptions of the human factors engineering design techniques applicable to the specific design issues. Explicit examples of how the techniques might be used in the analysis of human tasks and equipment required in the removal of spent fuel canisters have been provided. Only those techniques having possible relevance to the design of the waste retrieval system have been reviewed. This document is intended to provide the framework for integrating human engineering with the rest of the system development effort. The activities and methodologies reviewed in this document have been discussed in the general order in which they will occur, although the time frame (the total duration of the development program in years and months) in which they should be performed has not been discussed

  10. Alteration of Box-Jenkins methodology by implementing genetic algorithm method

    Science.gov (United States)

    Ismail, Zuhaimy; Maarof, Mohd Zulariffin Md; Fadzli, Mohammad

    2015-02-01

    A time series is a set of values sequentially observed through time. The Box-Jenkins methodology is a systematic method of identifying, fitting, checking and using integrated autoregressive moving average time series model for forecasting. Box-Jenkins method is an appropriate for a medium to a long length (at least 50) time series data observation. When modeling a medium to a long length (at least 50), the difficulty arose in choosing the accurate order of model identification level and to discover the right parameter estimation. This presents the development of Genetic Algorithm heuristic method in solving the identification and estimation models problems in Box-Jenkins. Data on International Tourist arrivals to Malaysia were used to illustrate the effectiveness of this proposed method. The forecast results that generated from this proposed model outperformed single traditional Box-Jenkins model.

  11. PROBLEM SOLVING TECHNIQUES AS A PART OF IMPLEMENTATION OF SIX SIGMA METHODOLOGY IN TIRE PRODUCTION. CASE STUDY

    Directory of Open Access Journals (Sweden)

    Maciej WOJTASZAK

    2015-07-01

    Full Text Available Problem solving methods – are an indispensable part of the management and improvement of production. At the turn of decades, with the development of industry, specific techniques have been implemented and refined by the leaders in this field, such as Toyota, GE and Motorola. The foundation of problem solving is to find real root cause of the problem as soon as possible, its understanding and implementation of appropriate solutions that will ensure that the problem does not occur again. This paper provides an overview of methods and techniques to solve problems in the manufactur-ing plant Trelleborg Wheel Systems Sri Lanka, producing pneumatic tires for light agricultural machinery. These tech-niques are implemented as part of the Lean Six Sigma program.

  12. Innovative research methods for studying treatments for rare diseases: methodological review.

    Science.gov (United States)

    Gagne, Joshua J; Thompson, Lauren; O'Keefe, Kelly; Kesselheim, Aaron S

    2014-11-24

    To examine methods for generating evidence on health outcomes in patients with rare diseases. Methodological review of existing literature. PubMed, Embase, and Academic Search Premier searched for articles describing innovative approaches to randomized trial design and analysis methods and methods for conducting observational research in patients with rare diseases. We assessed information related to the proposed methods, the specific rare disease being studied, and outcomes from the application of the methods. We summarize methods with respect to their advantages in studying health outcomes in rare diseases and provide examples of their application. We identified 46 articles that proposed or described methods for studying patient health outcomes in rare diseases. Articles covered a wide range of rare diseases and most (72%) were published in 2008 or later. We identified 16 research strategies for studying rare disease. Innovative clinical trial methods minimize sample size requirements (n=4) and maximize the proportion of patients who receive active treatment (n=2), strategies crucial to studying small populations of patients with limited treatment choices. No studies describing unique methods for conducting observational studies in patients with rare diseases were identified. Though numerous studies apply unique clinical trial designs and considerations to assess patient health outcomes in rare diseases, less attention has been paid to innovative methods for studying rare diseases using observational data. © Gagne et al 2014.

  13. INTERNATIONAL PAYMENT METHODS AND TECHNIQUES FROM THE ACCOUNTING PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    MAGDALENA MIHAI

    2013-08-01

    Full Text Available The starting point in our study regarding the international payment methods and techniques is the ideaaccording to which the international settlements are based on uniform fundament rules set by the states that takepart in international trade. Since the world economy and especially the international trade have evolved, theserules are changed and adapted to international trade necessities resulted from international trade. Theimportance of the topic consists in the idea that companies in our country are increasingly adopting,international trade activities. For this reason, in this paper we will conceptually determine the methods forinternational settlement, as well as present the accounting consequences regarding the international tradeactivity settlement. It is necessary to study the accounting implications that regard the management of collectionand payment activities in the intra- and extra-community trade since international trade as well as Europeanand international influences in national accounting regulations have been developed in our country.

  14. Methods for teaching effective patient communication techniques to radiography students.

    Science.gov (United States)

    Makely, S

    1990-07-01

    Teaching students to communicate effectively with patients has always been part of the radiography curriculum in the USA. However, developing these skills has become even more important in recent times due to several factors. Patients who have been well versed in what to expect from the examination being conducted are in a better position to co-operate with the radiographer. This increases the chances of producing optimal results from an examination at the first attempt, thus reducing radiation exposure, patient discomfort and the overall cost of conducting the procedure. Also, increased competition among health care providers has resulted in more emphasis being placed on patient, or customer, satisfaction. Radiographers are in the 'front line' of patient care. Patients often have more interaction with radiographers than with physicians or other medical specialists. Radiographers who practise effective communication techniques with their patients can alleviate anxiety and make an important contribution to the overall satisfaction of the patient with respect to the quality of service and care they receive. This article describes instructional methods being used in the USA to help develop effective patient communication techniques, and reports the findings of a study among radiography educators as to which of these methods are thought to be most successful.

  15. Sleeve push technique: A novel method of space gaining

    Directory of Open Access Journals (Sweden)

    Sanjeev Verma

    2018-01-01

    Full Text Available Space gaining is frequently required in orthodontics. Multiple loops were initially used for space gaining and alignment. The most common used mechanics for space gaining is the use of nickel–titanium open coil springs. The disadvantage of nickel–titanium coil spring is that they cannot be used until the arches are well aligned to receive the stiffer stainless steel wires. Therefore, a new method of gaining space during initial alignment and leveling has been developed and named as sleeve push technique (SPT. The nickel–titanium wires, i.e. 0.012 inches and 0.014 inches along with archwire sleeve (protective tubing can be used in a modified way to gain space along with alignment. This method helps in gaining space right from day 1 of treatment. The archwire sleeve and nickel–titanium wire in this new SPT act as a mutually synergistic combination and provide the orthodontist with a completely new technique for space opening.

  16. Methodology for the design of the method of siliceous sandstones operation using special software

    Directory of Open Access Journals (Sweden)

    Luis Ángel Lara-González

    2014-12-01

    Full Text Available The methodologies used for the design of the method of sandstones explotation by descending staggered banks using specialized software tools are reported. The data analyzed were collected in the field for the operating license 14816 in Melgar, Tolima. The characterization of the rock mass was held from physical and mechanical tests, performed on cylindrical test tubes in order to obtain the value of the maximum strenght and elastic modulus of the rock. The direction and dip of the sandstone package was rock. The direction and dip of the sandstone package was determined by using the stereographic projection whit DIPS®  software, and the safety factor of the slope was obtained with established banks whit SLIDE® . The slops are 8 meters high and 8 meters wide whit a tilt angle 60°, which generated a safety factor  of 2.1. The design  of the mining method was carried out with GEOVIA SURPAC® , at an early stage of development ascending to the level 11 of the exploitation, to then start mining in descending order to control the stabiLity of slopes. The results obtained allow a general methodology for the development of projects to optimize the process of evaluation and selection of mining method by using specialized design tools.

  17. Proposal of a method for evaluating tsunami risk using response-surface methodology

    Science.gov (United States)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response

  18. A survey on the task analysis methods and techniques for nuclear power plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators` tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators` tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author).

  19. A survey on the task analysis methods and techniques for nuclear power plant operators

    International Nuclear Information System (INIS)

    Lee, Yong Heui; Chun, Se Woo; Suh, Sang Moon; Lee, Jung Woon

    1994-04-01

    We have surveyed techniques and methods of task analysis from very traditional ones to recently developed ones that are being applicated to various industrial fields. We compare each other and analyse their fundamental characteristics and methodological specification in order to find a proper one enough to apply to nuclear power plant operators tasks. Generally, the fundamental process of task analyses has well been understandable, but its process of application in practice has not been so simple due to the wide and varying range of applications according to specific domain. Operators' tasks in NPPs are supposed to be performed strictly according to operational procedures written in a text and well trained, so the method of task analysis for operators' tasks in NPPs can be established to have its unique characteristics of task analysis based on the operational procedures. 8 figs., 10 tabs., 18 refs. (Author)

  20. Robust Optimization in Simulation : Taguchi and Response Surface Methodology

    NARCIS (Netherlands)

    Dellino, G.; Kleijnen, J.P.C.; Meloni, C.

    2008-01-01

    Optimization of simulated systems is tackled by many methods, but most methods assume known environments. This article, however, develops a 'robust' methodology for uncertain environments. This methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by

  1. Development of a consistent Monte Carlo-deterministic transport methodology based on the method of characteristics and MCNP5

    International Nuclear Information System (INIS)

    Karriem, Z.; Ivanov, K.; Zamonsky, O.

    2011-01-01

    This paper presents work that has been performed to develop an integrated Monte Carlo- Deterministic transport methodology in which the two methods make use of exactly the same general geometry and multigroup nuclear data. The envisioned application of this methodology is in reactor lattice physics methods development and shielding calculations. The methodology will be based on the Method of Long Characteristics (MOC) and the Monte Carlo N-Particle Transport code MCNP5. Important initial developments pertaining to ray tracing and the development of an MOC flux solver for the proposed methodology are described. Results showing the viability of the methodology are presented for two 2-D general geometry transport problems. The essential developments presented is the use of MCNP as geometry construction and ray tracing tool for the MOC, verification of the ray tracing indexing scheme that was developed to represent the MCNP geometry in the MOC and the verification of the prototype 2-D MOC flux solver. (author)

  2. Development of methodologies used in the areas of safeguards and nuclear forensics based on LA-HR-ICP-MS technique

    International Nuclear Information System (INIS)

    Marin, Rafael Coelho

    2013-01-01

    Environmental sampling performed by means of swipe samples is a methodology frequently employed by International Atomic Energy Agency (IAEA) to verify if the signatory States of the Safeguards Agreements are conducing unauthorized activities. Swipe samples analysis is complementary to the Safeguards ordinary procedures used to verify the information given by the States. In this work it was described a methodology intending to strengthen the nuclear safeguards and nuclear forensics procedures. The proposal is to study and evaluate the laser ablation high resolution inductively coupled plasma mass spectrometry (LA-HR-ICP-MS) technique as an alternative to analyze the real-life swipe samples. The precision achieved through the standard (CRM - 125A) measurements, represented by the relative standard deviation (RSD), was respectively 1.3 %, 0.2 % e 7.6 % for the 234 U/ 238 U, 235 U/ 238 U e 236 U/ 238 U isotopes ratios. The percent uncertainties (u %), which covers the RSD, ranged from 3.5 % to 29.8 % to the 235 U/ 238 U measurements and from 16.6 % to 42.9 % to the 234 U/ 238 U isotope ratio. These results were compatible with former studies performed by the LA-HR-ICP-MS that analyzed real-life swipe samples collected at a nuclear facility. Swipe samples collected from several points of the nuclear facility presented enrichment level ranging from (2.3 ± 0.7) % (sample 3) to (17.3 ± 2.8) % (sample 18). They also allowed detecting different enrichment levels within the facility. (author)

  3. Alternative method of highway traffic safety analysis for developing countries using delphi technique and Bayesian network.

    Science.gov (United States)

    Mbakwe, Anthony C; Saka, Anthony A; Choi, Keechoo; Lee, Young-Jae

    2016-08-01

    Highway traffic accidents all over the world result in more than 1.3 million fatalities annually. An alarming number of these fatalities occurs in developing countries. There are many risk factors that are associated with frequent accidents, heavy loss of lives, and property damage in developing countries. Unfortunately, poor record keeping practices are very difficult obstacle to overcome in striving to obtain a near accurate casualty and safety data. In light of the fact that there are numerous accident causes, any attempts to curb the escalating death and injury rates in developing countries must include the identification of the primary accident causes. This paper, therefore, seeks to show that the Delphi Technique is a suitable alternative method that can be exploited in generating highway traffic accident data through which the major accident causes can be identified. In order to authenticate the technique used, Korea, a country that underwent similar problems when it was in its early stages of development in addition to the availability of excellent highway safety records in its database, is chosen and utilized for this purpose. Validation of the methodology confirms the technique is suitable for application in developing countries. Furthermore, the Delphi Technique, in combination with the Bayesian Network Model, is utilized in modeling highway traffic accidents and forecasting accident rates in the countries of research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. A Methodology for the Selection of Multi-Criteria Decision Analysis Methods in Real Estate and Land Management Processes

    Directory of Open Access Journals (Sweden)

    Maria Rosaria Guarini

    2018-02-01

    Full Text Available Real estate and land management are characterised by a complex, elaborate combination of technical, regulatory and governmental factors. In Europe, Public Administrators must address the complex decision-making problems that need to be resolved, while also acting in consideration of the expectations of the different stakeholders involved in settlement transformation. In complex situations (e.g., with different aspects to be considered and multilevel actors involved, decision-making processes are often used to solve multidisciplinary and multidimensional analyses, which support the choices of those who are making the decision. Multi-Criteria Decision Analysis (MCDA methods are included among the examination and evaluation techniques considered useful by the European Community. Such analyses and techniques are performed using methods, which aim to reach a synthesis of the various forms of input data needed to define decision-making problems of a similar complexity. Thus, one or more of the conclusions reached allow for informed, well thought-out, strategic decisions. According to the technical literature on MCDA, numerous methods are applicable in different decision-making situations, however, advice for selecting the most appropriate for the specific field of application and problem have not been thoroughly investigated. In land and real estate management, numerous queries regarding evaluations often arise. In brief, the objective of this paper is to outline a procedure with which to select the method best suited to the specific queries of evaluation, which commonly arise while addressing decision-making problems. In particular issues of land and real estate management, representing the so-called “settlement sector”. The procedure will follow a theoretical-methodological approach by formulating a taxonomy of the endogenous and exogenous variables of the multi-criteria analysis methods.

  5. Comparison of economic evaluation methodology between levelized method and the evaluation system in China

    International Nuclear Information System (INIS)

    Zhang Shengli

    2005-01-01

    Different methodology would bring different results. This paper includes an introduction of levelized discounted generation cost methodology as well as that of Chinese system, respectively. In general, there have two key indices in Chinese evaluation system, they are generation cost and electricity sales price to the grid. This paper contains a description of cost breakdown and calculation procedure for each index. Comparison between these two methods and the primary differences are also included. For the first time, equations for calculating generation cost and selling price to the grid based on Chinese system have been derived, and its accuracy has been shown through running the special computer program. The two systems are quite different in many aspects. Firstly, levelized generation cost is always calculated with discounted method that excluded in Chinese system. Secondly, levelized generation cost is a single and constant value that would not change over the economic life while generation cost in Chinese system is estimated on a year by year base. Thirdly, the makeup of generation cost in Chinese system is different from that of levelized system since taxes and dividend share removed. Finally, the electricity sales price in Chinese system is more similar to levelized generation cost. (authors)

  6. Skin sparing mastectomy: Technique and suggested methods of reconstruction

    International Nuclear Information System (INIS)

    Farahat, A.M.; Hashim, T.; Soliman, H.O.; Manie, T.M.; Soliman, O.M.

    2014-01-01

    To demonstrate the feasibility and accessibility of performing adequate mastectomy to extirpate the breast tissue, along with en-block formal axillary dissection performed from within the same incision. We also compared different methods of immediate breast reconstruction used to fill the skin envelope to achieve the best aesthetic results. Methods: 38 patients with breast cancer underwent skin-sparing mastectomy with formal axillary clearance, through a circum-areolar incision. Immediate breast reconstruction was performed using different techniques to fill in the skin envelope. Two reconstruction groups were assigned; group 1: Autologus tissue transfer only (n= 24), and group 2: implant augmentation (n= 14). Autologus tissue transfer: The techniques used included filling in the skin envelope using Extended Latissimus Dorsi flap (18 patients) and Pedicled TRAM flap (6 patients). Augmentation with implants: Subpectoral implants(4 patients), a rounded implant placed under the pectoralis major muscle to augment an LD reconstructed breast. LD pocket (10 patients), an anatomical implant placed over the pectoralis major muscle within a pocket created by the LD flap. No contra-lateral procedure was performed in any of the cases to achieve symmetry. Results: All cases underwent adequate excision of the breast tissue along with en-block complete axillary clearance (when indicated), without the need for an additional axillary incision. Eighteen patients underwent reconstruction using extended LD flaps only, six had TRAM flaps, four had augmentation using implants placed below the pectoralis muscle along with LD flaps, and ten had implants placed within the LD pocket. Breast shape, volume and contour were successfully restored in all patients. Adequate degree of ptosis was achieved, to ensure maximal symmetry. Conclusions: Skin Sparing mastectomy through a circum-areolar incision has proven to be a safe and feasible option for the management of breast cancer in Egyptian

  7. Using Mixed Methods to Evaluate a Community Intervention for Sexual Assault Survivors: A Methodological Tale.

    Science.gov (United States)

    Campbell, Rebecca; Patterson, Debra; Bybee, Deborah

    2011-03-01

    This article reviews current epistemological and design issues in the mixed methods literature and then examines the application of one specific design, a sequential explanatory mixed methods design, in an evaluation of a community-based intervention to improve postassault care for sexual assault survivors. Guided by a pragmatist epistemological framework, this study collected quantitative and qualitative data to understand how the implementation of a Sexual Assault Nurse Examiner (SANE) program affected prosecution rates of adult sexual assault cases in a large midwestern community. Quantitative results indicated that the program was successful in affecting legal systems change and the qualitative data revealed the mediating mechanisms of the intervention's effectiveness. Challenges of implementing this design are discussed, including epistemological and practical difficulties that developed from blending methodologies into a single project. © The Author(s) 2011.

  8. Linking the Organizational Forms Teachers and Teaching Methods in a Class Instructional Methodology

    Directory of Open Access Journals (Sweden)

    Graciela Nápoles-Quiñones

    2016-05-01

    Full Text Available A descriptive study was conducted to show the link between the organizational forms teachers and teaching methods, to expose the pedagogical theory, to deepen the teaching-learning process through methodological class. The main content of the work of teachers is the preparation and level rise; which requires the selection and use of working methods, ways and procedures in accordance with the real and objective conditions of staff who have received the action and conducive to teaching work. Teachers should be aware that you need to master the content they teach, be aware of the level of development of its students, the specific characteristics of the group and of each student, and competent to reciprocate the content they teach with reality.

  9. Single Case Method in Psychology: How to Improve as a Possible Methodology in Quantitative Research.

    Science.gov (United States)

    Krause-Kjær, Elisa; Nedergaard, Jensine I

    2015-09-01

    Awareness of including Single-Case Method (SCM), as a possible methodology in quantitative research in the field of psychology, has been argued as useful, e.g., by Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Their article introduces a historical and conceptual analysis of SCMs and proposes changing the, often prevailing, tendency of neglecting SCM as an alternative to Null Hypothesis Significance Testing (NHST). This article contributes by putting a new light on SCM as an equally important methodology in psychology. The intention of the present article is to elaborate this point of view further by discussing one of the most fundamental requirements as well as main characteristics of SCM regarding temporality. In this respect that; "…performance is assessed continuously over time and under different conditions…" Hurtado-Parrado and López-López (IPBS: Integrative Psychological & Behavioral Science, 49:2, 2015). Defining principles when it comes to particular units of analysis, both synchronic (spatial) and diachronic (temporal) elements should be incorporated. In this article misunderstandings of the SCM will be adduced, and further the temporality will be described in order to propose how the SCM could have a more severe usability in psychological research. It is further discussed how to implement SCM in psychological methodology. It is suggested that one solution might be to reconsider the notion of time in psychological research to cover more than a variable of control and in this respect also include the notion of time as an irreversible unity within life.

  10. A methodology for on line fatigue life monitoring : rainflow cycle counting method

    International Nuclear Information System (INIS)

    Mukhopadhyay, N.K.; Dutta, B.K.; Kushwaha, H.S.

    1992-01-01

    Green's function technique is used in on line fatigue life monitoring to convert plant data to stress versus time data. This technique converts plant data most efficiently to stress versus time data. To compute the fatigue usage factor the actual number of cycles experienced by the component is to be found out from stress versus time data. Using material fatigue properties the fatigue usage factor is to be computed from the number of cycles. Generally the stress response is very irregular in nature. To convert an irregular stress history to stress frequency spectra rainflow cycle counting method is used. This method is proved to be superior to other counting methods and yields best fatigue estimates. A code has been developed which computes the number of cycles experienced by the component from stress time history using rainflow cycle counting method. This postprocessor also computes the accumulated fatigue usage factor from material fatigue properties. The present report describes the development of a code to compute fatigue usage factor using rainflow cycle counting technique and presents a real life case study. (author). 10 refs., 10 figs

  11. Cell synchrony techniques. I. A comparison of methods

    Energy Technology Data Exchange (ETDEWEB)

    Grdina, D.J.; Meistrich, M.L.; Meyn, R.E.; Johnson, T.S.; White, R.A.

    1984-01-01

    Selected cell synchrony techniques, as applied to asynchronous populations of Chinese hamster ovary (CHO) cells, have been compared. Aliquots from the same culture of exponentially growing cells were synchronized using mitotic selection, mitotic selection and hydroxyurea block, centrifugal elutriation, or an EPICS V cell sorter. Sorting of cells was achieved after staining cells with Hoechst 33258. After syncronization by the various methods the relative distribution of cells in G/sub 1/, S, or G/sub 2/ + M phases of the cell cycle was determined by flow cytometry. Fractions of synchronized cells obtained from each method were replated and allowed to progress through a second cell cycle. Mitotic selection gave rise to relatively pure and unperturbed early G/sub 1/ phase cells. While cell synchrony rapidly dispersed with time, cells progressed through the cell cycle in 12 hr. Sorting with the EPIC V on the modal G/sub 1/ peak yielded a relatively pure but heterogeneous G/sub 1/ population (i.e. early to late G/sub 1/). Again, synchrony dispersed with time, but cell-cycle progression required 14 hr. With centrifugal elutriation, several different cell populations synchronized throughout the cell cycle could be rapidly obtained with a purity comparable to mitotic selection and cell sorting. It was concluded that, either alone or in combination with blocking agents such as hydroxyurea, elutriation and mitotic selection were both excellent methods for synchronizing CHO cells. Cell sorting exhibited limitations in sample size and time required for synchronizing CHO cells. Its major advantage would be its ability to isolate cell populations unique with respect to selected cellular parameters. 19 references, 9 figures.

  12. MAIA - Method for Architecture of Information Applied: methodological construct of information processing in complex contexts

    Directory of Open Access Journals (Sweden)

    Ismael de Moura Costa

    2017-04-01

    Full Text Available Introduction: Paper to presentation the MAIA Method for Architecture of Information Applied evolution, its structure, results obtained and three practical applications.Objective: Proposal of a methodological constructo for treatment of complex information, distinguishing information spaces and revealing inherent configurations of those spaces. Metodology: The argument is elaborated from theoretical research of analitical hallmark, using distinction as a way to express concepts. Phenomenology is used as a philosophical position, which considers the correlation between Subject↔Object. The research also considers the notion of interpretation as an integrating element for concepts definition. With these postulates, the steps to transform the information spaces are formulated. Results: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Conclusions: This article explores not only how the method is structured to process information in its contexts, starting from a succession of evolutive cicles, divided in moments, which, on their turn, evolve to transformation acts. Besides that, the article presents not only possible applications as a cientific method, but also as configuration tool in information spaces, as well as generator of ontologies. At last, but not least, presents a brief summary of the analysis made by researchers who have already evaluated the method considering the three aspects mentioned.

  13. Sucevița Monastery – The overpainting from narthex and exonarthex. Technique of execution and the methodology of removal

    Directory of Open Access Journals (Sweden)

    Georgiana Zahariea

    2015-11-01

    Full Text Available The overpaintings found on the frescoes from Sucevița Monastery, date from periods that remain uncertain due to the rarity of written documents. Probably they were made because of the degradations that occurred in time or because of tastes. These interventions are made in oil technique or tempera, for that reason we can date them around the 19th century when there was this tendency of painting Orthodox churches in oil. The overpaintings are placed in key position, on the lunette, facilitating access from exonarthex to the narthex or from narthex to the tomb room. In terms of iconography, the overpaintings covered representations like: Anastasis / the Ressurection; The Holy Trinity of the New Testament and Virgin Mary with the thief represented in heaven (detail from the Last Judgement. The present paper tries to make a comparison among the three surfaces with overpainting, bringing technical arguments regarding the differences between them. At the same time, the paper presents details about the methodology applied to cleanning the overpaintings and it highlights the original image that can bring nuances in the iconographic interpretation.

  14. Developments in FT-ICR MS instrumentation, ionization techniques, and data interpretation methods for petroleomics.

    Science.gov (United States)

    Cho, Yunju; Ahmed, Arif; Islam, Annana; Kim, Sunghwan

    2015-01-01

    Because of the increasing importance of heavy and unconventional crude oil as an energy source, there is a growing need for petroleomics: the pursuit of more complete and detailed knowledge of the chemical compositions of crude oil. Crude oil has an extremely complex nature; hence, techniques with ultra-high resolving capabilities, such as Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS), are necessary. FT-ICR MS has been successfully applied to the study of heavy and unconventional crude oils such as bitumen and shale oil. However, the analysis of crude oil with FT-ICR MS is not trivial, and it has pushed analysis to the limits of instrumental and methodological capabilities. For example, high-resolution mass spectra of crude oils may contain over 100,000 peaks that require interpretation. To visualize large data sets more effectively, data processing methods such as Kendrick mass defect analysis and statistical analyses have been developed. The successful application of FT-ICR MS to the study of crude oil has been critically dependent on key developments in FT-ICR MS instrumentation and data processing methods. This review offers an introduction to the basic principles, FT-ICR MS instrumentation development, ionization techniques, and data interpretation methods for petroleomics and is intended for readers having no prior experience in this field of study. © 2014 Wiley Periodicals, Inc.

  15. An Improved Clutter Suppression Method for Weather Radars Using Multiple Pulse Repetition Time Technique

    Directory of Open Access Journals (Sweden)

    Yingjie Yu

    2017-01-01

    Full Text Available This paper describes the implementation of an improved clutter suppression method for the multiple pulse repetition time (PRT technique based on simulated radar data. The suppression method is constructed using maximum likelihood methodology in time domain and is called parametric time domain method (PTDM. The procedure relies on the assumption that precipitation and clutter signal spectra follow a Gaussian functional form. The multiple interleaved pulse repetition frequencies (PRFs that are used in this work are set to four PRFs (952, 833, 667, and 513 Hz. Based on radar simulation, it is shown that the new method can provide accurate retrieval of Doppler velocity even in the case of strong clutter contamination. The obtained velocity is nearly unbiased for all the range of Nyquist velocity interval. Also, the performance of the method is illustrated on simulated radar data for plan position indicator (PPI scan. Compared with staggered 2-PRT transmission schemes with PTDM, the proposed method presents better estimation accuracy under certain clutter situations.

  16. A methodological framework applied to the choice of the best method in replacement of nuclear systems

    International Nuclear Information System (INIS)

    Vianna Filho, Alfredo Marques

    2009-01-01

    The economic equipment replacement problem is a central question in Nuclear Engineering. On the one hand, new equipment are more attractive given their best performance, better reliability, lower maintenance cost etc. New equipment, however, require a higher initial investment. On the other hand, old equipment represent the other way around, with lower performance, lower reliability and specially higher maintenance costs, but in contrast having lower financial and insurance costs. The weighting of all these costs can be made with deterministic and probabilistic methods applied to the study of equipment replacement. Two types of distinct problems will be examined, substitution imposed by the wearing and substitution imposed by the failures. In order to solve the problem of nuclear system substitution imposed by wearing, deterministic methods are discussed. In order to solve the problem of nuclear system substitution imposed by failures, probabilistic methods are discussed. The aim of this paper is to present a methodological framework to the choice of the most useful method applied in the problem of nuclear system substitution.(author)

  17. Damage detection methodology under variable load conditions based on strain field pattern recognition using FBGs, nonlinear principal component analysis, and clustering techniques

    Science.gov (United States)

    Sierra-Pérez, Julián; Torres-Arredondo, M.-A.; Alvarez-Montoya, Joham

    2018-01-01

    Structural health monitoring consists of using sensors integrated within structures together with algorithms to perform load monitoring, damage detection, damage location, damage size and severity, and prognosis. One possibility is to use strain sensors to infer structural integrity by comparing patterns in the strain field between the pristine and damaged conditions. In previous works, the authors have demonstrated that it is possible to detect small defects based on strain field pattern recognition by using robust machine learning techniques. They have focused on methodologies based on principal component analysis (PCA) and on the development of several unfolding and standardization techniques, which allow dealing with multiple load conditions. However, before a real implementation of this approach in engineering structures, changes in the strain field due to conditions different from damage occurrence need to be isolated. Since load conditions may vary in most engineering structures and promote significant changes in the strain field, it is necessary to implement novel techniques for uncoupling such changes from those produced by damage occurrence. A damage detection methodology based on optimal baseline selection (OBS) by means of clustering techniques is presented. The methodology includes the use of hierarchical nonlinear PCA as a nonlinear modeling technique in conjunction with Q and nonlinear-T 2 damage indices. The methodology is experimentally validated using strain measurements obtained by 32 fiber Bragg grating sensors bonded to an aluminum beam under dynamic bending loads and simultaneously submitted to variations in its pitch angle. The results demonstrated the capability of the methodology for clustering data according to 13 different load conditions (pitch angles), performing the OBS and detecting six different damages induced in a cumulative way. The proposed methodology showed a true positive rate of 100% and a false positive rate of 1.28% for a

  18. The methods for generating tomographic images using transmition, emission and nuclear magnetic resonance techniques. II. Fourier method and iterative methods

    International Nuclear Information System (INIS)

    Ursu, I.; Demco, D.E.; Gligor, T.D.; Pop, G.; Dollinger, R.

    1987-01-01

    In a wide variety of applications it is necessary to infer the structure of a multidimensional object from a set of its projections. Computed tomography is at present largely extended in the medical field, but the industrial application may ultimately far exceed its medical applications. Two techniques for reconstructing objects from their projections are presented: Fourier methods and iterative techniques. The paper also contains a brief comparative study of the reconstruction algorithms. (authors)

  19. Weighted graph based ordering techniques for preconditioned conjugate gradient methods

    Science.gov (United States)

    Clift, Simon S.; Tang, Wei-Pai

    1994-01-01

    We describe the basis of a matrix ordering heuristic for improving the incomplete factorization used in preconditioned conjugate gradient techniques applied to anisotropic PDE's. Several new matrix ordering techniques, derived from well-known algorithms in combinatorial graph theory, which attempt to implement this heuristic, are described. These ordering techniques are tested against a number of matrices arising from linear anisotropic PDE's, and compared with other matrix ordering techniques. A variation of RCM is shown to generally improve the quality of incomplete factorization preconditioners.

  20. Methodological issues about techniques for the spiking of standard OECD soil with nanoparticles: evidence of different behaviours

    International Nuclear Information System (INIS)

    Miglietta, Maria Lucia; Rametta, Gabriella; Manzo, Sonia; Salluzzo, Antonio; Rimauro, Juri; Francia, Girolamo Di

    2015-01-01

    The aim of this study is to investigate at what extent the results of standard nanoparticle (NP) toxicity testing methodologies are affected by the different exposure procedures on soil organisms. In this view, differences in physicochemical properties of ZnO NPs (<100 nm), ZnO bulk (<200 nm) and ionic Zinc (ZnCl 2 ) and their ecotoxicological potential toward Lepidium sativum were investigated with respect to three different spiking methods. Results show that the spiking procedures give homogeneous distribution of the testing nanomaterial in soil but the physicochemical and ecotoxicological properties of the testing species differ according to the spiking procedure. Dry spiking produced the highest ZnO solubility whereas spiking through dispersions of ZnO in water and in aqueous soil extracts produced the lowest. At the same time, the ecotoxic effects showed different trends with regard to the spiking route. The need for a definition of agreed methods concerning the NP spiking procedures is, therefore, urgent

  1. Methodological issues about techniques for the spiking of standard OECD soil with nanoparticles: evidence of different behaviours

    Energy Technology Data Exchange (ETDEWEB)

    Miglietta, Maria Lucia, E-mail: mara.miglietta@enea.it; Rametta, Gabriella; Manzo, Sonia; Salluzzo, Antonio; Rimauro, Juri; Francia, Girolamo Di [ENEA, Portici Technical Unit, C.R. Portici (Italy)

    2015-07-15

    The aim of this study is to investigate at what extent the results of standard nanoparticle (NP) toxicity testing methodologies are affected by the different exposure procedures on soil organisms. In this view, differences in physicochemical properties of ZnO NPs (<100 nm), ZnO bulk (<200 nm) and ionic Zinc (ZnCl{sub 2}) and their ecotoxicological potential toward Lepidium sativum were investigated with respect to three different spiking methods. Results show that the spiking procedures give homogeneous distribution of the testing nanomaterial in soil but the physicochemical and ecotoxicological properties of the testing species differ according to the spiking procedure. Dry spiking produced the highest ZnO solubility whereas spiking through dispersions of ZnO in water and in aqueous soil extracts produced the lowest. At the same time, the ecotoxic effects showed different trends with regard to the spiking route. The need for a definition of agreed methods concerning the NP spiking procedures is, therefore, urgent.

  2. Diffusion tensor trace mapping in normal adult brain using single-shot EPI technique: A methodological study of the aging brain

    International Nuclear Information System (INIS)

    Chen, Z.G.; Hindmarsh, T.; Li, T.Q.

    2001-01-01

    Purpose: To quantify age-related changes of the average diffusion coefficient value in normal adult brain using orientation-independent diffusion tensor trace mapping and to address the methodological influences on diffusion quantification. Material and Methods: Fifty-four normal subjects (aged 20-79 years) were studied on a 1.5-T whole-body MR medical unit using a diffusion-weighted single-shot echo-planar imaging technique. Orientation-independent diffusion tensor trace maps were constructed for each subject using diffusion-weighted MR measurements in four different directions using a tetrahedral gradient combination pattern. The global average (including cerebral spinal fluid) and the tissue average of diffusion coefficients in adult brains were determined by analyzing the diffusion coefficient distribution histogram for the entire brain. Methodological influences on the measured diffusion coefficient were also investigated by comparing the results obtained using different experimental settings. Results: Both global and tissue averages of the diffusion coefficient are significantly correlated with age (p<0.03). The global average of the diffusion coefficient increases 3% per decade after the age of 40, whereas the increase in the tissue average of diffusion coefficient is about 1% per decade. Experimental settings for self-diffusion measurements, such as data acquisition methods and number of b-values, can slightly influence the statistical distribution histogram of the diffusion tensor trace and its average value. Conclusion: Increased average diffusion coefficient in adult brains with aging are consistent with findings regarding structural changes in the brain that have been associated with aging. The study also demonstrates that it is desirable to use the same experimental parameters for diffusion coefficient quantification when comparing between different subjects and groups of interest

  3. Building block method: a bottom-up modular synthesis methodology for distributed compliant mechanisms

    Directory of Open Access Journals (Sweden)

    G. Krishnan

    2012-03-01

    Full Text Available Synthesizing topologies of compliant mechanisms are based on rigid-link kinematic designs or completely automated optimization techniques. These designs yield mechanisms that match the kinematic specifications as a whole, but seldom yield user insight on how each constituent member contributes towards the overall mechanism performance. This paper reviews recent developments in building block based design of compliant mechanisms. A key aspect of such a methodology is formulating a representation of compliance at a (i single unique point of interest in terms of geometric quantities such as ellipses and vectors, and (ii relative compliance between distinct input(s and output(s in terms of load flow. This geometric representation provides a direct mapping between the mechanism geometry and their behavior, and is used to characterize simple deformable members that form a library of building blocks. The design space spanned by the building block library guides the decomposition of a given problem specification into tractable sub-problems that can be each solved from an entry in the library. The effectiveness of this geometric representation aids user insight in design, and enables discovery of trends and guidelines to obtain practical conceptual designs.

  4. Theoretical, methodological and methodical bases of structural policy of territorial subjects of the russian federation

    Directory of Open Access Journals (Sweden)

    Valentina Sergeevna Antonyuk

    2013-03-01

    Full Text Available In this article, the maintenance of the various points of view on a category «the structural policy» is revealed. The sight of authors of the article is reflected: the structural policy is understood as a subsystem of a social and economic policy of the state, called to carry out function managing by development of branches of the economy together with private business, distributions of financial resources between sectors, control over use of the given money resources with a view of, actual for a certain historical stage, by use of administrative, standard and financial tools of regulation. The methodological basis of a structural policy is defined, functions with that end in view reveal, the target system, subjects and objects, and also are specified principles and classification of tools of a structural policy. By sight authors, regional branch shifts which promote progressivechanges of branch structure of a region in directions of formation V and VI technological ways, increase of a diversification of manufacture by stimulation of innovative changes should become a target reference point of a structural policy. The most sensitiveto tactical both technological fluctuations and vulnerablein the economic plan are monospecialized regions. In this connection, the technique of carrying out of a structural policy in monospecialized subjects of the Russian Federation taking into account shifts in branches of their industrial specializations is offered.

  5. Optimization of MR fluid Yield stress using Taguchi Method and Response Surface Methodology Techniques

    Science.gov (United States)

    Mangal, S. K.; Sharma, Vivek

    2018-02-01

    Magneto rheological fluids belong to a class of smart materials whose rheological characteristics such as yield stress, viscosity etc. changes in the presence of applied magnetic field. In this paper, optimization of MR fluid constituents is obtained with on-state yield stress as response parameter. For this, 18 samples of MR fluids are prepared using L-18 Orthogonal Array. These samples are experimentally tested on a developed & fabricated electromagnet setup. It has been found that the yield stress of MR fluid mainly depends on the volume fraction of the iron particles and type of carrier fluid used in it. The optimal combination of the input parameters for the fluid are found to be as Mineral oil with a volume percentage of 67%, iron powder of 300 mesh size with a volume percentage of 32%, oleic acid with a volume percentage of 0.5% and tetra-methyl-ammonium-hydroxide with a volume percentage of 0.7%. This optimal combination of input parameters has given the on-state yield stress as 48.197 kPa numerically. An experimental confirmation test on the optimized MR fluid sample has been then carried out and the response parameter thus obtained has found matching quite well (less than 1% error) with the numerically obtained values.

  6. VALU, AVX and GPU acceleration techniques for parallel FDTD methods

    CERN Document Server

    Yu, Wenhua

    2013-01-01

    This book introduces a general hardware acceleration technique that can significantly speed up FDTD simulations and their applications to engineering problems without requiring any additional hardware devices. This acceleration of complex problems can be efficient in saving both time and money and once learned these new techniques can be used repeatedly.

  7. Unstructured characteristic method embedded with variational nodal method using domain decomposition techniques

    Energy Technology Data Exchange (ETDEWEB)

    Girardi, E.; Ruggieri, J.M. [CEA Cadarache (DER/SPRC/LEPH), 13 - Saint-Paul-lez-Durance (France). Dept. d' Etudes des Reacteurs; Santandrea, S. [CEA Saclay, Dept. Modelisation de Systemes et Structures DM2S/SERMA/LENR, 91 - Gif sur Yvette (France)

    2005-07-01

    This paper describes a recently-developed extension of our 'Multi-methods,multi-domains' (MM-MD) method for the solution of the multigroup transport equation. Based on a domain decomposition technique, our approach allows us to treat the one-group equation by cooperatively employing several numerical methods together. In this work, we describe the coupling between the Method of Characteristics (integro-differential equation, unstructured meshes) with the Variational Nodal Method (even parity equation, cartesian meshes). Then, the coupling method is applied to the benchmark model of the Phebus experimental facility (Cea Cadarache). Our domain decomposition method give us the capability to employ a very fine mesh in describing a particular fuel bundle with an appropriate numerical method (MOC), while using a much large mesh size in the rest of the core, in conjunction with a coarse-mesh method (VNM). This application shows the benefits of our MM-MD approach, in terms of accuracy and computing time: the domain decomposition method allows us to reduce the Cpu time, while preserving a good accuracy of the neutronic indicators: reactivity, core-to-bundle power coupling coefficient and flux error. (authors)

  8. Unstructured characteristic method embedded with variational nodal method using domain decomposition techniques

    International Nuclear Information System (INIS)

    Girardi, E.; Ruggieri, J.M.

    2005-01-01

    This paper describes a recently-developed extension of our 'Multi-methods,multi-domains' (MM-MD) method for the solution of the multigroup transport equation. Based on a domain decomposition technique, our approach allows us to treat the one-group equation by cooperatively employing several numerical methods together. In this work, we describe the coupling between the Method of Characteristics (integro-differential equation, unstructured meshes) with the Variational Nodal Method (even parity equation, cartesian meshes). Then, the coupling method is applied to the benchmark model of the Phebus experimental facility (Cea Cadarache). Our domain decomposition method give us the capability to employ a very fine mesh in describing a particular fuel bundle with an appropriate numerical method (MOC), while using a much large mesh size in the rest of the core, in conjunction with a coarse-mesh method (VNM). This application shows the benefits of our MM-MD approach, in terms of accuracy and computing time: the domain decomposition method allows us to reduce the Cpu time, while preserving a good accuracy of the neutronic indicators: reactivity, core-to-bundle power coupling coefficient and flux error. (authors)

  9. Biogeosystem Technique as a method to correct the climate

    Science.gov (United States)

    Kalinitchenko, Valery; Batukaev, Abdulmalik; Batukaev, Magomed; Minkina, Tatiana

    2017-04-01

    can be produced; The less energy is consumed for climate correction, the better. The proposed algorithm was never discussed before because most of its ingredients were unenforceable. Now the possibility to execute the algorithm exists in the framework of our new scientific-technical branch - Biogeosystem Technique (BGT*). The BGT* is a transcendental (non-imitating natural processes) approach to soil processing, regulation of energy, matter, water fluxes and biological productivity of biosphere: intra-soil machining to provide the new highly productive dispersed system of soil; intra-soil pulse continuous-discrete plants watering to reduce the transpiration rate and water consumption of plants for 5-20 times; intra-soil environmentally safe return of matter during intra-soil milling processing and (or) intra-soil pulse continuous-discrete plants watering with nutrition. Are possible: waste management; reducing flow of nutrients to water systems; carbon and other organic and mineral substances transformation into the soil to plant nutrition elements; less degradation of biological matter to greenhouse gases; increasing biological sequestration of carbon dioxide in terrestrial system's photosynthesis; oxidizing methane and hydrogen sulfide by fresh photosynthesis ionized biologically active oxygen; expansion of the active terrestrial site of biosphere. The high biological product output of biosphere will be gained. BGT* robotic systems are of low cost, energy and material consumption. By BGT* methods the uncertainties of climate and biosphere will be reduced. Key words: Biogeosystem Technique, method to correct, climate

  10. Auditing organizational communication: evaluating the methodological strengths and weaknesses of the critical incident technique, network analysis, and the communication satisfaction questionnaire

    NARCIS (Netherlands)

    Koning, K.H.

    2016-01-01

    This dissertation focuses on the methodology of communication audits. In the context of three Dutch high schools, we evaluated several audit instruments. The first study in this dissertation focuses on the question whether the rationale of the critical incident technique (CIT) still applies when it

  11. A Comparison of Various Software Development Methodologies: Feasibility and Methods of Integration

    Directory of Open Access Journals (Sweden)

    Samir Abou El-Seoud

    2016-12-01

    Full Text Available System development methodologies which have being used in the academic and commercial environments during last two decades have advantages and disadvantages. Researchers had tried to identify objectives, scope …etc. of the methodologies by following different approaches. Each approach has its Limitation, specific interest, coverage …etc. In this paper, we tried to perform a comparative study of those methodologies which are popular and commonly used in banking and commercial environment. We tried in our study to determine objectives, scope, tools and other features of the methodologies. We also, tried to determine how and to what extent the methodologies incorporate the facilities such as project management, cost benefit analysis, documentation …etc. One of the most important aspects of our study was how to integrate the methodologies and develop a global methodology which covers the complete span of the software development life cycle? A prototype system which integrates the selected methodologies has been developed. The developed system helps analysts and designers how to choose suitable tools or to obtain guidelines on what to do in a particular situation. The prototype system has been tested during the development of a software for an ATM “Auto Teller Machine” by selecting and applying SASD methodology during software development. This resulted in the development of high quality and well documented software system.

  12. Photographic and drafting techniques simplify method of producing engineering drawings

    Science.gov (United States)

    Provisor, H.

    1968-01-01

    Combination of photographic and drafting techniques has been developed to simplify the preparation of three dimensional and dimetric engineering drawings. Conventional photographs can be converted to line drawings by making copy negatives on high contrast film.

  13. Techniques and methods of characterization of admixtures for the concrete

    Directory of Open Access Journals (Sweden)

    Palacios, M.

    2003-03-01

    Full Text Available Admixtures are defined as those products that are incorporated in the moment of the process of mixture of the concrete in a quantity not bigger than 5 by mass of the cement %, with relationship to the cement content in the concrete, with object of modifying the properties of the mixture in .state fresh and/or hardened. The behaviour of the admixtures depends on its chemical and ionic composition, the organic functional groups present, and the structure of the polymer and the distribution of molecular weight of the different polymers. In the present work the techniques and methods of characterization physical-chemistry, chemistry and ionic, structural, as well as of the polymers that constitute this admixtures, are described. A lot of techniques have been employed like: ionic chromatography, ultraviolet-visible spectroscopy (UV-VIS, Fourier transform infrared spectroscopy (FTIR, Fourier transform Raman spectroscopy (FT-Raman, nuclear magnetic resonance spectroscopy (1H-RMN and 13C-RMN, gel permeation chromatography (GPC. Two commercial admixtures have been selected to carry out this characterization, a superplastificant based on policarboxilates, and a reducer of the shrinkage based on polipropilenglycol.

    RESUMEN Se definen los aditivos como aquellos productos que son incorporados en el momento del amasado del hormigón en una cantidad no mayor del 5% en masa, con relación al contenido de cemento en el hormigón, con objeto de modificar las propiedades de la mezcla en estado fresco y/o endurecido. El comportamiento de los aditivos depende de su composición química e iónica, de los grupos funcionales orgánicos presentes, de la estructura del polímero y de la distribución de pesos moleculares de los diferentes polímeros que lo constituyen. En el presente trabajo se describen diferentes técnicas y métodos de caracterización físico-química, química e iónica, estructural, así como de los polímeros que

  14. Development of a methodology for low-energy X-ray absorption correction in biological samples using radiation scattering techniques

    International Nuclear Information System (INIS)

    Pereira, Marcelo O.; Anjos, Marcelino J.; Lopes, Ricardo T.

    2009-01-01

    Non-destructive techniques with X-ray, such as tomography, radiography and X-ray fluorescence are sensitive to the attenuation coefficient and have a large field of applications in medical as well as industrial area. In the case of X-ray fluorescence analysis the knowledge of photon X-ray attenuation coefficients provides important information to obtain the elemental concentration. On the other hand, the mass attenuation coefficient values are determined by transmission methods. So, the use of X-ray scattering can be considered as an alternative to transmission methods. This work proposes a new method for obtain the X-ray absorption curve through superposition peak Rayleigh and Compton scattering of the lines L a e L β of Tungsten (Tungsten L lines of an X-ray tube with W anode). The absorption curve was obtained using standard samples with effective atomic number in the range from 6 to 16. The method were applied in certified samples of bovine liver (NIST 1577B) , milk powder and V-10. The experimental measurements were obtained using the portable system EDXRF of the Nuclear Instrumentation Laboratory (LIN-COPPE/UFRJ) with Tungsten (W) anode. (author)

  15. [A comparative study of blood culture conventional method vs. a modified lysis/centrifugation technique for the diagnosis of fungemias].

    Science.gov (United States)

    Santiago, Axel Rodolfo; Hernández, Betsy; Rodríguez, Marina; Romero, Hilda

    2004-12-01

    The purpose of this work was to compare the efficacy of blood culture conventional method vs. a modified lysis/centrifugation technique. Out of 450 blood specimens received in one year, 100 where chosen for this comparative study: 60 from patients with AIDS, 15 from leukemic patients, ten from febrile neutropenic patients, five from patients with respiratory infections, five from diabetics and five from septicemic patients. The specimens were processed, simultaneously, according to the above mentioned methodologies with daily inspections searching for fungal growth in order to obtain the final identification of the causative agent. The number (40) of isolates recovered was the same using both methods, which included; 18 Candida albicans (45%), ten Candida spp. (25%), ten Histoplasma capsulatum (25%), and two Cryptococcus neoformans (5%). When the fungal growth time was compared by both methods, growth was more rapid when using the modified lysis/centrifugation technique than when using the conventional method. Statistical analysis revealed a significant difference (pcentrifugation technique showed to be more efficacious than the conventional one, and therefore the implementation of this methodology is highly recommended for the isolation of fungi from blood.

  16. Tools and methods for teaching magnetic resonance concepts and techniques

    DEFF Research Database (Denmark)

    Hanson, Lars G.

    2012-01-01

    Teaching of MRI methodology can be challenging for teachers as well as students. To support student learning, two graphical simulators for exploration of basic magnetic resonance principles are here introduced. The first implements a simple compass needle analogy implemented for day one of NMR...... and MRI education. After a few minutes of use, any user with minimal experience of magnetism will be able to explain the basic magnetic resonance principle. A second piece of software, the Bloch Simulator, aims much further, as it can be used to demonstrate and explore a wide range of phenomena including...

  17. Dating method by electron spin resonance at the 'Museum National d'Histoire Naturelle'. Twenty years of methodological researches and geochronological applications

    International Nuclear Information System (INIS)

    Bahain, J.J.

    2007-12-01

    The Electronic Spin Resonance (ESR) dating method has considerably evolved since its first uses in France at the early 1980's. The samples classically used until the middle part of the 1990's, carbonates and bones, were forsaken progressively and replaced by teeth and bleached quartz. Analytical progresses and methodological developments related to the study of these two materials have considerably increase on one hand the precision and the accuracy of the obtained results, allowing a comparison with those derived from other geochronological techniques, and on the other hand the chronological range of application of the ESR method. Even if the ESR method experiences still today many methodological developments, it has an undeniable geochronological potential and it is one of the rare methods permitting the direct dating of the Lower and Middle Pleistocene layers in non volcanic areas. Then its application is of a considerable importance for the study of the first human settlements of Eurasia and the possibility to date various types of materials carried out from the same archaeological level, jointly by ESR and different other geochronological techniques, by allowing a intercalibration of the results, offers in addition a tool to evaluate the reliability of the obtained ages. Some of the most significant results obtained at the Department of Prehistory of the National Museum of Natural History are evoked in this memory and illustrates both the potential and the current limits of this method. (author)

  18. GENESIS OF METHODOLOGY OF MANAGEMENT BY DEVELOPMENT OF ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Z.N. Varlamova

    2007-06-01

    Full Text Available In clause the genesis of methodology of management of development of organizations as sets of the used methodological approaches and methods is investigated. The results of the comparative analysis of the methodological approaches to management of organizational development are submitted. The traditional methodological approaches are complemented strategic experiment and methodology case studies. The approaches to formation of new methodology and technique of research of sources of competitive advantages of organization are considered.

  19. Procedure Redesign Methods : E3-Control: a redesign methodology for control procedures

    NARCIS (Netherlands)

    Liu, J.; Hofman, W.J.; Tan, Y.H.

    2011-01-01

    This chapter highlights the core research methodology, e3-control, that is applied throughout the ITAIDE project for the purpose of control procedure redesign. We present the key concept of the e3-control methodology and its technical guidelines. Based on the output of this chapter, domain experts

  20. The Sterile Insect Technique as a method of pest control

    International Nuclear Information System (INIS)

    Argiles Herrero, R.

    2011-01-01

    In the Valencia community is doing one of the most ambitious project in the field of plant protection at European level: the fight against fruit fly, one of the most damaging pests of citrus and fruit; by Insect Technique Sterile. This technique consists of laboratory breeding and release into the fields of huge quantities of insects of the pest species that have previously been sterilized. Sterile insect looking for wild individuals of the same species to mate with them and the result is a clutch of viable eggs, causing a decrease in pest populations. After three years of application of the technique on an area of 150,000 hectares, the pest populations have been reduced by 90%. Other benefits have been the reduced used of insecticides and improved the quality of exported fruit. (Author)

  1. Methods and techniques for decontamination design and construction of facilities

    International Nuclear Information System (INIS)

    Augustin, X.; Cohen, S.

    1986-01-01

    TECHNICATOME and STMI have jointly solved a wide range of problems specific to decontamination from the very design studies up to operation. TECHNICATOME has brought its expertise in the design and construction of nuclear facilities concerned in particular with decontamination and radwaste management. STMI is an experienced operator with expertise in designing tools and developing advanced techniques in the same fields. The expertise of both companies in this field cumulated for many years has resulted in developing techniques and tools adapted to most of the decontamination problems including specific cases [fr

  2. Identifying plant cell-surface receptors: combining 'classical' techniques with novel methods.

    Science.gov (United States)

    Uebler, Susanne; Dresselhaus, Thomas

    2014-04-01

    Cell-cell communication during development and reproduction in plants depends largely on a few phytohormones and many diverse classes of polymorphic secreted peptides. The peptide ligands are bound at the cell surface of target cells by their membranous interaction partners representing, in most cases, either receptor-like kinases or ion channels. Although knowledge of both the extracellular ligand and its corresponding receptor(s) is necessary to describe the downstream signalling pathway(s), to date only a few ligand-receptor pairs have been identified. Several methods, such as affinity purification and yeast two-hybrid screens, have been used very successfully to elucidate interactions between soluble proteins, but most of these methods cannot be applied to membranous proteins. Experimental obstacles such as low concentration and poor solubility of membrane receptors, as well as instable transient interactions, often hamper the use of these 'classical' approaches. However, over the last few years, a lot of progress has been made to overcome these problems by combining classical techniques with new methodologies. In the present article, we review the most promising recent methods in identifying cell-surface receptor interactions, with an emphasis on success stories outside the field of plant research.

  3. A consistent modelling methodology for secondary settling tanks: a reliable numerical method.

    Science.gov (United States)

    Bürger, Raimund; Diehl, Stefan; Farås, Sebastian; Nopens, Ingmar; Torfs, Elena

    2013-01-01

    The consistent modelling methodology for secondary settling tanks (SSTs) leads to a partial differential equation (PDE) of nonlinear convection-diffusion type as a one-dimensional model for the solids concentration as a function of depth and time. This PDE includes a flux that depends discontinuously on spatial position modelling hindered settling and bulk flows, a singular source term describing the feed mechanism, a degenerating term accounting for sediment compressibility, and a dispersion term for turbulence. In addition, the solution itself is discontinuous. A consistent, reliable and robust numerical method that properly handles these difficulties is presented. Many constitutive relations for hindered settling, compression and dispersion can be used within the model, allowing the user to switch on and off effects of interest depending on the modelling goal as well as investigate the suitability of certain constitutive expressions. Simulations show the effect of the dispersion term on effluent suspended solids and total sludge mass in the SST. The focus is on correct implementation whereas calibration and validation are not pursued.

  4. MODELS AND METHODS OF SAFETY-ORIENTED PROJECT MANAGEMENT OF DEVELOPMENT OF COMPLEX SYSTEMS: METHODOLOGICAL APPROACH

    Directory of Open Access Journals (Sweden)

    Олег Богданович ЗАЧКО

    2016-03-01

    Full Text Available The methods and models of safety-oriented project management of the development of complex systems are proposed resulting from the convergence of existing approaches in project management in contrast to the mechanism of value-oriented management. A cognitive model of safety oriented project management of the development of complex systems is developed, which provides a synergistic effect that is to move the system from the original (pre condition in an optimal one from the viewpoint of life safety - post-project state. The approach of assessment the project complexity is proposed, which consists in taking into account the seasonal component of a time characteristic of life cycles of complex organizational and technical systems with occupancy. This enabled to take into account the seasonal component in simulation models of life cycle of the product operation in complex organizational and technical system, modeling the critical points of operation of systems with occupancy, which forms a new methodology for safety-oriented management of projects, programs and portfolios of projects with the formalization of the elements of complexity.

  5. Neutron diffraction technique as a method for material studies

    International Nuclear Information System (INIS)

    Belhorma, B.; Labrim, H.; Gandou, Z.

    2010-01-01

    The Morocco's first Nuclear Research has been constructed in CNESTEN. The reactor divergence has been tested, and the nominal power of 2MW was successfully achieved. The reactor has 4 beam ports two of them are projected for neutron scattering. Such technique allows studying the crystallographic and magnetic structures of materials using the thermal neutrons produced in the reactor. the powder diffractometer has been designed. Component reception and installation procedures are in progress. The second experiment consists on small angle neutron scattering that allows the study of soft matter and polymers in the range of 1-50 nm. The third technique that can complete the two previous is the 4-circle neutron spectrometry which is designed mainly to study structural properties of the mono-crystalline material and texture.This technique is complementary to the X-ray diffraction already available in CNESTEN. Some applications of this technique are: --to determine the crystallographic and magnetic structure of polycrystalline materials.-- to study the texture in metals and alloys.-- to perform holography measurement

  6. Microemulsion extrusion technique : a new method to produce lipid nanoparticles

    NARCIS (Netherlands)

    de Jesus, Marcelo Bispo; Radaic, Allan; Zuhorn, Inge S.; de Paula, Eneida

    2013-01-01

    Solid lipid nanoparticles (SLN) and nano-structured lipid carriers (NLC) have been intensively investigated for different applications, including their use as drug and gene delivery systems. Different techniques have been employed to produce lipid nanoparticles, of which high pressure homogenization

  7. Formation factor logging in-situ by electrical methods. Background and methodology

    International Nuclear Information System (INIS)

    Loefgren, Martin; Neretnieks, Ivars

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  8. Formation factor logging in-situ by electrical methods. Background and methodology

    Energy Technology Data Exchange (ETDEWEB)

    Loefgren, Martin; Neretnieks, Ivars [Royal Inst. of Tech., Stockholm (Sweden). Dept. of Chemical Engineering and Technology

    2002-10-01

    Matrix diffusion has been identified as one of the most important mechanisms governing the retardation of radionuclides escaping from a deep geological repository for nuclear waste. Radionuclides dissolved in groundwater flowing in water-bearing fractures will diffuse into water filled micropores in the rock. Important parameters governing the matrix diffusion are the formation factor, the surface diffusion and sorption. This report focuses on the formation factor in undisturbed intrusive igneous rock and the possibility of measuring this parameter in-situ. The background to and the methodology of formation factor logging in-situ by electrical methods are given. The formation factor is here defined as a parameter only depending on the geometry of the porous system and not on the diffusing specie. Traditionally the formation factor has been measured by through diffusion experiments on core samples, which are costly and time consuming. It has been shown that the formation factor could also be measured by electrical methods that are faster and less expensive. Previously this has only been done quantitatively in the laboratory on a centimetre or decimetre scale. When measuring the formation factor in-situ in regions with saline groundwater only the rock resistivity and the pore water resistivity are needed. The rock resistivity could be obtained by a variety of geophysical downhole tools. Water-bearing fractures disturb the measurements and data possibly affected by free water has to be sorted out. This could be done without loosing too much data if the vertical resolution of the tool is high enough. It was found that the rock resistivity tool presently used by SKB are neither quantitative or have enough vertical resolution. Therefore the slimhole Dual-Laterolog from Antares was tested with good results. This tool has a high vertical resolution and gives quantitative rock resistivities that need no correction. At present there is no method of directly obtaining the

  9. DECISIONS, METHODS AND TECHNIQUES RELATED TO DECISION SUPPORT SYSTEMS (DSS

    Directory of Open Access Journals (Sweden)

    Boghean Florin

    2015-07-01

    Full Text Available Generalised uncertainty, a phenomenon that today’s managers are facing as part of their professional experience, makes it impossible to anticipate the way the business environment will evolve or what will be the consequences of the decisions they plan to implement. Any decision making process within the company entails the simultaneous presence of a number of economic, technical, juridical, human and managerial variables. The development and the approval of a decision is the result of decision making activities developed by the decision maker and sometimes by a decision support team or/and a decision support system (DSS. These aspects related to specific applications of decision support systems in risk management will be approached in this research paper. Decisions in general and management decisions in particular are associated with numerous risks, due to their complexity and increasing contextual orientation. In each business entity, there are concerns with the implementation of risk management in order to improve the likelihood of meeting objectives, the trust of the parties involved, increase the operational safety and security as well as the protection of the environment, minimise losses, improve organisational resilience in order to diminish the negative impact on the organisation and provide a solid foundation for decision making. Since any business entity is considered to be a wealth generator, the analysis of their performance should not be restricted to financial efficiency alone, but will also encompass their economic efficiency as well. The type of research developed in this paper entails different dimensions: conceptual, methodological, as well as empirical testing. Subsequently, the conducted research entails a methodological side, since the conducted activities have resulted in the presentation of a simulation model that is useful in decision making processes on the capital market. The research conducted in the present paper

  10. Alar setback technique: a controlled method of nasal tip deprojection.

    Science.gov (United States)

    Foda, H M

    2001-11-01

    To describe an alar cartilage-modifying technique aimed at decreasing nasal tip projection in cases with overdeveloped alar cartilages and to compare it with other deprojection techniques used to correct such deformity. Selected case series. University and private practice settings in Alexandria, Egypt. Twenty patients presenting for rhinoplasty who had overprojected nasal tips primarily due to overdeveloped alar cartilages. All cases were primary cases except for one patient, who had undergone 2 previous rhinoplasties. An external rhinoplasty approach was used to set back the alar cartilages by shortening their medial and lateral crura. The choice of performing a high or low setback depended on the preexisting lobule-to-columella ratio. Following the setback, the alar cartilages were reconstructed in a fashion that increased the strength and stability of the tip complex. Subjective evaluation included clinical examination, analysis of preoperative and postoperative photographs, and patient satisfaction. Objective evaluation of nasal tip projection, using the Goode ratio and the nasofacial angle, was performed preoperatively and repeated at least 6 months postoperatively. A low setback was performed in 16 cases (80%) and a high setback in 4 (20%). The mean follow-up period was 18 months (range, 6-36 months). The technique effectively deprojected the nasal tip as evidenced by the considerable postoperative decrease in values of the Goode ratio and the nasofacial angle. No complications were encountered and no revision surgical procedures were required. The alar setback technique has many advantages; it results in precise predictable amounts of deprojection, controls the degree of tip rotation, preserves the natural contour of the nasal tip, respects the tip support mechanisms, increases the strength and stability of nasal tip complex, preserves or restores the normal lobule-to-columella proportion, and does not lead to alar flaring. However, the technique requires

  11. Evaluation of a Delphi technique based expert judgement method for LCA valuation - DELPHI II

    Energy Technology Data Exchange (ETDEWEB)

    Virtanen, Y.; Torkkeli, S. [VTT Chemical Technology, Espoo (Finland). Environmental Technology; Wilson, B. [Landbank Environmental Research and Consulting, London (United Kingdom)

    1999-07-01

    Because of the complexity and trade-offs between different points of the life cycles of the analysed systems, a method which measures the environmental damage caused by each intervention is needed in order to make a choice between the products. However, there is no commonly agreed methodology for this particular purpose. In most of the methods the valuation is implicitly or explicitly based on economic criteria. For various reasons, however, economically obtained criteria do not necessarily reflect ecological arguments correctly. Thus, there is a need for new, ecologically based valuation methods. One such approach is the expert judgement method, based on the Delphi technique, which rejects the economic basis in favour of the judgements of a group of environmental experts. However, it is not self evident that the expert judgement based environmental rating of interventions will be essentially more correct and certain than other methods. In this study the method was evaluated at different points of the procedure in order to obtain a picture of the quality of the indexes produced. The evaluation was based on an actual Delphi study made in 1995-1996 in Finland, Sweden and Norway. The main questions addressed were the significance of the results and the operational quality of the Delphi procedure. The results obtained by applying the expert method indexes were also compared with the results obtained with other valuation methods for the background life cycle inventory of the case study. Additional material included feedback data from panellists of the case study, collected with a questionnaire. The questionnaire data was analysed to identify major dimensions in the criteria for evaluating interventions and correlation of the final indexes of the Delphi I study with these dimensions. The rest of the questionnaire material was used to document panellists' opinions and experiences of the Delphi process, familiarity with the environmental impacts of various

  12. Evaluation of a Delphi technique based expert judgement method for LCA valuation - DELPHI II

    International Nuclear Information System (INIS)

    Virtanen, Y.; Torkkeli, S.

    1999-01-01

    Because of the complexity and trade-offs between different points of the life cycles of the analysed systems, a method which measures the environmental damage caused by each intervention is needed in order to make a choice between the products. However, there is no commonly agreed methodology for this particular purpose. In most of the methods the valuation is implicitly or explicitly based on economic criteria. For various reasons, however, economically obtained criteria do not necessarily reflect ecological arguments correctly. Thus, there is a need for new, ecologically based valuation methods. One such approach is the expert judgement method, based on the Delphi technique, which rejects the economic basis in favour of the judgements of a group of environmental experts. However, it is not self evident that the expert judgement based environmental rating of interventions will be essentially more correct and certain than other methods. In this study the method was evaluated at different points of the procedure in order to obtain a picture of the quality of the indexes produced. The evaluation was based on an actual Delphi study made in 1995-1996 in Finland, Sweden and Norway. The main questions addressed were the significance of the results and the operational quality of the Delphi procedure. The results obtained by applying the expert method indexes were also compared with the results obtained with other valuation methods for the background life cycle inventory of the case study. Additional material included feedback data from panellists of the case study, collected with a questionnaire. The questionnaire data was analysed to identify major dimensions in the criteria for evaluating interventions and correlation of the final indexes of the Delphi I study with these dimensions. The rest of the questionnaire material was used to document panellists' opinions and experiences of the Delphi process, familiarity with the environmental impacts of various interventions

  13. Indirect Observation in Everyday Contexts: Concepts and Methodological Guidelines within a Mixed Methods Framework

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2018-01-01

    Full Text Available Indirect observation is a recent concept in systematic observation. It largely involves analyzing textual material generated either indirectly from transcriptions of audio recordings of verbal behavior in natural settings (e.g., conversation, group discussions or directly from narratives (e.g., letters of complaint, tweets, forum posts. It may also feature seemingly unobtrusive objects that can provide relevant insights into daily routines. All these materials constitute an extremely rich source of information for studying everyday life, and they are continuously growing with the burgeoning of new technologies for data recording, dissemination, and storage. Narratives are an excellent vehicle for studying everyday life, and quantitization is proposed as a means of integrating qualitative and quantitative elements. However, this analysis requires a structured system that enables researchers to analyze varying forms and sources of information objectively. In this paper, we present a methodological framework detailing the steps and decisions required to quantitatively analyze a set of data that was originally qualitative. We provide guidelines on study dimensions, text segmentation criteria, ad hoc observation instruments, data quality controls, and coding and preparation of text for quantitative analysis. The quality control stage is essential to ensure that the code matrices generated from the qualitative data are reliable. We provide examples of how an indirect observation study can produce data for quantitative analysis and also describe the different software tools available for the various stages of the process. The proposed method is framed within a specific mixed methods approach that involves collecting qualitative data and subsequently transforming these into matrices of codes (not frequencies for quantitative analysis to detect underlying structures and behavioral patterns. The data collection and quality control procedures fully meet

  14. Three-dimensional display techniques: description and critique of methods

    International Nuclear Information System (INIS)

    Budinger, T.F.

    1982-01-01

    The recent advances in non invasive medical imaging of 3 dimensional spatial distribution of radionuclides, X-ray attenuation coefficients, and nuclear magnetic resonance parameters necessitate development of a general method for displaying these data. The objective of this paper is to give a systematic description and comparison of known methods for displaying three dimensional data. The discussion of display methods is divided into two major categories: 1) computer-graphics methods which use a two dimensional display screen; and 2) optical methods (such as holography, stereopsis and vari-focal systems)

  15. Determination of plutonium isotopic abundances by gamma-ray spectrometry. Interim report on the status of methods and techniques developed by the Lawrence Livermore Laboratory

    International Nuclear Information System (INIS)

    Gunnink, R.

    1980-03-01

    This report presents an overview of methods and techniques developed by the Lawrence Livermore Laboratory for determining plutonium isotopic abundances from gamma-ray spectra that have been measured with germanium detectors. The methodology of fitting the spectral features includes discussions of algorithms for gamma-ray and x-ray peak shape fitting and generation of response spectra profiles characteristic of specific isotopes. Applications of the techniques developed at government, commercial, and Japanese reprocessing plants are described. Current development of the methodology for the nondestructive analysis of samples containing nondescript solid materials is also presented

  16. [Estimating child mortality using the previous child technique, with data from health centers and household surveys: methodological aspects].

    Science.gov (United States)

    Aguirre, A; Hill, A G

    1988-01-01

    2 trials of the previous child or preceding birth technique in Bamako, Mali, and Lima, Peru, gave very promising results for measurement of infant and early child mortality using data on survivorship of the 2 most recent births. In the Peruvian study, another technique was tested in which each woman was asked about her last 3 births. The preceding birth technique described by Brass and Macrae has rapidly been adopted as a simple means of estimating recent trends in early childhood mortality. The questions formulated and the analysis of results are direct when the mothers are visited at the time of birth or soon after. Several technical aspects of the method believed to introduce unforeseen biases have now been studied and found to be relatively unimportant. But the problems arising when the data come from a nonrepresentative fraction of the total fertile-aged population have not been resolved. The analysis based on data from 5 maternity centers including 1 hospital in Bamako, Mali, indicated some practical problems and the information obtained showed the kinds of subtle biases that can result from the effects of selection. The study in Lima tested 2 abbreviated methods for obtaining recent early childhood mortality estimates in countries with deficient vital registration. The basic idea was that a few simple questions added to household surveys on immunization or diarrheal disease control for example could produce improved child mortality estimates. The mortality estimates in Peru were based on 2 distinct sources of information in the questionnaire. All women were asked their total number of live born children and the number still alive at the time of the interview. The proportion of deaths was converted into a measure of child survival using a life table. Then each woman was asked for a brief history of the 3 most recent live births. Dates of birth and death were noted in month and year of occurrence. The interviews took only slightly longer than the basic survey

  17. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  18. Skin sparing mastectomy: Technique and suggested methods of reconstruction

    Directory of Open Access Journals (Sweden)

    Ahmed M. Farahat

    2014-09-01

    Conclusions: Skin Sparing mastectomy through a circum-areolar incision has proven to be a safe and feasible option for the management of breast cancer in Egyptian women, offering them adequate oncologic control and optimum cosmetic outcome through preservation of the skin envelope of the breast when ever indicated. Our patients can benefit from safe surgery and have good cosmetic outcomeby applying different reconstructive techniques.

  19. Uncertainty quantification in reactor physics using adjoint/perturbation techniques and adaptive spectral methods

    NARCIS (Netherlands)

    Gilli, L.

    2013-01-01

    This thesis presents the development and the implementation of an uncertainty propagation algorithm based on the concept of spectral expansion. The first part of the thesis is dedicated to the study of uncertainty propagation methodologies and to the analysis of spectral techniques. The concepts

  20. Designs, Techniques, and Reporting Strategies in Geography Education: A Review of Research Methods

    Science.gov (United States)

    Zadrozny, Joann; McClure, Caroline; Lee, Jinhee; Jo, Injeong

    2016-01-01

    A wide variety of research is being completed and published in geography education. The purpose of this article is to provide a general overview of the different types of methodologies, research designs, and techniques used by geography education researchers. Analyzing three geography education journals, we found 191 research articles published…

  1. Measuring the impact of methodological research: a framework and methods to identify evidence of impact.

    Science.gov (United States)

    Brueton, Valerie C; Vale, Claire L; Choodari-Oskooei, Babak; Jinks, Rachel; Tierney, Jayne F

    2014-11-27

    Providing evidence of impact highlights the benefits of medical research to society. Such evidence is increasingly requested by research funders and commonly relies on citation analysis. However, other indicators may be more informative. Although frameworks to demonstrate the impact of clinical research have been reported, no complementary framework exists for methodological research. Therefore, we assessed the impact of methodological research projects conducted or completed between 2009 and 2012 at the UK Medical Research Council Clinical Trials Unit Hub for Trials Methodology Research Hub, with a view to developing an appropriate framework. Various approaches to the collection of data on research impact were employed. Citation rates were obtained using Web of Science (http://www.webofknowledge.com/) and analyzed descriptively. Semistructured interviews were conducted to obtain information on the rates of different types of research output that indicated impact for each project. Results were then pooled across all projects. Finally, email queries pertaining to methodology projects were collected retrospectively and their content analyzed. Simple citation analysis established the citation rates per year since publication for 74 methodological publications; however, further detailed analysis revealed more about the potential influence of these citations. Interviews that spanned 20 individual research projects demonstrated a variety of types of impact not otherwise collated, for example, applications and further developments of the research; release of software and provision of guidance materials to facilitate uptake; formation of new collaborations and broad dissemination. Finally, 194 email queries relating to 6 methodological projects were received from 170 individuals across 23 countries. They provided further evidence that the methodologies were impacting on research and research practice, both nationally and internationally. We have used the information

  2. Uncovering the Transnational Networks, Organisational Techniques and State-Corporate Ties Behind Grand Corruption: Building an Investigative Methodology

    Directory of Open Access Journals (Sweden)

    Kristian Lasslett

    2017-11-01

    Full Text Available While grand corruption is a major global governance challenge, researchers notably lack a systematic methodology for conducting qualitative research into its complex forms. To address this lacuna, the following article sets out and applies the corruption investigative framework (CIF, a methodology designed to generate a systematic, transferable approach for grand corruption research. Its utility will be demonstrated employing a case study that centres on an Australian-led megaproject being built in Papua New Guinea’s capital city, Port Moresby. Unlike conventional analyses of corruption in Papua New Guinea, which emphasise its local characteristics and patrimonial qualities, application of CIF uncovered new empirical layers that centre on transnational state-corporate power, the ambiguity of civil society, and the structural inequalities that marginalise resistance movements. The important theoretical consequences of the findings and underpinning methodology are explored.

  3. Comparison of Nested-PCR technique and culture method in ...

    African Journals Online (AJOL)

    USER

    2010-04-05

    Apr 5, 2010 ... Full Length Research Paper. Comparison of ... The aim of the present study was to evaluate the diagnostic value of nested PCR in genitourinary ... method. Based on obtained results, the positivity rate of urine samples in this study was 5.0% by using culture and PCR methods and 2.5% for acid fast staining.

  4. Drifter technique: a new method to obtain metaphases in Hep-2 cell line cultures

    Directory of Open Access Journals (Sweden)

    Eleonidas Moura Lima

    2005-07-01

    Full Text Available The Hep-2 cell line is derived from laryngeal carcinoma cells and is often utilized as a model in carcinogenesis and mutagenesis tests. To evaluate the proliferative potential of this line, we developed a cytogenetic methodology (drifter technique to obtain metaphases from cells that loose cellular adhesion when they underwent mitosis in culture. By this procedure, 2000 cells were counted, resulting in a mitotic index (MI of 22.2%. Although this MI was not statistically different from the one obtained using either a classical cytogenetic method or a cell synchronization technique, the drifter technique has the advantage of not requiring the use of some reagents for the obtention of metaphases and also of diminishing the consumption of maintenance reagents for this cell line.A linhagem celular Hep-2 é formada por células de carcinoma da laringe e é muito utilizada em modelos de carcinogênese e mutagenêse. Para avaliar o potencial proliferativo desta linhagem, desenvolvemos uma metodologia citogenética (técnica do sobrenadante para obtenção de metáfases a partir de células que, ao entrarem em mitose, perdem adesão celular, ficando em suspensão no meio de cultura. Através deste procedimento, foram contadas 2000 células, correspondendo a um índice mitótico (IM de 22.2% . Apesar de o IM obtido por esta técnica não ter sido estatisticamente diferente do IM obtido por outras metodologias citogenéticas clássicas, a técnica do sobrenadante é vantajosa porque elimina o uso de alguns reagentes utilizados na obtenção de metáfases e também diminui o consumo de reagentes de manutenção desta linhagem.

  5. Onto-clust--a methodology for combining clustering analysis and ontological methods for identifying groups of comorbidities for developmental disorders.

    Science.gov (United States)

    Peleg, Mor; Asbeh, Nuaman; Kuflik, Tsvi; Schertz, Mitchell

    2009-02-01

    Children with developmental disorders usually exhibit multiple developmental problems (comorbidities). Hence, such diagnosis needs to revolve on developmental disorder groups. Our objective is to systematically identify developmental disorder groups and represent them in an ontology. We developed a methodology that combines two methods (1) a literature-based ontology that we created, which represents developmental disorders and potential developmental disorder groups, and (2) clustering for detecting comorbid developmental disorders in patient data. The ontology is used to interpret and improve clustering results and the clustering results are used to validate the ontology and suggest directions for its development. We evaluated our methodology by applying it to data of 1175 patients from a child development clinic. We demonstrated that the ontology improves clustering results, bringing them closer to an expert generated gold-standard. We have shown that our methodology successfully combines an ontology with a clustering method to support systematic identification and representation of developmental disorder groups.

  6. Accurate ocean bottom seismometer positioning method inspired by multilateration technique

    Science.gov (United States)

    Benazzouz, Omar; Pinheiro, Luis M.; Matias, Luis M. A.; Afilhado, Alexandra; Herold, Daniel; Haines, Seth S.

    2018-01-01

    The positioning of ocean bottom seismometers (OBS) is a key step in the processing flow of OBS data, especially in the case of self popup types of OBS instruments. The use of first arrivals from airgun shots, rather than relying on the acoustic transponders mounted in the OBS, is becoming a trend and generally leads to more accurate positioning due to the statistics from a large number of shots. In this paper, a linearization of the OBS positioning problem via the multilateration technique is discussed. The discussed linear solution solves jointly for the average water layer velocity and the OBS position using only shot locations and first arrival times as input data.

  7. Statistical Techniques Utilized in Analyzing PISA and TIMSS Data in Science Education from 1996 to 2013: A Methodological Review

    Science.gov (United States)

    Liou, Pey-Yan; Hung, Yi-Chen

    2015-01-01

    We conducted a methodological review of articles using the Programme for International Student Assessment (PISA) or Trends in International Mathematics and Science Study (TIMSS) data published by the SSCI-indexed science education journals, such as the "International Journal of Science and Mathematics Education," the "International…

  8. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  9. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Okrent, D.

    1989-03-01

    This final report summarizes the accomplishments of a two year research project entitled ``Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed.

  10. Joint application of AI techniques, PRA and disturbance analysis methodology to problems in the maintenance and design of nuclear power plants

    International Nuclear Information System (INIS)

    Okrent, D.

    1989-01-01

    This final report summarizes the accomplishments of a two year research project entitled ''Joint Application of Artificial Intelligence Techniques, Probabilistic Risk Analysis, and Disturbance Analysis Methodology to Problems in the Maintenance and Design of Nuclear Power Plants. The objective of this project is to develop and apply appropriate combinations of techniques from artificial intelligence, (AI), reliability and risk analysis and disturbance analysis to well-defined programmatic problems of nuclear power plants. Reactor operations issues were added to those of design and maintenance as the project progressed

  11. Evacuation proctography - examination technique and method of evaluation

    International Nuclear Information System (INIS)

    Braunschweig, R.; Schott, U.; Starlinger, M.

    1993-01-01

    Evacuation proctography is the most important imaging technique to supplement findings of physical examination, manometry, and endoscopy in patients presenting with pathologies in anorectal morphology and function. Indications for evacuation proctography include obstructed defecation or incomplete evacuation, imaging of ileal pouches following excision of the rectum, and suspected anorectal fistulae. Evacuation proctography with thick barium sulfate is performed under fluoroscopy. Documentation of the study can either be done by single-shot X-rays, video recording, or imaging with a 100-mm spot-film camera. Evacuation proctography shows morphologic changes such as spastic pelvic floor, rectocele, enterocele, intussusception and anal prolapse. Measurements can be performed to obtain the anorectal angle, location and mobility of the pelvic floor, and size as well as importance of a rectocele. Qualitative and quantitative data can only be interpreted along with clinical and manometric data. (orig.) [de

  12. Methods for magnetic resonance analysis using magic angle technique

    Science.gov (United States)

    Hu, Jian Zhi [Richland, WA; Wind, Robert A [Kennewick, WA; Minard, Kevin R [Kennewick, WA; Majors, Paul D [Kennewick, WA

    2011-11-22

    Methods of performing a magnetic resonance analysis of a biological object are disclosed that include placing the object in a main magnetic field (that has a static field direction) and in a radio frequency field; rotating the object at a frequency of less than about 100 Hz around an axis positioned at an angle of about 54.degree.44' relative to the main magnetic static field direction; pulsing the radio frequency to provide a sequence that includes a phase-corrected magic angle turning pulse segment; and collecting data generated by the pulsed radio frequency. In particular embodiments the method includes pulsing the radio frequency to provide at least two of a spatially selective read pulse, a spatially selective phase pulse, and a spatially selective storage pulse. Further disclosed methods provide pulse sequences that provide extended imaging capabilities, such as chemical shift imaging or multiple-voxel data acquisition.

  13. Nuclear pulse signal processing techniques based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Qi Zhong; Meng Xiangting; Fu Yanyan; Li Dongcang

    2012-01-01

    This article presents a method of measurement and analysis of nuclear pulse signal, the FPGA to control high-speed ADC measurement of nuclear radiation signals and control the high-speed transmission status of the USB to make it work on the Slave FIFO mode, using the LabVIEW online data processing and display, using the blind deconvolution method to remove the accumulation of signal acquisition, and to restore the nuclear pulse signal with a transmission speed, real-time measurements show that the advantages. (authors)

  14. Nuclear pulse signal processing technique based on blind deconvolution method

    International Nuclear Information System (INIS)

    Hong Pengfei; Yang Lei; Fu Tingyan; Qi Zhong; Li Dongcang; Ren Zhongguo

    2012-01-01

    In this paper, we present a method for measurement and analysis of nuclear pulse signal, with which pile-up signal is removed, the signal baseline is restored, and the original signal is obtained. The data acquisition system includes FPGA, ADC and USB. The FPGA controls the high-speed ADC to sample the signal of nuclear radiation, and the USB makes the ADC work on the Slave FIFO mode to implement high-speed transmission status. Using the LabVIEW, it accomplishes online data processing of the blind deconvolution algorithm and data display. The simulation and experimental results demonstrate advantages of the method. (authors)

  15. Molecular techniques: An overview of methods for the detection of ...

    African Journals Online (AJOL)

    Several DNA molecular markers are now available for use in surveillance and investigation of food-borne outbreaks that were previously difficult to detect. The results from several sources of literature indicate substantially different degrees of sensitivities between conventional detection methods and molecular-based ...

  16. Nuclear fuel cycle optimization - methods and modelling techniques

    International Nuclear Information System (INIS)

    Silvennoinen, P.

    1982-01-01

    This book is aimed at presenting methods applicable in the analysis of fuel cycle logistics and optimization as well as in evaluating the economics of different reactor strategies. After a succinct introduction to the phases of a fuel cycle, uranium cost trends are assessed in a global perspective and subsequent chapters deal with the fuel cycle problems faced by a power utility. A fundamental material flow model is introduced first in the context of light water reactor fuel cycles. Besides the minimum cost criterion, the text also deals with other objectives providing for a treatment of cost uncertainties and of the risk of proliferation of nuclear weapons. Methods to assess mixed reactor strategies, comprising also other reactor types than the light water reactor, are confined to cost minimization. In the final Chapter, the integration of nuclear capacity within a generating system is examined. (author)

  17. Abstract analysis method facilitates filtering low-methodological quality and high-bias risk systematic reviews on psoriasis interventions.

    Science.gov (United States)

    Gómez-García, Francisco; Ruano, Juan; Aguilar-Luque, Macarena; Alcalde-Mellado, Patricia; Gay-Mimbrera, Jesús; Hernández-Romero, José Luis; Sanz-Cabanillas, Juan Luis; Maestre-López, Beatriz; González-Padilla, Marcelino; Carmona-Fernández, Pedro J; García-Nieto, Antonio Vélez; Isla-Tejera, Beatriz

    2017-12-29

    Article summaries' information and structure may influence researchers/clinicians' decisions to conduct deeper full-text analyses. Specifically, abstracts of systematic reviews (SRs) and meta-analyses (MA) should provide structured summaries for quick assessment. This study explored a method for determining the methodological quality and bias risk of full-text reviews using abstract information alone. Systematic literature searches for SRs and/or MA about psoriasis were undertaken on MEDLINE, EMBASE, and Cochrane database. For each review, quality, abstract-reporting completeness, full-text methodological quality, and bias risk were evaluated using Preferred Reporting Items for Systematic Reviews and Meta-analyses for abstracts (PRISMA-A), Assessing the Methodological Quality of Systematic Reviews (AMSTAR), and ROBIS tools, respectively. Article-, author-, and journal-derived metadata were systematically extracted from eligible studies using a piloted template, and explanatory variables concerning abstract-reporting quality were assessed using univariate and multivariate-regression models. Two classification models concerning SRs' methodological quality and bias risk were developed based on per-item and total PRISMA-A scores and decision-tree algorithms. This work was supported, in part, by project ICI1400136 (JR). No funding was received from any pharmaceutical company. This study analysed 139 SRs on psoriasis interventions. On average, they featured 56.7% of PRISMA-A items. The mean total PRISMA-A score was significantly higher for high-methodological-quality SRs than for moderate- and low-methodological-quality reviews. SRs with low-bias risk showed higher total PRISMA-A values than reviews with high-bias risk. In the final model, only 'authors per review > 6' (OR: 1.098; 95%CI: 1.012-1.194), 'academic source of funding' (OR: 3.630; 95%CI: 1.788-7.542), and 'PRISMA-endorsed journal' (OR: 4.370; 95%CI: 1.785-10.98) predicted PRISMA-A variability. Reviews with a

  18. Bicycle Frame Prediction Techniques with Fuzzy Logic Method

    Directory of Open Access Journals (Sweden)

    Rafiuddin Syam

    2015-03-01

    Full Text Available In general, an appropriate size bike frame would get comfort to the rider while biking. This study aims to predict the simulation system on the bike frame sizes with fuzzy logic. Testing method used is the simulation test. In this study, fuzzy logic will be simulated using Matlab language to test their performance. Mamdani fuzzy logic using 3 variables and 1 output variable intake. Triangle function for the input and output. The controller is designed in the type mamdani with max-min composition and the method deffuzification using center of gravity method. The results showed that height, inseam and Crank Size generating appropriate frame size for the rider associated with comfort. Has a height range between 142 cm and 201 cm. Inseam has a range between 64 cm and 97 cm. Crank has a size range between 175 mm and 180 mm. The simulation results have a range of frame sizes between 13 inches and 22 inches. By using the fuzzy logic can be predicted the size frame of bicycle suitable for the biker.

  19. Bicycle Frame Prediction Techniques with Fuzzy Logic Method

    Directory of Open Access Journals (Sweden)

    Rafiuddin Syam

    2017-03-01

    Full Text Available In general, an appropriate size bike frame would get comfort to the rider while biking. This study aims to predict the simulation system on the bike frame sizes with fuzzy logic. Testing method used is the simulation test. In this study, fuzzy logic will be simulated using Matlab language to test their performance. Mamdani fuzzy logic using 3 variables and 1 output variable intake. Triangle function for the input and output. The controller is designed in the type mamdani with max-min composition and the method deffuzification using center of gravity method. The results showed that height, inseam and Crank Size generating appropriate frame size for the rider associated with comfort. Has a height range between 142 cm and 201 cm. Inseam has a range between 64 cm and 97 cm. Crank has a size range between 175 mm and 180 mm. The simulation results have a range of frame sizes between 13 inches and 22 inches. By using the fuzzy logic can be predicted the size frame of bicycle suitable for the biker.

  20. Techniques and methodologies to identify potential generated industries of NORM in Angola Republic and evaluate its impacts

    International Nuclear Information System (INIS)

    Diogo, José Manuel Sucumula

    2017-01-01

    Numerous steps have been taken worldwide to identify and quantify the radiological risks associated with the mining of ores containing Naturally Occurrence Radioactive Material (NORM), often resulting in unnecessary exposures to individuals and high environmental damage, with devastating consequences for the health of workers and damage to the economy of many countries due to a lack of regulations or inadequate regulations. For these and other reasons, the objective of this work was to identify industrial potential generating NORM in the Republic of Angola and to estimate its radiological environmental impacts. To achieve this objective, we studied the theoretical aspects, identified the main internationally recognized industrial companies that as generate by NORM. The Brazilian experience in the regulatory aspect was observed in the evaluation criteria to classify industries that generate NORM, the methods of mining and its radiological environmental impacts, as well as the main techniques applied to evaluate the concentrations of radionuclides in a specific environmental matrix and/or a NORM sample. The study approach allowed the elaboration of a NORM map for the main provinces of Angola, establishing the evaluation criteria for implementing the Radiation Protection Plan in the extractive industry, establishing measures to control ionizing radiation in mining, identifying and quantifying radionuclides present in samples of lees oil. However, in order to assess adequately the radiological environmental impact of the NORM industry, it is not enough to identify them, it is important to know the origin, quantify the radioactive material released as liquid and gaseous effluents, identify the main routes of exposure and examine how this material spreads into the environment until it reaches man. (author)

  1. Methodological Foundations for the Empirical Evaluation of Non-Experimental Methods in Field Settings

    Science.gov (United States)

    Wong, Vivian C.; Steiner, Peter M.

    2015-01-01

    Across the disciplines of economics, political science, public policy, and now, education, the randomized controlled trial (RCT) is the preferred methodology for establishing causal inference about program impacts. But randomized experiments are not always feasible because of ethical, political, and/or practical considerations, so non-experimental…

  2. Toward a Methodology of Death: Deleuze's "Event" as Method for Critical Ethnography

    Science.gov (United States)

    Rodriguez, Sophia

    2016-01-01

    This article examines how qualitative researchers, specifically ethnographers, might utilize complex philosophical concepts in order to disrupt the normative truth-telling practices embedded in social science research. Drawing on my own research experiences, I move toward a methodology of death (for researcher/researched alike) grounded in…

  3. Methodologie, Eclectisme...et Bricolage Pedagogique (Methodology, Selective Teaching Methods... and Pedagogical Odds and Ends).

    Science.gov (United States)

    De Salins, Genevieve-Dominique

    1996-01-01

    Focuses on the contents of foreign language texts and the types of foreigners who learn French. The article notes the tone of didactic discussions in these manuals and questions the usage of two terms appearing there--"methodology" and "selection." (Author/CK)

  4. Problem-solving and developing quality management methods and techniques on the example of automotive industry

    OpenAIRE

    Jacek Łuczak; Radoslaw Wolniak

    2015-01-01

    The knowledge about methods and techniques of quality management together with their effective use can be definitely regarded as an indication of high organisational culture. Using such methods and techniques in an effective way can be attributed to certain level of maturity, as far as the quality management system in an organisation is concerned. There is in the paper an analysis of problem-solving methods and techniques of quality management in the automotive sector in Poland. The survey wa...

  5. Study and synthesis of orthophosphates by electrolytic method: new methodology in the generation of nanostructured materials

    International Nuclear Information System (INIS)

    Montalbert-Smith Echeverria, Ricardo

    2009-01-01

    An electrochemical synthesis of orthophosphated compounds (PO 4 3- ) of divalent cations is made to establish a standardized synthetic route for the production of nano-sized particles. The hypothesis was established on the use of common ligands producing a supersaturated system of ions and that the application of an electric current in the system functions as a generator of electromotive force and nanometric crystals of a specific phase. The method has been synthesized carbonated apatite from: hydroxyapatite nanometer dimension and /or defended in calcium, carbonated apatite of nanometer-sized strontium, barium carbonate apatite, apatite doped with magnesium and lanthanum cations, apatite doped with silicate anions. A study was realized to find any relationship of particle size dependent of parameters in initial pH and current density. A crystallographic study of Pawley was used to determine network parameters and crystallite size from diffraction patterns. Besides phases are confirmed produced with complementary techniques such as FT-IR thermogravimetric analysis (TGA) and elemental analysis (EDS). Kinetic studies were conducted following the oxidation of EDTA and calcium intake in determining the precise point of the electrolytic synthesis reaction of apatites. (author) [es

  6. Cooperative project on methods and techniques for assessment of ageing and safety of nuclear objects

    International Nuclear Information System (INIS)

    Bundara, B.; Udovc, M.; Cvelbar, R.; Vojvodic Tuma, J.; Celin, R.; Cizelj, L.; Simonovski, I.; Pirs, B.; Zabric, I.

    2007-01-01

    Nuclear Power Plants are so far the most demanding electric power plants concerning the extent and complexity of knowledge that is needed for design, construction, installation, safe operation and proper maintenance. For safe operation of the NPP it is important to have reliable inspection procedures and methods to detect the relevant defects in the components. It is also important to have effective techniques and efficient methodology that enable precise estimation of the material degradation and reliable prediction of the remaining period of the safe service of structures and components. During the operation of NPP its materials, structures and components are exposed to various impacts that have for the result changes in the material. Changes usually manifest as deviation from the origin (generally considered as defects) and can be observed at level of microstructure and/or at structural level. Defects are consequence of ageing and ageing is a consequence of mechanical, thermal, chemical, radiation induced and other processes. Complexity of the NPP and continuous operation at high level of safety demands extensive cooperation of researchers and engineers with different scientific and educational background. In the paper is discussed the importance of sufficient support to the NPP related research projects and the need for cooperation between institutes. As an example is presented the cooperative project that bands the research groups with different scientific background into complementary team working on multidisciplinary project focused on assessment of ageing and safety of nuclear objects. (author)

  7. A Testable Design Method for Memories by Boundary Scan Technique

    Directory of Open Access Journals (Sweden)

    Qiao Guo-Hui

    2016-01-01

    Full Text Available This paper presents a design for test the embedded flash in an object System-on-a-chip (SoC. The feature of the Flash TAP (Test Access Port complies with the IEEE std.1149.1, and it can select different scan chains and other control registers for other test. By the trade-off between the test time and the circuit area, an IST (In System Test circuit is designed in the SoC. Experiment results on the embedded memory have shown that the proposed method costs small testing timing by the use of IST.

  8. Reliability Evaluation Methodologies of Fault Tolerant Techniques of Digital I and C Systems in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Bo Gyung; Kang, Hyun Gook; Seong, Poong Hyun; Lee, Seung Jun

    2011-01-01

    Since the reactor protection system was replaced from analog to digital, digital reactor protection system has 4 redundant channels and each channel has several modules. It is necessary for various fault tolerant techniques to improve availability and reliability due to using complex components in DPPS. To use the digital system, it is necessary to improve the reliability and availability of a system through fault-tolerant techniques. Several researches make an effort to effects of fault tolerant techniques. However, the effects of fault tolerant techniques have not been properly considered yet in most fault tree models. Various fault-tolerant techniques, which used in digital system in NPPs, should reflect in fault tree analysis for getting lower system unavailability and more reliable PSA. When fault-tolerant techniques are modeled in fault tree, categorizing the module to detect by each fault tolerant techniques, fault coverage, detection period and the fault recovery should be considered. Further work will concentrate on various aspects for fault tree modeling. We will find other important factors, and found a new theory to construct the fault tree model

  9. Experimental investigation of the predictive capabilities of data driven modeling techniques in hydrology - Part 1: Concepts and methodology

    Directory of Open Access Journals (Sweden)

    A. Elshorbagy

    2010-10-01

    Full Text Available A comprehensive data driven modeling experiment is presented in a two-part paper. In this first part, an extensive data-driven modeling experiment is proposed. The most important concerns regarding the way data driven modeling (DDM techniques and data were handled, compared, and evaluated, and the basis on which findings and conclusions were drawn are discussed. A concise review of key articles that presented comparisons among various DDM techniques is presented. Six DDM techniques, namely, neural networks, genetic programming, evolutionary polynomial regression, support vector machines, M5 model trees, and K-nearest neighbors are proposed and explained. Multiple linear regression and naïve models are also suggested as baseline for comparison with the various techniques. Five datasets from Canada and Europe representing evapotranspiration, upper and lower layer soil moisture content, and rainfall-runoff process are described and proposed, in the second paper, for the modeling experiment. Twelve different realizations (groups from each dataset are created by a procedure involving random sampling. Each group contains three subsets; training, cross-validation, and testing. Each modeling technique is proposed to be applied to each of the 12 groups of each dataset. This way, both prediction accuracy and uncertainty of the modeling techniques can be evaluated. The description of the datasets, the implementation of the modeling techniques, results and analysis, and the findings of the modeling experiment are deferred to the second part of this paper.

  10. Optimisation of warpage on plastic injection moulding part using response surface methodology (RSM) and genetic algorithm method (GA)

    Science.gov (United States)

    Miza, A. T. N. A.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    In this study, Computer Aided Engineering was used for injection moulding simulation. The method of Design of experiment (DOE) was utilize according to the Latin Square orthogonal array. The relationship between the injection moulding parameters and warpage were identify based on the experimental data that used. Response Surface Methodology (RSM) was used as to validate the model accuracy. Then, the RSM and GA method were combine as to examine the optimum injection moulding process parameter. Therefore the optimisation of injection moulding is largely improve and the result shown an increasing accuracy and also reliability. The propose method by combining RSM and GA method also contribute in minimising the warpage from occur.

  11. Development of techniques and methods for evaluation of quality of scanned image in mammography

    International Nuclear Information System (INIS)

    Carmo Santana, P. do; Nogueira, M.S.

    2008-01-01

    Cancer is the second cause of death in the Brazilian female population and breast cancer is the most frequent neoplasm amongst women. Mammography is an essential tool for diagnosis and early detection of this disease. In order to be effective, the mammography must be of good quality. The Brazilian College of Radiology (CBR), the National Agency for Health Surveillance (ANVISA) and international bodies recommend standards of practice for mammography. Due to the risk of ionizing radiation, techniques that minimize dose and optimize image quality are essential to ensure that all women are submitted to mammography procedures of high quality for the detection of breast cancer. In this research were analyzed components of the image treatment via digital and developed methods and techniques of analysis aiming the detection of structures for medical diagnosis, decreasing variations due to subjectivity. It used free software Image J, to make the evaluations of the information contained in the scanned images. We use the scanned images of calibration of a simulated breast to calibrate the program Image J. Thus, it was able to correctly convert the values of the scale of shades of gray in optical density values of presenting the standard deviation for each measure held. Applying the test t-student noticed that the values obtained with the digital system to the level of contrast and spatial resolution are consistent with the results obtained so subjective, since there was no significant difference (p <0.05) for all comparisons evaluated. Since then, this methodology is recommended in routine evaluations of services of mammography. (author)

  12. Nuclear-fuel-cycle optimization: methods and modelling techniques

    International Nuclear Information System (INIS)

    Silvennoinen, P.

    1982-01-01

    This book present methods applicable to analyzing fuel-cycle logistics and optimization as well as in evaluating the economics of different reactor strategies. After an introduction to the phases of a fuel cycle, uranium cost trends are assessed in a global perspective. Subsequent chapters deal with the fuel-cycle problems faced by a power utility. The fuel-cycle models cover the entire cycle from the supply of uranium to the disposition of spent fuel. The chapter headings are: Nuclear Fuel Cycle, Uranium Supply and Demand, Basic Model of the LWR (light water reactor) Fuel Cycle, Resolution of Uncertainties, Assessment of Proliferation Risks, Multigoal Optimization, Generalized Fuel-Cycle Models, Reactor Strategy Calculations, and Interface with Energy Strategies. 47 references, 34 figures, 25 tables

  13. Intelligent Search Method Based ACO Techniques for a Multistage Decision Problem EDP/LFP

    Directory of Open Access Journals (Sweden)

    Mostefa RAHLI

    2006-07-01

    Full Text Available The implementation of a numerical library of calculation based optimization in electrical supply networks area is in the centre of the current research orientations, thus, our project in a form given is centred on the development of platform NMSS1. It's a software environment which will preserve many efforts as regards calculations of charge, smoothing curves, losses calculation and economic planning of the generated powers [23].The operational research [17] in a hand and the industrial practice in the other, prove that the means and processes of simulation reached a level of very appreciable reliability and mathematical confidence [4, 5, 14]. It is of this expert observation that many processes make confidence to the results of simulation.The handicaps of this approach or methodology are that it makes base its judgments and handling on simplified assumptions and constraints whose influence was deliberately neglected to be added to the cost to spend [14].By juxtaposing the methods of simulation with artificial intelligence techniques, gathering set of numerical methods acquires an optimal reliability whose assurance can not leave doubt.Software environment NMSS [23] can be a in the field of the rallying techniques of simulation and electric network calculation via a graphic interface. In the same software integrate an AI capability via a module expert system.Our problem is a multistage case where are completely dependant and can't be performed separately.For a multistage problem [21, 22], the results obtained from a credible (large size problem calculation, makes the following question: Could choice of numerical methods set make the calculation of a complete problem using more than two treatments levels, a total error which will be the weakest one possible? It is well-known according to algorithmic policy; each treatment can be characterized by a function called mathematical complexity. This complexity is in fact a coast (a weight overloading

  14. 34 CFR 429.1 - What is the Bilingual Vocational Materials, Methods, and Techniques Program?

    Science.gov (United States)

    2010-07-01

    ... techniques for bilingual vocational training for individuals with limited English proficiency. (Authority..., and Techniques Program? 429.1 Section 429.1 Education Regulations of the Offices of the Department of... MATERIALS, METHODS, AND TECHNIQUES PROGRAM General § 429.1 What is the Bilingual Vocational Materials...

  15. Methodological and Methodical Principles of the Empirical Study of Spiritual Development of a Personality

    Directory of Open Access Journals (Sweden)

    Olga Klymyshyn

    2017-06-01

    Full Text Available The article reveals the essence of the methodological principles of the spiritual development of a personality. The results of the theoretical analysis of psychological content of spirituality from the positions of system and structural approach to studying of a personality, age patterns of the mental personality development, the sacramental nature of human person, mechanisms of human spiritual development are taken into consideration. The interpretation of spirituality and the spiritual development of a personality is given. Initial principles of the organization of the empirical research of the spiritual development of a personality (ontogenetic, sociocultural, self-determination, system are presented. Such parameters of the estimation of a personality’s spiritual development as general index of the development of spiritual potential, indexes of the development of ethical, aesthetical, cognitive, existential components of spirituality, index of religiousness of a personality are described. Methodological support of psychological diagnostic research is defined.

  16. Application of the HGPT methodology of reactor operation problems with a nodal mixed method

    International Nuclear Information System (INIS)

    Baudron, A.M.; Bruna, G.B.; Gandini, A.; Lautard, J.J.; Monti, S.; Pizzigati, G.

    1998-01-01

    The heuristically based generalized perturbation theory (HGPT), to first and higher order, applied to the neutron field of a reactor system, is discussed in relation to quasistatic problems. This methodology is of particular interest in reactor operation. In this application it may allow an on-line appraisal of the main physical responses of the reactor system when subject to alterations relevant to normal system exploitation, e.g. control rod movement, and/or soluble boron concentration changes to be introduced, for instance, for compensating power level variations following electrical network demands. In this paper, after describing the main features of the theory, its implementation into the diffusion, 3D mixed dual nodal code MINOS of the SAPHYR system is presented. The results from a small scale investigation performed on a simplified PWR system corroborate the validity of the methodology proposed

  17. Surface Signature Characterization at SPE through Ground-Proximal Methods: Methodology Change and Technical Justification

    Energy Technology Data Exchange (ETDEWEB)

    Schultz-Fellenz, Emily S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-09-09

    A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation and careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.

  18. Detection of concrete dam leakage using an integrated geophysical technique based on flow-field fitting method

    Science.gov (United States)

    Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.

    2017-05-01

    An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.

  19. Methods and techniques of detecting petroleum-polluted water. Metody i tekhnika obnaruzheniya neftyanykh zagroyaznenii vod

    Energy Technology Data Exchange (ETDEWEB)

    Bogorodskii, V V; Kropotkin, M A; Sheveleva, T Yu

    1975-01-01

    The booklet presents physical principles and techniques of contact and remote sensing of oil pollution. Different methods and their practical possibilities are discussed. The possibility of application of remote CO/sub 2/-laser radar technique for the detection of oil pollution is considered. The booklet may be useful for specialists in oceanology, oceanic physics, meteorology, and in remote physical methods for environmental studies.

  20. Isotopic dilution methods to determine the gross transformation rates of nitrogen, phosphorus, and sulfur in soil: a review of the theory, methodologies, and limitations

    International Nuclear Information System (INIS)

    Di, H. J.; Cameron, K. C.; McLaren, R. G.

    2000-01-01

    The rates at which nutrients are released to, and removed from, the mineral nutrient pool are important in regulating the nutrient supply to plants. These nutrient transformation rates need to be taken into account when developing nutrient management strategies for economical and sustainable production. A method that is gaining popularity for determining the gross transformation rates of nutrients in the soil is the isotopic dilution technique. The technique involves labelling a soil mineral nutrient pool, e.g. NH 4 + , NO 3 - , PO 4 3- , or SO 4 2- , and monitoring the changes with time of the size of the labelled nutrient pool and the excess tracer abundance (atom %, if stable isotope tracer is used) or specific activity (if radioisotope is used) in the nutrient pool. Because of the complexity of the concepts and procedures involved, the method has sometimes been used incorrectly, and results misinterpreted. This paper discusses the isotopic dilution technique, including the theoretical background, the methodologies to determine the gross flux rates of nitrogen, phosphorus, and sulfur, and the limitations of the technique. The assumptions, conceptual models, experimental procedures, and compounding factors are discussed. Possible effects on the results by factors such as the uniformity of tracer distribution in the soil, changes in soil moisture content, substrate concentration, and aeration status, and duration of the experiment are also discussed. The influx and out-flux transformation rates derived from this technique are often contributed by several processes simultaneously, and thus cannot always be attributed to a particular nutrient transformation process. Despite the various constraints or possible compounding factors, the technique is a valuable tool that can provide important quantitative information on nutrient dynamics in the soil-plant system. Copyright (2000) CSIRO Publishing

  1. Basin Visual Estimation Technique (BVET) and Representative Reach Approaches to Wadeable Stream Surveys: Methodological Limitations and Future Directions

    Science.gov (United States)

    Lance R. Williams; Melvin L. Warren; Susan B. Adams; Joseph L. Arvai; Christopher M. Taylor

    2004-01-01

    Basin Visual Estimation Techniques (BVET) are used to estimate abundance for fish populations in small streams. With BVET, independent samples are drawn from natural habitat units in the stream rather than sampling "representative reaches." This sampling protocol provides an alternative to traditional reach-level surveys, which are criticized for their lack...

  2. Nuclear techniques in the study of pollutant transport in the environment. Interaction of solutes with geological media (methodological aspects)

    International Nuclear Information System (INIS)

    1993-07-01

    This volume includes a summary of the 5-year co-ordinated research programme to use nuclear techniques for the study of the transport of pollutants (both radioactive and non-radioactive) in the environment as well as twelve individual reports of the different activities performed under the programme. These have been indexed separately. Refs, figs and tabs

  3. Gamma-ray spectrometry combined with acceptable knowledge (GSAK). A technique for characterization of certain remote-handled transuranic (RH-TRU) wastes. Part 1. Methodology and techniques

    International Nuclear Information System (INIS)

    Hartwell, J.K.; McIlwain, M.E.

    2005-01-01

    Gamma-ray spectrometry combined with acceptable knowledge (GSAK) is a technique for the characterization of certain remote-handled transuranic (RH-TRU) wastes. GSAK uses gamma-ray spectrometry to quantify a portion of the fission product inventory of RH-TRU wastes. These fission product results are then coupled with calculated inventories derived from acceptable process knowledge to characterize the radionuclide content of the assayed wastes. GSAK has been evaluated and tested through several test exercises. GSAK approach is described, while test results are presented in Part II. (author)

  4. Mixed methods research in tobacco control with youth and young adults: A methodological review of current strategies.

    Directory of Open Access Journals (Sweden)

    Craig S Fryer

    Full Text Available Tobacco use among young people is a complex and serious global dilemma that demands innovative and diverse research approaches. The purpose of this methodological review was to examine the current use of mixed methods research in tobacco control with youth and young adult populations and to develop practical recommendations for tobacco control researchers interested in this methodology.Using PubMed, we searched five peer-reviewed journals that publish tobacco control empirical literature for the use of mixed methods research to study young populations, age 12-25 years. Our team analyzed the features of each article in terms of tobacco control topic, population, youth engagement strategies, and several essential elements of mixed methods research.We identified 23 mixed methods studies published by authors from five different countries reported between 2004 and 2015. These 23 articles examined various topics that included tobacco use behavior, tobacco marketing and branding, and cessation among youth and young adults. The most common mixed methods approach was variations of the concurrent design in which the qualitative and quantitative strands were administered at the same time and given equal priority. This review documented several innovative applications of mixed methods research as well as challenges in the reporting of the complex research designs.The use of mixed methods research in tobacco control has great potential for advancing the understanding of complex behavioral and sociocultural issues for all groups, especially youth and young adults.

  5. Mixed methods research in tobacco control with youth and young adults: A methodological review of current strategies.

    Science.gov (United States)

    Fryer, Craig S; Seaman, Elizabeth L; Clark, Rachael S; Plano Clark, Vicki L

    2017-01-01

    Tobacco use among young people is a complex and serious global dilemma that demands innovative and diverse research approaches. The purpose of this methodological review was to examine the current use of mixed methods research in tobacco control with youth and young adult populations and to develop practical recommendations for tobacco control researchers interested in this methodology. Using PubMed, we searched five peer-reviewed journals that publish tobacco control empirical literature for the use of mixed methods research to study young populations, age 12-25 years. Our team analyzed the features of each article in terms of tobacco control topic, population, youth engagement strategies, and several essential elements of mixed methods research. We identified 23 mixed methods studies published by authors from five different countries reported between 2004 and 2015. These 23 articles examined various topics that included tobacco use behavior, tobacco marketing and branding, and cessation among youth and young adults. The most common mixed methods approach was variations of the concurrent design in which the qualitative and quantitative strands were administered at the same time and given equal priority. This review documented several innovative applications of mixed methods research as well as challenges in the reporting of the complex research designs. The use of mixed methods research in tobacco control has great potential for advancing the understanding of complex behavioral and sociocultural issues for all groups, especially youth and young adults.

  6. Evaluation of the effectiveness of the three-dimensional residual stresses method based on the eigenstrain methodology via x-ray measurements

    International Nuclear Information System (INIS)

    Ogawa, Masaru; Ishii, Takehiro; Furusako, Seiji

    2015-01-01

    In order to prevent fractures caused by fatigue or stress corrosion cracking in welded structures, it is important to predict crack propagation for cracks observed during in-service inspections. However, it is difficult to evaluate three-dimensional welding residual stresses non-destructively. Today, it is possible to measure residual stresses just on surface by X-ray diffraction. Neutron diffraction makes it possible to measure welding residual stresses non-destructively even in the thickness direction but it is only available in special irradiation facilities. Therefore, it is impossible to use neutron diffraction as an on-site measurement technique. As non-destructive method of three-dimensional welding residual stresses based on the eigenstrain methodology, the bead flush method has been proposed. In this method, three-dimensional welding residual stresses are calculated by an elastic FEM (Finite Element Method) analysis from eigenstrain distributions which are estimated by an inverse analysis from released strains by strain gauges in the removal of the weld reinforcement. Here, the removal of the excess metal contributes inhibition of crack initiation. Therefore, the bead flush method is a non-destructive technique essentially. However, estimation accuracy of this method becomes relatively poor when processing strains are added on the machined surface. The first author has been developed the bead flush method to be free from the influence of the processing strains. In this method, eigenstrains are estimated not from released strains but from residual strains on surface by X-ray diffraction. In this study, welding residual stresses on the bottom surface in an actual welded plate are estimated from elastic strains measured on the top surface using this method. To evaluate estimation accuracy, estimated residual stresses on the bottom surface are compared with residual stresses measured by X-ray diffraction. Here, eigenstrain distributions not only in the welding

  7. Testing methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs.

  8. Testing methodologies

    International Nuclear Information System (INIS)

    Bender, M.A.

    1990-01-01

    Several methodologies are available for screening human populations for exposure to ionizing radiation. Of these, aberration frequency determined in peripheral blood lymphocytes is the best developed. Individual exposures to large doses can easily be quantitated, and population exposures to occupational levels can be detected. However, determination of exposures to the very low doses anticipated from a low-level radioactive waste disposal site is more problematical. Aberrations occur spontaneously, without known cause. Exposure to radiation induces no new or novel types, but only increases their frequency. The limitations of chromosomal aberration dosimetry for detecting low level radiation exposures lie mainly in the statistical ''signal to noise'' problem, the distribution of aberrations among cells and among individuals, and the possible induction of aberrations by other environmental occupational or medical exposures. However, certain features of the human peripheral lymphocyte-chromosomal aberration system make it useful in screening for certain types of exposures. Future technical developments may make chromosomal aberration dosimetry more useful for low-level radiation exposures. Other methods, measuring gene mutations or even minute changes on the DNA level, while presently less will developed techniques, may eventually become even more practical and sensitive assays for human radiation exposure. 15 refs

  9. Proper Methodology and Methods of Collecting and Analyzing Slavery Data: An Examination of the Global Slavery Index

    Directory of Open Access Journals (Sweden)

    Andrew Guth

    2014-11-01

    Full Text Available The Global Slavery Index aims to, among other objectives, recognize the forms, size, and scope of slavery worldwide as well as the strengths and weaknesses of individual countries. An analysis of the Index’s methods exposes significant and critical weaknesses and raises questions into its replicability and validity. The Index may prove more valuable in the future if proper methods are implemented, but the longer improper methods are used the more damage is done to the public policy debate on slavery by advancing data and policy that is not based on sound methodology. To implement proper methods, a committee of sophisticated methodologists needs to develop measurement tools and constantly analyze and refine these methods over the years as data is collected.

  10. For a reasoned development of experimental methods in information and communication sciences Some epistemological findings of methodological pluralism

    Directory of Open Access Journals (Sweden)

    Didier COURBET

    2013-07-01

    Full Text Available If multidisciplinarity is necessary, first, for studying the widest possible set of communication phenomena (organizational, in groups, interpersonal, media, computer-mediated communication... and, secondly, for grasping the complexity of the different moments of the same phenomenon of communication (production, content, reception, circulation ..., methodological pluralism is also important. However, French research in communication sciences leaves in the shade a number of phenomena and moments of communication that could be better understood thanks to the experimental method. We will underline that the epistemological issues related to rational use of the experimental method in communication sciences are not negligible: it allows the study of objects that cannot be investigated with other methods and offers the opportunity to build knowledge by the refutation of hypotheses and theoretical propositions. We will clarify some epistemological misunderstandings concerning this method. First, it is actually a method of studying complex systems and communication processes. Secondly, its use is not incompatible with constructivism.

  11. A Method to Extract the Intrinsic Mechanical Properties of Soft Metallic Thin Films Based on Nanoindentation Continuous Stiffness Measurement Technique

    International Nuclear Information System (INIS)

    Zhou, X Y; Jiang, Z D; Wang, H R; Zhu, Q

    2006-01-01

    In order to determine accurately the intrinsic hardness of the soft metallic thin film on a hard substrate using nanoindentation, a proper methodology irrespective of several important effects the Oliver-Pharr method concerns is described. First, the original analysis data such as the load, P, and contact stiffness, S, as a function of the indentation depth, h, are acquired by means of the continuous stiffness measurement (CSM) technique. By CSM, the complicating effects including indentation creep behaviour of metal materials as well as thermal drift on the measured results are avoided effectively. Then, the hardness of film-only is calculated via a material characteristic parameter, P/S 2 , which is independent of the contact area, A, based on the constant modulus assumption method. In this way, the influences of the substrate contribution and material pile-up behaviour needn't be accounted for. Guided by above ideas, moreover, a 504 nm Au film on the glass substrate system was chosen to study. The results show that the hardness of Au thin film is 1.6±1 GPa, which agree well with the literature. While the composite hardness measured by Oliver-Pharr method is between 2∼3GPa, obviously, which is overestimated. This implies the present methodology is a more accurate and simple way for extracting the true hardness of the soft metallic thin films

  12. Beam optimization: improving methodology

    International Nuclear Information System (INIS)

    Quinteiro, Guillermo F.

    2004-01-01

    Different optimization techniques commonly used in biology and food technology allow a systematic and complete analysis of response functions. In spite of the great interest in medical and nuclear physics in the problem of optimizing mixed beams, little attention has been given to sophisticate mathematical tools. Indeed, many techniques are perfectly suited to the typical problem of beam optimization. This article is intended as a guide to the use of two methods, namely Response Surface Methodology and Simplex, that are expected to fasten the optimization process and, meanwhile give more insight into the relationships among the dependent variables controlling the response

  13. Communication methods and production techniques in fixed prosthesis fabrication: a UK based survey. Part 2: Production techniques

    Science.gov (United States)

    Berry, J.; Nesbit, M.; Saberi, S.; Petridis, H.

    2014-01-01

    Aim The aim of this study was to identify the communication methods and production techniques used by dentists and dental technicians for the fabrication of fixed prostheses within the UK from the dental technicians' perspective. This second paper reports on the production techniques utilised. Materials and methods Seven hundred and eighty-two online questionnaires were distributed to the Dental Laboratories Association membership and included a broad range of topics, such as demographics, impression disinfection and suitability, and various production techniques. Settings were managed in order to ensure anonymity of respondents. Statistical analysis was undertaken to test the influence of various demographic variables such as the source of information, the location, and the size of the dental laboratory. Results The number of completed responses totalled 248 (32% response rate). Ninety percent of the respondents were based in England and the majority of dental laboratories were categorised as small sized (working with up to 25 dentists). Concerns were raised regarding inadequate disinfection protocols between dentists and dental laboratories and the poor quality of master impressions. Full arch plastic trays were the most popular impression tray used by dentists in the fabrication of crowns (61%) and bridgework (68%). The majority (89%) of jaw registration records were considered inaccurate. Forty-four percent of dental laboratories preferred using semi-adjustable articulators. Axial and occlusal under-preparation of abutment teeth was reported as an issue in about 25% of cases. Base metal alloy was the most (52%) commonly used alloy material. Metal-ceramic crowns were the most popular choice for anterior (69%) and posterior (70%) cases. The various factors considered did not have any statistically significant effect on the answers provided. The only notable exception was the fact that more methods of communicating the size and shape of crowns were utilised for

  14. Determination of proline in honey: comparison between official methods, optimization and validation of the analytical methodology.

    Science.gov (United States)

    Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe

    2014-05-01

    The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Developing "Personality" Taxonomies: Metatheoretical and Methodological Rationales Underlying Selection Approaches, Methods of Data Generation and Reduction Principles.

    Science.gov (United States)

    Uher, Jana

    2015-12-01

    Taxonomic "personality" models are widely used in research and applied fields. This article applies the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) to scrutinise the three methodological steps that are required for developing comprehensive "personality" taxonomies: 1) the approaches used to select the phenomena and events to be studied, 2) the methods used to generate data about the selected phenomena and events and 3) the reduction principles used to extract the "most important" individual-specific variations for constructing "personality" taxonomies. Analyses of some currently popular taxonomies reveal frequent mismatches between the researchers' explicit and implicit metatheories about "personality" and the abilities of previous methodologies to capture the particular kinds of phenomena toward which they are targeted. Serious deficiencies that preclude scientific quantifications are identified in standardised questionnaires, psychology's established standard method of investigation. These mismatches and deficiencies derive from the lack of an explicit formulation and critical reflection on the philosophical and metatheoretical assumptions being made by scientists and from the established practice of radically matching the methodological tools to researchers' preconceived ideas and to pre-existing statistical theories rather than to the particular phenomena and individuals under study. These findings raise serious doubts about the ability of previous taxonomies to appropriately and comprehensively reflect the phenomena towards which they are targeted and the structures of individual-specificity occurring in them. The article elaborates and illustrates with empirical examples methodological principles that allow researchers to appropriately meet the metatheoretical requirements and that are suitable for comprehensively exploring individuals' "personality".

  16. Evaluation and assessment of nuclear power plant seismic methodology

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-03-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology.

  17. Evaluation and assessment of nuclear power plant seismic methodology

    International Nuclear Information System (INIS)

    Bernreuter, D.; Tokarz, F.; Wight, L.; Smith, P.; Wells, J.; Barlow, R.

    1977-01-01

    The major emphasis of this study is to develop a methodology that can be used to assess the current methods used for assuring the seismic safety of nuclear power plants. The proposed methodology makes use of system-analysis techniques and Monte Carlo schemes. Also, in this study, we evaluate previous assessments of the current seismic-design methodology

  18. Problems of method of technology assessment. A methodological analysis; Methodenprobleme des Technology Assessment; Eine methodologische Analyse

    Energy Technology Data Exchange (ETDEWEB)

    Zimmermann, V

    1993-03-01

    The study undertakes to analyse the theoretical and methodological structure of Technology Assessment (TA). It is based on a survey of TA studies which provided an important condition for theoreticall sound statements on methodological aspects of TA. It was established that the main basic theoretical problems of TA are in the field of dealing with complexity. This is also apparent in the constitution of problems, the most elementary and central approach of TA. Scientifically founded constitution of problems and the corresponding construction of models call for interdisciplinary scientific work. Interdisciplinarity in the TA research process is achieved at the level of virtual networks, these networks being composed of individuals suited to teamwork. The emerging network structures have an objective-organizational and an ideational basis. The objective-organizational basis is mainly the result of team composition and the external affiliations of the team members. The ideational basis of the virtual network is represented by the team members` mode of thinking, which is individually located at a multidisciplinary level. The theoretical `skeleton` of the TA knowledge system, which is represented by process knowledge based linkage structures, can be generated and also processed in connection with the knowledge on types of problems, areas of analysis and procedures to deal with complexity. Within this process, disciplinary knowledge is a necessary but not a sufficient condition. Metatheoretical and metadisciplinary knowledge and the correspondingly processes complexity of models are the basis for the necessary methodological awareness, that allows TA to become designable as a research procedure. (orig./HP) [Deutsch] Die Studie stellt sich die Aufgabe, die theoretische und methodische Struktur des Technology Assessment (TA) zu analysieren. Sie fusst auf Erhebungen, die bei Technology-Assessment-Studien vorgenommen wurden und die wesentliche Voraussetzungen fuer

  19. A comparison of partial order technique with three methods of multi-criteria analysis for ranking of chemical substances.

    Science.gov (United States)

    Lerche, Dorte; Brüggemann, Rainer; Sørensen, Peter; Carlsen, Lars; Nielsen, Ole John

    2002-01-01

    An alternative to the often cumbersome and time-consuming risk assessments of chemical substances could be more reliable and advanced priority setting methods. An elaboration of the simple scoring methods is provided by Hasse Diagram Technique (HDT) and/or Multi-Criteria Analysis (MCA). The present study provides an in depth evaluation of HDT relative to three MCA techniques. The new and main methodological step in the comparison is the use of probability concepts based on mathematical tools such as linear extensions of partially ordered sets and Monte Carlo simulations. A data set consisting of 12 High Production Volume Chemicals (HPVCs) is used for illustration. It is a paradigm in this investigation to claim that the need of external input (often subjective weightings of criteria) should be minimized and that the transparency should be maximized in any multicriteria prioritisation. The study illustrates that the Hasse diagram technique (HDT) needs least external input, is most transparent and is least subjective. However, HDT has some weaknesses if there are criteria which exclude each other. Then weighting is needed. Multi-Criteria Analysis (i.e. Utility Function approach, PROMETHEE and concordance analysis) can deal with such mutual exclusions because their formalisms to quantify preferences allow participation e.g. weighting of criteria. Consequently MCA include more subjectivity and loose transparency. The recommendation which arises from this study is that the first step in decision making is to run HDT and as the second step possibly is to run one of the MCA algorithms.

  20. EXPLANATORY METHODS OF MARKETING DATA ANALYSIS – THEORETICAL AND METHODOLOGICAL CONSIDERATIONS

    Directory of Open Access Journals (Sweden)

    Rozalia GABOR

    2010-01-01

    Full Text Available Explanatory methods of data analysis – also named by some authors supervised learning methods - enable researchers to identify and analyse configurations of relations between two or several variables, most of them with a high accuracy, as there is possibility of testing statistic significance by calculating the confidence level associated with validation of relation concerned across the entire population and not only the surveyed sample. The paper shows some of these methods, respectively: variance analysis, covariance analysis, segmentation and discriminant analysis with the mention - for every method – of applicability area for marketing research.

  1. Measuring coral calcification under ocean acidification: methodological considerations for the 45Ca-uptake and total alkalinity anomaly technique

    Directory of Open Access Journals (Sweden)

    Stephanie Cohen

    2017-09-01

    Full Text Available As the oceans become less alkaline due to rising CO2 levels, deleterious consequences are expected for calcifying corals. Predicting how coral calcification will be affected by on-going ocean acidification (OA requires an accurate assessment of CaCO3 deposition and an understanding of the relative importance that decreasing calcification and/or increasing dissolution play for the overall calcification budget of individual corals. Here, we assessed the compatibility of the 45Ca-uptake and total alkalinity (TA anomaly techniques as measures of gross and net calcification (GC, NC, respectively, to determine coral calcification at pHT 8.1 and 7.5. Considering the differing buffering capacity of seawater at both pH values, we were also interested in how strongly coral calcification alters the seawater carbonate chemistry under prolonged incubation in sealed chambers, potentially interfering with physiological functioning. Our data indicate that NC estimates by TA are erroneously ∼5% and ∼21% higher than GC estimates from 45Ca for ambient and reduced pH, respectively. Considering also previous data, we show that the consistent discrepancy between both techniques across studies is not constant, but largely depends on the absolute value of CaCO3 deposition. Deriving rates of coral dissolution from the difference between NC and GC was not possible and we advocate a more direct approach for the future by simultaneously measuring skeletal calcium influx and efflux. Substantial changes in carbonate system parameters for incubation times beyond two hours in our experiment demonstrate the necessity to test and optimize experimental incubation setups when measuring coral calcification in closed systems, especially under OA conditions.

  2. A frequency domain radar interferometric imaging (FII) technique based on high-resolution methods

    Science.gov (United States)

    Luce, H.; Yamamoto, M.; Fukao, S.; Helal, D.; Crochet, M.

    2001-01-01

    In the present work, we propose a frequency-domain interferometric imaging (FII) technique for a better knowledge of the vertical distribution of the atmospheric scatterers detected by MST radars. This is an extension of the dual frequency-domain interferometry (FDI) technique to multiple frequencies. Its objective is to reduce the ambiguity (resulting from the use of only two adjacent frequencies), inherent with the FDI technique. Different methods, commonly used in antenna array processing, are first described within the context of application to the FII technique. These methods are the Fourier-based imaging, the Capon's and the singular value decomposition method used with the MUSIC algorithm. Some preliminary simulations and tests performed on data collected with the middle and upper atmosphere (MU) radar (Shigaraki, Japan) are also presented. This work is a first step in the developments of the FII technique which seems to be very promising.

  3. Searching for Rigour in the Reporting of Mixed Methods Population Health Research: A Methodological Review

    Science.gov (United States)

    Brown, K. M.; Elliott, S. J.; Leatherdale, S. T.; Robertson-Wilson, J.

    2015-01-01

    The environments in which population health interventions occur shape both their implementation and outcomes. Hence, when evaluating these interventions, we must explore both intervention content and context. Mixed methods (integrating quantitative and qualitative methods) provide this opportunity. However, although criteria exist for establishing…

  4. Application of Response Surface Methodology for Optimization of Paracetamol Particles Formation by RESS Method

    International Nuclear Information System (INIS)

    Sabet, J.K.; Ghotbi, C.; Dorkoosh, F.

    2012-01-01

    Ultrafine particles of paracetamol were produced by Rapid Expansion of Supercritical Solution (RESS). The experiments were conducted to investigate the effects of extraction temperature (313-353 K), extraction pressure (10-18 MPa), pre expansion temperature (363-403 K), and post expansion temperature (273-323 K) on particles size and morphology of paracetamol particles. The characterization of the particles was determined by Scanning Electron Microscopy (SEM), Transmission Electron Microscopy (TEM), and Liquid Chromatography/Mass Spectrometry (LC-MS) analysis. The average particle size of the original paracetamol was 20.8 μm, while the average particle size of paracetamol after nan onization via the RESS process was 0.46 μm depending on the experimental conditions used. Moreover, the morphology of the processed particles changed to spherical and regular while the virgin particles of paracetamol were needle-shape and irregular. Response surface methodology (RSM) was used to optimize the process parameters. The extraction temperature, 347 K; extraction pressure, 12 MPa; pre expansion temperature, 403?K; and post expansion temperature, 322 K was found to be the optimum conditions to achieve the minimum average particle size of paracetamol.

  5. Safety at Work : Research Methodology

    NARCIS (Netherlands)

    Beurden, van K. (Karin); Boer, de J. (Johannes); Brinks, G. (Ger); Goering-Zaburnenko, T. (Tatiana); Houten, van Y. (Ynze); Teeuw, W. (Wouter)

    2012-01-01

    In this document, we provide the methodological background for the Safety atWork project. This document combines several project deliverables as defined inthe overall project plan: validation techniques and methods (D5.1.1), performanceindicators for safety at work (D5.1.2), personal protection

  6. Overlay control methodology comparison: field-by-field and high-order methods

    Science.gov (United States)

    Huang, Chun-Yen; Chiu, Chui-Fu; Wu, Wen-Bin; Shih, Chiang-Lin; Huang, Chin-Chou Kevin; Huang, Healthy; Choi, DongSub; Pierson, Bill; Robinson, John C.

    2012-03-01

    Overlay control in advanced integrated circuit (IC) manufacturing is becoming one of the leading lithographic challenges in the 3x and 2x nm process nodes. Production overlay control can no longer meet the stringent emerging requirements based on linear composite wafer and field models with sampling of 10 to 20 fields and 4 to 5 sites per field, which was the industry standard for many years. Methods that have emerged include overlay metrology in many or all fields, including the high order field model method called high order control (HOC), and field by field control (FxFc) methods also called correction per exposure. The HOC and FxFc methods were initially introduced as relatively infrequent scanner qualification activities meant to supplement linear production schemes. More recently, however, it is clear that production control is also requiring intense sampling, similar high order and FxFc methods. The added control benefits of high order and FxFc overlay methods need to be balanced with the increased metrology requirements, however, without putting material at risk. Of critical importance is the proper control of edge fields, which requires intensive sampling in order to minimize signatures. In this study we compare various methods of overlay control including the performance levels that can be achieved.

  7. Methods of dichotic listening as a research methodology for hemispheric interaction.

    Directory of Open Access Journals (Sweden)

    Kovyazina M.S.

    2014-07-01

    Full Text Available Experimental data was obtained from a dichotic listening test by patients with unilateral brain lesions and corpus callosum pathology (agenesis, cysts, degenerative changes, etc. Efficiency index analysis shows that interhemispheric interaction in the audioverbal sphere depends to a greater extent on the right hemisphere state. The dichotic listening technique is not an informative means of studying hemispheric interaction, since it does not allow a clear distinction between hemispheric symptoms and symptoms of pathology of the corpus callosum. Thus, violations of hemispheric relations caused by disorders of the corpus callosum and cerebral hemispheres change worth more right hemisphere activity.

  8. A gradient-based method for segmenting FDG-PET images: methodology and validation

    International Nuclear Information System (INIS)

    Geets, Xavier; Lee, John A.; Gregoire, Vincent; Bol, Anne; Lonneux, Max

    2007-01-01

    A new gradient-based method for segmenting FDG-PET images is described and validated. The proposed method relies on the watershed transform and hierarchical cluster analysis. To allow a better estimation of the gradient intensity, iteratively reconstructed images were first denoised and deblurred with an edge-preserving filter and a constrained iterative deconvolution algorithm. Validation was first performed on computer-generated 3D phantoms containing spheres, then on a real cylindrical Lucite phantom containing spheres of different volumes ranging from 2.1 to 92.9 ml. Moreover, laryngeal tumours from seven patients were segmented on PET images acquired before laryngectomy by the gradient-based method and the thresholding method based on the source-to-background ratio developed by Daisne (Radiother Oncol 2003;69:247-50). For the spheres, the calculated volumes and radii were compared with the known values; for laryngeal tumours, the volumes were compared with the macroscopic specimens. Volume mismatches were also analysed. On computer-generated phantoms, the deconvolution algorithm decreased the mis-estimate of volumes and radii. For the Lucite phantom, the gradient-based method led to a slight underestimation of sphere volumes (by 10-20%), corresponding to negligible radius differences (0.5-1.1 mm); for laryngeal tumours, the segmented volumes by the gradient-based method agreed with those delineated on the macroscopic specimens, whereas the threshold-based method overestimated the true volume by 68% (p = 0.014). Lastly, macroscopic laryngeal specimens were totally encompassed by neither the threshold-based nor the gradient-based volumes. The gradient-based segmentation method applied on denoised and deblurred images proved to be more accurate than the source-to-background ratio method. (orig.)

  9. Prospective, Double-Blind Evaluation of Umbilicoplasty Techniques Using Conventional and Crowdsourcing Methods

    NARCIS (Netherlands)

    Veldhuisen, C.L. Van; Kamali, P.; Wu, W.; Becherer, B.E.; Sinno, H.H.; Ashraf, A.A.; Ibrahim, A.M.S.; Tobias, A.; Lee, B.T.; Lin, S.J.

    2017-01-01

    BACKGROUND: Umbilical reconstruction is an important component of deep inferior epigastric perforator (DIEP) flap breast reconstruction. This study evaluated the aesthetics of three different umbilical reconstruction techniques during DIEP flap breast reconstruction. METHODS: From January to April

  10. The close objects buffer : a sharp shadow detection technique for radiosity methods

    NARCIS (Netherlands)

    Telea, A.C.; Overveld, van C.W.A.M.

    1997-01-01

    Detecting sharp illumination variations such as shadow boundaries is an important problem for radiosity methods. Such illumination variations are captured using a nonuniform mesh that refines the areas exhibiting high illumination gradients. Nonuniform meshing techniques like discontinuity meshing

  11. The Close Objects Buffer : A Sharp Shadow Detection Technique for Radiosity Methods

    NARCIS (Netherlands)

    Telea, A.C.; Overveld, C.W.A.M. van

    1998-01-01

    Detecting sharp illumination variations such as shadow boundaries is an important problem for radiosity methods. Such illumination variations are captured using a nonuniform mesh that refines the areas exhibiting high illumination gradients. Nonuniform meshing techniques like discontinuity meshing

  12. A Combined Methodology for Landslide Risk Mitigation in Basilicata Region by Using LIDAR Technique and Rockfall Simulation

    Directory of Open Access Journals (Sweden)

    G. Colangelo

    2011-01-01

    Full Text Available Rockfalls represent a significant geohazards along the SS18 road of Basilicata Region, Italy. The management of these rockfall hazards and the mitigation of the risk require innovative approaches and technologies. This paper discusses a hazard assessment strategy and risk mitigation for rockfalls in a section of SS118, along the coast of Maratea, using LIDAR technique and spatial modelling. Historical rockfall records were used to calibrate the physical characteristics of the rockfall processes. The results of the simulations were used to define the intervention actions and engineering strategy for the mitigation of the phenomena. Within two months, 260 linear meters of high-energy rockfall barriers for impact energies up to 3000 kJ were installed. After that, according to road authority, the SS18 road was opened in a safe condition. The results represent a valid cognitive support to choose the most appropriate technical solution for topography strengthening and an example of good practice for the cooperation between innovative technologies and field emergency management.

  13. Optimization of methodology for the assessing of bioaccumulation factors of periphyton metals applying the X-ray fluorescence technique

    International Nuclear Information System (INIS)

    Merced Ch, D.

    2015-01-01

    The Lerma River is one of the most polluted at Mexico, has a high pollutant load and low biodiversity, this aquatic plants and species zoo perifiton presented adaptations to environmental conditions that exist due to dumping of wastewater are developed. In this paper bioaccumulation factors (BAF) of Cr, Mn, Fe, Ni, Cu, Zn and Pb metal on Hydrocotyle ranunculoides zoo perifiton associated with the upper reaches of the Lerma River applying the technique of fluorescence X-rays were evaluated in the form of Total Reflection. The BAF were higher compared to the soluble fraction to the total fraction this because the metal in the soluble phase are in solution and are therefore more available to join aquatic organisms, moreover respect to the BAF sediment were ≤ 1.5 indicate that these organisms have little affinity for incorporating metals from the sediment. Considering the sum of the FBA of all metals in each agency notes that the leech was the biggest bio accumulated metals (42468) followed by the worm to (27958), the arthropod with (10757) and finally the snail with (8421). Overall for this study agencies according to the BAF reported to bio accumulate metals are the following behavior Fe > Zn > Cu > Cr > Ni > Mn > Pb. (Author)

  14. Close But Not Too Close: Friendship as Method(ology) in Ethnographic Research Encounters

    OpenAIRE

    Owton, H.; Allen-Collinson, J.

    2013-01-01

    ‘Friendship as method’ is a relatively under-explored – and often unacknowledged - method of qualitative inquiry in the research literature, particularly within the field of sports and exercise studies. In this article, we consider the use of friendship as method in general, and situate this in relation to a specific qualitative research project in sport, which examined the lived experience of asthma amongst sports participants. The study involved researching individuals with whom the princip...

  15. A Systematic Literature Review on relationship between agile methods and Open Source Software Development methodology

    OpenAIRE

    Gandomani, Taghi Javdani; Zulzalil, Hazura; Ghani, Abdul Azim Abdul; Sultan, Abu Bakar Md

    2013-01-01

    Agile software development methods (ASD) and open source software development methods (OSSD) are two different approaches which were introduced in last decade and both of them have their fanatical advocators. Yet, it seems that relation and interface between ASD and OSSD is a fertile area and few rigorous studies have been done in this matter. Major goal of this study was assessment of the relation and integration of ASD and OSSD. Analyzing of collected data shows that ASD and OSSD are able t...

  16. Organizational capabilities assessment: a dynamic methodology, methods and a tool for supporting organizational diagnosis

    OpenAIRE

    Rauffet , Philippe; Da Cunha , Catherine ,; Bernard , Alain

    2010-01-01

    Many methods, like CMMI, ISO norms or 5 steps roadmapping, are implemented in organizations in order to develop collective competencies, called also organizational capabilities, around organizational needs. They aim at providing new means to controls resources of organization, and enabling an organizational diagnosis, it is to say the evaluation of the strengths and the weaknesses of the organization. Nevertheless, these methods are generally based on knowledge based models (they are composed...

  17. Criteria for quantitative and qualitative data integration: mixed-methods research methodology.

    Science.gov (United States)

    Lee, Seonah; Smith, Carrol A M

    2012-05-01

    Many studies have emphasized the need and importance of a mixed-methods approach for evaluation of clinical information systems. However, those studies had no criteria to guide integration of multiple data sets. Integrating different data sets serves to actualize the paradigm that a mixed-methods approach argues; thus, we require criteria that provide the right direction to integrate quantitative and qualitative data. The first author used a set of criteria organized from a literature search for integration of multiple data sets from mixed-methods research. The purpose of this article was to reorganize the identified criteria. Through critical appraisal of the reasons for designing mixed-methods research, three criteria resulted: validation, complementarity, and discrepancy. In applying the criteria to empirical data of a previous mixed methods study, integration of quantitative and qualitative data was achieved in a systematic manner. It helped us obtain a better organized understanding of the results. The criteria of this article offer the potential to produce insightful analyses of mixed-methods evaluations of health information systems.

  18. The Art of Hardware Architecture Design Methods and Techniques for Digital Circuits

    CERN Document Server

    Arora, Mohit

    2012-01-01

    This book highlights the complex issues, tasks and skills that must be mastered by an IP designer, in order to design an optimized and robust digital circuit to solve a problem. The techniques and methodologies described can serve as a bridge between specifications that are known to the designer and RTL code that is final outcome, reducing significantly the time it takes to convert initial ideas and concepts into right-first-time silicon.� Coverage focuses on real problems rather than theoretical concepts, with an emphasis on design techniques across various aspects of chip-design.�� Describes techniques to help IP designers get it right the first time, creating designs optimized in terms of power, area and performance; Focuses on practical aspects of chip design and minimizes theory; Covers chip design in a consistent way, starting with basics and gradually developing advanced concepts, such as electromagnetic compatibility (EMC) design techniques and low-power design techniques such as dynamic voltage...

  19. An accurate method for determining residual stresses with magnetic non-destructive techniques in welded ferromagnetic steels

    International Nuclear Information System (INIS)

    Vourna, P

    2016-01-01

    The scope of the present research work was to investigate the proper selection criteria for developing a suitable methodology for the accurate determination of residual stresses existing in welded parts. Magnetic non-destructive testing took place by the use of two magnetic non-destructive techniques: by the measurement of the magnetic Barkhausen noise and by the evaluation of the magnetic hysteresis loop parameters. The spatial distribution of residual stresses in welded metal parts by both non-destructive magnetic methods and two diffraction methods was determined. The conduction of magnetic measurements required an initial calibration of ferromagnetic steels. Based on the examined volume of the sample, all methods used were divided into two large categories: the first one was related to the determination of surface residual stress, whereas the second one was related to bulk residual stress determination. The first category included the magnetic Barkhausen noise and the X-ray diffraction measurements, while the second one included the magnetic permeability and the neutron diffraction data. The residual stresses determined by the magnetic techniques were in a good agreement with the diffraction ones. (paper)

  20. Study on the Filament Yarns Spreading Techniques and Assessment Methods of the Electronic Fiberglass Fabric

    Science.gov (United States)

    Wang, Xi; Chen, Shouhui; Zheng, Tianyong; Ning, Xiangchun; Dai, Yifei

    2018-03-01

    The filament yarns spreading techniques of electronic fiberglass fabric were developed in the past few years in order to meet the requirements of the development of electronic industry. Copper clad laminate (CCL) requires that the warp and weft yarns of the fabric could be spread out of apart and formed flat. The penetration performance of resin could be improved due to the filament yarns spreading techniques of electronic fiberglass fabric, the same as peeling strength of CCL and drilling performance of printed circuit board (PCB). This paper shows the filament yarns spreading techniques of electronic fiberglass fabric from several aspects, such as methods and functions, also with the assessment methods of their effects.

  1. The Itinerary Method: A Methodological Contribution from Social Sciences to Consumer Research in Management

    Directory of Open Access Journals (Sweden)

    Dominique Desjeux

    2014-05-01

    Full Text Available Consumer choice has been a focus of interest in the study of consumer behavior for over 50 years. Over time, however, the focus has widened to include not only the moment of purchase itself but also gradually a reflection on the consumer decision process, concerning the selection, consumption and disposal of products and services. More recently, researchers trained in areas like anthropology and sociology have contributed with perspectives that view the process of choice as a social and cultural phenomenon. This paper presents the Itinerary Method — a research approach originally applied in anthropology studies investigating consumption. The method can contribute to consumer research in management inasmuch as it allows investigation of the consumption process - selection, consumption and disposal - within a systemic perspective, that can expand consumer research's comprehension of choice, since it stresses culture as a central element. The method is described, along with its assumptions, operational steps and concrete examples of researches on consumption. 

  2. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods.

    Science.gov (United States)

    Jha, Abhinav K; Caffo, Brian; Frey, Eric C

    2016-04-07

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  3. A no-gold-standard technique for objective assessment of quantitative nuclear-medicine imaging methods

    International Nuclear Information System (INIS)

    Jha, Abhinav K; Frey, Eric C; Caffo, Brian

    2016-01-01

    The objective optimization and evaluation of nuclear-medicine quantitative imaging methods using patient data is highly desirable but often hindered by the lack of a gold standard. Previously, a regression-without-truth (RWT) approach has been proposed for evaluating quantitative imaging methods in the absence of a gold standard, but this approach implicitly assumes that bounds on the distribution of true values are known. Several quantitative imaging methods in nuclear-medicine imaging measure parameters where these bounds are not known, such as the activity concentration in an organ or the volume of a tumor. We extended upon the RWT approach to develop a no-gold-standard (NGS) technique for objectively evaluating such quantitative nuclear-medicine imaging methods with patient data in the absence of any ground truth. Using the parameters estimated with the NGS technique, a figure of merit, the noise-to-slope ratio (NSR), can be computed, which can rank the methods on the basis of precision. An issue with NGS evaluation techniques is the requirement of a large number of patient studies. To reduce this requirement, the proposed method explored the use of multiple quantitative measurements from the same patient, such as the activity concentration values from different organs in the same patient. The proposed technique was evaluated using rigorous numerical experiments and using data from realistic simulation studies. The numerical experiments demonstrated that the NSR was estimated accurately using the proposed NGS technique when the bounds on the distribution of true values were not precisely known, thus serving as a very reliable metric for ranking the methods on the basis of precision. In the realistic simulation study, the NGS technique was used to rank reconstruction methods for quantitative single-photon emission computed tomography (SPECT) based on their performance on the task of estimating the mean activity concentration within a known volume of interest

  4. A fast hybrid methodology based on machine learning, quantum methods, and experimental measurements for evaluating material properties

    Science.gov (United States)

    Kong, Chang Sun; Haverty, Michael; Simka, Harsono; Shankar, Sadasivan; Rajan, Krishna

    2017-09-01

    We present a hybrid approach based on both machine learning and targeted ab-initio calculations to determine adhesion energies between dissimilar materials. The goals of this approach are to complement experimental and/or all ab-initio computational efforts, to identify promising materials rapidly and identify in a quantitative manner the relative contributions of the different material attributes affecting adhesion. Applications of the methodology to predict bulk modulus, yield strength, adhesion and wetting properties of copper (Cu) with other materials including metals, nitrides and oxides is discussed in this paper. In the machine learning component of this methodology, the parameters that were chosen can be roughly divided into four types: atomic and crystalline parameters (which are related to specific elements such as electronegativities, electron densities in Wigner-Seitz cells); bulk material properties (e.g. melting point), mechanical properties (e.g. modulus) and those representing atomic characteristics in ab-initio formalisms (e.g. pseudopotentials). The atomic parameters are defined over one dataset to determine property correlation with published experimental data. We then develop a semi-empirical model across multiple datasets to predict adhesion in material interfaces outside the original datasets. Since adhesion is between two materials, we appropriately use parameters which indicate differences between the elements that comprise the materials. These semi-empirical predictions agree reasonably with the trend in chemical work of adhesion predicted using ab-initio techniques and are used for fast materials screening. For the screened candidates, the ab-initio modeling component provides fundamental understanding of the chemical interactions at the interface, and explains the wetting thermodynamics of thin Cu layers on various substrates. Comparison against ultra-high vacuum (UHV) experiments for well-characterized Cu/Ta and Cu/α-Al2O3 interfaces is

  5. An Investigation of Science Teachers’ Teaching Methods and Techniques: Amasya Case

    Directory of Open Access Journals (Sweden)

    Orhan KARAMUSTAFAOĞLU

    2014-10-01

    Full Text Available The purpose of this study is to determine the methods and techniques science teachers mostly employ in their classrooms. To collect data, the researchers employed a survey with 60 science teachers and randomly selected 6 of them to observe these selected teachers in real classroom situation. Furthermore, the researchers invited 154 students taught by the selected 6 teachers in this study, for focus group interviewing. After analyzing the collected data, the researchers found that teachers in this study 1 were more likely to use narrative method, 2 supported their teaching with question and answer, demonstration, case study, and problem solving methods and techniques, and 3 rarely employed student centered discussion, laboratory practice, role playing and project-based learning methods in their classroom. Consequently, there exist some differences between theory and practice regarding teaching methods and techniques of teachers in this study.

  6. Measuring subjective meaning structures by the laddering method: Theoretical considerations and methodological problems

    DEFF Research Database (Denmark)

    Grunert, Klaus G.; Grunert, Suzanne C.

    1995-01-01

    Starting from a general model of measuring cognitive structures for predicting consumer behaviour, we discuss laddering as a possible method to obtain estimates of consumption-relevant cognitive structures which will have predictive validity. Four criteria for valid measurement are derived and ap...

  7. A comparison of usability methods for testing interactive health technologies: Methodological aspects and empirical evidence

    NARCIS (Netherlands)

    Jaspers, Monique W. M.

    2009-01-01

    OBJECTIVE: Usability evaluation is now widely recognized as critical to the success of interactive health care applications. However, the broad range of usability inspection and testing methods available may make it difficult to decide on a usability assessment plan. To guide novices in the

  8. Towards an empirical method of usability testing of system parts : a methodological study

    NARCIS (Netherlands)

    Brinkman, W.P.; Haakma, R.; Bouwhuis, D.G.

    2007-01-01

    Current usability evaluation methods are essentially holistic in nature. However, engineers that apply a component-based software engineering approach might also be interested in understanding the usability of individual parts of an interactive system. This paper examines the efficiency dimension of

  9. New insights into the methodological issues of the indicator amino acid oxidation method in preterm neonates

    NARCIS (Netherlands)

    de Groof, F.; Huang, L.S.; Twisk, J.W.R.; Voortman, G.J.; Joemai, W.; Hau, C.H.; Schierbeek, H.; Chen, C.; Huang, Y.; van Goudoever, J.B.

    2013-01-01

    Background: We determined the effect of adaptation to the study diet on oxidation of the indicator amino acid and the required tracer washout time in preterms. Methods: Subjects received a study diet for 6 d that entailed a 50% reduction in leucine. Tracer studies using enterally infused [ 13

  10. Methodological Appendix of Research Methods Employed in the Mexican American Education Study.

    Science.gov (United States)

    Commission on Civil Rights, Washington, DC.

    The U.S. Commission on Civil Rights released Mexican American Education Study findings in a series of documents: (1) "The Ethnic Isolation of Mexican Americans in the Public Schools of the Southwest" (ED 052 849), "The Unfinished Education" (ED 056 821), and "The Excluded Student" (ED 062 069). The research methods employed in the study are…

  11. Paradigms or Toolkits? Philosophical and Methodological Positions as Heuristics for Mixed Methods Research

    Science.gov (United States)

    Maxwell, Joseph A.

    2011-01-01

    In this article, the author challenges the validity and usefulness of the concept of "paradigm," as this term has been used in the social sciences generally, and specifically in the debates over research methods. He emphasizes that in criticizing what he sees as the misuse of the paradigm concept, he is not arguing for dismissing or ignoring…

  12. Who's Afraid of the Big Bad Methods? Methodological Games and Role Play

    Science.gov (United States)

    Kollars, Nina; Rosen, Amanda M.

    2017-01-01

    In terms of gamification within political science, some fields-particularly international relations and American politics--have received more attention than others. One of the most underserved parts of the discipline is research methods; a course that, coincidentally, is frequently cited as one that instructors hate to teach and students hate to…

  13. Teens, Food Choice, and Health: How Can a Multi-Method Research Methodology Enhance the Study of Teen Food Choice and Health Messaging?

    OpenAIRE

    Wiseman, Kelleen

    2011-01-01

    This research report compares alternative approaches to analyzing the complex factors that influence teenagers' food choice. Specifically, a multi-method approach-which involves the integration of the qualitative and quantitative research methodoligies, data and analysis-is compared to a single methodological approach, which involves use of either a quantitative or qualitative methodology.

  14. From Theory-Inspired to Theory-Based Interventions: A Protocol for Developing and Testing a Methodology for Linking Behaviour Change Techniques to Theoretical Mechanisms of Action.

    Science.gov (United States)

    Michie, Susan; Carey, Rachel N; Johnston, Marie; Rothman, Alexander J; de Bruin, Marijn; Kelly, Michael P; Connell, Lauren E

    2018-05-18

    Understanding links between behaviour change techniques (BCTs) and mechanisms of action (the processes through which they affect behaviour) helps inform the systematic development of behaviour change interventions. This research aims to develop and test a methodology for linking BCTs to their mechanisms of action. Study 1 (published explicit links): Hypothesised links between 93 BCTs (from the 93-item BCT taxonomy, BCTTv1) and mechanisms of action will be identified from published interventions and their frequency, explicitness and precision documented. Study 2 (expert-agreed explicit links): Behaviour change experts will identify links between 61 BCTs and 26 mechanisms of action in a formal consensus study. Study 3 (integrated matrix of explicit links): Agreement between studies 1 and 2 will be evaluated and a new group of experts will discuss discrepancies. An integrated matrix of BCT-mechanism of action links, annotated to indicate strength of evidence, will be generated. Study 4 (published implicit links): To determine whether groups of co-occurring BCTs can be linked to theories, we will identify groups of BCTs that are used together from the study 1 literature. A consensus exercise will be used to rate strength of links between groups of BCT and theories. A formal methodology for linking BCTs to their hypothesised mechanisms of action can contribute to the development and evaluation of behaviour change interventions. This research is a step towards developing a behaviour change 'ontology', specifying relations between BCTs, mechanisms of action, modes of delivery, populations, settings and types of behaviour.

  15. Methodology for the application of the probabilistic techniques of security analysis (APS) to the units of cobalto therapy in Cuba; Metodologia para la aplicacion de las tecnicas de analisis probabilista de seguridad (APS) a las unidades de cobaltoterapia en Cuba

    Energy Technology Data Exchange (ETDEWEB)

    Vilaragut Llanes, J.J.; Ferro Fernandez, R.; Troncoso Fleitas, M.; Lozano Lima, B.; De la Fuente Puch, A.; Perez Reyes, Y.; Dumenico Gonzalez, C. [Centro Nacional de Seguridad Nuclear (CNSN), La Habana (Cuba)

    2001-07-01

    The applications of PSA techniques in the nuclear power plants during the last two decades and the positive results obtained for decision making in relation with safety, as a complement to deterministic methods, have increased their use in the rest of the nuclear applications. At present a wide set of documents from international institutions can be found summarizing the investigations carried out in this field and promoting their use in radioactive facilities. Although still without a mandatory character, the new regulations of radiological safety also promote the complete or partial application of the PSA techniques in the safety assessment of the radiological practices. Also the IAEA, through various programs in which Cuba has been inserted, it is taking a group of actions so that the nuclear community would encourage the application of the probabilistic risk methods for the evaluations and decision making with respect to safety. However, the fact that in any radioactive installation it has not still been carried out a complete PSA study, makes that certain methodological aspects require to be improved and modified for the application of these techniques. This work presents the main elements for the use of PSA in the evaluation of the safety of cobalt-therapy units in Cuba. Also presents, as part of the results of the first stage of the Study, the Guidelines that are being applied in a Research Contract with the Agency by the authors themselves, which belong to the CNSN, together with other specialists from the Cuban Ministry of Public Health. (author)

  16. Abrasion Resistance of Nano Silica Modified Roller Compacted Rubbercrete: Cantabro Loss Method and Response Surface Methodology Approach

    Science.gov (United States)

    Adamu, Musa; Mohammed, Bashar S.; Shafiq, Nasir

    2018-04-01

    Roller compacted concrete (RCC) when used for pavement is subjected to skidding/rubbing by wheels of moving vehicles, this causes pavement surface to wear out and abrade. Therefore, abrasion resistance is one of the most important properties of concern for RCC pavement. In this study, response surface methodology was used to design, evaluate and analyze the effect of partial replacement of fine aggregate with crumb rubber, and addition of nano silica on the abrasion resistance of roller compacted rubbercrete (RCR). RCR is the terminology used for RCC pavement where crumb rubber was used as partial replacement to fine aggregate. The Box-Behnken design method was used to develop the mixtures combinations using 10%, 20%, and 30% crumb rubber with 0%, 1%, and 2% nano silica. The Cantabro loss method was used to measure the abrasion resistance. The results showed that the abrasion resistance of RCR decreases with increase in crumb rubber content, and increases with increase in addition of nano silica. The analysis of variance shows that the model developed using response surface methodology (RSM) has a very good degree of correlation, and can be used to predict the abrasion resistance of RCR with a percentage error of 5.44%. The combination of 10.76% crumb rubber and 1.59% nano silica yielded the best combinations of RCR in terms of abrasion resistance of RCR.

  17. A METHODOLOGY FOR IMPROVING PRODUCTIVITY OF THE EXISTING SHIPBUILDING PROCESS USING MODERN PRODUCTION CONCEPTs AND THE AHP METHOD

    Directory of Open Access Journals (Sweden)

    Venesa Stanić

    2017-01-01

    Full Text Available In recent years, shipyards have been facing difficulties in controlling operational costs. To maintain continual operation of all of the facilities, a shipyard must analyze ways of utilizing present production systems for assembling interim vessel products as well as other types of industrial constructions. In the past, new machines continuously improved shipbuilding processes, including software and organizational restructuring, but management continued to search for a modern technological concept that will provide higher productivity, greater profit and overall reduction in costs. In the article the authors suggest implementing Design for Production, Design for Maintainability and Group Technology principles using the Analytical Hierarchy Process (AHP to apply to multi criteria decision making methods as an efficient tool for maintaining international competitiveness in the modern shipbuilding industry. This novel methodology is implemented through four phases. In the first phase, the present situation analysis is suggested for a real shipyard by establishing closest relations among production lines. The second phase presents a constraint analysis that must be evaluated when developing the design solution. The third phase involves generating a typical number of selected alternatives of the Design for Production, Design for Maintainability and Group Technology principles. In the fourth phase, the optimal design solution is selected using the Analytical Hierarchy Process (AHP method. The solution incorporating this modern methodology will improve productivity, profit and lead to decreasing operational costs.

  18. THE PROPOSED METHODOLOGIES FOR THE SIX SIGMA METHOD AND TQM STRATEGY AS WELL AS THEIR APPLICATION IN PRACTICE IN MACEDONIA

    Directory of Open Access Journals (Sweden)

    Elizabeta Mitreva

    2014-05-01

    Full Text Available This paper presents the proposed methodologies for the Six Sigma method and the TQM strategy as well as their application in practice in Macedonia. Although the philosophy of the total quality management (TQM is deeply involved in many industries and business areas of European and other countries it is insufficiently known and present in our country and other developing countries. The same applies to the Six Sigma approach of reducing the dispersion of a process and it is present in a small fraction in Macedonian companies. The results of the implementation have shown that the application of the Six Sigma approach does not refer to the number of defects per million opportunities but to the systematic and systemic lowering of the dispersion process. The operation and effect of the implementation of the six sigma method engages experts that receive a salary depending on the success of the Six Sigma program. On other hand the results of the application of the TQM methodology within the Macedonian companies will depend on the commitment of all employees and their motivation.

  19. Investigation of the existing methodology of value estimation and methods of discount rate estimation

    OpenAIRE

    Plikus, Iryna

    2017-01-01

    The subject of research is the current practice of determining the fair value of assets and liabilities at the present (discounted) cost. One of the most problematic places is the determination of the discount rate, which belongs to the jurisdiction of a professional accountant judgment.The methods of formalization, hypothetical assumption, system approach and scientific abstraction in substantiating the formation of accounting policy with respect to the choice of the discount rate are used i...

  20. Strategies and methodologies to develop techniques for computer-assisted analysis of gas phase formation during altitude decompression

    Science.gov (United States)

    Powell, Michael R.; Hall, W. A.

    1993-01-01

    It would be of operational significance if one possessed a device that would indicate the presence of gas phase formation in the body during hypobaric decompression. Automated analysis of Doppler gas bubble signals has been attempted for 2 decades but with generally unfavorable results, except with surgically implanted transducers. Recently, efforts have intensified with the introduction of low-cost computer programs. Current NASA work is directed towards the development of a computer-assisted method specifically targeted to EVA, and we are most interested in Spencer Grade 4. We note that Spencer Doppler Grades 1 to 3 have increased in the FFT sonogram and spectrogram in the amplitude domain, and the frequency domain is sometimes increased over that created by the normal blood flow envelope. The amplitude perturbations are of very short duration, in both systole and diastole and at random temporal positions. Grade 4 is characteristic in the amplitude domain but with modest increases in the FFT sonogram and spectral frequency power from 2K to 4K over all of the cardiac cycle. Heart valve motion appears to characteristic display signals: (1) the demodulated Doppler signal amplitude is considerably above the Doppler-shifted blow flow signal (even Grade 4); and (2) demodulated Doppler frequency shifts are considerably greater (often several kHz) than the upper edge of the blood flow envelope. Knowledge of these facts will aid in the construction of a real-time, computer-assisted discriminator to eliminate cardiac motion artifacts. There could also exist perturbations in the following: (1) modifications of the pattern of blood flow in accordance with Poiseuille's Law, (2) flow changes with a change in the Reynolds number, (3) an increase in the pulsatility index, and/or (4) diminished diastolic flow or 'runoff.' Doppler ultrasound devices have been constructed with a three-transducer array and a pulsed frequency generator.

  1. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Directory of Open Access Journals (Sweden)

    Vatutin Eduard

    2017-12-01

    Full Text Available The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  2. Comparison of Decisions Quality of Heuristic Methods with Limited Depth-First Search Techniques in the Graph Shortest Path Problem

    Science.gov (United States)

    Vatutin, Eduard

    2017-12-01

    The article deals with the problem of analysis of effectiveness of the heuristic methods with limited depth-first search techniques of decision obtaining in the test problem of getting the shortest path in graph. The article briefly describes the group of methods based on the limit of branches number of the combinatorial search tree and limit of analyzed subtree depth used to solve the problem. The methodology of comparing experimental data for the estimation of the quality of solutions based on the performing of computational experiments with samples of graphs with pseudo-random structure and selected vertices and arcs number using the BOINC platform is considered. It also shows description of obtained experimental results which allow to identify the areas of the preferable usage of selected subset of heuristic methods depending on the size of the problem and power of constraints. It is shown that the considered pair of methods is ineffective in the selected problem and significantly inferior to the quality of solutions that are provided by ant colony optimization method and its modification with combinatorial returns.

  3. Risk methodology for geologic disposal of radioactive waste: The distributed velocity method of solving the convective-dispersion equation

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, James E; Longsine, Dennis E [Sandia National Laboratories, Albuquerque, New Mexico (United States); Reeves, Mark [INTERA Environmental Consultants, Inc. Houston, TX (United States)

    1980-06-01

    A new method is proposed for treating convective-dispersive transport. The motivation for developing this technique arises from the demands of performing a risk assessment for a nuclear waste repository. These demands include computational efficiency over a relatively large range of Peclet numbers and the ability to handle chains of decaying radionuclides with rather extreme contrasts in both solution velocities and half lives. To the extent it has been tested to date, the Distributed Velocity Method (DVM) appears to satisfy these demands. Included in this paper are the mathematical theory, numerical implementation, an error analysis employing statistical sampling and regression analysis techniques, and comparisons of DVM with other methods for convective-dispersive transport. (author)

  4. A Combined Fuzzy-AHP and Fuzzy-GRA Methodology for Hydrogen Energy Storage Method Selection in Turkey

    Directory of Open Access Journals (Sweden)

    Aytac Yildiz

    2013-06-01

    Full Text Available In this paper, we aim to select the most appropriate Hydrogen Energy Storage (HES method for Turkey from among the alternatives of tank, metal hydride and chemical storage, which are determined based on expert opinions and literature review. Thus, we propose a Buckley extension based fuzzy Analytical Hierarchical Process (Fuzzy-AHP and linear normalization based fuzzy Grey Relational Analysis (Fuzzy-GRA combined Multi Criteria Decision Making (MCDM methodology. This combined approach can be applied to a complex decision process, which often makes sense with subjective data or vague information; and used to solve to solve HES selection problem with different defuzzification methods. The proposed approach is unique both in the HES literature and the MCDM literature.

  5. Elimination Method of Multi-Criteria Decision Analysis (MCDA: A Simple Methodological Approach for Assessing Agricultural Sustainability

    Directory of Open Access Journals (Sweden)

    Byomkesh Talukder

    2017-02-01

    Full Text Available In the present world context, there is a need to assess the sustainability of agricultural systems. Various methods have been proposed to assess agricultural sustainability. Like in many other fields, Multi-Criteria Decision Analysis (MCDA has recently been used as a methodological approach for the assessment of agricultural sustainability. In this paper, an attempt is made to apply Elimination, a MCDA method, to an agricultural sustainability assessment, and to investigate its benefits and drawbacks. This article starts by explaining the importance of agricultural sustainability. Common MCDA types are discussed, with a description of the state-of-the-art method for incorporating multi-criteria and reference values for agricultural sustainability assessment. Then, a generic description of the Elimination Method is provided, and its modeling approach is applied to a case study in coastal Bangladesh. An assessment of the results is provided, and the issues that need consideration before applying Elimination to agricultural sustainability, are examined. Whilst having some limitations, the case study shows that it is applicable for agricultural sustainability assessments and for ranking the sustainability of agricultural systems. The assessment is quick compared to other assessment methods and is shown to be helpful for agricultural sustainability assessment. It is a relatively simple and straightforward analytical tool that could be widely and easily applied. However, it is suggested that appropriate care must be taken to ensure the successful use of the Elimination Method during the assessment process.

  6. Tank Operations Contract Construction Management Methodology. Utilizing The Agency Method Of Construction Management

    International Nuclear Information System (INIS)

    Lesko, K.F.; Berriochoa, M.V.

    2010-01-01

    Washington River Protection Solutions, LLC (WRPS) has faced significant project management challenges in managing Davis-Bacon construction work that meets contractually required small business goals. The unique challenge is to provide contracting opportunities to multiple small business constructioin subcontractors while performing high hazard work in a safe and productive manner. Previous to the WRPS contract, construction work at the Hanford Tank Farms was contracted to large companies, while current Department of Energy (DOE) Contracts typically emphasize small business awards. As an integral part of Nuclear Project Management at Hanford Tank Farms, construction involves removal of old equipment and structures and installation of new infrastructure to support waste retrieval and waste feed delivery to the Waste Treatment Plant. Utilizing the optimum construction approach ensures that the contractors responsible for this work are successful in meeting safety, quality, cost and schedule objectives while working in a very hazardous environment. This paper descirbes the successful transition from a traditional project delivery method that utilized a large business general contractor and subcontractors to a new project construction management model that is more oriented to small businesses. Construction has selected the Agency Construction Management Method (John E Schaufelberger, Len Holm, 'Management of Construction Projects, A Constructor's Perspective', University of Washington, Prentice Hall 2002). This method was implemented in the first quarter of Fiscal Year 2009 (FY2009), where Construction Management is performed by substantially home office resources from the URS Northwest Office in Richland, Washington. The Agency Method has allowed WRPS to provide proven Construction Managers and Field Leads to mentor and direct small business contractors, thus providing expertise and assurance of a successful project. Construction execution contracts are subcontracted

  7. Methodologies for the practical determination and use of method detection limits

    International Nuclear Information System (INIS)

    Rucker, T.L.

    1995-01-01

    Method detection limits have often been misunderstood and misused. The basic definitions developed by Lloyd Currie and others have been combined with assumptions that are inappropriate for many types of radiochemical analyses. A partical way for determining detection limits based on Currie's basic definition is presented that removes the reliance on assumptions and that accounts for the total measurement uncertainty. Examples of proper and improper use of detection limits are also presented, including detection limits reported by commercial software for gamma spectroscopy and neutron activation analyses. (author) 6 refs.; 2 figs

  8. TANK OPERATIONS CONTRACT CONSTRUCTION MANAGEMENT METHODOLOGY UTILIZING THE AGENCY METHOD OF CONSTRUCTION MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    LESKO KF; BERRIOCHOA MV

    2010-02-26

    Washington River Protection Solutions, LLC (WRPS) has faced significant project management challenges in managing Davis-Bacon construction work that meets contractually required small business goals. The unique challenge is to provide contracting opportunities to multiple small business constructioin subcontractors while performing high hazard work in a safe and productive manner. Previous to the WRPS contract, construction work at the Hanford Tank Farms was contracted to large companies, while current Department of Energy (DOE) Contracts typically emphasize small business awards. As an integral part of Nuclear Project Management at Hanford Tank Farms, construction involves removal of old equipment and structures and installation of new infrastructure to support waste retrieval and waste feed delivery to the Waste Treatment Plant. Utilizing the optimum construction approach ensures that the contractors responsible for this work are successful in meeting safety, quality, cost and schedule objectives while working in a very hazardous environment. This paper descirbes the successful transition from a traditional project delivery method that utilized a large business general contractor and subcontractors to a new project construction management model that is more oriented to small businesses. Construction has selected the Agency Construction Management Method (John E Schaufelberger, Len Holm, "Management of Construction Projects, A Constructor's Perspective", University of Washington, Prentice Hall 2002). This method was implemented in the first quarter of Fiscal Year 2009 (FY2009), where Construction Management is performed by substantially home office resources from the URS Northwest Office in Richland, Washington. The Agency Method has allowed WRPS to provide proven Construction Managers and Field Leads to mentor and direct small business contractors, thus providing expertise and assurance of a successful project. Construction execution contracts are

  9. A localized in vivo detection method for lactate using zero quantum coherence techniques

    NARCIS (Netherlands)

    van Dijk, J. E.; Bosman, D. K.; Chamuleau, R. A.; Bovee, W. M.

    1991-01-01

    A method is described to selectively measure lactate in vivo using proton zero quantum coherence techniques. The signal from lipids is eliminated. A surface coil and additionally slice selective localization are used. The resulting spectra demonstrate the good performance of the method

  10. An Enzymatic Clinical Chemistry Laboratory Experiment Incorporating an Introduction to Mathematical Method Comparison Techniques

    Science.gov (United States)

    Duxbury, Mark

    2004-01-01

    An enzymatic laboratory experiment based on the analysis of serum is described that is suitable for students of clinical chemistry. The experiment incorporates an introduction to mathematical method-comparison techniques in which three different clinical glucose analysis methods are compared using linear regression and Bland-Altman difference…

  11. Iterative Method of Regularization with Application of Advanced Technique for Detection of Contours

    International Nuclear Information System (INIS)

    Niedziela, T.; Stankiewicz, A.

    2000-01-01

    This paper proposes a novel iterative method of regularization with application of an advanced technique for detection of contours. To eliminate noises, the properties of convolution of functions are utilized. The method can be accomplished in a simple neural cellular network, which creates the possibility of extraction of contours by automatic image recognition equipment. (author)

  12. Research Methods and Techniques in Spanish Library and Information Science Journals (2012-2014)

    Science.gov (United States)

    Ferran-Ferrer, Núria; Guallar, Javier; Abadal, Ernest; Server, Adan

    2017-01-01

    Introduction. This study examines the research methods and techniques used in Spanish journals of library and information science, the topics addressed by papers in these journals and their authorship affiliation. Method. The researchers selected 580 papers published in the top seven Spanish LIS journals indexed in Web of Science and Scopus and…

  13. Do COPD patients taught pursed lips breathing (PLB) for dyspnoea management continue to use the technique long-term? A mixed methodological study.

    Science.gov (United States)

    Roberts, S E; Schreuder, F M; Watson, T; Stern, M

    2017-12-01

    To investigate whether COPD patients taught pursed lips breathing (PLB) for dyspnoea management continue to use the technique long-term and, if so, their experience of this. A mixed methodological approach using semi-structured telephone interviews, a focus group and observation of current PLB technique was used. Qualitative analysis was based on grounded theory. Participants were recruited from the two inner city London (UK) boroughs. A purposive sample of 13 patients with COPD taught PLB 6 to 24 months previously. 11 participants took part in the telephone interviews; focus group participation and observed PLB was 5/11 and 6/11 respectively. A thematic analysis of interviews and focus group; observation of PLB technique. Nine reported on-going use of PLB with 8 reporting definite benefit. Observed technique showed ongoing ability for PLB to reduce RR and increase SpO 2 . Four distinct themes emerged from the data: use of PLB when short of breath due to physical activity (8/9), increased confidence and reduced panic (4/9), use as an exercise (3/9), use at night (3/9). Those that had discontinued PLB had done so because it didn't help (2) and they had forgotten/were too busy to continue. This study found 9 of 13 of patients taught PLB continued with long-term use and 8 of 13 reporting definite benefit from PLB. The role of PLB in increasing patients' confidence in their ability to manage their breathlessness and, use at night, were novel findings. Copyright © 2016 Chartered Society of Physiotherapy. Published by Elsevier Ltd. All rights reserved.

  14. A Preconditioning Technique for First-Order Primal-Dual Splitting Method in Convex Optimization

    Directory of Open Access Journals (Sweden)

    Meng Wen

    2017-01-01

    Full Text Available We introduce a preconditioning technique for the first-order primal-dual splitting method. The primal-dual splitting method offers a very general framework for solving a large class of optimization problems arising in image processing. The key idea of the preconditioning technique is that the constant iterative parameters are updated self-adaptively in the iteration process. We also give a simple and easy way to choose the diagonal preconditioners while the convergence of the iterative algorithm is maintained. The efficiency of the proposed method is demonstrated on an image denoising problem. Numerical results show that the preconditioned iterative algorithm performs better than the original one.

  15. Krylov subspace method with communication avoiding technique for linear system obtained from electromagnetic analysis

    International Nuclear Information System (INIS)

    Ikuno, Soichiro; Chen, Gong; Yamamoto, Susumu; Itoh, Taku; Abe, Kuniyoshi; Nakamura, Hiroaki

    2016-01-01

    Krylov subspace method and the variable preconditioned Krylov subspace method with communication avoiding technique for a linear system obtained from electromagnetic analysis are numerically investigated. In the k−skip Krylov method, the inner product calculations are expanded by Krylov basis, and the inner product calculations are transformed to the scholar operations. k−skip CG method is applied for the inner-loop solver of Variable Preconditioned Krylov subspace methods, and the converged solution of electromagnetic problem is obtained using the method. (author)

  16. Application of response surface methodology for determination of methyl red in water samples by spectrophotometry method.

    Science.gov (United States)

    Khodadoust, Saeid; Ghaedi, Mehrorang

    2014-12-10

    In this study a rapid and effective method (dispersive liquid-liquid microextraction (DLLME)) was developed for extraction of methyl red (MR) prior to its determination by UV-Vis spectrophotometry. Influence variables on DLLME such as volume of chloroform (as extractant solvent) and methanol (as dispersive solvent), pH and ionic strength and extraction time were investigated. Then significant variables were optimized by using a Box-Behnken design (BBD) and desirability function (DF). The optimized conditions (100μL of chloroform, 1.3mL of ethanol, pH 4 and 4% (w/v) NaCl) resulted in a linear calibration graph in the range of 0.015-10.0mgmL(-1) of MR in initial solution with R(2)=0.995 (n=5). The limits of detection (LOD) and limit of quantification (LOQ) were 0.005 and 0.015mgmL(-1), respectively. Finally, the DLLME method was applied for determination of MR in different water samples with relative standard deviation (RSD) less than 5% (n=5). Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Application of response surface methodology for determination of methyl red in water samples by spectrophotometry method

    Science.gov (United States)

    Khodadoust, Saeid; Ghaedi, Mehrorang

    2014-12-01

    In this study a rapid and effective method (dispersive liquid-liquid microextraction (DLLME) was developed for extraction of methyl red (MR) prior to its determination by UV-Vis spectrophotometry. Influence variables on DLLME such as volume of chloroform (as extractant solvent) and methanol (as dispersive solvent), pH and ionic strength and extraction time were investigated. Then significant variables were optimized by using a Box-Behnken design (BBD) and desirability function (DF). The optimized conditions (100 μL of chloroform, 1.3 mL of ethanol, pH 4 and 4% (w/v) NaCl) resulted in a linear calibration graph in the range of 0.015-10.0 mg mL-1 of MR in initial solution with R2 = 0.995 (n = 5). The limits of detection (LOD) and limit of quantification (LOQ) were 0.005 and 0.015 mg mL-1, respectively. Finally, the DLLME method was applied for determination of MR in different water samples with relative standard deviation (RSD) less than 5% (n = 5).

  18. A fresh recipe for designers: HCI approach to explore the nexus between design techniques and formal methods in software development

    Directory of Open Access Journals (Sweden)

    Julian Galindo Losada

    2016-11-01

    Full Text Available Emerging companies involved in design and implementation of innovative products demand multidisciplinary teams to be competitive in the market. This need mainly exposes designers to extend their knowledge not only in User Interface elements of the design process but also in software methodologies to cover the lack of resources and expertise in start-ups. It raises the question of how designers can line up HCI techniques with best practices in software development while preserving usability and easy-to-use principles. To explore this gap, this paper proposes an approach which combines existing technology and methods by studying the nexus between HCI prototyping and software engineering. The approach is applied into a case study in the design of a virtual shop harmonizing the use of storyboards and the spiral. A comprehensive analysis is performed by using a Technology acceptance model (TAM regarding with two variables: usability and easy-to-use. The present finding underlines the positive integration of HCI techniques and formal methods without compromising user satisfaction with a potential benefit for small companies in a formation stage.

  19. Method and apparatus for in-situ drying investigation and optimization of slurry drying methodology

    Science.gov (United States)

    Armstrong, Beth L.; Daniel, Claus; Howe, Jane Y.; Kiggans, Jr, James O.; Sabau, Adrian S.; Wood, III, David L.; Kalnaus, Sergiy

    2016-05-10

    A method of drying casted slurries that includes calculating drying conditions from an experimental model for a cast slurry and forming a cast film. An infrared heating probe is positioned on one side of the casted slurry and a thermal probe is positioned on an opposing side of the casted slurry. The infrared heating probe may control the temperature of the casted slurry during drying. The casted slurry may be observed with an optical microscope, while applying the drying conditions from the experimental model. Observing the casted slurry includes detecting the incidence of micro-structural changes in the casted slurry during drying to determine if the drying conditions from the experimental model are optimal.

  20. Non-perturbative methodologies for low-dimensional strongly-correlated systems: From non-Abelian bosonization to truncated spectrum methods.

    Science.gov (United States)

    James, Andrew J A; Konik, Robert M; Lecheminant, Philippe; Robinson, Neil J; Tsvelik, Alexei M

    2018-02-26

    We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symmetries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one and two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb-Liniger model, 1  +  1D quantum chromodynamics, as well as Landau-Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. We describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.

  1. Non-perturbative methodologies for low-dimensional strongly-correlated systems: From non-Abelian bosonization to truncated spectrum methods

    Science.gov (United States)

    James, Andrew J. A.; Konik, Robert M.; Lecheminant, Philippe; Robinson, Neil J.; Tsvelik, Alexei M.

    2018-04-01

    We review two important non-perturbative approaches for extracting the physics of low-dimensional strongly correlated quantum systems. Firstly, we start by providing a comprehensive review of non-Abelian bosonization. This includes an introduction to the basic elements of conformal field theory as applied to systems with a current algebra, and we orient the reader by presenting a number of applications of non-Abelian bosonization to models with large symmetries. We then tie this technique into recent advances in the ability of cold atomic systems to realize complex symmetries. Secondly, we discuss truncated spectrum methods for the numerical study of systems in one and two dimensions. For one-dimensional systems we provide the reader with considerable insight into the methodology by reviewing canonical applications of the technique to the Ising model (and its variants) and the sine-Gordon model. Following this we review recent work on the development of renormalization groups, both numerical and analytical, that alleviate the effects of truncating the spectrum. Using these technologies, we consider a number of applications to one-dimensional systems: properties of carbon nanotubes, quenches in the Lieb–Liniger model, 1  +  1D quantum chromodynamics, as well as Landau–Ginzburg theories. In the final part we move our attention to consider truncated spectrum methods applied to two-dimensional systems. This involves combining truncated spectrum methods with matrix product state algorithms. We describe applications of this method to two-dimensional systems of free fermions and the quantum Ising model, including their non-equilibrium dynamics.

  2. Forecasting method for global radiation time series without training phase: Comparison with other well-known prediction methodologies

    International Nuclear Information System (INIS)

    Voyant, Cyril; Motte, Fabrice; Fouilloy, Alexis; Notton, Gilles; Paoli, Christophe; Nivet, Marie-Laure

    2017-01-01

    Integration of unpredictable renewable energy sources into electrical networks intensifies the complexity of the grid management due to their intermittent and unforeseeable nature. Because of the strong increase of solar power generation the prediction of solar yields becomes more and more important. Electrical operators need an estimation of the future production. For nowcasting and short term forecasting, the usual technics based on machine learning need large historical data sets of good quality during the training phase of predictors. However data are not always available and induce an advanced maintenance of meteorological stations, making the method inapplicable for poor instrumented or isolated sites. In this work, we propose intuitive methodologies based on the Kalman filter use (also known as linear quadratic estimation), able to predict a global radiation time series without the need of historical data. The accuracy of these methods is compared to other classical data driven methods, for different horizons of prediction and time steps. The proposed approach shows interesting capabilities allowing to improve quasi-systematically the prediction. For one to 10 h horizons Kalman model performances are competitive in comparison to more sophisticated models such as ANN which require both consistent historical data sets and computational resources. - Highlights: • Solar radiation forecasting with time series formalism. • Trainless approach compared to machine learning methods. • Very simple method dedicated to solar irradiation forecasting with high accuracy.

  3. Intracavitary after loading techniques, advantages and disadvantages with high and low dose-rate methods

    International Nuclear Information System (INIS)

    Walstam, Rune

    1980-01-01

    Even though suggested as early as 1903, it is only when suitable sealed gamma sources became available, afterloading methods could be developed for interstitial as well as intracavitary work. Manual afterloading technique can be used only for low dose rate irradiation, while remote controlled afterloading technique can be used for both low and high dose-rate irradiation. Afterloading units used at the Karolinska Institute, Stockholm, are described, and experience of their use is narrated briefly. (M.G.B.)

  4. A company’s market value: the methodology of its valuation and methods for its maximization

    Directory of Open Access Journals (Sweden)

    Olexandr Kravchenko

    2007-03-01

    Full Text Available This article investigates the creation and monitoring of the fundamental value of a company, the methods of its valuation, and capital market responses to changes of the fundamental value. The author uses the basic theory of discounted cash flows as his main theoretical model. This theory states that the investment value equals the net present value of future cash flows that is created as a result of this investment. Other theories referred to in the article are derived from the aforementioned model. The article contains an empirical analysis of correlation dependence between the fundamental value and the market capitalization. The figures obtained from international companies during a 5-year time period showed that the highest indices of fundamental value increase were used as output data. The article argues that the total business return has the highest correlation index with respect to a company’s market value. The reasons affecting the results of the empirical research have been analyzed. The author gives some recommendations on the appreciation of a company’s market value.

  5. Authenticity study of Phyllanthus species by NMR and FT-IR techniques coupled with chemometric methods

    International Nuclear Information System (INIS)

    Santos, Maiara S.; Pereira-Filho, Edenir R.; Ferreira, Antonio G.; Boffo, Elisangela F.; Figueira, Glyn M.

    2012-01-01

    The importance of medicinal plants and their use in industrial applications is increasing worldwide, especially in Brazil. Phyllanthus species, popularly known as 'quebra-pedras' in Brazil, are used in folk medicine for treating urinary infections and renal calculus. This paper reports an authenticity study, based on herbal drugs from Phyllanthus species, involving commercial and authentic samples using spectroscopic techniques: FT-IR, 1 H HR-MAS NMR and 1 H NMR in solution, combined with chemometric analysis. The spectroscopic techniques evaluated, coupled with chemometric methods, have great potential in the investigation of complex matrices. Furthermore, several metabolites were identified by the NMR techniques. (author)

  6. Authenticity study of Phyllanthus species by NMR and FT-IR Techniques coupled with chemometric methods

    Directory of Open Access Journals (Sweden)

    Maiara S. Santos

    2012-01-01

    Full Text Available The importance of medicinal plants and their use in industrial applications is increasing worldwide, especially in Brazil. Phyllanthus species, popularly known as "quebra-pedras" in Brazil, are used in folk medicine for treating urinary infections and renal calculus. This paper reports an authenticity study, based on herbal drugs from Phyllanthus species, involving commercial and authentic samples using spectroscopic techniques: FT-IR, ¹H HR-MAS NMR and ¹H NMR in solution, combined with chemometric analysis. The spectroscopic techniques evaluated, coupled with chemometric methods, have great potential in the investigation of complex matrices. Furthermore, several metabolites were identified by the NMR techniques.

  7. Authenticity study of Phyllanthus species by NMR and FT-IR techniques coupled with chemometric methods

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Maiara S.; Pereira-Filho, Edenir R.; Ferreira, Antonio G. [Universidade Federal de Sao Carlos (UFSCAR), SP (Brazil). Dept. de Quimica; Boffo, Elisangela F. [Universidade Federal da Bahia (UFBA), Salvador, BA (Brazil). Inst. de Quimica; Figueira, Glyn M., E-mail: maiarassantos@yahoo.com.br [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Centro Pluridisciplinar de Pesquisas Quimicas, Biologicas e Agricolas

    2012-07-01

    The importance of medicinal plants and their use in industrial applications is increasing worldwide, especially in Brazil. Phyllanthus species, popularly known as 'quebra-pedras' in Brazil, are used in folk medicine for treating urinary infections and renal calculus. This paper reports an authenticity study, based on herbal drugs from Phyllanthus species, involving commercial and authentic samples using spectroscopic techniques: FT-IR, {sup 1}H HR-MAS NMR and {sup 1}H NMR in solution, combined with chemometric analysis. The spectroscopic techniques evaluated, coupled with chemometric methods, have great potential in the investigation of complex matrices. Furthermore, several metabolites were identified by the NMR techniques. (author)

  8. Authenticity study of Phyllanthus species by NMR and FT-IR techniques coupled with chemometric methods

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Maiara S.; Pereira-Filho, Edenir R.; Ferreira, Antonio G. [Universidade Federal de Sao Carlos (UFSCAR), SP (Brazil). Dept. de Quimica; Boffo, Elisangela F. [Universidade Federal da Bahia (UFBA), Salvador, BA (Brazil). Inst. de Quimica; Figueira, Glyn M., E-mail: maiarassantos@yahoo.com.br [Universidade Estadual de Campinas (UNICAMP), Campinas, SP (Brazil). Centro Pluridisciplinar de Pesquisas Quimicas, Biologicas e Agricolas

    2012-07-01

    The importance of medicinal plants and their use in industrial applications is increasing worldwide, especially in Brazil. Phyllanthus species, popularly known as 'quebra-pedras' in Brazil, are used in folk medicine for treating urinary infections and renal calculus. This paper reports an authenticity study, based on herbal drugs from Phyllanthus species, involving commercial and authentic samples using spectroscopic techniques: FT-IR, {sup 1}H HR-MAS NMR and {sup 1}H NMR in solution, combined with chemometric analysis. The spectroscopic techniques evaluated, coupled with chemometric methods, have great potential in the investigation of complex matrices. Furthermore, several metabolites were identified by the NMR techniques. (author)

  9. Influence of molecularbiological techniques upon nuclearmedical methods; Einfluss molekularbiologischer Verfahren auf nuklearmedizinische Methoden

    Energy Technology Data Exchange (ETDEWEB)

    Haberkorn, U. [Abt. fuer Nuklearmedizin, Klinik fuer Radiologie, Univ. Heidelberg (Germany)

    2005-03-01

    Basic research delivers information concerning new molecular structures with potential use as target structures for diagnosis and therapy with further need for selection and evaluation by the methods of physiology, biochemistry and pharmacology using in part nuclear medical techniques for estimation of gene function and regulation. Pharmacogenomic will identify new surrogate markers as potential new radiotracers for follow-up of therapies. New therapeutical approaches will need biodistribution studies in the preclinical stage and techniques for evaluation of efficiency. At least, biotechnological techniques such as phage display may be suitable to develop new biomolecules for isotope related diagnostics and therapy. (orig.)

  10. Active methodologies in Financial Management classes: an alternative to the traditional teaching method for awakening intrinsic motivation and developing autonomy

    Directory of Open Access Journals (Sweden)

    Guilherme Muniz Pereira Chaves Urias

    2017-01-01

    Full Text Available This article presents a pedagogical experience in Financial Management classes. The objective of this study was to investigate whether the educational activity based on active methodologies, applied in the Financial Management classes in an undergraduate course in Business Administration, can offer formative spaces that enhance the development of the students’ intrinsic motivation to the point of being relevant to the development of their autonomy and thus to be characterized as a viable manner of putting the Freirean pedagogy into practice. In order to do so, the adopted teaching strategy aimed at creating opportunities for interpretating problems that simulated real situations. A questionnaire was applied and the Bardinian content analysis was used to verify the students' impressions about the activity itself and its respective contribution to their professional and personal training. The analysis points to the fact that active methodologies are viable alternatives to the traditional method of teaching regarding the awakening of interest, motivation and the development of learning. It also points to their consonance with the Freirian pedagogy.

  11. A Methods and procedures to apply probabilistic safety Assessment (PSA) techniques to the cobalt-therapy process. Cuban experience

    International Nuclear Information System (INIS)

    Vilaragut Llanes, J.J.; Ferro Fernandez, R.; Lozano Lima, B; De la Fuente Puch, A.; Dumenigo Gonzalez, C.; Troncoso Fleitas, M.; Perez Reyes, Y.

    2003-01-01

    This paper presents the results of the Probabilistic Safety Analysis (PSA) to the Cobalt Therapy Process, which was performed as part of the International Atomic Energy Agency's Coordinated Research Project (CRP) to Investigate Appropriate Methods and Procedures to Apply Probabilistic Safety Assessment (PSA) Techniques to Large Radiation Sources. The primary methodological tools used in the analysis were Failure Modes and Effects Analysis (FMEA), Event Trees and Fault Trees. These tools were used to evaluate occupational, public and medical exposures during cobalt therapy treatment. The emphasis of the study was on the radiological protection of patients. During the course of the PSA, several findings were analysed concerning the cobalt treatment process. In relation with the Undesired Events Probabilities, the lowest exposures probabilities correspond to the public exposures during the treatment process (Z21); around 10-10 per year, being the workers exposures (Z11); around 10-4 per year. Regarding to the patient, the Z33 probabilities prevail (not desired dose to normal tissue) and Z34 (not irradiated portion to target volume). Patient accidental exposures are also classified in terms of the extent to which the error is likely to affect individual treatments, individual patients, or all the patients treated on a specific unit. Sensitivity analyses were realised to determine the influence of certain tasks or critical stages on the results. As a conclusion the study establishes that the PSA techniques may effectively and reasonably determine the risk associated to the cobalt-therapy treatment process, though there are some weaknesses in its methodological application for this kind of study requiring further research. These weaknesses are due to the fact that the traditional PSA has been mainly applied to complex hardware systems designed to operate with a high automation level, whilst the cobalt therapy treatment is a relatively simple hardware system with a

  12. Comparison of dye dilution method to radionuclide techniques for cardiac output determination in dogs

    International Nuclear Information System (INIS)

    Eng, S.S.; Robayo, J.R.; Porter, W.; Smith, R.E.

    1980-01-01

    A study was undertaken to identify the most accurate /sup 99m/Tc-labeled radiopharmaceutical and to determine the accuracy of a noninvasive radionuclide technique or cardiac output determinations. Phase I employed sodium pertechnetate, stannous pyrophosphate with sodium pertechnetate, /sup 99m/Tc red blood cells, and /sup 99m/Tc human serum albumin as radionuclide tracers. Cardiac output was determined by the dye dilution method and then by the invasive radionuclide technique. A pairied t test and regression analysis indicated that /sup 99m/Tc human serum albumin was the most accurate radiopharmaceutical for cardiac output determinations, and the results compared favorably to those obtained by the dye dilution method. In Phase II, /sup 99m/Tc human serum albumin was used as the radionuclide tracer for cardiac output determinations with the noninvasive technique. The results compared favorably to those obtained by the dye dilution method

  13. Double sliding-window technique: a new method to calculate the neuronal response onset latency.

    Science.gov (United States)

    Berényi, Antal; Benedek, György; Nagy, Attila

    2007-10-31

    Neuronal response onset latency provides important data on the information processing within the central nervous system. In order to enhance the quality of the onset latency estimation, we have developed a 'double sliding-window' technique, which combines the advantages of mathematical methods with the reliability of standard statistical processes. This method is based on repetitive series of statistical probes between two virtual time windows. The layout of the significance curve reveals the starting points of changes in neuronal activity in the form of break-points between linear segments. A second-order difference function is applied to determine the position of maximum slope change, which corresponds to the onset of the response. In comparison with Poisson spike-train analysis, the cumulative sum technique and the method of Falzett et al., this 'double sliding-window, technique seems to be a more accurate automated procedure to calculate the response onset latency of a broad range of neuronal response characteristics.

  14. Methodological challenges for the large N study of local participatory experiences. Combining methods and databases

    Directory of Open Access Journals (Sweden)

    Galais, Carolina

    2012-12-01

    Full Text Available In this article we analyse the effects of different data collection strategies in the study of local participatory experiences in a region of Spain (Andalusia. We examine the divergences and similarities between the data collected using different methods, as well as the implications for the reliability of the data. We have collected participatory experiences through two parallel processes: a survey of municipalities and web content mining. The survey of municipalities used two complementary strategies: an online questionnaire and a CATI follow-up for those municipalities that had not answered our first online contact attempt. Both processes (survey and data mining were applied to the same sample of municipalities, but provided significantly different images of the characteristics of Andalusia’s participatory landscape. The goal of this work is to discuss the different types of biases introduced by each data collection procedure and their implications for substantive analyses.

    En este artículo analizamos los efectos de diferentes estrategias para la recolección de datos en el estudio de las experiencias participativas andaluzas. Examinamos para ello las diferencias y similitudes entre los datos recogidos mediante diferentes métodos, así como las implicaciones para la fiabilidad de los datos. Para ello, hemos utilizado dos procedimientos paralelos. En primer lugar, una encuesta a municipios y la minería de datos en Internet. La encuesta se realizó utilizando dos modos de administración diferentes, un cuestionario online y un cuestionario telefónico de seguimiento a los municipios que no respondieron al primer intento de contacto vía correo electrónico. Tanto la encuesta como la minería de datos fueron aplicados a la misma muestra de municipios, aunque arrojaron diferencias significativas en cuanto a las características del panorama participativo en Andalucía. El objetivo de este trabajo es discutir los diferentes tipos

  15. Problem-solving and developing quality management methods and techniques on the example of automotive industry

    Directory of Open Access Journals (Sweden)

    Jacek Łuczak

    2015-12-01

    Full Text Available The knowledge about methods and techniques of quality management together with their effective use can be definitely regarded as an indication of high organisational culture. Using such methods and techniques in an effective way can be attributed to certain level of maturity, as far as the quality management system in an organisation is concerned. There is in the paper an analysis of problem-solving methods and techniques of quality management in the automotive sector in Poland. The survey was given to the general population, which in case of the study consisted of companies operating in Poland that had certified quality management systems against ISO/TS 16949. The results of the conducted survey and the conclusions of the author can show actual and potential OEM suppliers (both 1st and 2nd tier in which direction their strategies for development and improvement of quality management systems should go in order to be effective. When the universal character of methods and techniques used in the surveyed population of companies is taken into consideration, it can be assumed that the results of the survey are also universal for all organisations realising the TQM strategy. The results of the research confirmed that methods which are also the basis for creating key system documents are the most relevant ones, i.e. flowcharts and FMEA, and moreover process monitoring tools (SPC and problem solving methods -above all 8D.

  16. Proposal for evaluation methodology on impact resistant performance and construction method of tornado missile protection net structure

    International Nuclear Information System (INIS)

    Namba, Kosuke; Shirai, Koji

    2014-01-01

    In nuclear power plants, the necessity of the Tornado Missile Protection Structure is becoming a technical key issue. Utilization of the net structure seems to be one of the realistic counter measures from the point of the view of the mitigation wind and seismic loads. However, the methodology for the selection of the net suitable materials, the energy absorption design method and the construction method are not sufficiently established. In this report, three materials (high-strength metal mesh, super strong polyethylene fiber net and steel grating) were selected for the candidate material and the material screening tests, the energy absorption tests by free drop test using the heavy weight and the impact tests with the small diameter missile. As a result, high-strength metal mesh was selected as a suitable material for tornado missile protection net structure. Moreover, the construction method to obtain the good energy absorption performance of the material and the practical design method to estimate the energy absorption of the high-strength metal mesh under tornado missile impact load were proposed. (author)

  17. Validation of methodologies for the analysis of lead and methyl-ether in gasoline, using the techniques of atomic emission with plasma source coupled inductively and micellar liquid chromatography

    International Nuclear Information System (INIS)

    Redondo Escalante, M.

    1995-01-01

    This study established and optimized the experimental variables for the lead quantization through the Icp-Aes technique, in aqueous media. A comparative study of several proposal methods, that appears in the literature for the extraction in aqueous media of the lead in gasoline was made. It determined that it is not possible, to make this procedure using the reaction of hydrolysis of tetraethyl lead. The op tim conditions were established, for the lead quantization in gasoline, using methyl-isobutyl-ketone and also ethanol as dis solvents. The conditions of the proposed methodologies were optimized, and the variables of analytical performance were defined. It was demonstrated, that it is possible to prepare lead dissolution patterns, in organic media, starting from inorganic salts of this metal. The techniques of chromatography of gases and of liquid chromatography of high pressure, in the analysis of methyl-ter butyl-ether (Mtbe), were compared. It demonstrated that it is possible, to quantize the Mtbe through the HPLC technique, and it found that the 'micellar' liquid chromatography. (author) [es

  18. Novel technique of making thin target foil of high density material via rolling method

    Science.gov (United States)

    Gupta, C. K.; Rohilla, Aman; Singh, R. P.; Singh, Gurjot; Chamoli, S. K.

    2018-05-01

    The conventional rolling method fails to yield good quality thin foils of thicknesses less than 2 mg/cm2 for high density materials with Z ≥ 70 (e.g. gold, lead). A special and improved technique has been developed to obtain such low thickness good quality gold foils by rolling method. Using this technique thin gold foils of thickness in the range of 0.850-2.5 mg/cm2 were obtained in the present work. By making use of alcohol during rolling, foils of thickness 1 mg/cm2 can be obtained in shorter time with less effort.

  19. Teaching research methods in nursing using Aronson's Jigsaw Technique. A cross-sectional survey of student satisfaction.

    Science.gov (United States)

    Leyva-Moral, Juan M; Riu Camps, Marta

    2016-05-01

    To adapt nursing studies to the European Higher Education Area, new teaching methods have been included that assign maximum importance to student-centered learning and collaborative work. The Jigsaw Technique is based on collaborative learning and everyone in the group must play their part because each student's mark depends on the other students. Home group members are given the responsibility to become experts in a specific area of knowledge. Experts meet together to reach an agreement and improve skills. Finally, experts return to their home groups to share all their findings. The aim of this study was to evaluate nursing student satisfaction with the Jigsaw Technique used in the context of a compulsory course in research methods for nursing. A cross-sectional study was conducted using a self-administered anonymous questionnaire administered to students who completed the Research Methods course during the 2012-13 and 2013-14 academic years. The questionnaire was developed taking into account the learning objectives, competencies and skills that should be acquired by students, as described in the course syllabus. The responses were compared by age group (younger or older than 22years). A total of 89.6% of nursing students under 22years believed that this methodology helped them to develop teamwork, while this figure was 79.6% in older students. Nursing students also believed it helped them to work independently, with differences according to age, 79.7% and 58% respectively (p=0.010). Students disagreed with the statement "The Jigsaw Technique involves little workload", with percentages of 88.5% in the group under 22years and 80% in older students. Most believed that this method should not be employed in upcoming courses, although there were differences by age, with 44.3% of the younger group being against and 62% of the older group (p=0.037). The method was not highly valued by students, mainly by those older than 22years, who concluded that they did not learn

  20. A new methodology based on the two-region model and microscopic noise analysis techniques for absolute measurements of betaeff, Λ and betaeff/Λ of the IPEN-MB-01 reactor

    International Nuclear Information System (INIS)

    Kuramoto, Renato Yoichi Ribeiro

    2007-01-01

    A new method for absolute measurement of the effective delayed neutron fraction, beta eff based on microscopic noise experiments and the Two-Region Model was developed at the IPEN/MB-01 Research Reactor facility. In contrast with other techniques like the Modified Bennett Method, Nelson-Number Method and 252 Cf-Source Method, the main advantage of this new methodology is to obtain the effective delayed neutron parameters in a purely experimental way, eliminating all parameters that are difficult to measure or calculate. In this way, Rossi-a and Feynman-a experiments for validation of this method were performed at the IPEN/MB-01 facility, and adopting the present approach, beta eff was measured with a 0.67% uncertainty. In addition, the prompt neutron generation time, A, and other parameters were also obtained in an absolute experimental way. In general, the final results agree well with values from frequency analysis experiments. The theory-experiment comparison reveals that JENDL-3.3 shows deviation for beta eff lower than 1% which meets the desired accuracy for the theoretical determination of this parameter. This work supports the reduction of the 235 U thermal yield as proposed by Okajima and Sakurai. (author)

  1. Standardization of Laser Methods and Techniques for Vibration Measurements and Calibrations

    International Nuclear Information System (INIS)

    Martens, Hans-Juergen von

    2010-01-01

    The realization and dissemination of the SI units of motion quantities (vibration and shock) have been based on laser interferometer methods specified in international documentary standards. New and refined laser methods and techniques developed by national metrology institutes and by leading manufacturers in the past two decades have been swiftly specified as standard methods for inclusion into in the series ISO 16063 of international documentary standards. A survey of ISO Standards for the calibration of vibration and shock transducers demonstrates the extended ranges and improved accuracy (measurement uncertainty) of laser methods and techniques for vibration and shock measurements and calibrations. The first standard for the calibration of laser vibrometers by laser interferometry or by a reference accelerometer calibrated by laser interferometry (ISO 16063-41) is on the stage of a Draft International Standard (DIS) and may be issued by the end of 2010. The standard methods with refined techniques proved to achieve wider measurement ranges and smaller measurement uncertainties than that specified in the ISO Standards. The applicability of different standardized interferometer methods to vibrations at high frequencies was recently demonstrated up to 347 kHz (acceleration amplitudes up to 350 km/s 2 ). The relative deviations between the amplitude measurement results of the different interferometer methods that were applied simultaneously, differed by less than 1% in all cases.

  2. An efficient preconditioning technique using Krylov subspace methods for 3D characteristics solvers

    International Nuclear Information System (INIS)

    Dahmani, M.; Le Tellier, R.; Roy, R.; Hebert, A.

    2005-01-01

    The Generalized Minimal RESidual (GMRES) method, using a Krylov subspace projection, is adapted and implemented to accelerate a 3D iterative transport solver based on the characteristics method. Another acceleration technique called the self-collision rebalancing technique (SCR) can also be used to accelerate the solution or as a left preconditioner for GMRES. The GMRES method is usually used to solve a linear algebraic system (Ax=b). It uses K(r (o) ,A) as projection subspace and AK(r (o) ,A) for the orthogonalization of the residual. This paper compares the performance of these two combined methods on various problems. To implement the GMRES iterative method, the characteristics equations are derived in linear algebra formalism by using the equivalence between the method of characteristics and the method of collision probability to end up with a linear algebraic system involving fluxes and currents. Numerical results show good performance of the GMRES technique especially for the cases presenting large material heterogeneity with a scattering ratio close to 1. Similarly, the SCR preconditioning slightly increases the GMRES efficiency

  3. A methodology for semiautomatic taxonomy of concepts extraction from nuclear scientific documents using text mining techniques; Metodologia para extracao semiautomatica de uma taxonomia de conceitos a partir da producao cientifica da area nuclear utilizando tecnicas de mineracao de textos

    Energy Technology Data Exchange (ETDEWEB)

    Braga, Fabiane dos Reis

    2013-07-01

    This thesis presents a text mining method for semi-automatic extraction of taxonomy of concepts, from a textual corpus composed of scientific papers related to nuclear area. The text classification is a natural human practice and a crucial task for work with large repositories. The document clustering technique provides a logical and understandable framework that facilitates the organization, browsing and searching. Most clustering algorithms using the bag of words model to represent the content of a document. This model generates a high dimensionality of the data, ignores the fact that different words can have the same meaning and does not consider the relationship between them, assuming that words are independent of each other. The methodology presents a combination of a model for document representation by concepts with a hierarchical document clustering method using frequency of co-occurrence concepts and a technique for clusters labeling more representatives, with the objective of producing a taxonomy of concepts which may reflect a structure of the knowledge domain. It is hoped that this work will contribute to the conceptual mapping of scientific production of nuclear area and thus support the management of research activities in this area. (author)

  4. Leak detection of complex pipelines based on the filter diagonalization method: robust technique for eigenvalue assessment

    International Nuclear Information System (INIS)

    Lay-Ekuakille, Aimé; Pariset, Carlo; Trotta, Amerigo

    2010-01-01

    The FDM (filter diagonalization method), an interesting technique used in nuclear magnetic resonance data processing for tackling FFT (fast Fourier transform) limitations, can be used by considering pipelines, especially complex configurations, as a vascular apparatus with arteries, veins, capillaries, etc. Thrombosis, which might occur in humans, can be considered as a leakage for the complex pipeline, the human vascular apparatus. The choice of eigenvalues in FDM or in spectra-based techniques is a key issue in recovering the solution of the main equation (for FDM) or frequency domain transformation (for FFT) in order to determine the accuracy in detecting leaks in pipelines. This paper deals with the possibility of improving the leak detection accuracy of the FDM technique thanks to a robust algorithm by assessing the problem of eigenvalues, making it less experimental and more analytical using Tikhonov-based regularization techniques. The paper starts from the results of previous experimental procedures carried out by the authors

  5. Iterative methods used in overlap astrometric reduction techniques do not always converge

    Science.gov (United States)

    Rapaport, M.; Ducourant, C.; Colin, J.; Le Campion, J. F.

    1993-04-01

    In this paper we prove that the classical Gauss-Seidel type iterative methods used for the solution of the reduced normal equations occurring in overlapping reduction methods of astrometry do not always converge. We exhibit examples of divergence. We then analyze an alternative algorithm proposed by Wang (1985). We prove the consistency of this algorithm and verify that it can be convergent while the Gauss-Seidel method is divergent. We conjecture the convergence of Wang method for the solution of astrometric problems using overlap techniques.

  6. Depth extraction method with high accuracy in integral imaging based on moving array lenslet technique

    Science.gov (United States)

    Wang, Yao-yao; Zhang, Juan; Zhao, Xue-wei; Song, Li-pei; Zhang, Bo; Zhao, Xing

    2018-03-01

    In order to improve depth extraction accuracy, a method using moving array lenslet technique (MALT) in pickup stage is proposed, which can decrease the depth interval caused by pixelation. In this method, the lenslet array is moved along the horizontal and vertical directions simultaneously for N times in a pitch to get N sets of elemental images. Computational integral imaging reconstruction method for MALT is taken to obtain the slice images of the 3D scene, and the sum modulus (SMD) blur metric is taken on these slice images to achieve the depth information of the 3D scene. Simulation and optical experiments are carried out to verify the feasibility of this method.

  7. A systematic methodology for creep master curve construction using the stepped isostress method (SSM): a numerical assessment

    Science.gov (United States)

    Miranda Guedes, Rui

    2018-02-01

    Long-term creep of viscoelastic materials is experimentally inferred through accelerating techniques based on the time-temperature superposition principle (TTSP) or on the time-stress superposition principle (TSSP). According to these principles, a given property measured for short times at a higher temperature or higher stress level remains the same as that obtained for longer times at a lower temperature or lower stress level, except that the curves are shifted parallel to the horizontal axis, matching a master curve. These procedures enable the construction of creep master curves with short-term experimental tests. The Stepped Isostress Method (SSM) is an evolution of the classical TSSP method. Higher reduction of the required number of test specimens to obtain the master curve is achieved by the SSM technique, since only one specimen is necessary. The classical approach, using creep tests, demands at least one specimen per each stress level to produce a set of creep curves upon which TSSP is applied to obtain the master curve. This work proposes an analytical method to process the SSM raw data. The method is validated using numerical simulations to reproduce the SSM tests based on two different viscoelastic models. One model represents the viscoelastic behavior of a graphite/epoxy laminate and the other represents an adhesive based on epoxy resin.

  8. Double-lock technique: a simple method to secure abdominal wall closure

    International Nuclear Information System (INIS)

    Jategaonkar, P.A.; Yadav, S.P.

    2013-01-01

    Secure closure of a laparotomy incision remains an important aspect of any abdominal operation with the aim to avoid the postoperative morbidity and hasten the patient's recovery. Depending on the operator's preference and experience, it may be done by the continuous or the interrupted methods either using a non-absorbable or delayed-absorbable suture. We describe a simple, secure and quick technique of abdominal wall closure which involves continuous suture inter-locked doubly after every third bite. This simple and easy to use mass closure technique can be easily mastered by any member of the surgical team and does not need any assistant. It amalgamates the advantages of both, the continuous and the interrupted methods of closures. To our knowledge, such a technique has not been reported in the literature. (author)

  9. The quality of reporting methods and results of cost-effectiveness analyses in Spain: a methodological systematic review.

    Science.gov (United States)

    Catalá-López, Ferrán; Ridao, Manuel; Alonso-Arroyo, Adolfo; García-Altés, Anna; Cameron, Chris; González-Bermejo, Diana; Aleixandre-Benavent, Rafael; Bernal-Delgado, Enrique; Peiró, Salvador; Tabarés-Seisdedos, Rafael; Hutton, Brian

    2016-01-07

    Cost-effectiveness analysis has been recognized as an important tool to determine the efficiency of healthcare interventions and services. There is a need for evaluating the reporting of methods and results of cost-effectiveness analyses and establishing their validity. We describe and examine reporting characteristics of methods and results of cost-effectiveness analyses conducted in Spain during more than two decades. A methodological systematic review was conducted with the information obtained through an updated literature review in PubMed and complementary databases (e.g. Scopus, ISI Web of Science, National Health Service Economic Evaluation Database (NHS EED) and Health Technology Assessment (HTA) databases from Centre for Reviews and Dissemination (CRD), Índice Médico Español (IME) Índice Bibliográfico Español en Ciencias de la Salud (IBECS)). We identified cost-effectiveness analyses conducted in Spain that used quality-adjusted life years (QALYs) as outcome measures (period 1989-December 2014). Two reviewers independently extracted the data from each paper. The data were analysed descriptively. In total, 223 studies were included. Very few studies (10; 4.5 %) reported working from a protocol. Most studies (200; 89.7 %) were simulation models and included a median of 1000 patients. Only 105 (47.1 %) studies presented an adequate description of the characteristics of the target population. Most study interventions were categorized as therapeutic (189; 84.8 %) and nearly half (111; 49.8 %) considered an active alternative as the comparator. Effectiveness of data was derived from a single study in 87 (39.0 %) reports, and only few (40; 17.9 %) used evidence synthesis-based estimates. Few studies (42; 18.8 %) reported a full description of methods for QALY calculation. The majority of the studies (147; 65.9 %) reported that the study intervention produced "more costs and more QALYs" than the comparator. Most studies (200; 89.7 %) reported favourable

  10. The comparison of MCNP perturbation technique with MCNP difference method in critical calculation

    International Nuclear Information System (INIS)

    Liu Bin; Lv Xuefeng; Zhao Wei; Wang Kai; Tu Jing; Ouyang Xiaoping

    2010-01-01

    For a nuclear fission system, we calculated Δk eff , which arise from system material composition changes, by two different approaches, the MCNP perturbation technique and the MCNP difference method. For every material composition change, we made four different runs, each run with different cycles or each cycle generating different neutrons, then we compared the two Δk eff that are obtained by two different approaches. As a material composition change in any particular cell of the nuclear fission system is small compared to the material compositions in the whole nuclear fission system, in other words, this composition change can be treated as a small perturbation, the Δk eff results obtained from the MCNP perturbation technique are much quicker, much more efficient and reliable than the results from the MCNP difference method. When a material composition change in any particular cell of the nuclear fission system is significant compared to the material compositions in the whole nuclear fission system, both the MCNP perturbation technique and the MCNP difference method can give satisfactory results. But for the run with the same cycles and each cycle generating the same neutrons, the results obtained from the MCNP perturbation technique are systemically less than the results obtained from the MCNP difference method. To further confirm our calculation results from the MCNP4C, we run the exact same MCNP4C input file in MCNP5, the calculation results from MCNP5 are the same as the calculation results from MCNP4C. We need caution when using the MCNP perturbation technique to calculate the Δk eff as the material composition change is large compared to the material compositions in the whole nuclear fission system, even though the material composition changes of any particular cell of the fission system still meet the criteria of MCNP perturbation technique.

  11. Status of the Usage of Active Learning and Teaching Method and Techniques by Social Studies Teachers

    Science.gov (United States)

    Akman, Özkan

    2016-01-01

    The purpose of this study was to determine the active learning and teaching methods and techniques which are employed by the social studies teachers working in state schools of Turkey. This usage status was assessed using different variables. This was a case study, wherein the research was limited to 241 social studies teachers. These teachers…

  12. Technique for Increasing the Selectivity of the Method of Laser Fragmentation/Laser-Induced Fluorescence

    Science.gov (United States)

    Bobrovnikov, S. M.; Gorlov, E. V.; Zharkov, V. I.

    2018-05-01

    A technique for increasing the selectivity of the method of detecting high-energy materials (HEMs) based on laser fragmentation of HEM molecules with subsequent laser excitation of fluorescence of the characteristic NO fragments from the first vibrational level of the ground state is suggested.

  13. Combined smoothing method and its use in combining Earth orientation parameters measured by space techniques

    Czech Academy of Sciences Publication Activity Database

    Vondrák, Jan; Čepek, A.

    2000-01-01

    Roč. 147, č. 2 (2000), s. 347-359 ISSN 0365-0138 R&D Projects: GA ČR GA205/98/1104 Institutional research plan: CEZ:AV0Z1003909 Keywords : numerical method s * miscellaneous techniques * reference systems Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics Impact factor: 1.745, year: 2000

  14. Workers' Education Methods and Techniques for Rural Workers and Their Organisations: Summary of Views Expressed

    Science.gov (United States)

    Labour Education, 1975

    1975-01-01

    Several issues concerning rural workers' organizations and workers' education are discussed: motivation for self-organization, workers' education needs of rural workers, workers' education methods and techniques, training institutions and training personnel, financial resources, and the role of the International Labor Organization workers'…

  15. Novel Technique for Safe Primary Trocar Insertion in Laparoscopy: Chou's Method

    Directory of Open Access Journals (Sweden)

    Pan-Hsin Chou

    2005-06-01

    Conclusion: The results with this novel method incorporating the unique concept of directly holding the fascia suggest it to be relatively safe, simple, and economic. The risk of major vascular injury was decreased to nil by this technique and the chance of visceral injury was also minimal.

  16. A Learning Method for Neural Networks Based on a Pseudoinverse Technique

    Directory of Open Access Journals (Sweden)

    Chinmoy Pal

    1996-01-01

    Full Text Available A theoretical formulation of a fast learning method based on a pseudoinverse technique is presented. The efficiency and robustness of the method are verified with the help of an Exclusive OR problem and a dynamic system identification of a linear single degree of freedom mass–spring problem. It is observed that, compared with the conventional backpropagation method, the proposed method has a better convergence rate and a higher degree of learning accuracy with a lower equivalent learning coefficient. It is also found that unlike the steepest descent method, the learning capability of which is dependent on the value of the learning coefficient ν, the proposed pseudoinverse based backpropagation algorithm is comparatively robust with respect to its equivalent variable learning coefficient. A combination of the pseudoinverse method and the steepest descent method is proposed for a faster, more accurate learning capability.

  17. Development of gap measurement technique in-vessel corium retention using ultrasonic pulse echo method

    International Nuclear Information System (INIS)

    Koo, Kil Mo; Kim, Jong Hwan; Kang, Kyung Ho; Kim, Sang Baik; Sim, Cheul Muu

    1999-03-01

    A gap between a molten material and a lower vessel is formed in the LAVA experiment, a phase 1 study of Sonata-IV program. In this technical report, quantitative results of the gap measurement using an off-line ultrasonic pulse echo method are presented. This report aims at development of an appropriate ultrasonics test method, by analyzing the problems from the external environmental reason and the internal characteristic reason. The signal analyzing methods to improve the S/N ratio in these problems are divided into the time variant synthesized signal analyzing method and the time invariant synthesized signal analyzing method. In this report, the possibility of the application of these two methods to the gap signal and the noise is considered. In this test, the signal of the propagational direction and reflectional direction through solid-liquid-solid specimen was analyzed to understand the behavior of the reflectional signal in a multi-layered structure by filling the gap with water between the melt and the lower head vessel. The quantitative gap measurement using the off-line ultrasonic pulse echo method was available for a little of the scanned region. But furtherly using DSP technique and imaging technique, the better results will be obtained. Some of the measured signals are presented as 2-dimensional spherical mapping method using distance and amplitude. Other signals difficult in quantitative measurement are saved for a new signal processing method. (author). 11 refs., 4 tabs., 54 figs

  18. [Application of three heat pulse technique-based methods to determine the stem sap flow].

    Science.gov (United States)

    Wang, Sheng; Fan, Jun

    2015-08-01

    It is of critical importance to acquire tree transpiration characters through sap flow methodology to understand tree water physiology, forest ecology and ecosystem water exchange. Tri-probe heat pulse sensors, which are widely utilized in soil thermal parameters and soil evaporation measurement, were applied to implement Salix matsudana sap flow density (Vs) measurements via heat-ratio method (HRM), T-Max method (T-Max) and single-probe heat pulse probe (SHPP) method, and comparative analysis was conducted with additional Grainer's thermal diffusion probes (TDP) measured results. The results showed that, it took about five weeks to reach a stable measurement stage after TPHP installation, Vs measured with three methods in the early stage after installation was 135%-220% higher than Vs in the stable measurement stage, and Vs estimated via HRM, T-Max and SHPP methods were significantly linearly correlated with Vs estimated via TDP method, with R2 of 0.93, 0.73 and 0.91, respectively, and R2 for Vs measured by SHPP and HRM reached 0.94. HRM had relatively higher precision in measuring low rates and reverse sap flow. SHPP method seemed to be very promising to measure sap flow for configuration simplicity and high measuring accuracy, whereas it couldn' t distinguish directions of flow. T-Max method had relatively higher error in sap flow measurement, and it couldn' t measure sap flow below 5 cm3 · cm(-2) · h(-1), thus this method could not be used alone, however it could measure thermal diffusivity for calculating sap flow when other methods were imposed. It was recommended to choose a proper method or a combination of several methods to measure stem sap flow, based on specific research purpose.

  19. Nano-Evaluris: an inhalation and explosion risk evaluation method for nanoparticle use. Part I: description of the methodology

    Science.gov (United States)

    Bouillard, Jacques X.; Vignes, Alexis

    2014-02-01

    In this paper, an inhalation health and explosion safety risk assessment methodology for nanopowders is described. Since toxicological threshold limit values are still unknown for nanosized substances, detailed risk assessment on specific plants may not be carried out. A simple approach based on occupational hazard/exposure band expressed in mass concentrations is proposed for nanopowders. This approach is consolidated with an iso surface toxicological scaling method, which has the merit, although incomplete, to provide concentration threshold levels for which new metrological instruments should be developed for proper air monitoring in order to ensure safety. Whenever the processing or use of nanomaterials is introducing a risk to the worker, a specific nano pictogram is proposed to inform the worker. Examples of risk assessment of process equipment (i.e., containment valves) processing various nanomaterials are provided. Explosion risks related to very reactive nanomaterials such as aluminum nanopowders can be assessed using this new analysis methodology adapted to nanopowders. It is nevertheless found that to formalize and extend this approach, it is absolutely necessary to develop new relevant standard apparatuses and to qualify individual and collective safety barriers with respect to health and explosion risks. In spite of these uncertainties, it appears, as shown in the second paper (Part II) that health and explosion risks, evaluated for given MWCNTs and aluminum nanoparticles, remain manageable in their continuous fabrication mode, considering current individual and collective safety barriers that can be put in place. The authors would, however, underline that peculiar attention must be paid to non-continuous modes of operations, such as process equipment cleaning steps, that are often under-analyzed and are too often forgotten critical steps needing vigilance in order to minimize potential toxic and explosion risks.

  20. Degradation of ticarcillin by subcritial water oxidation method: Application of response surface methodology and artificial neural network modeling.

    Science.gov (United States)

    Yabalak, Erdal

    2018-05-18

    This study was performed to investigate the mineralization of ticarcillin in the artificially prepared aqueous solution presenting ticarcillin contaminated waters, which constitute a serious problem for human health. 81.99% of total organic carbon removal, 79.65% of chemical oxygen demand removal, and 94.35% of ticarcillin removal were achieved by using eco-friendly, time-saving, powerful and easy-applying, subcritical water oxidation method in the presence of a safe-to-use oxidizing agent, hydrogen peroxide. Central composite design, which belongs to the response surface methodology, was applied to design the degradation experiments, to optimize the methods, to evaluate the effects of the system variables, namely, temperature, hydrogen peroxide concentration, and treatment time, on the responses. In addition, theoretical equations were proposed in each removal processes. ANOVA tests were utilized to evaluate the reliability of the performed models. F values of 245.79, 88.74, and 48.22 were found for total organic carbon removal, chemical oxygen demand removal, and ticarcillin removal, respectively. Moreover, artificial neural network modeling was applied to estimate the response in each case and its prediction and optimizing performance was statistically examined and compared to the performance of central composite design.

  1. Field Application of Cable Tension Estimation Technique Using the h-SI Method

    Directory of Open Access Journals (Sweden)

    Myung-Hyun Noh

    2015-01-01

    Full Text Available This paper investigates field applicability of a new system identification technique of estimating tensile force for a cable of long span bridges. The newly proposed h-SI method using the combination of the sensitivity updating algorithm and the advanced hybrid microgenetic algorithm can allow not only avoiding the trap of local minimum at initial searching stage but also finding the optimal solution in terms of better numerical efficiency than existing methods. First, this paper overviews the procedure of tension estimation through a theoretical formulation. Secondly, the validity of the proposed technique is numerically examined using a set of dynamic data obtained from benchmark numerical samples considering the effect of sag extensibility and bending stiffness of a sag-cable system. Finally, the feasibility of the proposed method is investigated through actual field data extracted from a cable-stayed Seohae Bridge. The test results show that the existing methods require precise initial data in advance but the proposed method is not affected by such initial information. In particular, the proposed method can improve accuracy and convergence rate toward final values. Consequently, the proposed method can be more effective than existing methods in terms of characterizing the tensile force variation for cable structures.

  2. System Response Analysis and Model Order Reduction, Using Conventional Method, Bond Graph Technique and Genetic Programming

    Directory of Open Access Journals (Sweden)

    Lubna Moin

    2009-04-01

    Full Text Available This research paper basically explores and compares the different modeling and analysis techniques and than it also explores the model order reduction approach and significance. The traditional modeling and simulation techniques for dynamic systems are generally adequate for single-domain systems only, but the Bond Graph technique provides new strategies for reliable solutions of multi-domain system. They are also used for analyzing linear and non linear dynamic production system, artificial intelligence, image processing, robotics and industrial automation. This paper describes a unique technique of generating the Genetic design from the tree structured transfer function obtained from Bond Graph. This research work combines bond graphs for model representation with Genetic programming for exploring different ideas on design space tree structured transfer function result from replacing typical bond graph element with their impedance equivalent specifying impedance lows for Bond Graph multiport. This tree structured form thus obtained from Bond Graph is applied for generating the Genetic Tree. Application studies will identify key issues and importance for advancing this approach towards becoming on effective and efficient design tool for synthesizing design for Electrical system. In the first phase, the system is modeled using Bond Graph technique. Its system response and transfer function with conventional and Bond Graph method is analyzed and then a approach towards model order reduction is observed. The suggested algorithm and other known modern model order reduction techniques are applied to a 11th order high pass filter [1], with different approach. The model order reduction technique developed in this paper has least reduction errors and secondly the final model retains structural information. The system response and the stability analysis of the system transfer function taken by conventional and by Bond Graph method is compared and

  3. The bridge technique for pectus bar fixation: a method to make the bar un-rotatable.

    Science.gov (United States)

    Park, Hyung Joo; Kim, Kyung Soo; Moon, Young Kyu; Lee, Sungsoo

    2015-08-01

    Pectus bar rotation is a major challenge in pectus repair. However, to date, no satisfactory technique to completely eliminate bar displacement has been introduced. Here, we propose a bar fixation technique using a bridge that makes the bar unmovable. The purpose of this study was to determine the efficacy of this bridge technique. A total of 80 patients underwent pectus bar repair of pectus excavatum with the bridge technique from July 2013 to July 2014. The technique involved connecting 2 parallel bars using plate-screws at the ends of the bars. To determine bar position change, the angles between the sternum and pectus bars were measured on postoperative day 5 (POD5) and 4 months (POM4) and compared. The mean patient age was 17.5 years (range, 6-38 years). The mean difference between POD5 and POM4 were 0.23° (P=.602) and 0.35° (P=.338) for the upper and lower bars, respectively. Bar position was virtually unchanged during the follow-up, and there was no bar dislocation or reoperation. A "bridge technique" designed to connect 2 parallel bars using plates and screws was demonstrated as a method to avoid pectus bar displacement. This approach was easy to implement without using sutures or invasive devices. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. On the methanol permeability through pristine Nafion {sup registered} and Nafion/PVA membranes measured by different techniques. A comparison of methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Molla, S.; Compan, V. [Departmento de Termodinamica Aplicada, Escuela de Ingenieria Tecnica Industrial (ETSII), Universidad Politecnica de Valencia, 46022 Valencia (Spain); Instituto Tecnologico de la Energia (ITE), Av. Juan de la Cierva 24, 46980 Paterna, Valencia (Spain); Luis Lafuente, S. [Departmento de Quimica Organica, Universidad Jaume I, 12072 Castellon (Spain); Prats, J. [Departmento de Termodinamica Aplicada, Escuela de Ingenieria Tecnica Industrial (ETSII), Universidad Politecnica de Valencia, 46022 Valencia (Spain)

    2011-12-15

    Methanol crossover through polymer electrolyte membranes is a critical issue and causes an important reduction of performance in direct methanol fuel cells (DMFCs). Measuring the evolution of CO{sub 2} gas in the cathode is a common method to determine the methanol crossover under real operating conditions, although an easier and simpler method is preferable for the screening of membranes during their step of development. In this sense, this work has been focused on the ex situ characterization of the methanol permeability in novel nanofiber-reinforced composite Nafion/PVA membranes for DMFC application by means of three different experimental procedures: (a) potentiometric method, (b) gas chromatography technique, and (c) measuring the density. It was found that all these methods resulted in comparable results and it was observed that the incorporation of the PVA nanofiber phase within the Nafion {sup registered} matrix causes a remarkable reduction of the methanol permeability. The optimal choice of the most suitable technique depends on the accuracy expected for the methanol concentration, the availability of the required instrumental, and the complexity of the procedure. (Copyright copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  5. Recent advances in sample preparation techniques and methods of sulfonamides detection - A review.

    Science.gov (United States)

    Dmitrienko, Stanislava G; Kochuk, Elena V; Apyari, Vladimir V; Tolmacheva, Veronika V; Zolotov, Yury A

    2014-11-19

    Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Validation of a residue method to determine pesticide residues in cucumber by using nuclear techniques

    International Nuclear Information System (INIS)

    Baysoyu, D.; Tiryaki, O.; Secer, E.; Aydin, G.

    2009-01-01

    In this study, a multi-residue method using ethyl acetate for extraction and gel permeation chromatography for clean-up was validated to determine chlorpyrifos, malathion and dichlorvos in cucumber by gas chromatography. For this purpose, homogenized cucumber samples were fortified with pesticides at 0.02 0.2, 0.8 and 1 mg/kg levels. The efficiency and repeatability of the method in extraction and cleanup steps were performed using 1 4C-carbaryl by radioisotope tracer technique. 1 4C-carbaryl recoveries after the extraction and cleanup steps were between 92.63-111.73 % with a repeatability of 4.85% (CV) and 74.83-102.22 % with a repeatability of 7.19% (CV), respectively. The homogeneity of analytical samples and the stability of pesticides during homogenization were determined using radio tracer technique and chromatographic methods, respectively.

  7. Application of learning techniques based on kernel methods for the fault diagnosis in industrial processes

    Directory of Open Access Journals (Sweden)

    Jose M. Bernal-de-Lázaro

    2016-05-01

    Full Text Available This article summarizes the main contributions of the PhD thesis titled: "Application of learning techniques based on kernel methods for the fault diagnosis in Industrial processes". This thesis focuses on the analysis and design of fault diagnosis systems (DDF based on historical data. Specifically this thesis provides: (1 new criteria for adjustment of the kernel methods used to select features with a high discriminative capacity for the fault diagnosis tasks, (2 a proposed approach process monitoring using statistical techniques multivariate that incorporates a reinforced information concerning to the dynamics of the Hotelling's T2 and SPE statistics, whose combination with kernel methods improves the detection of small-magnitude faults; (3 an robustness index to compare the diagnosis classifiers performance taking into account their insensitivity to possible noise and disturbance on historical data.

  8. Technique of Critical Current Density Measurement of Bulk Superconductor with Linear Extrapolation Method

    International Nuclear Information System (INIS)

    Adi, Wisnu Ari; Sukirman, Engkir; Winatapura, Didin S.

    2000-01-01

    Technique of critical current density measurement (Jc) of HTc bulk ceramic superconductor has been performed by using linear extrapolation with four-point probes method. The measurement of critical current density HTc bulk ceramic superconductor usually causes damage in contact resistance. In order to decrease this damage factor, we introduce extrapolation method. The extrapolating data show that the critical current density Jc for YBCO (123) and BSCCO (2212) at 77 K are 10,85(6) Amp.cm - 2 and 14,46(6) Amp.cm - 2, respectively. This technique is easier, simpler, and the use of the current flow is low, so it will not damage the contact resistance of the sample. We expect that the method can give a better solution for bulk superconductor application. Key words. : superconductor, critical temperature, and critical current density

  9. Variance-to-mean method generalized by linear difference filter technique

    International Nuclear Information System (INIS)

    Hashimoto, Kengo; Ohsaki, Hiroshi; Horiguchi, Tetsuo; Yamane, Yoshihiro; Shiroya, Seiji

    1998-01-01

    The conventional variance-to-mean method (Feynman-α method) seriously suffers the divergency of the variance under such a transient condition as a reactor power drift. Strictly speaking, then, the use of the Feynman-α is restricted to a steady state. To apply the method to more practical uses, it is desirable to overcome this kind of difficulty. For this purpose, we propose an usage of higher-order difference filter technique to reduce the effect of the reactor power drift, and derive several new formulae taking account of the filtering. The capability of the formulae proposed was demonstrated through experiments in the Kyoto University Critical Assembly. The experimental results indicate that the divergency of the variance can be effectively suppressed by the filtering technique, and that the higher-order filter becomes necessary with increasing variation rate in power

  10. Development and application of preventive maintenance technique for pipes using laser cladding method

    International Nuclear Information System (INIS)

    Hatakenaka, Hiroaki; Yamadera, Masao; Shiraiwa, Takanori.

    1995-01-01

    A laser cladding method which produces a highly corrosion-resisting coating (cladding) on the surface of the material was developed for the purpose of preventing stress corrosion cracking (SCC) in the austenitic stainless steel (Type 304). In this method, metallic powder paste is applied on the inner surface of pipes, and then a YAG laser beam is irradiated to the paste, which melts and forms a clad with excellent corrosion resistance. Recently, the laser cladding method was practically and successfully applied to the actual nuclear power plant in Japan. This report describes this laser cladding technique, the equipment, and actual works in the field. (author)

  11. Identification techniques for phenomenological models of hysteresis based on the conjugate gradient method

    International Nuclear Information System (INIS)

    Andrei, Petru; Oniciuc, Liviu; Stancu, Alexandru; Stoleriu, Laurentiu

    2007-01-01

    An identification technique for the parameters of phenomenological models of hysteresis is presented. The basic idea of our technique is to set up a system of equations for the parameters of the model as a function of known quantities on the major or minor hysteresis loops (e.g. coercive force, susceptibilities at various points, remanence), or other magnetization curves. This system of equations can be either over or underspecified and is solved by using the conjugate gradient method. Numerical results related to the identification of parameters in the Energetic, Jiles-Atherton, and Preisach models are presented

  12. Profile of research methodology and statistics training of ...

    African Journals Online (AJOL)

    The aim of this study was to determine the profile of research methodology and ... Method: Respondents for this descriptive study were persons responsible for the ..... universities: all study designs, all sampling techniques, incidence and.

  13. Evaluation of user input methods for manipulating a tablet personal computer in sterile techniques.

    Science.gov (United States)

    Yamada, Akira; Komatsu, Daisuke; Suzuki, Takeshi; Kurozumi, Masahiro; Fujinaga, Yasunari; Ueda, Kazuhiko; Kadoya, Masumi

    2017-02-01

    To determine a quick and accurate user input method for manipulating tablet personal computers (PCs) in sterile techniques. We evaluated three different manipulation methods, (1) Computer mouse and sterile system drape, (2) Fingers and sterile system drape, and (3) Digitizer stylus and sterile ultrasound probe cover with a pinhole, in terms of the central processing unit (CPU) performance, manipulation performance, and contactlessness. A significant decrease in CPU score ([Formula: see text]) and an increase in CPU temperature ([Formula: see text]) were observed when a system drape was used. The respective mean times taken to select a target image from an image series (ST) and the mean times for measuring points on an image (MT) were [Formula: see text] and [Formula: see text] s for the computer mouse method, [Formula: see text] and [Formula: see text] s for the finger method, and [Formula: see text] and [Formula: see text] s for the digitizer stylus method, respectively. The ST for the finger method was significantly longer than for the digitizer stylus method ([Formula: see text]). The MT for the computer mouse method was significantly longer than for the digitizer stylus method ([Formula: see text]). The mean success rate for measuring points on an image was significantly lower for the finger method when the diameter of the target was equal to or smaller than 8 mm than for the other methods. No significant difference in the adenosine triphosphate amount at the surface of the tablet PC was observed before, during, or after manipulation via the digitizer stylus method while wearing starch-powdered sterile gloves ([Formula: see text]). Quick and accurate manipulation of tablet PCs in sterile techniques without CPU load is feasible using a digitizer stylus and sterile ultrasound probe cover with a pinhole.

  14. MS-Based Analytical Techniques: Advances in Spray-Based Methods and EI-LC-MS Applications

    Science.gov (United States)

    Medina, Isabel; Cappiello, Achille; Careri, Maria

    2018-01-01

    Mass spectrometry is the most powerful technique for the detection and identification of organic compounds. It can provide molecular weight information and a wealth of structural details that give a unique fingerprint for each analyte. Due to these characteristics, mass spectrometry-based analytical methods are showing an increasing interest in the scientific community, especially in food safety, environmental, and forensic investigation areas where the simultaneous detection of targeted and nontargeted compounds represents a key factor. In addition, safety risks can be identified at the early stage through online and real-time analytical methodologies. In this context, several efforts have been made to achieve analytical instrumentation able to perform real-time analysis in the native environment of samples and to generate highly informative spectra. This review article provides a survey of some instrumental innovations and their applications with particular attention to spray-based MS methods and food analysis issues. The survey will attempt to cover the state of the art from 2012 up to 2017.

  15. An acceleration technique for 2D MOC based on Krylov subspace and domain decomposition methods

    International Nuclear Information System (INIS)

    Zhang Hongbo; Wu Hongchun; Cao Liangzhi

    2011-01-01

    Highlights: → We convert MOC into linear system solved by GMRES as an acceleration method. → We use domain decomposition method to overcome the inefficiency on large matrices. → Parallel technology is applied and a matched ray tracing system is developed. → Results show good efficiency even in large-scale and strong scattering problems. → The emphasis is that the technique is geometry-flexible. - Abstract: The method of characteristics (MOC) has great geometrical flexibility but poor computational efficiency in neutron transport calculations. The generalized minimal residual (GMRES) method, a type of Krylov subspace method, is utilized to accelerate a 2D generalized geometry characteristics solver AutoMOC. In this technique, a form of linear algebraic equation system for angular flux moments and boundary fluxes is derived to replace the conventional characteristics sweep (i.e. inner iteration) scheme, and then the GMRES method is implemented as an efficient linear system solver. This acceleration method is proved to be reliable in theory and simple for implementation. Furthermore, as introducing no restriction in geometry treatment, it is suitable for acceleration of an arbitrary geometry MOC solver. However, it is observed that the speedup decreases when the matrix becomes larger. The spatial domain decomposition method and multiprocessing parallel technology are then employed to overcome the problem. The calculation domain is partitioned into several sub-domains. For each of them, a smaller matrix is established and solved by GMRES; and the adjacent sub-domains are coupled by 'inner-edges', where the trajectory mismatches are considered adequately. Moreover, a matched ray tracing system is developed on the basis of AutoCAD, which allows a user to define the sub-domains on demand conveniently. Numerical results demonstrate that the acceleration techniques are efficient without loss of accuracy, even in the case of large-scale and strong scattering

  16. Development of an incipient rotor crack detection method by acoustic emission techniques

    International Nuclear Information System (INIS)

    Le Reverend, D.; Massouri, M.H.

    1988-01-01

    The objective of the program presented is to develop a method of detection and monitoring of crack growth in machine rotor by application of acoustic emission techniques. This program is performed by R and D Division of Electricite de France, jointly with INSA de Lyon. The first task of the program is relative to the characterization of acoustic emission during a progressive tensile test performed on a NCT specimen. The second task of the program deals with the experimentation of acoustic emission techniques for the monitoring of a specimen during cycling bending tests. The last task of the program is relative to evaluation of application of acoustic emission techniques for a small rotor integrity monitoring during fatigue rotation tests [fr

  17. Development of Uncertainty Quantification Method for MIR-PIV Measurement using BOS Technique

    International Nuclear Information System (INIS)

    Seong, Jee Hyun; Song, Min Seop; Kim, Eung Soo

    2014-01-01

    Matching Index of Refraction (MIR) is frequently used for obtaining high quality PIV measurement data. ven small distortion by unmatched refraction index of test section can result in uncertainty problems. In this context, it is desirable to construct new concept for checking errors of MIR and following uncertainty of PIV measurement. This paper proposes a couple of experimental concept and relative results. This study developed an MIR uncertainty quantification method for PIV measurement using SBOS technique. From the reference data of the BOS, the reliable SBOS experiment procedure was constructed. Then with the combination of SBOS technique with MIR-PIV technique, velocity vector and refraction displacement vector field was measured simultaneously. MIR errors are calculated through mathematical equation, in which PIV and SBOS data are put. These errors are also verified by another BOS experiment. Finally, with the applying of calculated MIR-PIV uncertainty, correct velocity vector field can be obtained regardless of MIR errors

  18. Modified emission-transmission method for determining trace elements in solid samples using the XRF techniques

    International Nuclear Information System (INIS)

    Poblete, V.; Alvarez, M.; Hermosilla, M.

    2000-01-01

    This is a study of an analysis of trace elements in medium thick solid samples, by the modified transmission emission method, using the energy dispersion X-ray fluorescence technique (EDXRF). The effects of absorption and reinforcement are the main disadvantages of the EDXRF technique for the quantitative analysis of bigger elements and trace elements in solid samples. The implementation of this method and its application to a variety of samples was carried out using an infinitely thick multi-element white sample that calculates the correction factors by absorbing all the analytes in the sample. The discontinuities in the masic absorption coefficients versus energies association for each element, with medium thick and homogenous samples, are analyzed and corrected. A thorough analysis of the different theoretical and test variables are proven by using real samples, including certified material with known concentration. The simplicity of the calculation method and the results obtained show the method's major precision, with possibilities for the non-destructive routine analysis of different solid samples, using the EDXRF technique (author)

  19. Method-independent, Computationally Frugal Convergence Testing for Sensitivity Analysis Techniques

    Science.gov (United States)

    Mai, J.; Tolson, B.

    2017-12-01

    The increasing complexity and runtime of environmental models lead to the current situation that the calibration of all model parameters or the estimation of all of their uncertainty is often computationally infeasible. Hence, techniques to determine the sensitivity of model parameters are used to identify most important parameters. All subsequent model calibrations or uncertainty estimation procedures focus then only on these subsets of parameters and are hence less computational demanding. While the examination of the convergence of calibration and uncertainty methods is state-of-the-art, the convergence of the sensitivity methods is usually not checked. If any, bootstrapping of the sensitivity results is used to determine the reliability of the estimated indexes. Bootstrapping, however, might as well become computationally expensive in case of large model outputs and a high number of bootstraps. We, therefore, present a Model Variable Augmentation (MVA) approach to check the convergence of sensitivity indexes without performing any additional model run. This technique is method- and model-independent. It can be applied either during the sensitivity analysis (SA) or afterwards. The latter case enables the checking of already processed sensitivity indexes. To demonstrate the method's independency of the convergence testing method, we applied it to two widely used, global SA methods: the screening method known as Morris method or Elementary Effects (Morris 1991) and the variance-based Sobol' method (Solbol' 1993). The new convergence testing method is first scrutinized using 12 analytical benchmark functions (Cuntz & Mai et al. 2015) where the true indexes of aforementioned three methods are known. This proof of principle shows that the method reliably determines the uncertainty of the SA results when different budgets are used for the SA. The results show that the new frugal method is able to test the convergence and therefore the reliability of SA results in an

  20. Sampling methods and non-destructive examination techniques for large radioactive waste packages

    International Nuclear Information System (INIS)

    Green, T.H.; Smith, D.L.; Burgoyne, K.E.; Maxwell, D.J.; Norris, G.H.; Billington, D.M.; Pipe, R.G.; Smith, J.E.; Inman, C.M.

    1992-01-01

    Progress is reported on work undertaken to evaluate quality checking methods for radioactive wastes. A sampling rig was designed, fabricated and used to develop techniques for the destructive sampling of cemented simulant waste using remotely operated equipment. An engineered system for the containment of cooling water was designed and manufactured and successfully demonstrated with the drum and coring equipment mounted in both vertical and horizontal orientations. The preferred in-cell orientation was found to be with the drum and coring machinery mounted in a horizontal position. Small powdered samples can be taken from cemented homogeneous waste cores using a hollow drill/vacuum section technique with the preferred subsampling technique being to discard the outer 10 mm layer to obtain a representative sample of the cement core. Cement blends can be dissolved using fusion techniques and the resulting solutions are stable to gelling for periods in excess of one year. Although hydrochloric acid and nitric acid are promising solvents for dissolution of cement blends, the resultant solutions tend to form silicic acid gels. An estimate of the beta-emitter content of cemented waste packages can be obtained by a combination of non-destructive and destructive techniques. The errors will probably be in excess of +/-60 % at the 95 % confidence level. Real-time X-ray video-imaging techniques have been used to analyse drums of uncompressed, hand-compressed, in-drum compacted and high-force compacted (i.e. supercompacted) simulant waste. The results have confirmed the applicability of this technique for NDT of low-level waste. 8 refs., 12 figs., 3 tabs

  1. Determination of the optimal method for the field-in-field technique in breast tangential radiotherapy

    International Nuclear Information System (INIS)

    Tanaka, Hidekazu; Hayashi, Shinya; Hoshi, Hiroaki

    2014-01-01

    Several studies have reported the usefulness of the field-in-field (FIF) technique in breast radiotherapy. However, the methods for the FIF technique used in these studies vary. These methods were classified into three categories. We simulated a radiotherapy plan with each method and analyzed the outcomes. In the first method, a pair of subfields was added to each main field: the single pair of subfields method (SSM). In the second method, three pairs of subfields were added to each main field: the multiple pairs of subfields method (MSM). In the third method, subfields were alternately added: the alternate subfields method (ASM). A total of 51 patients were enrolled in this study. The maximum dose to the planning target volume (PTV) (Dmax) and the volumes of the PTV receiving 100% of the prescription dose (V100%) were calculated. The thickness of the breast between the chest wall and skin surface was measured, and patients were divided into two groups according to the median. In the overall series, the average V100% with ASM (60.3%) was significantly higher than with SSM (52.6%) and MSM (48.7%). In the thin breast group as well, the average V100% with ASM (57.3%) and SSM (54.2%) was significantly higher than that with MSM (43.3%). In the thick breast group, the average V100% with ASM (63.4%) was significantly higher than that with SSM (51.0%) and MSM (54.4%). ASM resulted in better dose distribution, regardless of the breast size. Moreover, planning for ASM required a relatively short time. ASM was considered the most preferred method. (author)

  2. Technical errors in complete mouth radiographic survey according to radiographic techniques and film holding methods

    International Nuclear Information System (INIS)

    Choi, Karp Sik; Byun, Chong Soo; Choi, Soon Chul

    1986-01-01

    The purpose of this study was to investigate the numbers and causes of retakes in 300 complete mouth radiographic surveys made by 75 senior dental students. According to radiographic techniques and film holding methods, they were divided into 4 groups: Group I: Bisecting-angle technique with patient's fingers. Group II: Bisecting-angle technique with Rinn Snap-A-Ray device. Group III: Bisecting-angle technique with Rinn XCP instrument (short cone) Group IV: Bisecting-angle technique with Rinn XCP instrument (long cone). The most frequent cases of retakes, the most frequent tooth area examined, of retakes and average number of retakes per complete mouth survey were evaluated. The obtained results were as follows: Group I: Incorrect film placement (47.8), upper canine region, and 0.89. Group II: Incorrect film placement (44.0), upper canine region, and 1.12. Group III: Incorrect film placement (79.2), upper canine region, and 2.05. Group IV: Incorrect film placement (67.7), upper canine region, and 1.69.

  3. A new method for reducing DNL in nuclear ADCs using an interpolation technique

    International Nuclear Information System (INIS)

    Vaidya, P.P.; Gopalakrishnan, K.R.; Pethe, V.A.; Anjaneyulu, T.

    1986-01-01

    The paper describes a new method for reducing the DNL associated with nuclear ADCs. The method named the ''interpolation technique'' is utilized to derive the quantisation steps corresponding to the last n bits of the digital code by dividing quantisation steps due to higher significant bits of the DAC, using a chain of resistors. Using comparators, these quantisation steps are compared with the analog voltage to be digitized, which is applied as a voltage shift at both ends of this chain. The output states of the comparators define the n bit code. The errors due to offset voltages and bias currents of the comparators are statistically neutralized by changing the polarity of quantisation steps as well as the polarity of analog voltage (corresponding to last n bits) for alternate A/D conversion. The effect of averaging on the channel profile can be minimized. A 12 bit ADC was constructured using this technique which gives DNL of less than +-1% over most of the channels for conversion time of nearly 4.5 μs. Gatti's sliding scale technique can be implemented for further reduction of DNL. The interpolation technique has a promising potential of improving the resolution of existing 12 bit ADCs to 16 bit, without degrading the percentage DNL significantly. (orig.)

  4. Self-cleaning Foliar Surfaces Characterization using RIMAPS Technique and Variogram Method

    International Nuclear Information System (INIS)

    Rosi, Pablo E.

    2002-01-01

    Along the last ten years many important studies about characterization of self-cleaning foliar surfaces have been done and focused new interest on this kind of surfaces.These studies were possible due to the development of a novel preparation technique for this biological material that let us observe the delicate structures of a foliar surface under scanning electron microscope (S.E.M.).This technique consists of replacing the natural water of the specimen by glycerol. Digital S.E.M. images from both self-cleaning and non-self-cleaning foliar surfaces were obtained and analyzed using RIMAPS technique and Variograms method. Our results revealed the existence of a common and exclusive geometrical pattern that is found in species which present self-cleaning foliar surfaces.This pattern combines at least nine different directions.The results from the Variograms method showed that the stomata play a key role in the determination of foliar surface roughness. In addition, spectra from RIMAPS technique constitute a fingerprint of a foliar surface so they can be used to find evolutionary relationships among species.Further studies will provide more detailed information to fully elucidate the self-cleaning pattern, so it might be possible to reproduce it on an artificial surface and make it self-cleaning

  5. A New Image Encryption Technique Combining Hill Cipher Method, Morse Code and Least Significant Bit Algorithm

    Science.gov (United States)

    Nofriansyah, Dicky; Defit, Sarjon; Nurcahyo, Gunadi W.; Ganefri, G.; Ridwan, R.; Saleh Ahmar, Ansari; Rahim, Robbi

    2018-01-01

    Cybercrime is one of the most serious threats. Efforts are made to reduce the number of cybercrime is to find new techniques in securing data such as Cryptography, Steganography and Watermarking combination. Cryptography and Steganography is a growing data security science. A combination of Cryptography and Steganography is one effort to improve data integrity. New techniques are used by combining several algorithms, one of which is the incorporation of hill cipher method and Morse code. Morse code is one of the communication codes used in the Scouting field. This code consists of dots and lines. This is a new modern and classic concept to maintain data integrity. The result of the combination of these three methods is expected to generate new algorithms to improve the security of the data, especially images.

  6. ICF implosion hotspot ion temperature diagnostic techniques based on neutron time-of-flight method

    International Nuclear Information System (INIS)

    Tang Qi; Song Zifeng; Chen Jiabin; Zhan Xiayu

    2013-01-01

    Ion temperature of implosion hotspot is a very important parameter for inertial confinement fusion. It reflects the energy level of the hotspot, and it is very sensitive to implosion symmetry and implosion speed. ICF implosion hotspot ion temperature diagnostic techniques based on neutron time-of-flight method were described. A neutron TOF spectrometer was developed using a ultrafast plastic scintillator as the neutron detector. Time response of the spectrometer has 1.1 ns FWHM and 0.5 ns rising time. TOF spectrum resolving method based on deconvolution and low pass filter was illuminated. Implosion hotspot ion temperature in low neutron yield and low ion temperature condition at Shenguang-Ⅲ facility was acquired using the diagnostic techniques. (authors)

  7. Application of classic engineering techniques (value engineering and observational method) at the Weldon Spring Quarry

    International Nuclear Information System (INIS)

    Ferguson, R.D.; Valett, G.L.

    1991-01-01

    In July of 1987 the Weldon Spring quarry was listed on the Environmental Protection Agency National Priority List as the highest priority Federal facility site. The Weldon Spring Site Remedial Action Project applied the principles and techniques of Value Engineering (VE) and the Observational Method to remedial planning efforts at the quarry. VE sessions resulted in modifications of the scenarios developed during the Remedial Investigation/Feasibility Study (RM) process in preparation for conceptual design activities for the removal of waste from the quarry. The Observational Method, a technique developed to manage uncertainties, was used to guide both environmental and engineering planning to ensure that the waste removal activities win be carried out in a safe and environmentally responsible manner

  8. A Novel Technique for Steganography Method Based on Improved Genetic Algorithm Optimization in Spatial Domain

    Directory of Open Access Journals (Sweden)

    M. Soleimanpour-moghadam

    2013-06-01

    Full Text Available This paper devotes itself to the study of secret message delivery using cover image and introduces a novel steganographic technique based on genetic algorithm to find a near-optimum structure for the pair-wise least-significant-bit (LSB matching scheme. A survey of the related literatures shows that the LSB matching method developed by Mielikainen, employs a binary function to reduce the number of changes of LSB values. This method verifiably reduces the probability of detection and also improves the visual quality of stego images. So, our proposal draws on the Mielikainen's technique to present an enhanced dual-state scoring model, structured upon genetic algorithm which assesses the performance of different orders for LSB matching and searches for a near-optimum solution among all the permutation orders. Experimental results confirm superiority of the new approach compared to the Mielikainen’s pair-wise LSB matching scheme.

  9. Detection of Cavities by Inverse Heat Conduction Boundary Element Method Using Minimal Energy Technique

    International Nuclear Information System (INIS)

    Choi, C. Y.

    1997-01-01

    A geometrical inverse heat conduction problem is solved for the infrared scanning cavity detection by the boundary element method using minimal energy technique. By minimizing the kinetic energy of temperature field, boundary element equations are converted to the quadratic programming problem. A hypothetical inner boundary is defined such that the actual cavity is located interior to the domain. Temperatures at hypothetical inner boundary are determined to meet the constraints of measurement error of surface temperature obtained by infrared scanning, and then boundary element analysis is performed for the position of an unknown boundary (cavity). Cavity detection algorithm is provided, and the effects of minimal energy technique on the inverse solution method are investigated by means of numerical analysis

  10. Data-driven remaining useful life prognosis techniques stochastic models, methods and applications

    CERN Document Server

    Si, Xiao-Sheng; Hu, Chang-Hua

    2017-01-01

    This book introduces data-driven remaining useful life prognosis techniques, and shows how to utilize the condition monitoring data to predict the remaining useful life of stochastic degrading systems and to schedule maintenance and logistics plans. It is also the first book that describes the basic data-driven remaining useful life prognosis theory systematically and in detail. The emphasis of the book is on the stochastic models, methods and applications employed in remaining useful life prognosis. It includes a wealth of degradation monitoring experiment data, practical prognosis methods for remaining useful life in various cases, and a series of applications incorporated into prognostic information in decision-making, such as maintenance-related decisions and ordering spare parts. It also highlights the latest advances in data-driven remaining useful life prognosis techniques, especially in the contexts of adaptive prognosis for linear stochastic degrading systems, nonlinear degradation modeling based pro...

  11. A Conservative Method for Treating Severely Displaced Pediatric Mandibular Fractures: An Effective Alternative Technique

    OpenAIRE

    Sahand Samieirad; Saeedeh khajehahmadi; Elahe Tohidi; Meysam Pakravan

    2016-01-01

    Pediatric mandibular fractures have been successfully managed in various ways. The use of a lingual splint is an option. This article presents a 4-year old boy who was treated by an alternative conservative method with a combination of an arch bar plus a lingual splint, circum-mandibular wiring and IMF for the reduction, stabilization and fixation of a severely displaced bilateral man‌dibular body fracture. This technique is a reliable, noninvasive procedure; it also limits the discomfort and...

  12. Nuclear method for determination of nitrogen depth distributions in single seeds. [/sup 14/N tracer technique

    Energy Technology Data Exchange (ETDEWEB)

    Sundqvist, B; Gonczi, L; Koersner, I; Bergman, R; Lindh, U

    1974-01-01

    (d,p) reactions in /sup 14/N were used for probing single kernels of seed for nitrogen content and nitrogen depth distributions. Comparison with the Kjeldahl method was made on individual peas and beans. The results were found to be strongly correlated. The technique to obtain depth distributions of nitrogen was also used on high- and low-lysine varieties of barley for which large differences in nitrogen distributions were found.

  13. Epigenetic Tracking, a Method to Generate Arbitrary Shapes By Using Evolutionary-Developmental Techniques

    OpenAIRE

    Fontana, Alessandro

    2008-01-01

    This paper describes an Artificial Embryology method (called ``Epigenetic Tracking'') to generate predefined arbitrarily shaped 2-dimensional arrays of cells by means of evolutionary techniques. It is based on a model of development, whose key features are: i) the distinction bewteen ``normal'' and ``driver'' cells, being the latter able to receive guidance from the genome, ii) the implementation of the proliferation/apoptosis events in such a way that many cells are created/deleted at once, ...

  14. Method and apparatus for optimizing operation of a power generating plant using artificial intelligence techniques

    Science.gov (United States)

    Wroblewski, David [Mentor, OH; Katrompas, Alexander M [Concord, OH; Parikh, Neel J [Richmond Heights, OH

    2009-09-01

    A method and apparatus for optimizing the operation of a power generating plant using artificial intelligence techniques. One or more decisions D are determined for at least one consecutive time increment, where at least one of the decisions D is associated with a discrete variable for the operation of a power plant device in the power generating plant. In an illustrated embodiment, the power plant device is a soot cleaning device associated with a boiler.

  15. PSA methodology

    Energy Technology Data Exchange (ETDEWEB)

    Magne, L

    1997-12-31

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300{sup 1} and EPS 900{sup 2} PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs.

  16. PSA methodology

    International Nuclear Information System (INIS)

    Magne, L.

    1996-01-01

    The purpose of this text is first to ask a certain number of questions on the methods related to PSAs. Notably we will explore the positioning of the French methodological approach - as applied in the EPS 1300 1 and EPS 900 2 PSAs - compared to other approaches (Part One). This reflection leads to more general reflection: what contents, for what PSA? This is why, in Part Two, we will try to offer a framework for definition of the criteria a PSA should satisfy to meet the clearly identified needs. Finally, Part Three will quickly summarize the questions approached in the first two parts, as an introduction to the debate. 15 refs

  17. Simple analytical technique for liquid scintillation counting of environmental carbon-14 using gel suspension method

    International Nuclear Information System (INIS)

    Okai, Tomio; Wakabayashi, Genichiro; Nagao, Kenjiro; Matoba, Masaru; Ohura, Hirotaka; Momoshima, Noriyuki; Kawamura, Hidehisa

    2000-01-01

    A simple analytical technique for liquid scintillation counting of environmental 14 C was developed. Commercially available gelling agent, N-lauroyl-L -glutamic -α,γ-dibutylamide, was used for the gel-formation of the samples (gel suspension method) and for the subsequent liquid scintillation counting of 14 C in the form of CaCO 3 . Our procedure for sample preparation is much simpler than that of the conventional methods and requires no special equipment. Self absorption, stability and reproducibility of gel suspension samples were investigated in order to evaluate the characteristics of the gel suspension method for 14 C activity measurement. The self absorption factor is about 70% and slightly decrease as CaCO 3 weight increase. This is considered to be mainly due to the absorption of β-rays and scintillation light by the CaCO 3 sample itself. No change of the counting rate for the gel suspension sample was observed for more than 2 years after the sample preparation. Four samples were used for checking the reproducibility of the sample preparation method. The same values were obtained for the counting rate of 24 C activity within the counting error. No change of the counting rate was observed for the 're-gelated' sample. These results show that the gel suspension method is appropriate for the 14 C activity measurement by the liquid scintillation counting method and useful for a long-term preservation of the sample for repeated measurement. The above analytical technique was applied to actual environmental samples in Fukuoka prefecture, Japan. Results obtained were comparable with those by other researchers and appear to be reasonable. Therefore, the newly developed technique is useful for the routine monitoring of environmental 14 C. (author)

  18. Sustainable cotton production and water economy through different planting methods and mulching techniques

    International Nuclear Information System (INIS)

    Nasrullah, H.M.; Khan, M.B.; Ahmad, R.; Ahmad, S.; Hanif, M.; Nazeer, W

    2011-01-01

    Planting methods and mulching techniques are important factors which affect crop growth, development and yield by conserving soil and plant moisture. A multifactorial experiment was conducted to study the water economy involving different planting methods and mulching techniques in cotton (Gossypium hirsutum L.) for two consecutive years (2004 and 2005) at the Agronomic Research Station, Khanewal. Two moisture stress tolerant cotton varieties (CIM-473 and CIM-499) were planted using four different planting methods i.e. 70c m spaced single row planting, 105 cm spaced double row strip planting, 70 cm spaced ridge planting and 140 cm spaced furrow beds (or bed and furrows) along four mulching practices i.e. cultural, straw, sheet and chemical for their individual and interactive effects on various parameters including water use efficiency. Positive interactive effects of furrow bed planting method (140 cm spaced) with plastic sheet/film mulching were observed for all the parameters i.e., highest seed cotton yield (3009 and 3332 kg ha/sup -1/), maximum water saving (up to 25.62% and 26.53%), highest water use efficiency up to 5.04 and 4.79 [macro mol (CO/sub 2/)/mmol (H/sub 2/O)], highest net income (Rs. 27224.2 and 50927.7 ha/sup -1/) with a cost-benefit ratio of 1.64 and 2.20 followed by maximum net income (Rs. 27382.2 and 47244.5 ha/sup -1/) with 1.64 and 2.10 cost-benefit ratio in case of plastic mulch and 2814 and 3007 kg ha/sup -1/ in ridge planting method during 2004 and 2005, respectively. It is concluded that cotton crop can be grown using bed and furrow planting method with plastic sheet/film mulching technique for sustainable cotton production and better water economy. (author)

  19. Novel method based on Fricke gel dosimeters for dose verification in IMRT techniques

    International Nuclear Information System (INIS)

    Aon, E.; Brunetto, M.; Sansogne, R.; Castellano, G.; Valente, M.

    2008-01-01

    Modern radiotherapy is becoming increasingly complex. Conformal and intensity modulated (IMRT) techniques are nowadays available for achieving better tumour control. However, accurate methods for 3D dose verification for these modern irradiation techniques have not been adequately established yet. Fricke gel dosimeters consist, essentially, in a ferrous sulphate (Fricke) solution fixed to a gel matrix, which enables spatial resolution. A suitable radiochromic marker (xylenol orange) is added to the solution in order to produce radiochromic changes within the visible spectrum range, due to the chemical internal conversion (oxidation) of ferrous ions to ferric ions. In addition, xylenol orange has proved to slow down the internal diffusion effect of ferric ions. These dosimeters suitably shaped in form of thin layers and optically analyzed by means of visible light transmission imaging have recently been proposed as a method for 3D absorbed dose distribution determinations in radiotherapy, and tested in several IMRT applications employing a homogeneous plane (visible light) illuminator and a CCD camera with a monochromatic filter for sample analysis by means of transmittance images. In this work, the performance of an alternative read-out method is characterized, consisting on visible light images, acquired before and after irradiation by means of a commercially available flatbed-like scanner. Registered images are suitably converted to matrices and analyzed by means of dedicated 'in-house' software. The integral developed method allows performing 1D (profiles), 2D (surfaces) and 3D (volumes) dose mapping. In addition, quantitative comparisons have been performed by means of the Gamma composite criteria. Dose distribution comparisons between Fricke gel dosimeters and traditional standard dosimetric techniques for IMRT irradiations show an overall good agreement, supporting the suitability of the method. The agreement, quantified by the gamma index (that seldom

  20. A new method used in laparoscopic hysterectomy for uterine manipulation: uterine rein technique.

    Science.gov (United States)

    Boztosun, Abdullah; Atılgan, Remzi; Pala, Şehmus; Olgan, Şafak

    2018-03-22

    The aim of this study is to define a new method of manipulating the uterus during laparoscopic hysterectomy. A total laparoscopic hysterectomy (TLH) with the newly defined technique was performed in 29 patients between July 2016 and July 2017. In this new technique, the uterus was bound from uterine corpus and fundus like a bridle with polyester tape, to allow abdominal manipulation. The technique was successfully performed at the first attempt in 93.1% of cases. It was repeated in two cases (6.9%) since the polyester tape departed away from the uterus at the first attempt. The mean application time was 11.2 min. The vaginal manipulator was not required in any of the cases. There were no intraoperative complications. In conclusion, this method has the advantages of not requiring any vaginal manipulator, reducing the number of people required during operation, permitting a near maximum manipulation of the uterus in all three dimensions, and giving the control of these manipulations directly to the surgeon. On the other hand, the technique also has some inadequacies which should be discussed and improved on in the future. Impact of statement What is already known on this subject? In a laparoscopic hysterectomy, manipulation of the uterus is essential for anatomical dissection of the regions and completion of the operation without complications. An ideal uterine manipulator is defined as inexpensive, as convenient, fast and suitable for injecting solutions, removing the need for an assistant and most importantly offering the most suitable range of motion. In this study, we describe a new and different technique (rein technique) allowing the abdominal manipulation of the uterus in a laparoscopic hysterectomy and discuss the advantages and disadvantages of this method. What do the results of this study add? The procedure was easily accomplished in most patients. We did not need to use an extra uterine manipulator in any of the cases. What are the implications of these

  1. A Method for Dynamically Selecting the Best Frequency Hopping Technique in Industrial Wireless Sensor Network Applications.

    Science.gov (United States)

    Fernández de Gorostiza, Erlantz; Berzosa, Jorge; Mabe, Jon; Cortiñas, Roberto

    2018-02-23

    Industrial wireless applications often share the communication channel with other wireless technologies and communication protocols. This coexistence produces interferences and transmission errors which require appropriate mechanisms to manage retransmissions. Nevertheless, these mechanisms increase the network latency and overhead due to the retransmissions. Thus, the loss of data packets and the measures to handle them produce an undesirable drop in the QoS and hinder the overall robustness and energy efficiency of the network. Interference avoidance mechanisms, such as frequency hopping techniques, reduce the need for retransmissions due to interferences but they are often tailored to specific scenarios and are not easily adapted to other use cases. On the other hand, the total absence of interference avoidance mechanisms introduces a security risk because the communication channel may be intentionally attacked and interfered with to hinder or totally block it. In this paper we propose a method for supporting the design of communication solutions under dynamic channel interference conditions and we implement dynamic management policies for frequency hopping technique and channel selection at runtime. The method considers several standard frequency hopping techniques and quality metrics, and the quality and status of the available frequency channels to propose the best combined solution to minimize the side effects of interferences. A simulation tool has been developed and used in this work to validate the method.

  2. Validation and application of the methodology for analysis of radon concentration in the air through the technique of solid state nuclear track detectors (SSNTD)

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Caroline de [Pontificia Universidade Catolica de Minas Gerais (PUC-Pocos), Pocos de Caldas, MG (Brazil); Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Lab. de Pocos de Caldas; Silva, Nivaldo Carlos da, E-mail: ncsilva@cnen.gov.b [Comissao Nacional de Energia Nuclear (LAPOC/CNEN), Pocos de Caldas, MG (Brazil). Lab. de Pocos de Caldas

    2011-07-01

    Radon is a radioactive noble gas that occurs naturally in soil and could enter into residential. The decay products of radon are radioactive metals which, when inhaled, can be retained in the respiratory system, leading to an internal dose of radiation. The monitoring of radon levels in residences and workplaces is extremely important, since high concentrations of this gas can cause serious public health problems. This study analyzed the concentration of radon in the air in 94 work environments at the Laboratory of Pocos de Caldas - LAPOC/CNEN, including laboratories, administrative rooms, workshop, warehouse and guardhouse. The method employed in the monitoring was the technique of solid state nuclear track detectors, known as SSNTD. For calibration and validation of this method, controlled experiments were conducted in laboratory with specific instrumentation. The monitoring results indicated that most environments present radon concentrations above 100 Bq m{sup -3}, which is the reference level recommended by the World Health Organization. (author)

  3. Validation and application of the methodology for analysis of radon concentration in the air through the technique of solid state nuclear track detectors (SSNTD)

    International Nuclear Information System (INIS)

    Carvalho, Caroline de; Comissao Nacional de Energia Nuclear; Silva, Nivaldo Carlos da

    2011-01-01

    Radon is a radioactive noble gas that occurs naturally in soil and could enter into residential. The decay products of radon are radioactive metals which, when inhaled, can be retained in the respiratory system, leading to an internal dose of radiation. The monitoring of radon levels in residences and workplaces is extremely important, since high concentrations of this gas can cause serious public health problems. This study analyzed the concentration of radon in the air in 94 work environments at the Laboratory of Pocos de Caldas - LAPOC/CNEN, including laboratories, administrative rooms, workshop, warehouse and guardhouse. The method employed in the monitoring was the technique of solid state nuclear track detectors, known as SSNTD. For calibration and validation of this method, controlled experiments were conducted in laboratory with specific instrumentation. The monitoring results indicated that most environments present radon concentrations above 100 Bq m -3 , which is the reference level recommended by the World Health Organization. (author)

  4. A study on in-situ measuring method and modeling technique of an unsaturated zone

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Hisashi [Hazama Corp., Tsukuba, Ibaraki (Japan). Technical Research Inst.; Amemiya, Kiyoshi; Nishida, Kaoru; Lin, Weiren; Lei, Xinglin

    1997-03-01

    It is generally considered that an unsaturated zone is generated in the vicinity of a drift after excavation. In such a zone, invasion of air containing oxygen possibly changes geochemical environment (redox condition) of the rock mass. However, no measurement technique for quantitative understanding of this unsaturated zone is currently available. This study has been started to develop the measuring method in the several years. This year, fundamental information has been obtained through analysis, laboratory experiments using homogeneous rock samples and field measurement described below. (1) experiments on the mechanism of undersaturation in rock. (2) experiments on the measuring method of the extend of unsaturated zone. (author)

  5. METHODS AND TECHNIQUES OF TEACHING CHILDREN OF PRE-SCHOOL AGE THE SPORTS DANCE

    Directory of Open Access Journals (Sweden)

    VARNACOVA ELEONORA

    2016-12-01

    Full Text Available The article exposes some conclusions about the influence of the sports dance on the multilateral development of a preschool child`s personality. The author presents methods and techniques used in the didactic activity of teaching-learning the sports dance in groups of children of under school age. At the same time, she outlines the methods of organizing systematically her classes and the fact that the teacher is free to choose the time when to deliver theoretical courses or have practical classes taking in consideration the children`s individual abilities.

  6. Effective classroom teaching methods: a critical incident technique from millennial nursing students' perspective.

    Science.gov (United States)

    Robb, Meigan

    2014-01-11

    Engaging nursing students in the classroom environment positively influences their ability to learn and apply course content to clinical practice. Students are motivated to engage in learning if their learning preferences are being met. The methods nurse educators have used with previous students in the classroom may not address the educational needs of Millennials. This manuscript presents the findings of a pilot study that used the Critical Incident Technique. The purpose of this study was to gain insight into the teaching methods that help the Millennial generation of nursing students feel engaged in the learning process. Students' perceptions of effective instructional approaches are presented in three themes. Implications for nurse educators are discussed.

  7. SEU mitigation technique by Dynamic Reconfiguration method in FPGA based DSP application

    International Nuclear Information System (INIS)

    Dey, Madhusudan; Singh, Abhishek; Roy, Amitava

    2012-01-01

    Field Programmable Gate Array (FPGA), an SRAM based configurable devices meant for implementation of any digital circuits is susceptible to malfunction in the harsh radiation environment. It causes the corruption of the configuration memory of FPGA and the digital circuits starts malfunctioning. There is a need to restore the system as early as possible. This paper discusses about one such technique named dynamic partial reconfiguration (DPR) method. This paper also touches upon the signal processing by DPR method. The framework consisting of ADC, DAC and ICAP controllers designed using dedicated state machines to study the best possible downtime also for verifying the performance of digital filters for signal processing

  8. An efficient search method for finding the critical slip surface using the compositional Monte Carlo technique

    International Nuclear Information System (INIS)

    Goshtasbi, K.; Ahmadi, M; Naeimi, Y.

    2008-01-01

    Locating the critical slip surface and the associated minimum factor of safety are two complementary parts in a slope stability analysis. A large number of computer programs exist to solve slope stability problems. Most of these programs, however, have used inefficient and unreliable search procedures to locate the global minimum factor of safety. This paper presents an efficient and reliable method to determine the global minimum factor of safety coupled with a modified version of the Monte Carlo technique. Examples arc presented to illustrate the reliability of the proposed method

  9. A new method for measurement of femoral anteversion using 3D imaging technique

    International Nuclear Information System (INIS)

    Kim, S.I.; Lee, Y.H.; Park, S.-B.; Lee, K.-M.

    1996-01-01

    Conventional methods that use cross-sectional computed tomography (CT) images to estimate femoral anteversion have several problems because of the complex 3 dimensional structure of the femur. These are the ambiguity of defining the femoral neck axis and condylar line, and the dependence on patient positioning. Especially, the femoral neck axis that is known as a major source of error is hard to determine from a single or multiple 2-dimensional transverse CT images. In this study, we are presenting a new method that we have devised form the measurement of femoral anteversion by utilizing the 3 dimensional imaging technique. Poster 176. (author)

  10. Search method optimization technique for thermal design of high power RFQ structure

    International Nuclear Information System (INIS)

    Sharma, N.K.; Joshi, S.C.

    2009-01-01

    RRCAT has taken up the development of 3 MeV RFQ structure for the low energy part of 100 MeV H - ion injector linac. RFQ is a precision machined resonating structure designed for high rf duty factor. RFQ structural stability during high rf power operation is an important design issue. The thermal analysis of RFQ has been performed using ANSYS finite element analysis software and optimization of various parameters is attempted using Search Method optimization technique. It is an effective optimization technique for the systems governed by a large number of independent variables. The method involves examining a number of combinations of values of independent variables and drawing conclusions from the magnitude of the objective function at these combinations. In these methods there is a continuous improvement in the objective function throughout the course of the search and hence these methods are very efficient. The method has been employed in optimization of various parameters (called independent variables) of RFQ like cooling water flow rate, cooling water inlet temperatures, cavity thickness etc. involved in RFQ thermal design. The temperature rise within RFQ structure is the objective function during the thermal design. Using ANSYS Programming Development Language (APDL), various multiple iterative programmes are written and the analysis are performed to minimize the objective function. The dependency of the objective function on various independent variables is established and the optimum values of the parameters are evaluated. The results of the analysis are presented in the paper. (author)

  11. Evaluation of a Delphi technique based expert judgement method for LCA valuation - DELPHI II

    International Nuclear Information System (INIS)

    Hakala, S.; Wilson, B.

    1999-01-01

    Transparency and certainty are essential qualities for an acceptable and trusted valuation method. Based on the evaluation of the expert judgement method developed in the Delphi I study both of these criteria may be only partially accomplished by such a method. As for the technical procedure the method is well documented and transparency is good. Argumentation of the judgements, however, should be increased. The quality of the valuation indexes is explicitly available, but their certainty is very low for most interventions. The opinions of the experts differ much from each other. How much this depends on different values and how much on differences in knowledge etc. is impossible to assess. Also, how much the technique used and the statistical handling of the expert answers may have impacted the eventual scores of different interventions is difficult to assess. However, application of the expert judgement by means of the Delphi-technique to LCA valuation is a new idea, and, consequently, the method is still very much under development, far from maturity. This should be taken into account when considering the results out of the evaluation of the case study, which was the third of the kind in Europe

  12. 40Ar-39Ar method for age estimation: principles, technique and application in orogenic regions

    International Nuclear Information System (INIS)

    Dalmejer, R.

    1984-01-01

    A variety of the K-Ar method for age estimation by 40 Ar/ 39 Ar recently developed is described. This method doesn't require direct analysis of potassium, its content is calculated as a function of 39 Ar, which is formed from 39 K under neutron activation. Errors resulted from interactions between potassium and calcium nuclei with neutrons are considered. The attention is paid to the technique of gradual heating, used in 40 Ar- 39 Ar method, and of obtaining age spectrum. Aplicabilities of isochronous diagram is discussed for the case of presence of excessive argon in a sample. Examples of 40 Ar- 39 Ar method application for dating events in orogenic regions are presented

  13. New Technique Of Determination Of Biogenic Fraction In Liquid Fuels By The 14C Method

    International Nuclear Information System (INIS)

    Krajcar Bronic, I.; Baresic, J.; Horvatincic, N.; Kristof, R.; Kozar Logar, J.

    2015-01-01

    According to the EU Directive 2009/28/EC all (liquid) fuels have to contain at least 10 percent of bio-fuel, i.e., blend of biogenic origin, by 2020. 14C method is the most reliable method of determination of the biogenic fraction in fuels and various measurement techniques can be applied. A technique of direct measurement of the 14C content in liquid fuel is simple and fast but has main disadvantage: different liquid colours cause different quenching and changes in the measurement efficiency. Here we have described a new technique that uses liquids of different colours to construct modern and background calibration curves, MCC and BCC, respectively, by measuring count rates and SQP values of various modern and fossil liquids. Several types of fossil fuel, pure benzine and benzene (used as 14C-free background for 14C dating) were used for BCC, and various brands of domestic oil (vegetable, sunflower, olive, pumpkin), bioethanol and benzene prepared from modern samples were used MCC construction. The procedure for the unknown sample consists of: 1) measurement of the count rate and the SQP value, 2) calculation of background and modern count rates corresponding to the measured SQP value based on the BCC and MCC curves, respectively, and 3) the ratio of net count rates of the unknown sample and the modern net count rate at the same SQP represents the fraction of the biogenic component in the liquid. All samples should be measured under the same conditions. In our case these are: UltimaGold F scintillator, the ratio sample:scintillator (10 mL:10 mL), low-potassium glass vials of 20 mL volume, spectra recorded by LSC Quantulus and evaluated in the window 124 - 570. Lowest detectable biogenic fraction is 0.5 %. The technique depends neither on the fossil matrix or the biogenic additive types. The results are in good agreement with those obtained by different evaluation technique. (author).

  14. Single-molecule techniques in biophysics: a review of the progress in methods and applications

    Science.gov (United States)

    Miller, Helen; Zhou, Zhaokun; Shepherd, Jack; Wollman, Adam J. M.; Leake, Mark C.

    2018-02-01

    Single-molecule biophysics has transformed our understanding of biology, but also of the physics of life. More exotic than simple soft matter, biomatter lives far from thermal equilibrium, covering multiple lengths from the nanoscale of single molecules to up to several orders of magnitude higher in cells, tissues and organisms. Biomolecules are often characterized by underlying instability: multiple metastable free energy states exist, separated by levels of just a few multiples of the thermal energy scale k B T, where k B is the Boltzmann constant and T absolute temperature, implying complex inter-conversion kinetics in the relatively hot, wet environment of active biological matter. A key benefit of single-molecule biophysics techniques is their ability to probe heterogeneity of free energy states across a molecular population, too challenging in general for conventional ensemble average approaches. Parallel developments in experimental and computational techniques have catalysed the birth of multiplexed, correlative techniques to tackle previously intractable biological questions. Experimentally, progress has been driven by improvements in sensitivity and speed of detectors, and the stability and efficiency of light sources, probes and microfluidics. We discuss the motivation and requirements for these recent experiments, including the underpinning mathematics. These methods are broadly divided into tools which detect molecules and those which manipulate them. For the former we discuss the progress of super-resolution microscopy, transformative for addressing many longstanding questions in the life sciences, and for the latter we include progress in ‘force spectroscopy’ techniques that mechanically perturb molecules. We also consider in silico progress of single-molecule computational physics, and how simulation and experimentation may be drawn together to give a more complete understanding. Increasingly, combinatorial techniques are now used, including

  15. 3-D thermal weight function method and multiple virtual crack extension technique for thermal shock problems

    International Nuclear Information System (INIS)

    Lu Yanlin; Zhou Xiao; Qu Jiadi; Dou Yikang; He Yinbiao

    2005-01-01

    An efficient scheme, 3-D thermal weight function (TWF) method, and a novel numerical technique, multiple virtual crack extension (MVCE) technique, were developed for determination of histories of transient stress intensity factor (SIF) distributions along 3-D crack fronts of a body subjected to thermal shock. The TWF is a universal function, which is dependent only on the crack configuration and body geometry. TWF is independent of time during thermal shock, so the whole history of transient SIF distributions along crack fronts can be directly calculated through integration of the products of TWF and transient temperatures and temperature gradients. The repeated determinations of the distributions of stresses (or displacements) fields for individual time instants are thus avoided in the TWF method. An expression of the basic equation for the 3-D universal weight function method for Mode I in an isotropic elastic body is derived. This equation can also be derived from Bueckner-Rice's 3-D WF formulations in the framework of transformation strain. It can be understood from this equation that the so-called thermal WF is in fact coincident with the mechanical WF except for some constants of elasticity. The details and formulations of the MVCE technique are given for elliptical cracks. The MVCE technique possesses several advantages. The specially selected linearly independent VCE modes can directly be used as shape functions for the interpolation of unknown SIFs. As a result, the coefficient matrix of the final system of equations in the MVCE method is a triple-diagonal matrix and the values of the coefficients on the main diagonal are large. The system of equations has good numerical properties. The number of linearly independent VCE modes that can be introduced in a problem is unlimited. Complex situations in which the SIFs vary dramatically along crack fronts can be numerically well simulated by the MVCE technique. An integrated system of programs for solving the

  16. THE METHOD OF APPLICATION OF A COLLECTIVE SEARCH ACTIVITY AS A TOOL DEVELOPING METHODOLOGICAL THINKING OF A TEACHER

    Directory of Open Access Journals (Sweden)

    Ibragimova Luiza Vahaevna

    2013-02-01

    Full Text Available To realize any pedagogical theory into practice it is necessary to transform the theoretical concepts in teaching methods. The development of all abilities, including thinking, occurs only in the activity, which is specially organized by creating the required pedagogical conditions, in this case – it is a the application of enhanced mental activity in teachers training course and vocational training b establishment of a "virtual university" for teachers in an institute of professional training c the organization of interdisciplinary interaction of teachers, based on conditions of the nonlinear didactics (training teachers of different subjects. The presented method is implemented for two years and consists of three phases: the motivational and educational, intellectual and developmental, innovative and reflective. At the motivational and educational stage, possibilities of collective search activity actualize during the course of training, group goals are set and chosen methods of their achieving by using the first pedagogical conditions. At intellectual and developmental stage, the development of skills to the collective search for effective teaching decisions during intercourse training with the first-and second-pedagogical conditions is carried out. The innovative step is the promotion of teachers to self-determination of techniques and tools that improve the quality of the educational process, providing assistance to each other in the development of teaching manuals, which is achieved with the help of all three pedagogical conditions.

  17. THE METHOD OF APPLICATION OF A COLLECTIVE SEARCH ACTIVITY AS A TOOL DEVELOPING METHODOLOGICAL THINKING OF A TEACHER

    Directory of Open Access Journals (Sweden)

    Луиза Вахаевна Ибрагимова

    2013-04-01

    Full Text Available To realize any pedagogical theory into practice it is necessary to transform the theoretical concepts in teaching methods. The development of all abilities, including thinking, occurs only in the activity, which is specially organized by creating the required pedagogical conditions, in this case – it is a the application of enhanced mental activity in teachers training course and vocational training b establishment of a "virtual university" for teachers in an institute of professional training c the organization of interdisciplinary interaction of teachers, based on conditions of the nonlinear didactics (training teachers of different subjects. The presented method is implemented for two years and consists of three phases: the motivational and educational, intellectual and developmental, innovative and reflective. At the motivational and educational stage, possibilities of collective search activity actualize during the course of training, group goals are set and chosen methods of their achieving by using the first pedagogical conditions. At intellectual and developmental stage, the development of skills to the collective search for effective teaching decisions during intercourse training with the first-and second-pedagogical conditions is carried out. The innovative step is the promotion of teachers to self-determination of techniques and tools that improve the quality of the educational process, providing assistance to each other in the development of teaching manuals, which is achieved with the help of all three pedagogical conditions.DOI: http://dx.doi.org/10.12731/2218-7405-2013-2-17

  18. A novel method for detecting and counting overlapping tracks in SSNTD by image processing techniques

    International Nuclear Information System (INIS)

    Ab Azar, N.; Babakhani, A.; Broumandnia, A.; Sepanloo, K.

    2016-01-01

    Overlapping object detection and counting is a challenge in image processing. A new method for detecting and counting overlapping circles is presented in this paper. This method is based on pattern recognition and feature extraction using “neighborhood values“ in an object image by implementation of image processing techniques. The junction points are detected by assignment of a value for each pixel in an image. As is shown, the neighborhood values for junction points are larger than the values for other points. This distinction of neighborhood values is the main feature which can be utilized to identify the junction points and to count the overlapping tracks. This method can be used for recognizing and counting charged particle tracks, blood cells and also cancer cells. The method is called “Track Counting based on Neighborhood Values” and is symbolized by “TCNV”. - Highlights: • A new method is introduced to recognize nuclear tracks by image processing. • The method is used to specify neighborhood pixels in junction points in overlapping tracks. • Enhanced method of counting overlapping tracks. • New counting system has linear behavior in counting tracks with density less than 300,000 tracks per cm"2. • In the new method, the overlap tracks can be recognized even to 10× tracks and more.

  19. A characteristics of the small crack evaluation technique by triangle method with phased array UT

    International Nuclear Information System (INIS)

    Cho, Yong Sang

    2005-01-01

    Ultrasonic testing is a kind of nondestructive test to detect a crack or discontinuity in material or material surface by sending ultrasound to it. This conventional ultrasonic test has some difficulties to detect crack or inspect material specially in the case of complex-shaped power plant components such as Turbine blade root. Phased array UT system and its application methods for complex shaped power plant components will be a good alternative method which overcome present UT weakness. This study was aimed at developing a new method for finding the crack on material or material structures, and especially for determining the crack length without moving transducer. Especially ultrasonic phased array with electronic scan technique was used in carrying out both sizing and detect ability of crack as its depth and length changes. The response of ultrasonic phased array was analyzed to obtain the special method of determining crack length without moving the transducer and detect-ability of crack minimal length and depth from the material. The result showed a newly developed method for crack length determining is very real method which has its accuracy and verify the effectiveness of method compared to a conventional crack length determining method

  20. Real-Time Spaceborne Synthetic Aperture Radar Float-Point Imaging System Using Optimized Mapping Methodology and a Multi-Node Parallel Accelerating Technique

    Science.gov (United States)

    Li, Bingyi; Chen, Liang; Yu, Wenyue; Xie, Yizhuang; Bian, Mingming; Zhang, Qingjun; Pang, Long

    2018-01-01

    With the development of satellite load technology and very large-scale integrated (VLSI) circuit technology, on-board real-time synthetic aperture radar (SAR) imaging systems have facilitated rapid response to disasters. A key goal of the on-board SAR imaging system design is to achieve high real-time processing performance under severe size, weight, and power consumption constraints. This paper presents a multi-node prototype system for real-time SAR imaging processing. We decompose the commonly used chirp scaling (CS) SAR imaging algorithm into two parts according to the computing features. The linearization and logic-memory optimum allocation methods are adopted to realize the nonlinear part in a reconfigurable structure, and the two-part bandwidth balance method is used to realize the linear part. Thus, float-point SAR imaging processing can be integrated into a single Field Programmable Gate Array (FPGA) chip instead of relying on distributed technologies. A single-processing node requires 10.6 s and consumes 17 W to focus on 25-km swath width, 5-m resolution stripmap SAR raw data with a granularity of 16,384 × 16,384. The design methodology of the multi-FPGA parallel accelerating system under the real-time principle is introduced. As a proof of concept, a prototype with four processing nodes and one master node is implemented using a Xilinx xc6vlx315t FPGA. The weight and volume of one single machine are 10 kg and 32 cm × 24 cm × 20 cm, respectively, and the power consumption is under 100 W. The real-time performance of the proposed design is demonstrated on Chinese Gaofen-3 stripmap continuous imaging. PMID:29495637

  1. A Fast Multi-layer Subnetwork Connection Method for Time Series InSAR Technique

    Directory of Open Access Journals (Sweden)

    WU Hong'an

    2016-10-01

    Full Text Available Nowadays, times series interferometric synthetic aperture radar (InSAR technique has been widely used in ground deformation monitoring, especially in urban areas where lots of stable point targets can be detected. However, in standard time series InSAR technique, affected by atmospheric correlation distance and the threshold of linear model coherence, the Delaunay triangulation for connecting point targets can be easily separated into many discontinuous subnetworks. Thus it is difficult to retrieve ground deformation in non-urban areas. In order to monitor ground deformation in large areas efficiently, a novel multi-layer subnetwork connection (MLSC method is proposed for connecting all subnetworks. The advantage of the method is that it can quickly reduce the number of subnetworks with valid edges layer-by-layer. This method is compared with the existing complex network connecting mehod. The experimental results demonstrate that the data processing time of the proposed method is only 32.56% of the latter one.

  2. A framework for laboratory pre-work based on the concepts, tools and techniques questioning method

    International Nuclear Information System (INIS)

    Huntula, J; Sharma, M D; Johnston, I; Chitaree, R

    2011-01-01

    Learning in the laboratory is different from learning in other contexts because students have to engage with various aspects of the practice of science. They have to use many skills and knowledge in parallel-not only to understand the concepts of physics but also to use the tools and analyse the data. The question arises, how to best guide students' learning in the laboratory. This study is about creating and using questions with a specifically designed framework to aid learning in the laboratory. The concepts, tools and techniques questioning (CTTQ) method was initially designed and used at Mahidol University, Thailand, and was subsequently extended to laboratory pre-work at the University of Sydney. The CTTQ method was implemented in Sydney with 190 first-year students. Three pre-work exercises on a series of electrical experiments were created based on the CTTQ method. The pre-works were completed individually and submitted before the experiment started. Analysed pre-work, surveys and interviews were used to evaluate the pre-work questions in this study. The results indicated that the CTTQ method was successful and the flow in the experiments was better than that in the previous year. At the same time students had difficulty with the last experiment in the sequence and with techniques.

  3. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    International Nuclear Information System (INIS)

    Lungaroni, M.; Peluso, E.; Gelfusa, M.; Malizia, A.; Talebzadeh, S.; Gaudio, P.; Murari, A.; Vega, J.

    2016-01-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  4. A new registration method with voxel-matching technique for temporal subtraction images

    Science.gov (United States)

    Itai, Yoshinori; Kim, Hyoungseop; Ishikawa, Seiji; Katsuragawa, Shigehiko; Doi, Kunio

    2008-03-01

    A temporal subtraction image, which is obtained by subtraction of a previous image from a current one, can be used for enhancing interval changes on medical images by removing most of normal structures. One of the important problems in temporal subtraction is that subtraction images commonly include artifacts created by slight differences in the size, shape, and/or location of anatomical structures. In this paper, we developed a new registration method with voxel-matching technique for substantially removing the subtraction artifacts on the temporal subtraction image obtained from multiple-detector computed tomography (MDCT). With this technique, the voxel value in a warped (or non-warped) previous image is replaced by a voxel value within a kernel, such as a small cube centered at a given location, which would be closest (identical or nearly equal) to the voxel value in the corresponding location in the current image. Our new method was examined on 16 clinical cases with MDCT images. Preliminary results indicated that interval changes on the subtraction images were enhanced considerably, with a substantial reduction of misregistration artifacts. The temporal subtraction images obtained by use of the voxel-matching technique would be very useful for radiologists in the detection of interval changes on MDCT images.

  5. Fundamentals of functional imaging II: emerging MR techniques and new methods of analysis.

    Science.gov (United States)

    Luna, A; Martín Noguerol, T; Mata, L Alcalá

    2018-05-01

    Current multiparameter MRI protocols integrate structural, physiological, and metabolic information about cancer. Emerging techniques such as arterial spin-labeling (ASL), blood oxygen level dependent (BOLD), MR elastography, chemical exchange saturation transfer (CEST), and hyperpolarization provide new information and will likely be integrated into daily clinical practice in the near future. Furthermore, there is great interest in the study of tumor heterogeneity as a prognostic factor and in relation to resistance to treatment, and this interest is leading to the application of new methods of analysis of multiparametric protocols. In parallel, new oncologic biomarkers that integrate the information from MR with clinical, laboratory, genetic, and histologic findings are being developed, thanks to the application of big data and artificial intelligence. This review analyzes different emerging MR techniques that are able to evaluate the physiological, metabolic, and mechanical characteristics of cancer, as well as the main clinical applications of these techniques. In addition, it summarizes the most novel methods of analysis of functional radiologic information in oncology. Copyright © 2018 SERAM. Publicado por Elsevier España, S.L.U. All rights reserved.

  6. GO methodology. Volume 1. Overview manual

    International Nuclear Information System (INIS)

    1983-06-01

    The GO methodology is a success-oriented probabilistic system performance analysis technique. The methodology can be used to quantify system reliability and availability, identify and rank critical components and the contributors to system failure, construct event trees, and perform statistical uncertainty analysis. Additional capabilities of the method currently under development will enhance its use in evaluating the effects of external events and common cause failures on system performance. This Overview Manual provides a description of the GO Methodology, how it can be used, and benefits of using it in the analysis of complex systems

  7. Continuous culture apparatus and methodology

    International Nuclear Information System (INIS)

    Conway, H.L.

    1975-01-01

    At present, we are investigating the sorption of potentially toxic trace elements by phytoplankton under controlled laboratory conditions. Continuous culture techniques were used to study the mechanism of the sorption of the trace elements by unialgal diatom populations and the factors influencing this sorption. Continuous culture methodology has been used extensively to study bacterial kinetics. It is an excellent technique for obtaining a known physiological state of phytoplankton populations. An automated method for the synthesis of continuous culture medium for use in these experiments is described

  8. UNCOMMON SENSORY METHODOLOGIES

    Directory of Open Access Journals (Sweden)

    Vladimír Vietoris

    2015-02-01

    Full Text Available Sensory science is the young but the rapidly developing field of the food industry. Actually, the great emphasis is given to the production of rapid techniques of data collection, the difference between consumers and trained panel is obscured and the role of sensory methodologists is to prepare the ways for evaluation, by which a lay panel (consumers can achieve identical results as a trained panel. Currently, there are several conventional methods of sensory evaluation of food (ISO standards, but more sensory laboratories are developing methodologies that are not strict enough in the selection of evaluators, their mechanism is easily understandable and the results are easily interpretable. This paper deals with mapping of marginal methods used in sensory evaluation of food (new types of profiles, CATA, TDS, napping.

  9. Improving lung cancer prognosis assessment by incorporating synthetic minority oversampling technique and score fusion method

    International Nuclear Information System (INIS)

    Yan, Shiju; Qian, Wei; Guan, Yubao; Zheng, Bin

    2016-01-01

    Purpose: This study aims to investigate the potential to improve lung cancer recurrence risk prediction performance for stage I NSCLS patients by integrating oversampling, feature selection, and score fusion techniques and develop an optimal prediction model. Methods: A dataset involving 94 early stage lung cancer patients was retrospectively assembled, which includes CT images, nine clinical and biological (CB) markers, and outcome of 3-yr disease-free survival (DFS) after surgery. Among the 94 patients, 74 remained DFS and 20 had cancer recurrence. Applying a computer-aided detection scheme, tumors were segmented from the CT images and 35 quantitative image (QI) features were initially computed. Two normalized Gaussian radial basis function network (RBFN) based classifiers were built based on QI features and CB markers separately. To improve prediction performance, the authors applied a synthetic minority oversampling technique (SMOTE) and a BestFirst based feature selection method to optimize the classifiers and also tested fusion methods to combine QI and CB based prediction results. Results: Using a leave-one-case-out cross-validation (K-fold cross-validation) method, the computed areas under a receiver operating characteristic curve (AUCs) were 0.716 ± 0.071 and 0.642 ± 0.061, when using the QI and CB based classifiers, respectively. By fusion of the scores generated by the two classifiers, AUC significantly increased to 0.859 ± 0.052 (p < 0.05) with an overall prediction accuracy of 89.4%. Conclusions: This study demonstrated the feasibility of improving prediction performance by integrating SMOTE, feature selection, and score fusion techniques. Combining QI features and CB markers and performing SMOTE prior to feature selection in classifier training enabled RBFN based classifier to yield improved prediction accuracy.

  10. Improving lung cancer prognosis assessment by incorporating synthetic minority oversampling technique and score fusion method

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Shiju [School of Medical Instrument and Food Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China and School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019 (United States); Qian, Wei [Department of Electrical and Computer Engineering, University of Texas, El Paso, Texas 79968 and Sino-Dutch Biomedical and Information Engineering School, Northeastern University, Shenyang 110819 (China); Guan, Yubao [Department of Radiology, Guangzhou Medical University, Guangzhou 510182 (China); Zheng, Bin, E-mail: Bin.Zheng-1@ou.edu [School of Electrical and Computer Engineering, University of Oklahoma, Norman, Oklahoma 73019 (United States)

    2016-06-15

    Purpose: This study aims to investigate the potential to improve lung cancer recurrence risk prediction performance for stage I NSCLS patients by integrating oversampling, feature selection, and score fusion techniques and develop an optimal prediction model. Methods: A dataset involving 94 early stage lung cancer patients was retrospectively assembled, which includes CT images, nine clinical and biological (CB) markers, and outcome of 3-yr disease-free survival (DFS) after surgery. Among the 94 patients, 74 remained DFS and 20 had cancer recurrence. Applying a computer-aided detection scheme, tumors were segmented from the CT images and 35 quantitative image (QI) features were initially computed. Two normalized Gaussian radial basis function network (RBFN) based classifiers were built based on QI features and CB markers separately. To improve prediction performance, the authors applied a synthetic minority oversampling technique (SMOTE) and a BestFirst based feature selection method to optimize the classifiers and also tested fusion methods to combine QI and CB based prediction results. Results: Using a leave-one-case-out cross-validation (K-fold cross-validation) method, the computed areas under a receiver operating characteristic curve (AUCs) were 0.716 ± 0.071 and 0.642 ± 0.061, when using the QI and CB based classifiers, respectively. By fusion of the scores generated by the two classifiers, AUC significantly increased to 0.859 ± 0.052 (p < 0.05) with an overall prediction accuracy of 89.4%. Conclusions: This study demonstrated the feasibility of improving prediction performance by integrating SMOTE, feature selection, and score fusion techniques. Combining QI features and CB markers and performing SMOTE prior to feature selection in classifier training enabled RBFN based classifier to yield improved prediction accuracy.

  11. Optimizing and modeling of effective parameters on the structural and magnetic properties of Fe{sub 3}O{sub 4} nanoparticles synthesized by coprecipitation technique using response surface methodology

    Energy Technology Data Exchange (ETDEWEB)

    Ghazanfari, Mohammad Reza [Department of Materials Science and Engineering, Ferdowsi University of Mashhad, 9177948974 Mashhad (Iran, Islamic Republic of); Kashefi, Mehrdad, E-mail: m-kashefi@um.ac.ir [Department of Materials Science and Engineering, Ferdowsi University of Mashhad, 9177948974 Mashhad (Iran, Islamic Republic of); Jaafari, Mahmoud Reza [Biotechnology Research Center, Nanotechnology Research Center, School of Pharmacy, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2016-07-01

    In present work, the Fe{sub 3}O{sub 4} magnetic nanoparticles were successfully synthesized by coprecipitation method. In order to study the effects of influential factors on the structural and magnetic properties of particles, the experimental runs were designed using response surface methodology (RSM) based on central composite design (CCD), while the reaction temperature, Fe{sup 2+}/Fe{sup 3+} cation ratio, and pH of reaction were defined as effective factors on the two responses include the amounts of crystallinity degree and saturation magnetization (M{sub s}). The investigation of structural, magnetic, and microstructural properties of particles were carried out by X-ray diffraction (XRD), vibrating sample magnetometer (VSM), and dynamic light scattering (DLS) and transmission electron microscopy (TEM) analyses. As a result, the predictive quadratic models were fitted on the both responses while the R{sup 2} values were more than 0.97 for both models. The highest amounts of both responses (crystallinity degree: 88.07% and M{sub s}: 65.801 emu/g) are presented when the reaction temperature, cation ratio, and pH amounts are equal to 90 °C, 0.60, and 10.5, respectively. Finally, the TEM results show the particles with size of about 10 nm and narrow size distribution. - Highlights: • The Fe{sub 3}O{sub 4} nanoparticles were successfully synthesized by coprecipitation method. • By RSM technique, the predictive models were presented for crystallinity degree. • By RSM technique, the predictive models were presented for amounts of M{sub s}. • Temperature, pH and their interactions had most effectiveness on the amounts of M{sub s}. • Temperature, cation ratio and their interactions had most effectiveness on the crystallinity degree.

  12. A calibration method for fringe reflection technique based on the analytical phase-slope description

    Science.gov (United States)

    Wu, Yuxiang; Yue, Huimin; Pan, Zhipeng; Liu, Yong

    2018-05-01

    The fringe reflection technique (FRT) has been one of the most popular methods to measure the shape of specular surface these years. The existing system calibration methods of FRT usually contain two parts, which are camera calibration and geometric calibration. In geometric calibration, the liquid crystal display (LCD) screen position calibration is one of the most difficult steps among all the calibration procedures, and its accuracy is affected by the factors such as the imaging aberration, the plane mirror flatness, and LCD screen pixel size accuracy. In this paper, based on the deduction of FRT analytical phase-slope description, we present a novel calibration method with no requirement to calibrate the position of LCD screen. On the other hand, the system can be arbitrarily arranged, and the imaging system can either be telecentric or non-telecentric. In our experiment of measuring the 5000mm radius sphere mirror, the proposed calibration method achieves 2.5 times smaller measurement error than the geometric calibration method. In the wafer surface measuring experiment, the measurement result with the proposed calibration method is closer to the interferometer result than the geometric calibration method.

  13. A high-resolution neutron spectra unfolding method using the Genetic Algorithm technique

    CERN Document Server

    Mukherjee, B

    2002-01-01

    The Bonner sphere spectrometers (BSS) are commonly used to determine the neutron spectra within various nuclear facilities. Sophisticated mathematical tools are used to unfold the neutron energy distribution from the output data of the BSS. This paper highlights a novel high-resolution neutron spectra-unfolding method using the Genetic Algorithm (GA) technique. The GA imitates the biological evolution process prevailing in the nature to solve complex optimisation problems. The GA method was utilised to evaluate the neutron energy distribution, average energy, fluence and equivalent dose rates at important work places of a DIDO class research reactor and a high-energy superconducting heavy ion cyclotron. The spectrometer was calibrated with a sup 2 sup 4 sup 1 Am/Be (alpha,n) neutron standard source. The results of the GA method agreed satisfactorily with the results obtained by using the well-known BUNKI neutron spectra unfolding code.

  14. Application to Determination of Scholarship Worthiness Using Simple Multi Attribute Rating Technique and Merkle Hellman Method

    Directory of Open Access Journals (Sweden)

    Dicky Nofriansyah

    2017-10-01

    Full Text Available This research was focused on explaining how the concept of simple multi attribute rating technique method in a decision support system based on desktop programming to solve multi-criteria selection problem, especially Scholarship. The Merkle Hellman method is used for securing the results of choices made by the Smart process. The determination of PPA and BBP-PPA scholarship recipients on STMIK Triguna Dharma becomes a problem because it takes a long time in determining the decision. By adopting the SMART method, the application can make decisions quickly and precisely. The expected result of this research is the application can facilitate in overcoming the problems that occur concerning the determination of PPA and BBP-PPA scholarship recipients as well as assisting Student Affairs STMIK Triguna Dharma in making decisions quickly and accurately

  15. Applied methods and techniques for mechatronic systems modelling, identification and control

    CERN Document Server

    Zhu, Quanmin; Cheng, Lei; Wang, Yongji; Zhao, Dongya

    2014-01-01

    Applied Methods and Techniques for Mechatronic Systems brings together the relevant studies in mechatronic systems with the latest research from interdisciplinary theoretical studies, computational algorithm development and exemplary applications. Readers can easily tailor the techniques in this book to accommodate their ad hoc applications. The clear structure of each paper, background - motivation - quantitative development (equations) - case studies/illustration/tutorial (curve, table, etc.) is also helpful. It is mainly aimed at graduate students, professors and academic researchers in related fields, but it will also be helpful to engineers and scientists from industry. Lei Liu is a lecturer at Huazhong University of Science and Technology (HUST), China; Quanmin Zhu is a professor at University of the West of England, UK; Lei Cheng is an associate professor at Wuhan University of Science and Technology, China; Yongji Wang is a professor at HUST; Dongya Zhao is an associate professor at China University o...

  16. Application of variance reduction techniques of Monte-Carlo method to deep penetration shielding problems

    International Nuclear Information System (INIS)

    Rawat, K.K.; Subbaiah, K.V.

    1996-01-01

    General purpose Monte Carlo code MCNP is being widely employed for solving deep penetration problems by applying variance reduction techniques. These techniques depend on the nature and type of the problem being solved. Application of geometry splitting and implicit capture method are examined to study the deep penetration problems of neutron, gamma and coupled neutron-gamma in thick shielding materials. The typical problems chosen are: i) point isotropic monoenergetic gamma ray source of 1 MeV energy in nearly infinite water medium, ii) 252 Cf spontaneous source at the centre of 140 cm thick water and concrete and iii) 14 MeV fast neutrons incident on the axis of 100 cm thick concrete disk. (author). 7 refs., 5 figs

  17. A new method of maintaining airway during nasotracheal intubation--the hand mask technique.

    Science.gov (United States)

    Wu, R S; Wong, D S; Chung, P C; Tan, P P

    1993-09-01

    The efficacy of a new method (The hand mask technique) for airway maintenance during nasotracheal intubation was evaluated in our randomized crossover study. Sixty, age less than 50, ASA physical status class I-II patients undergoing surgery for the extremities with informed consent were randomly chosen for the study. Pulse oximeter, capnometer, EKG, blood pressure monitor and a peripheral nerve stimulator were attached to the patients before induction for continuous monitoring. An arterial cannula was inserted for intermittent blood gas sampling. After baseline room air blood gas data had been obtained from the spontaneously breathing patients, a flow rate of 6L/min pure oxygen was applied through a loosely fitted face mask and a semi-closed anesthesia breathing circuit for a period of 5 minutes. An arterial blood sample was drawn and the patients were put under general anesthesia with full muscle relaxation thereafter. Patients were then randomly assigned into two groups according to the ventilation technique used. Group A patients (n = 30) were manually ventilated first through a face mask for ten minutes and then the hand mask technique for another ten minutes. Blood gas data was sampled and heart rate, blood pressure, peak inspiratory airway pressure and end tidal CO2 were recorded immediately after each ventilation technique. For patients in Group B (n = 30), the sequence of the two ventilation technique were reversed. The results showed significant increases in PaO2 after artificial ventilation in both groups (No significant difference in results between the two groups) and less incidence of nasal bleeding in Group A.(ABSTRACT TRUNCATED AT 250 WORDS)

  18. Dissociation coefficients of protein adsorption to nanoparticles as quantitative metrics for description of the protein corona: A comparison of experimental techniques and methodological relevance

    KAUST Repository

    Hü hn, Jonas; Fedeli, Chiara; Zhang, Qian; Masood, Atif; del Pino, Pablo; Khashab, Niveen M.; Papini, Emanuele; Parak, Wolfgang J.

    2015-01-01

    Protein adsorption to nanoparticles is described as a chemical reaction in which proteins attach to binding sites on the nanoparticle surface. This process can be described with a dissociation coefficient, which tells how many proteins are adsorbed per nanoparticle in dependence of the protein concentration. Different techniques to experimentally determine dissociation coefficients of protein adsorption to nanoparticles are reviewed. Results of more than 130 experiments in which dissociation coefficients have been determined are compared. Data show that different methods, nanoparticle systems, and proteins can lead to significantly different dissociation coefficients. However, we observed a clear tendency of smaller dissociation coefficients upon less negative towards more positive zeta potentials of the nanoparticles. The zeta potential thus is a key parameter influencing protein adsorption to the surface of nanoparticles. Our analysis highlights the importance of the characterization of the parameters governing protein-nanoparticle interaction for quantitative evaluation and objective literature comparison.

  19. Dissociation coefficients of protein adsorption to nanoparticles as quantitative metrics for description of the protein corona: A comparison of experimental techniques and methodological relevance

    KAUST Repository

    Hühn, Jonas

    2015-12-31

    Protein adsorption to nanoparticles is described as a chemical reaction in which proteins attach to binding sites on the nanoparticle surface. This process can be described with a dissociation coefficient, which tells how many proteins are adsorbed per nanoparticle in dependence of the protein concentration. Different techniques to experimentally determine dissociation coefficients of protein adsorption to nanoparticles are reviewed. Results of more than 130 experiments in which dissociation coefficients have been determined are compared. Data show that different methods, nanoparticle systems, and proteins can lead to significantly different dissociation coefficients. However, we observed a clear tendency of smaller dissociation coefficients upon less negative towards more positive zeta potentials of the nanoparticles. The zeta potential thus is a key parameter influencing protein adsorption to the surface of nanoparticles. Our analysis highlights the importance of the characterization of the parameters governing protein-nanoparticle interaction for quantitative evaluation and objective literature comparison.

  20. Comparisons of Particle Tracking Techniques and Galerkin Finite Element Methods in Flow Simulations on Watershed Scales

    Science.gov (United States)

    Shih, D.; Yeh, G.

    2009-12-01

    This paper applies two numerical approximations, the particle tracking technique and Galerkin finite element method, to solve the diffusive wave equation in both one-dimensional and two-dimensional flow simulations. The finite element method is one of most commonly approaches in numerical problems. It can obtain accurate solutions, but calculation times may be rather extensive. The particle tracking technique, using either single-velocity or average-velocity tracks to efficiently perform advective transport, could use larger time-step sizes than the finite element method to significantly save computational time. Comparisons of the alternative approximations are examined in this poster. We adapt the model WASH123D to examine the work. WASH123D is an integrated multimedia, multi-processes, physics-based computational model suitable for various spatial-temporal scales, was first developed by Yeh et al., at 1998. The model has evolved in design capability and flexibility, and has been used for model calibrations and validations over the course of many years. In order to deliver a locally hydrological model in Taiwan, the Taiwan Typhoon and Flood Research Institute (TTFRI) is working with Prof. Yeh to develop next version of WASH123D. So, the work of our preliminary cooperationx is also sketched in this poster.