WorldWideScience

Sample records for factor analysis techniques

  1. Attitude Exploration Using Factor Analysis Technique

    Directory of Open Access Journals (Sweden)

    Monika Raghuvanshi

    2016-12-01

    Full Text Available Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters related to conservation. The primary data is collected though a structured questionnaire, using cluster sampling technique and analyzed using the IBM SPSS 23 statistical tool. Factor analysis is used to reduce 30 variables to a smaller number of more identifiable groups of variables. Results show that students “need more regulation and voluntary participation to protect the environment”, “need conservation of water and electricity”, “are concerned for undue wastage of water”, “need visible actions to protect the environment”, “need strengthening of the public transport system”, “are a little bit ignorant about the consequences of global warming”, “want prevention of water pollution by industries”, “need changing of personal habits to protect the environment”, and “don’t have firsthand experience of global warming”. Analysis revealed that nine factors obtained could explain about 58.5% variance in the attitude of secondary school students towards the environment in the city of Bikaner, India. The remaining 39.6% variance is attributed to other elements not explained by this analysis. A global campaign for improvement in attitude about environmental issues and its utility in daily lives may boost positive youth attitudes, potentially impacting worldwide. A cross-disciplinary approach may be developed by teaching along with other related disciplines such as science, economics, and social studies etc.

  2. Evaluating Exploratory Factor Analysis: Which Initial-Extraction Techniques Provide the Best Factor Fidelity?

    Science.gov (United States)

    Buley, Jerry L.

    1995-01-01

    States that attacks by communication scholars have cast doubt on the validity of exploratory factor analysis (EFA). Tests EFA's ability to produce results that replicate known dimensions in a data set. Concludes that EFA should be viewed with cautious optimism and be evaluated according to the findings of this and similar studies. (PA)

  3. The development of human factors technologies -The development of human behaviour analysis techniques-

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Heui; Park, Keun Ok; Chun, Se Woo; Suh, Sang Moon; Park, Jae Chang [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    In order to contribute to human error reduction through the studies on human-machine interaction in nuclear power plants, this project has objectives to develop SACOM(Simulation Analyzer with a Cognitive Operator Model) and techniques for human error analysis and application. In this year, we studied the followings: (1) Site investigation of operator tasks, (2) Development of operator task micro structure and revision of micro structure, (3) Development of knowledge representation software and SACOM prototype, (4) Development of performance assessment methodologies in task simulation and analysis of the effects of performance shaping factors. analysis and application techniques> (1) Classification of error shaping factors(ESFs) and development of software for ESF evaluation, (2) Analysis of human error occurrences and revision of analysis procedure, (3) Experiment for human error data collection using a compact nuclear simulator, (4) Development of a prototype data base system of the analyzed information on trip cases. 55 figs, 23 tabs, 33 refs. (Author).

  4. A Performance Study of Data Mining Techniques: Multiple Linear Regression vs. Factor Analysis

    CERN Document Server

    Taneja, Abhishek

    2011-01-01

    The growing volume of data usually creates an interesting challenge for the need of data analysis tools that discover regularities in these data. Data mining has emerged as disciplines that contribute tools for data analysis, discovery of hidden knowledge, and autonomous decision making in many application domains. The purpose of this study is to compare the performance of two data mining techniques viz., factor analysis and multiple linear regression for different sample sizes on three unique sets of data. The performance of the two data mining techniques is compared on following parameters like mean square error (MSE), R-square, R-Square adjusted, condition number, root mean square error(RMSE), number of variables included in the prediction model, modified coefficient of efficiency, F-value, and test of normality. These parameters have been computed using various data mining tools like SPSS, XLstat, Stata, and MS-Excel. It is seen that for all the given dataset, factor analysis outperform multiple linear re...

  5. Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS)

    Science.gov (United States)

    Alexander, Tiffaney Miller

    2017-01-01

    Research results have shown that more than half of aviation, aerospace and aeronautics mishaps incidents are attributed to human error. As a part of Quality within space exploration ground processing operations, the identification and or classification of underlying contributors and causes of human error must be identified, in order to manage human error.This presentation will provide a framework and methodology using the Human Error Assessment and Reduction Technique (HEART) and Human Factor Analysis and Classification System (HFACS), as an analysis tool to identify contributing factors, their impact on human error events, and predict the Human Error probabilities (HEPs) of future occurrences. This research methodology was applied (retrospectively) to six (6) NASA ground processing operations scenarios and thirty (30) years of Launch Vehicle related mishap data. This modifiable framework can be used and followed by other space and similar complex operations.

  6. Classification of ECG signals using LDA with factor analysis method as feature reduction technique.

    Science.gov (United States)

    Kaur, Manpreet; Arora, A S

    2012-11-01

    The analysis of ECG signal, especially the QRS complex as the most characteristic wave in ECG, is a widely accepted approach to study and to classify cardiac dysfunctions. In this paper, first wavelet coefficients calculated for QRS complex are taken as features. Next, factor analysis procedures without rotation and with orthogonal rotation (varimax, equimax and quartimax) are used for feature reduction. The procedure uses the 'Principal Component Method' to estimate component loadings. Further, classification has been done with a LDA classifier. The MIT-BIH arrhythmia database is used and five types of beats (normal, PVC, paced, LBBB and RBBB) are considered for analysis. Accuracy, sensitivity and positive predictivity are performance parameters used for comparing performance of feature reduction techniques. Results demonstrate that the equimax rotation method yields maximum average accuracy of 99.056% for unknown data sets among other used methods.

  7. Factor analysis

    CERN Document Server

    Gorsuch, Richard L

    2013-01-01

    Comprehensive and comprehensible, this classic covers the basic and advanced topics essential for using factor analysis as a scientific tool in psychology, education, sociology, and related areas. Emphasizing the usefulness of the techniques, it presents sufficient mathematical background for understanding and sufficient discussion of applications for effective use. This includes not only theory but also the empirical evaluations of the importance of mathematical distinctions for applied scientific analysis.

  8. Statistical factor analysis technique for characterizing basalt through interpreting nuclear and electrical well logging data (case study from Southern Syria).

    Science.gov (United States)

    Asfahani, Jamal

    2014-02-01

    Factor analysis technique is proposed in this research for interpreting the combination of nuclear well logging, including natural gamma ray, density and neutron-porosity, and the electrical well logging of long and short normal, in order to characterize the large extended basaltic areas in southern Syria. Kodana well logging data are used for testing and applying the proposed technique. The four resulting score logs enable to establish the lithological score cross-section of the studied well. The established cross-section clearly shows the distribution and the identification of four kinds of basalt which are hard massive basalt, hard basalt, pyroclastic basalt and the alteration basalt products, clay. The factor analysis technique is successfully applied on the Kodana well logging data in southern Syria, and can be used efficiently when several wells and huge well logging data with high number of variables are required to be interpreted.

  9. Microscopic image analysis techniques for the morphological characterization of pharmaceutical particles: influence of the software, and the factor algorithms used in the shape factor estimation.

    Science.gov (United States)

    Almeida-Prieto, Sergio; Blanco-Méndez, José; Otero-Espinar, Francisco J

    2007-11-01

    The present report highlights the difficulties of particle shape characterizations of multiparticulate systems obtained using different image analysis techniques. The report describes and discusses a number of shape factors that are widely used in pharmaceutical research. Using photographs of 16 pellets of different shapes, obtained by extrusion-spheronization, we investigated how shape factor estimates vary depending on method of calculation, and among different software packages. The results obtained indicate that the algorithms used (both for estimation of basic dimensions such as perimeter and maximum diameter, and for estimation of shape factors on the basis of these basic dimensions) have marked influences on the shape factor values obtained. These findings suggest that care is required when comparing results obtained using different image analysis programs.

  10. Appraisal and Analysis on Diversified Web Service Selection Techniques based on QoS Factors

    Directory of Open Access Journals (Sweden)

    N.Balaji

    Full Text Available Numerous monumental changes have been made in the existing web service selection to provide quality of services. The quality of service is a major bottle neck in the recent development. Hitherto various QoS based Web Service Selection Techniques exist. But these techniques lacks in functional and non-functional attributes. This paper consists with the following tasks; segregate various QoS based Web Service selection techniques with their respective merits and demerits, an extensive comparative study on different QoS aware service selection techniques with respect to the user requirements and multiple QoS properties and preferences. This paper also evaluates the performance of discussed techniques based on the strength of various QoS aware Web service selection functionalitiesusing a set of evaluation metrics.

  11. Ideal, nonideal, and no-marker variables: The confirmatory factor analysis (CFA) marker technique works when it matters.

    Science.gov (United States)

    Williams, Larry J; O'Boyle, Ernest H

    2015-09-01

    A persistent concern in the management and applied psychology literature is the effect of common method variance on observed relations among variables. Recent work (i.e., Richardson, Simmering, & Sturman, 2009) evaluated 3 analytical approaches to controlling for common method variance, including the confirmatory factor analysis (CFA) marker technique. Their findings indicated significant problems with this technique, especially with nonideal marker variables (those with theoretical relations with substantive variables). Based on their simulation results, Richardson et al. concluded that not correcting for method variance provides more accurate estimates than using the CFA marker technique. We reexamined the effects of using marker variables in a simulation study and found the degree of error in estimates of a substantive factor correlation was relatively small in most cases, and much smaller than error associated with making no correction. Further, in instances in which the error was large, the correlations between the marker and substantive scales were higher than that found in organizational research with marker variables. We conclude that in most practical settings, the CFA marker technique yields parameter estimates close to their true values, and the criticisms made by Richardson et al. are overstated. (c) 2015 APA, all rights reserved).

  12. FACTORS & ELEMENTAL ANALYSIS OF SIX THINKING HATS TECHNIQUE USING ABCD FRAMEWORK

    OpenAIRE

    Dr. P. S. Aithal; V. T. Shailashree; Dr. P. M. Suresh Kumar

    2017-01-01

    De Bono's Six Thinking Hats technique suggests different types of thinking corresponding to six thinking roles for the analyst, associated with hats of six different colors. The technique correlates different thinking styles used in a systematic problem solving procedure with different coloured hats. Alternately, by conceptualizing each type of hat, the person focuses on the style of thinking associated with each colour so that the problem can be analysed from different angles and frame of re...

  13. Gamma self-shielding correction factors calculation for aqueous bulk sample analysis by PGNAA technique

    Energy Technology Data Exchange (ETDEWEB)

    Nasrabadi, M.N. [Department of Nuclear Engineering, Faculty of Modern Sciences and Technologies, University of Isfahan, Isfahan 81746-73441 (Iran, Islamic Republic of)], E-mail: mnnasrabadi@ast.ui.ac.ir; Mohammadi, A. [Department of Physics, Payame Noor University (PNU), Kohandej, Isfahan (Iran, Islamic Republic of); Jalali, M. [Isfahan Nuclear Science and Technology Research Institute (NSTRT), Reactor and Accelerators Research and Development School, Atomic Energy Organization of Iran (Iran, Islamic Republic of)

    2009-07-15

    In this paper bulk sample prompt gamma neutron activation analysis (BSPGNAA) was applied to aqueous sample analysis using a relative method. For elemental analysis of an unknown bulk sample, gamma self-shielding coefficient was required. Gamma self-shielding coefficient of unknown samples was estimated by an experimental method and also by MCNP code calculation. The proposed methodology can be used for the determination of the elemental concentration of unknown aqueous samples by BSPGNAA where knowledge of the gamma self-shielding within the sample volume is required.

  14. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    National Research Council Canada - National Science Library

    Caescu Stefan Claudiu; Popescu Andrei; Ploesteanu Mara Gabriela

    2011-01-01

    .... Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization...

  15. Combined statistical analysis of vasodilation and flow curves in brachial ultrasonography: technique and its connection to cardiovascular risk factors

    Science.gov (United States)

    Boisrobert, Loic; Laclaustra, Martin; Bossa, Matias; Frangi, Andres G.; Frangi, Alejandro F.

    2005-04-01

    Clinical studies report that impaired endothelial function is associated with Cardio-Vascular Diseases (CVD) and their risk factors. One commonly used mean for assessing endothelial function is Flow-Mediated Dilation (FMD). Classically, FMD is quantified using local indexes e.g. maximum peak dilation. Although such parameters have been successfully linked to CVD risk factors and other clinical variables, this description does not consider all the information contained in the complete vasodilation curve. Moreover, the relation between flow impulse and the vessel vasodilation response to this stimulus, although not clearly known, seems to be important and is not taken into account in the majority of studies. In this paper we propose a novel global parameterization for the vasodilation and the flow curves of a FMD test. This parameterization uses Principal Component Analysis (PCA) to describe independently and jointly the variability of flow and FMD curves. These curves are obtained using computerized techniques (based on edge detection and image registration, respectively) to analyze the ultrasound image sequences. The global description obtained through PCA yields a detailed characterization of the morphology of such curves allowing the extraction of intuitive quantitative information of the vasodilation process and its interplay with flow changes. This parameterization is consistent with traditional measurements and, in a database of 177 subjects, seems to correlate more strongly (and with more clinical parameters) than classical measures to CVD risk factors and clinical parameters such as LDL- and HDL-Cholesterol.

  16. Communication Analysis modelling techniques

    CERN Document Server

    España, Sergio; Pastor, Óscar; Ruiz, Marcela

    2012-01-01

    This report describes and illustrates several modelling techniques proposed by Communication Analysis; namely Communicative Event Diagram, Message Structures and Event Specification Templates. The Communicative Event Diagram is a business process modelling technique that adopts a communicational perspective by focusing on communicative interactions when describing the organizational work practice, instead of focusing on physical activities1; at this abstraction level, we refer to business activities as communicative events. Message Structures is a technique based on structured text that allows specifying the messages associated to communicative events. Event Specification Templates are a means to organise the requirements concerning a communicative event. This report can be useful to analysts and business process modellers in general, since, according to our industrial experience, it is possible to apply many Communication Analysis concepts, guidelines and criteria to other business process modelling notation...

  17. Are factor analytical techniques used appropriately in the validation of health status questionnaires? A systematic review on the quality of factor analysis of the SF-36

    NARCIS (Netherlands)

    Vet, de H.C.W.; Adèr, H.J.; Terwee, C.B.; Pouwer, F.

    2005-01-01

    Factor analysis is widely used to evaluate whether questionnaire items can be grouped into clusters representing different dimensions of the construct under study. This review focuses on the appropriate use of factor analysis. The Medical Outcomes Study Short Form-36 (SF-36) is used as an example. A

  18. Ranking factors involved in product design using a hybrid model of Quality Function Deployment, Data Envelopment Analysis and TOPSIS technique

    Directory of Open Access Journals (Sweden)

    Davood Feiz

    2014-08-01

    Full Text Available Quality function deployment (QFD is one such extremely important quality management tool, which is useful in product design and development. Traditionally, QFD rates the design requirements (DRs with respect to customer requirements, and aggregates the rating to get relative importance score of DRs. An increasing number of studies emphasize on the need to incorporate additional factors, such as cost and environmental impact, while calculating the relative importance of DRs. However, there are different methodologies for driving the relative importance of DRs, when several additional factors are considered. TOPSIS (technique for order preferences by similarity to ideal solution is suggested for the purpose of the research. This research proposes new approach of TOPSIS for considering the rating of DRs with respect to CRs, and several additional factors, simultaneously. Proposed method is illustrated using by step-by-step procedure. The proposed methodology was applied for the Sanam Electronic Company in Iran.

  19. INTERNAL ENVIRONMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Caescu Stefan Claudiu

    2011-12-01

    Full Text Available Theme The situation analysis, as a separate component of the strategic planning, involves collecting and analysing relevant types of information on the components of the marketing environment and their evolution on the one hand and also on the organization’s resources and capabilities on the other. Objectives of the Research The main purpose of the study of the analysis techniques of the internal environment is to provide insight on those aspects that are of strategic importance to the organization. Literature Review The marketing environment consists of two distinct components, the internal environment that is made from specific variables within the organization and the external environment that is made from variables external to the organization. Although analysing the external environment is essential for corporate success, it is not enough unless it is backed by a detailed analysis of the internal environment of the organization. The internal environment includes all elements that are endogenous to the organization, which are influenced to a great extent and totally controlled by it. The study of the internal environment must answer all resource related questions, solve all resource management issues and represents the first step in drawing up the marketing strategy. Research Methodology The present paper accomplished a documentary study of the main techniques used for the analysis of the internal environment. Results The special literature emphasizes that the differences in performance from one organization to another is primarily dependant not on the differences between the fields of activity, but especially on the differences between the resources and capabilities and the ways these are capitalized on. The main methods of analysing the internal environment addressed in this paper are: the analysis of the organizational resources, the performance analysis, the value chain analysis and the functional analysis. Implications Basically such

  20. Decision Analysis Technique

    Directory of Open Access Journals (Sweden)

    Hammad Dabo Baba

    2014-01-01

    Full Text Available One of the most significant step in building structure maintenance decision is the physical inspection of the facility to be maintained. The physical inspection involved cursory assessment of the structure and ratings of the identified defects based on expert evaluation. The objective of this paper is to describe present a novel approach to prioritizing the criticality of physical defects in a residential building system using multi criteria decision analysis approach. A residential building constructed in 1985 was considered in this study. Four criteria which includes; Physical Condition of the building system (PC, Effect on Asset (EA, effect on Occupants (EO and Maintenance Cost (MC are considered in the inspection. The building was divided in to nine systems regarded as alternatives. Expert's choice software was used in comparing the importance of the criteria against the main objective, whereas structured Proforma was used in quantifying the defects observed on all building systems against each criteria. The defects severity score of each building system was identified and later multiplied by the weight of the criteria and final hierarchy was derived. The final ranking indicates that, electrical system was considered the most critical system with a risk value of 0.134 while ceiling system scored the lowest risk value of 0.066. The technique is often used in prioritizing mechanical equipment for maintenance planning. However, result of this study indicates that the technique could be used in prioritizing building systems for maintenance planning

  1. Microencapsulation techniques, factors influencing encapsulation efficiency.

    Science.gov (United States)

    Jyothi, N Venkata Naga; Prasanna, P Muthu; Sakarkar, Suhas Narayan; Prabha, K Surya; Ramaiah, P Seetha; Srawan, G Y

    2010-05-01

    Microencapsulation is one of the quality preservation techniques of sensitive substances and a method for production of materials with new valuable properties. Microencapsulation is a process of enclosing micron-sized particles in a polymeric shell. There are different techniques available for the encapsulation of drug entities. The encapsulation efficiency of the microparticle or microsphere or microcapsule depends upon different factors like concentration of the polymer, solubility of polymer in solvent, rate of solvent removal, solubility of organic solvent in water, etc. The present article provides a literature review of different microencapsulation techniques and different factors influencing the encapsulation efficiency of the microencapsulation technique.

  2. Foundations of factor analysis

    CERN Document Server

    Mulaik, Stanley A

    2009-01-01

    Introduction Factor Analysis and Structural Theories Brief History of Factor Analysis as a Linear Model Example of Factor AnalysisMathematical Foundations for Factor Analysis Introduction Scalar AlgebraVectorsMatrix AlgebraDeterminants Treatment of Variables as Vectors Maxima and Minima of FunctionsComposite Variables and Linear Transformations Introduction Composite Variables Unweighted Composite VariablesDifferentially Weighted Composites Matrix EquationsMulti

  3. Surface analysis the principal techniques

    CERN Document Server

    Vickerman, John C

    2009-01-01

    This completely updated and revised second edition of Surface Analysis: The Principal Techniques, deals with the characterisation and understanding of the outer layers of substrates, how they react, look and function which are all of interest to surface scientists. Within this comprehensive text, experts in each analysis area introduce the theory and practice of the principal techniques that have shown themselves to be effective in both basic research and in applied surface analysis. Examples of analysis are provided to facilitate the understanding of this topic and to show readers how they c

  4. Influence of Topographic and Hydrographic Factors on the Spatial Distribution of Leptospirosis Disease in São Paulo County, Brazil: An Approach Using Geospatial Techniques and GIS Analysis

    Science.gov (United States)

    Ferreira, M. C.; Ferreira, M. F. M.

    2016-06-01

    Leptospirosis is a zoonosis caused by Leptospira genus bacteria. Rodents, especially Rattus norvegicus, are the most frequent hosts of this microorganism in the cities. The human transmission occurs by contact with urine, blood or tissues of the rodent and contacting water or mud contaminated by rodent urine. Spatial patterns of concentration of leptospirosis are related to the multiple environmental and socioeconomic factors, like housing near flooding areas, domestic garbage disposal sites and high-density of peoples living in slums located near river channels. We used geospatial techniques and geographical information system (GIS) to analysing spatial relationship between the distribution of leptospirosis cases and distance from rivers, river density in the census sector and terrain slope factors, in Sao Paulo County, Brazil. To test this methodology we used a sample of 183 geocoded leptospirosis cases confirmed in 2007, ASTER GDEM2 data, hydrography and census sectors shapefiles. Our results showed that GIS and geospatial analysis techniques improved the mapping of the disease and permitted identify the spatial pattern of association between location of cases and spatial distribution of the environmental variables analyzed. This study showed also that leptospirosis cases might be more related to the census sectors located on higher river density areas and households situated at shorter distances from rivers. In the other hand, it was not possible to assert that slope terrain contributes significantly to the location of leptospirosis cases.

  5. Quantitative Techniques in Volumetric Analysis

    Science.gov (United States)

    Zimmerman, John; Jacobsen, Jerrold J.

    1996-12-01

    Quantitative Techniques in Volumetric Analysis is a visual library of techniques used in making volumetric measurements. This 40-minute VHS videotape is designed as a resource for introducing students to proper volumetric methods and procedures. The entire tape, or relevant segments of the tape, can also be used to review procedures used in subsequent experiments that rely on the traditional art of quantitative analysis laboratory practice. The techniques included are: Quantitative transfer of a solid with a weighing spoon Quantitative transfer of a solid with a finger held weighing bottle Quantitative transfer of a solid with a paper strap held bottle Quantitative transfer of a solid with a spatula Examples of common quantitative weighing errors Quantitative transfer of a solid from dish to beaker to volumetric flask Quantitative transfer of a solid from dish to volumetric flask Volumetric transfer pipet A complete acid-base titration Hand technique variations The conventional view of contemporary quantitative chemical measurement tends to focus on instrumental systems, computers, and robotics. In this view, the analyst is relegated to placing standards and samples on a tray. A robotic arm delivers a sample to the analysis center, while a computer controls the analysis conditions and records the results. In spite of this, it is rare to find an analysis process that does not rely on some aspect of more traditional quantitative analysis techniques, such as careful dilution to the mark of a volumetric flask. Figure 2. Transfer of a solid with a spatula. Clearly, errors in a classical step will affect the quality of the final analysis. Because of this, it is still important for students to master the key elements of the traditional art of quantitative chemical analysis laboratory practice. Some aspects of chemical analysis, like careful rinsing to insure quantitative transfer, are often an automated part of an instrumental process that must be understood by the

  6. Risk factor analysis of pulmonary hemorrhage complicating CT-guided lung biopsy in coaxial and non-coaxial core biopsy techniques in 650 patients

    Energy Technology Data Exchange (ETDEWEB)

    Nour-Eldin, Nour-Eldin A., E-mail: nour410@hotmail.com [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany); Diagnostic and Interventional Radiology Department, Cairo University Hospital, Cairo (Egypt); Alsubhi, Mohammed [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany); Naguib, Nagy N. [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany); Diagnostic and Interventional Radiology Department, Alexandria University Hospital, Alexandria (Egypt); Lehnert, Thomas; Emam, Ahmed; Beeres, Martin; Bodelle, Boris; Koitka, Karen; Vogl, Thomas J.; Jacobi, Volkmar [Institute for Diagnostic and Interventional Radiology, Johan Wolfgang Goethe – University Hospital, Theodor-Stern-Kai 7, 60590 Frankfurt am Main (Germany)

    2014-10-15

    Purpose: To evaluate the risk factors involved in the development of pulmonary hemorrhage complicating CT-guided biopsy of pulmonary lesions in coaxial and non-coaxial techniques. Materials and methods: Retrospective study included CT-guided percutaneous lung biopsies in 650 consecutive patients (407 males, 243 females; mean age 54.6 years, SD: 5.2) from November 2008 to June 2013. Patients were classified according to lung biopsy technique in coaxial group (318 lesions) and non-coaxial group (332 lesions). Exclusion criteria for biopsy were: lesions <5 mm in diameter, uncorrectable coagulopathy, positive-pressure ventilation, severe respiratory compromise, pulmonary arterial hypertension or refusal of the procedure. Risk factors for pulmonary hemorrhage complicating lung biopsy were classified into: (a) patient's related risk factors, (b) lesion's related risk factors and (d) technical risk factors. Radiological assessments were performed by two radiologists in consensus. Mann–Whitney U test and Fisher's exact tests for statistical analysis. p values <0.05 were considered statistically significant. Results: Incidence of pulmonary hemorrhage was 19.6% (65/332) in non-coaxial group and 22.3% (71/318) in coaxial group. The difference in incidence between both groups was statistically insignificant (p = 0.27). Hemoptysis developed in 5.4% (18/332) and in 6.3% (20/318) in the non-coaxial and coaxial groups respectively. Traversing pulmonary vessels in the needle biopsy track was a significant risk factor of the development pulmonary hemorrhage (incidence: 55.4% (36/65, p = 0.0003) in the non-coaxial group and 57.7% (41/71, p = 0.0013) in coaxial group). Other significant risk factors included: lesions of less than 2 cm (p value of 0.01 and 0.02 in non-coaxial and coaxial groups respectively), basal and middle zonal lesions in comparison to upper zonal lung lesions (p = 0.002 and 0.03 in non-coaxial and coaxial groups respectively), increased lesion

  7. Digital Fourier analysis advanced techniques

    CERN Document Server

    Kido, Ken'iti

    2015-01-01

    This textbook is a thorough, accessible introduction to advanced digital Fourier analysis for advanced undergraduate and graduate students. Assuming knowledge of the Fast Fourier Transform, this book covers advanced topics including the Hilbert transform, cepstrum analysis, and the two-dimensional Fourier transform. Saturated with clear, coherent illustrations, "Digital Fourier Analysis - Advanced Techniques" includes practice problems and thorough Appendices. As a central feature, the book includes interactive applets (available online) that mirror the illustrations. These user-friendly applets animate concepts interactively, allowing the user to experiment with the underlying mathematics. The applet source code in Visual Basic is provided online, enabling advanced students to tweak and change the programs for more sophisticated results. A complete, intuitive guide, "Digital Fourier Analysis - Advanced Techniques" is an essential reference for students in science and engineering.

  8. Triangulation of Data Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Lauri, M

    2011-10-01

    Full Text Available In psychology, as in other disciplines, the concepts of validity and reliability are considered essential to give an accurate interpretation of results. While in quantitative research the idea is well established, in qualitative research, validity and reliability take on a different dimension. Researchers like Miles and Huberman (1994 and Silverman (2000, 2001, have shown how these issues are addressed in qualitative research. In this paper I am proposing that the same corpus of data, in this case the transcripts of focus group discussions, can be analysed using more than one data analysis technique. I refer to this idea as ‘triangulation of data analysis techniques’ and argue that such triangulation increases the reliability of the results. If the results obtained through a particular data analysis technique, for example thematic analysis, are congruent with the results obtained by analysing the same transcripts using a different technique, for example correspondence analysis, it is reasonable to argue that the analysis and interpretation of the data is valid.

  9. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.;

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corr......This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  10. First course in factor analysis

    CERN Document Server

    Comrey, Andrew L

    2013-01-01

    The goal of this book is to foster a basic understanding of factor analytic techniques so that readers can use them in their own research and critically evaluate their use by other researchers. Both the underlying theory and correct application are emphasized. The theory is presented through the mathematical basis of the most common factor analytic models and several methods used in factor analysis. On the application side, considerable attention is given to the extraction problem, the rotation problem, and the interpretation of factor analytic results. Hence, readers are given a background of

  11. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  12. Model building techniques for analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Walther, Howard P.; McDaniel, Karen Lynn; Keener, Donald; Cordova, Theresa Elena; Henry, Ronald C.; Brooks, Sean; Martin, Wilbur D.

    2009-09-01

    The practice of mechanical engineering for product development has evolved into a complex activity that requires a team of specialists for success. Sandia National Laboratories (SNL) has product engineers, mechanical designers, design engineers, manufacturing engineers, mechanical analysts and experimentalists, qualification engineers, and others that contribute through product realization teams to develop new mechanical hardware. The goal of SNL's Design Group is to change product development by enabling design teams to collaborate within a virtual model-based environment whereby analysis is used to guide design decisions. Computer-aided design (CAD) models using PTC's Pro/ENGINEER software tools are heavily relied upon in the product definition stage of parts and assemblies at SNL. The three-dimensional CAD solid model acts as the design solid model that is filled with all of the detailed design definition needed to manufacture the parts. Analysis is an important part of the product development process. The CAD design solid model (DSM) is the foundation for the creation of the analysis solid model (ASM). Creating an ASM from the DSM currently is a time-consuming effort; the turnaround time for results of a design needs to be decreased to have an impact on the overall product development. This effort can be decreased immensely through simple Pro/ENGINEER modeling techniques that summarize to the method features are created in a part model. This document contains recommended modeling techniques that increase the efficiency of the creation of the ASM from the DSM.

  13. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  14. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  15. An easy guide to factor analysis

    CERN Document Server

    Kline, Paul

    2014-01-01

    Factor analysis is a statistical technique widely used in psychology and the social sciences. With the advent of powerful computers, factor analysis and other multivariate methods are now available to many more people. An Easy Guide to Factor Analysis presents and explains factor analysis as clearly and simply as possible. The author, Paul Kline, carefully defines all statistical terms and demonstrates step-by-step how to work out a simple example of principal components analysis and rotation. He further explains other methods of factor analysis, including confirmatory and path analysis, a

  16. Bayesian Exploratory Factor Analysis

    DEFF Research Database (Denmark)

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor......, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates...

  17. Prefractionation techniques in proteome analysis.

    Science.gov (United States)

    Righetti, Pier Giorgio; Castagna, Annalisa; Herbert, Ben; Reymond, Frederic; Rossier, Joël S

    2003-08-01

    The present review deals with a number of prefractionation protocols in preparation for two-dimensional map analysis, both in the fields of chromatography and in the field of electrophoresis. In the first case, Fountoulaki's groups has reported just about any chromatographic procedure useful as a prefractionation step, including affinity, ion-exchange, and reversed-phase resins. As a result of the various enrichment steps, several hundred new species, previously undetected in unfractionated samples, could be revealed for the first time. Electrophoretic prefractionation protocols include all those electrokinetic methodologies which are performed in free solution, essentially all relying on isoelectric focusing steps. The devices here reviewed include multichamber apparatus, such as the multicompartment electrolyzer with Immobiline membranes, Off-Gel electrophoresis in a multicup device and the Rotofor, an instrument also based on a multichamber system but exploiting the conventional technique of carrier-ampholyte-focusing. Other instruments of interest are the Octopus, a continuous-flow device for isoelectric focusing in a upward flowing liquid curtain, and the Gradiflow, where different pI cuts are obtained by a multistep passage through two compartments buffered at different pH values. It is felt that this panoply of methods could offer a strong step forward in "mining below the tip of the iceberg" for detecting the "unseen proteome".

  18. An Exploratory Study of Critical Factors Affecting the Efficiency of Sorting Techniques (Shell, Heap and Treap)

    CERN Document Server

    Folorunso, Olusegun; Salako, Oluwatimilehin

    2012-01-01

    The efficiency of sorting techniques has a significant impact on the overall efficiency of a program. The efficiency of Shell, Heap and Treap sorting techniques in terms of both running time and memory usage was studied, experiments conducted and results subjected to factor analysis by SPSS. The study revealed the main factor affecting these sorting techniques was time taken to sort.

  19. The development of human behavior analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Lee, Yong Hee; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon; Oh, In Suk; Lee, Hyun Chul; Park, Jae Chang

    1997-07-01

    In this project, which is to study on man-machine interaction in Korean nuclear power plants, we developed SACOM (Simulation Analyzer with a Cognitive Operator Model), a tool for the assessment of task performance in the control rooms using software simulation, and also develop human error analysis and application techniques. SACOM was developed to assess operator`s physical workload, workload in information navigation at VDU workstations, and cognitive workload in procedural tasks. We developed trip analysis system including a procedure based on man-machine interaction analysis system including a procedure based on man-machine interaction analysis and a classification system. We analyzed a total of 277 trips occurred from 1978 to 1994 to produce trip summary information, and for 79 cases induced by human errors time-lined man-machine interactions. The INSTEC, a database system of our analysis results, was developed. The MARSTEC, a multimedia authoring and representation system for trip information, was also developed, and techniques for human error detection in human factors experiments were established. (author). 121 refs., 38 tabs., 52 figs.

  20. The development of human factors technologies -The development of human factors experimental evaluation techniques-

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Bong Sik; Oh, In Suk; Cha, Kyung Hoh; Lee, Hyun Chul [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    In this year, we studied the followings: (1) Development of operator mental workload evaluation techniques, (2) Development of a prototype for preliminary human factors experiment, (3) Suitability test of information display on a large scale display panel, (4) Development of guidelines for VDU-based control room design, (5) Development of integrated test facility (ITF). (6) Establishment of an eye tracking system, and we got the following results: (1) Mental workload evaluation techniques for MMI evaluation, (2) PROTOPEX (PROTOtype for preliminary human factors experiment) for preliminary human factors experiments, (3) Usage methods of APTEA (Analysis-Prototyping-Training-Experiment-Analysis) experiment design, (4) Design guidelines for human factors verification, (5) Detail design requirements and development plan of ITF, (6) Eye movement measurement system. 38 figs, 20 tabs, 54 refs. (Author).

  1. Biomechanical Analysis of Contemporary Throwing Technique Theory

    Directory of Open Access Journals (Sweden)

    Chen Jian

    2015-01-01

    Full Text Available Based on the movement process of throwing and in order to further improve the throwing technique of our country, this paper will first illustrate the main influence factors which will affect the shot distance via the mutual combination of movement equation and geometrical analysis. And then, it will give the equation of the acting force that the throwing athletes have to bear during throwing movement; and will reach the speed relationship between each arthrosis during throwing and batting based on the kinetic analysis of the throwing athletes’ arms while throwing. This paper will obtain the momentum relationship of the athletes’ each arthrosis by means of rotational inertia analysis; and then establish a restricted particle dynamics equation from the Lagrange equation. The obtained result shows that the momentum of throwing depends on the momentum of the athletes’ wrist joints while batting.

  2. Why does minimally invasive coracoclavicular ligament reconstruction using a flip button repair technique fail? An analysis of risk factors and complications.

    Science.gov (United States)

    Schliemann, Benedikt; Roßlenbroich, Steffen B; Schneider, Kristian N; Theisen, Christina; Petersen, Wolf; Raschke, Michael J; Weimann, André

    2015-05-01

    Aim of the present study was to evaluate the risk factors for the failure of coracoclavicular ligament reconstruction using a flip button repair technique and to analyse complications related to this procedure. Seventy-one patients (3 female, 68 male) underwent surgical treatment using a flip button repair technique for an acute acromioclavicular joint dislocation. The following factors and its impact on clinical and radiographic outcome were assessed: age at trauma, interval between trauma and surgery, degree of displacement (according to Rockwood's classification), coracoid button position, button migration and post-operative appearance of ossifications. Sixty-three patients were available for follow-up. The overall Constant score was 95.2 points (range 61-100 points) compared to 97 points (range 73-100 points) for the contralateral side (p = 0.05). Nine patients (14.3 %) needed surgical revision. Inappropriate positioning of the coracoid bone tunnel with subsequent button dislocation was the most frequently observed mode of failure (6 cases, 9.5 %). Button migration into the clavicle was associated with loss of reduction (p = 0.02). The patient's age at the time of trauma had a significant impact on the clinical outcome, whereas younger patients achieved better results (p = 0.02). The interval between trauma and surgery did not significantly affect the outcome (n.s.). Good to excellent clinical results can be achieved with the presented surgical technique. The age of the patient at trauma had a significant influence on the functional outcome. Furthermore, placement of the coracoid button centrally under the coracoid base is crucial to prevent failure. IV.

  3. Linking human factors to corporate strategy with cognitive mapping techniques.

    Science.gov (United States)

    Village, Judy; Greig, Michael; Salustri, Filippo A; Neumann, W Patrick

    2012-01-01

    For human factors (HF) to avoid being considered of "side-car" status, it needs to be positioned within the organization in such a way that it affects business strategies and their implementation. Tools are needed to support this effort. This paper explores the feasibility of applying a technique from operational research called cognitive mapping to link HF to corporate strategy. Using a single case study, a cognitive map is drawn to reveal the complex relationships between human factors and achieving an organization's strategic goals. Analysis of the map for central concepts and reinforcing loops enhances understanding that can lead to discrete initiatives to facilitate integration of HF. It is recommended that this technique be used with senior managers to understand the organizations` strategic goals and enhance understanding of the potential for HF to contribute to the strategic goals.

  4. Factor Analysis and AIC.

    Science.gov (United States)

    Akaike, Hirotugu

    1987-01-01

    The Akaike Information Criterion (AIC) was introduced to extend the method of maximum likelihood to the multimodel situation. Use of the AIC in factor analysis is interesting when it is viewed as the choice of a Bayesian model; thus, wider applications of AIC are possible. (Author/GDC)

  5. Assessing the impact of various ensilage factors on the fermentation of grass silage using conventional culture and bacterial community analysis techniques.

    Science.gov (United States)

    McEniry, J; O'Kiely, P; Clipson, N J W; Forristal, P D; Doyle, E M

    2010-05-01

    Grass silage is an important ruminant feedstuff on farms during winter. The ensilage of grass involves a natural lactic acid bacterial fermentation under anaerobic conditions, and numerous factors can influence the outcome of preservation. The aim of this study was to investigate the effect of dry matter concentration, ensiling system, compaction and air infiltration on silage bacterial community composition. The impact of these factors was examined using conventional methods of microbial analysis and culture-independent Terminal Restriction Fragment Length Polymorphism (T-RFLP). Silage fermentation was restricted in herbage with a high dry matter concentration, and this was reflected in a shift in the bacterial population present. In contrast, ensiling system had little effect on bacterial community composition. Air infiltration, in the absence of compaction, altered silage bacterial community composition and silage pH. Dry matter concentration and the absence of compaction were the main factors affecting silage microbial community composition, and this was reflected in both the conventional culture-based and T-RFLP data. T-RFLP proved a useful tool to study the factors affecting ensilage. Apart from monitoring the presence or absence of members of the population, shifts in the relative presence of members could be monitored.

  6. The development of human factors experimental evaluation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Bong Shick; Oh, In Suk; Cha, Kyung Ho; Lee, Hyun Chul; Park, Geun Ok; Cheon, Se Woo; Suh, Sang Moon

    1997-07-01

    New human factors issues, such as evaluation of information navigation, the consideration of operator characteristics, and operator performance assessment, related to the HMI design based on VDUs are being risen. Thus, in order to solve these human factors issues, this project aims to establish the experimental technologies including the techniques for experimental design, experimental measurement, data collection and analysis, and to develop ITF (Integrated Test Facility) suitable for the experiment of HMI design evaluation. For the establish of the experimental data analysis and evaluation methodologies, we developed as the following: (1) a paradigm for human factors experimentation including experimental designs, procedures, and data analysis. (2) the methods for the assessment of operator`s mental workload (3) DAEXESS (data analysis and experiment evaluation supporting system). Also, we have established a experiment execution technologies through the preliminary experiments, such as the suitability evaluation of information display on a LSDP, the evaluation of information display on a LSDP, the evaluation of computerized operation procedure and an experiment of advanced alarm system (ADIOS). Finally, we developed the ITF including human machine simulator, telemetry system, an eye tracking system, an audio/video data measurement system, and three dimensional micro behaviour analysis system. (author). 81 refs., 68 tabs., 73 figs.

  7. Evaluation via multivariate techniques of scale factor variability in the rietveld method applied to quantitative phase analysis with X ray powder diffraction

    Directory of Open Access Journals (Sweden)

    Terezinha Ferreira de Oliveira

    2006-12-01

    Full Text Available The present work uses multivariate statistical analysis as a form of establishing the main sources of error in the Quantitative Phase Analysis (QPA using the Rietveld method. The quantitative determination of crystalline phases using x ray powder diffraction is a complex measurement process whose results are influenced by several factors. Ternary mixtures of Al2O3, MgO and NiO were prepared under controlled conditions and the diffractions were obtained using the Bragg-Brentano geometric arrangement. It was possible to establish four sources of critical variations: the experimental absorption and the scale factor of NiO, which is the phase with the greatest linear absorption coefficient of the ternary mixture; the instrumental characteristics represented by mechanical errors of the goniometer and sample displacement; the other two phases (Al2O3 and MgO; and the temperature and relative humidity of the air in the laboratory. The error sources excessively impair the QPA with the Rietveld method. Therefore it becomes necessary to control them during the measurement procedure.

  8. 缺失資料在因素分析上的處理方法之研究 Missing Data Techniques for Factor Analysis

    Directory of Open Access Journals (Sweden)

    王鴻龍 Hong-Long Wan

    2012-03-01

    Full Text Available 因素分析常用來研究問卷及量表。當資料缺失過多或缺失機制為非完全隨機時,分析所得的共同因素個數或因素負荷常有偏差。本研究使用「台灣教育長期追蹤資料庫」,將其中的完整資料視為基準資料,並根據原有缺失結構,建構一至五倍缺失比率的資料集,以探討因素分析對缺失插補的敏感度。研究者比較了四種缺失處理法,包括:可用個體法、完整個體法、邏輯斯迴歸插補法與蒙第卡羅-馬可夫鏈(Monte Carlo Markov Chain, MCMC)插補法。結果顯示,缺失比率愈高時,所估計出來的變異數矩陣與基準資料的矩陣差異愈大。可用個體法在缺失比率較高時,萃取的共同因子的個數比基準資料多。在因素負荷上,可用個體法的誤差最嚴重,而完整個體法雖然和其他兩種插補法的誤差接近,不過會因缺失比率的增加與基準的誤差而隨之變大。研究者建議在缺失比率20%~30%或以上時,使用邏輯斯迴歸插補法或是蒙第卡羅-馬可夫鏈插補法後再進行因素分析會有較小的誤差。 Factor analysis is frequently employed to analyze scales and questionnaires. However, when the proportion of missing data is high or the missing data are not random, the number of factors extracted can be biased. We used the Taiwan Education Panel Survey (TEPS and constructed 5 data sets with different missing proportions to assess the effects of missingness on factor analysis imputation. Complete observed data were used as a baseline for comparison. We compared the 4 treatments: available case method (AC, the complete case method (CC, MCMC single imputation (MCMC, and step-wise logistic regression single imputation (LR. The results show that the higher the missing proportion, the greater the discrepancy between the covariance matrix of the constructed data set and that of the baseline. For the AC method, the

  9. Process Mapping: Tools, Techniques, & Critical Success Factors.

    Science.gov (United States)

    Kalman, Howard K.

    2002-01-01

    Explains process mapping as an analytical tool and a process intervention that performance technologists can use to improve human performance by reducing error variance. Highlights include benefits of process mapping; and critical success factors, including organizational readiness, time commitment by participants, and the availability of a…

  10. Favourable uranium-phosphate exploration trends guided by the application of statistical factor analysis technique on the aerial gamma spectrometric data in Syrian desert (Area-1), Syria

    Science.gov (United States)

    Asfahani, J.; Al-Hent, R.; Aissa, M.

    2016-02-01

    A scored lithological map including 10 radiometric units is established through applying factor analysis approach to aerial spectrometric data of Area-1, Syrian desert, which includes Ur, eU, eTh, K%, eU/eTh, eU/K%, and eTh/K%. A model of four rotated factors F1, F2, F3, and F4 is adapted for representing 234,829 data measured points in Area-1, where 86% of total data variance is interpreted. A geological scored pseudo-section derived from the lithological scored map is established and analyzed in order to show the possible stratigraphic and structural traps for uranium occurrences associated with phosphate deposits in the studied Area-1. These identified traps presented in this paper need detailed investigation and must be necessarily followed and checked by ground validations and subsurface well logging, in order to locate the anomalous uranium occurrences and explore with more confidence and certitude their characteristics as a function of depth.

  11. Favourable uranium–phosphate exploration trends guided by the application of statistical factor analysis technique on the aerial gamma spectrometric data in Syrian desert (Area-1), Syria

    Indian Academy of Sciences (India)

    J Asfahani; R Al-Hent; M Aissa

    2016-02-01

    A scored lithological map including 10 radiometric units is established through applying factor analysis approach to aerial spectrometric data of Area-1, Syrian desert, which includes Ur, eU, eTh, K%, eU/eTh, eU/K%, and eTh/K%. A model of four rotated factors F1, F2, F3, and F4 is adapted for representing 234,829 data measured points in Area-1, where 86% of total data variance is interpreted. A geological scored pseudo-section derived from the lithological scored map is established and analyzed in order to show the possible stratigraphic and structural traps for uranium occurrences associated with phosphate deposits in the studied Area-1. These identified traps presented in this paper need detailed investigation and must be necessarily followed and checked by ground validations and subsurface well logging, in order to locate the anomalous uranium occurrences and explore with more confidence and certitude their characteristics as a function of depth.

  12. Internet-induced marketing techniques: Critical factors of viral marketing

    OpenAIRE

    Woerndl, M; Papagiannidis, S; Bourlakis, M. A.; Li, F.

    2008-01-01

    The rapid diffusion of the Internet and the emergence of various social constructs facilitated by Internet technologies are changing the drivers that define how marketing techniques are developed and refined. This paper identifies critical factors for viral marketing, an Internet-based ‘word-of-mouth’ marketing technique. Based on existing knowledge, five types of viral marketing factors that may critically influence the success of viral marketing campaigns are identified. These factors are t...

  13. Innovative Techniques Simplify Vibration Analysis

    Science.gov (United States)

    2010-01-01

    In the early years of development, Marshall Space Flight Center engineers encountered challenges related to components in the space shuttle main engine. To assess the problems, they evaluated the effects of vibration and oscillation. To enhance the method of vibration signal analysis, Marshall awarded Small Business Innovation Research (SBIR) contracts to AI Signal Research, Inc. (ASRI), in Huntsville, Alabama. ASRI developed a software package called PC-SIGNAL that NASA now employs on a daily basis, and in 2009, the PKP-Module won Marshall s Software of the Year award. The technology is also used in many industries: aircraft and helicopter, rocket engine manufacturing, transportation, and nuclear power."

  14. Comparing Techniques for Certified Static Analysis

    Science.gov (United States)

    Cachera, David; Pichardie, David

    2009-01-01

    A certified static analysis is an analysis whose semantic validity has been formally proved correct with a proof assistant. The recent increasing interest in using proof assistants for mechanizing programming language metatheory has given rise to several approaches for certification of static analysis. We propose a panorama of these techniques and compare their respective strengths and weaknesses.

  15. Techniques and Applications of Urban Data Analysis

    KAUST Repository

    AlHalawani, Sawsan N.

    2016-05-26

    Digitization and characterization of urban spaces are essential components as we move to an ever-growing ’always connected’ world. Accurate analysis of such digital urban spaces has become more important as we continue to get spatial and social context-aware feedback and recommendations in our daily activities. Modeling and reconstruction of urban environments have thus gained unprecedented importance in the last few years. Such analysis typically spans multiple disciplines, such as computer graphics, and computer vision as well as architecture, geoscience, and remote sensing. Reconstructing an urban environment usually requires an entire pipeline consisting of different tasks. In such a pipeline, data analysis plays a strong role in acquiring meaningful insights from the raw data. This dissertation primarily focuses on the analysis of various forms of urban data and proposes a set of techniques to extract useful information, which is then used for different applications. The first part of this dissertation presents a semi-automatic framework to analyze facade images to recover individual windows along with their functional configurations such as open or (partially) closed states. The main advantage of recovering both the repetition patterns of windows and their individual deformation parameters is to produce a factored facade representation. Such a factored representation enables a range of applications including interactive facade images, improved multi-view stereo reconstruction, facade-level change detection, and novel image editing possibilities. The second part of this dissertation demonstrates the importance of a layout configuration on its performance. As a specific application scenario, I investigate the interior layout of warehouses wherein the goal is to assign items to their storage locations while reducing flow congestion and enhancing the speed of order picking processes. The third part of the dissertation proposes a method to classify cities

  16. TV content analysis techniques and applications

    CERN Document Server

    Kompatsiaris, Yiannis

    2012-01-01

    The rapid advancement of digital multimedia technologies has not only revolutionized the production and distribution of audiovisual content, but also created the need to efficiently analyze TV programs to enable applications for content managers and consumers. Leaving no stone unturned, TV Content Analysis: Techniques and Applications provides a detailed exploration of TV program analysis techniques. Leading researchers and academics from around the world supply scientifically sound treatment of recent developments across the related subject areas--including systems, architectures, algorithms,

  17. Eventraciones post-trasplante renal: análisis de factores de riesgo y técnica quirúrgica / Post kidney transplantatios incisional hernia: risk factors analysis and surgical repair techniques

    Directory of Open Access Journals (Sweden)

    Santa Maria Victoria

    2015-11-01

    Full Text Available Several factors increase the risk of insicional hernias post-kidney transplantation and different surgical techniques exist for solving this problem. A retrospective study analyzing the risk factors for developing an insicional hernia and eventroplasties made in the kidney transplants performed between 2006-2013 was performed. The incidence of hernias was 12.7%. All elements studied were statistically independent of the appearance of hernias, probably by influence when combined together and not each separately. Consistent with the literature in which patients did not mesh was used or an absorbable mesh used had a recurrence of 100%. It should be more study of the risk factors that influence the development of post-transplant hernias because of the contradictions that arise from the literature. Post-transplant renal ventral hernia repair is safe and effective provided it is carried out for non-resorbable mesh. The risk of post-surgical infections does not appear to be affected by the use of mesh when the necessary precautions are taken, and if it does not change the prognosis.

  18. Classification Techniques for Multivariate Data Analysis.

    Science.gov (United States)

    1980-03-28

    analysis among biologists, botanists, and ecologists, while some social scientists may refer "typology". Other frequently encountered terms are pattern...the determinantal equation: lB -XW 0 (42) 49 The solutions X. are the eigenvalues of the matrix W-1 B 1 as in discriminant analysis. There are t non...Statistical Package for Social Sciences (SPSS) (14) subprogram FACTOR was used for the principal components analysis. It is designed both for the factor

  19. INVERSE FILTERING TECHNIQUES IN SPEECH ANALYSIS

    African Journals Online (AJOL)

    Dr Obe

    most convenient example, have been devised for obtaining waveforms related ... computer to speech analysis led to important elaborations of ... techniques of fast Fourier transformer (FFT) and. Analysis by ... the first three formants F1, F2, F3 to be made. Using the ... introduced and demonstrated to be a powerful tool for the ...

  20. Applications of electrochemical techniques in mineral analysis.

    Science.gov (United States)

    Niu, Yusheng; Sun, Fengyue; Xu, Yuanhong; Cong, Zhichao; Wang, Erkang

    2014-09-01

    This review, covering reports published in recent decade from 2004 to 2013, shows how electrochemical (EC) techniques such as voltammetry, electrochemical impedance spectroscopy, potentiometry, coulometry, etc., have made significant contributions in the analysis of minerals such as clay, sulfide, oxide, and oxysalt. It was discussed based on the classifications of both the types of the used EC techniques and kinds of the analyzed minerals. Furthermore, minerals as electrode modification materials for EC analysis have also been summarized. Accordingly, research vacancies and future development trends in these areas are discussed.

  1. Factor analysis of multivariate data

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, A.A.; Mahadevan, R.

    A brief introduction to factor analysis is presented. A FORTRAN program, which can perform the Q-mode and R-mode factor analysis and the singular value decomposition of a given data matrix is presented in Appendix B. This computer program, uses...

  2. Factor Analysis of Intern Effectiveness

    Science.gov (United States)

    Womack, Sid T.; Hannah, Shellie Louise; Bell, Columbus David

    2012-01-01

    Four factors in teaching intern effectiveness, as measured by a Praxis III-similar instrument, were found among observational data of teaching interns during the 2010 spring semester. Those factors were lesson planning, teacher/student reflection, fairness & safe environment, and professionalism/efficacy. This factor analysis was as much of a…

  3. Factor analysis and missing data

    NARCIS (Netherlands)

    Kamakura, WA; Wedel, M

    2000-01-01

    The authors study the estimation of factor models and the imputation of missing data and propose an approach that provides direct estimates of factor weights without the replacement of missing data with imputed values. First, the approach is useful in applications of factor analysis in the presence

  4. On the factors governing water vapor turbulence mixing in the convective boundary layer over land: Concept and data analysis technique using ground-based lidar measurements

    Energy Technology Data Exchange (ETDEWEB)

    Pal, Sandip, E-mail: sup252@PSU.EDU

    2016-06-01

    The convective boundary layer (CBL) turbulence is the key process for exchanging heat, momentum, moisture and trace gases between the earth's surface and the lower part of the troposphere. The turbulence parameterization of the CBL is a challenging but important component in numerical models. In particular, correct estimation of CBL turbulence features, parameterization, and the determination of the contribution of eddy diffusivity are important for simulating convection initiation, and the dispersion of health hazardous air pollutants and Greenhouse gases. In general, measurements of higher-order moments of water vapor mixing ratio (q) variability yield unique estimates of turbulence in the CBL. Using the high-resolution lidar-derived profiles of q variance, third-order moment, and skewness and analyzing concurrent profiles of vertical velocity, potential temperature, horizontal wind and time series of near-surface measurements of surface flux and meteorological parameters, a conceptual framework based on bottom up approach is proposed here for the first time for a robust characterization of the turbulent structure of CBL over land so that our understanding on the processes governing CBL q turbulence could be improved. Finally, principal component analyses will be applied on the lidar-derived long-term data sets of q turbulence statistics to identify the meteorological factors and the dominant physical mechanisms governing the CBL turbulence features. - Highlights: • Lidar based study for CBL turbulence features • Water vapor and aerosol turbulence profiles • Processes governing boundary layer turbulence profiles using lidars.

  5. On the factors governing water vapor turbulence mixing in the convective boundary layer over land: Concept and data analysis technique using ground-based lidar measurements.

    Science.gov (United States)

    Pal, Sandip

    2016-06-01

    The convective boundary layer (CBL) turbulence is the key process for exchanging heat, momentum, moisture and trace gases between the earth's surface and the lower part of the troposphere. The turbulence parameterization of the CBL is a challenging but important component in numerical models. In particular, correct estimation of CBL turbulence features, parameterization, and the determination of the contribution of eddy diffusivity are important for simulating convection initiation, and the dispersion of health hazardous air pollutants and Greenhouse gases. In general, measurements of higher-order moments of water vapor mixing ratio (q) variability yield unique estimates of turbulence in the CBL. Using the high-resolution lidar-derived profiles of q variance, third-order moment, and skewness and analyzing concurrent profiles of vertical velocity, potential temperature, horizontal wind and time series of near-surface measurements of surface flux and meteorological parameters, a conceptual framework based on bottom up approach is proposed here for the first time for a robust characterization of the turbulent structure of CBL over land so that our understanding on the processes governing CBL q turbulence could be improved. Finally, principal component analyses will be applied on the lidar-derived long-term data sets of q turbulence statistics to identify the meteorological factors and the dominant physical mechanisms governing the CBL turbulence features.

  6. Alternative Analysis Techniques for Needs and Needs Documentation Techniques,

    Science.gov (United States)

    1980-06-20

    Have you previously participated in a brainwriting session! a. Yes b. No 9. Have you previously participated in the Nominal Group Technique process...brainstorming technique for future sessions. Strongly I Strongly disagree !I agree 8. It was easy to present my views using the brainwriting technique...Strongly!i Strongly disagree I , agree * 9. I was satisfied with the brainwriting technique. Strongly i Strongly disagree __ agree . 10. I recommend using

  7. Analysis of Gopher Tortoise Population Estimation Techniques

    Science.gov (United States)

    2005-10-01

    terrestrial reptile that was once found throughout the southeastern United States from North Carolina into Texas. However, due to numerous factors...et al. 2000, Waddle 2000). Solar energy is used for thermoregulation and egg incubation. Also, tortoises are grazers (Garner and Landers 1981...Evaluation and review of field techniques used to study and manage gopher tortoises.” Pages 205-215 in Management of amphibians, reptiles , and small mammals

  8. Identification of noise in linear data sets by factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roscoe, B.A.; Hopke, Ph.K. (Illinois Univ., Urbana (USA))

    1982-01-01

    A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors.

  9. PHOTOGRAMMETRIC TECHNIQUES FOR ROAD SURFACE ANALYSIS

    Directory of Open Access Journals (Sweden)

    V. A. Knyaz

    2016-06-01

    Full Text Available The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  10. Photogrammetric Techniques for Road Surface Analysis

    Science.gov (United States)

    Knyaz, V. A.; Chibunichev, A. G.

    2016-06-01

    The quality and condition of a road surface is of great importance for convenience and safety of driving. So the investigations of the behaviour of road materials in laboratory conditions and monitoring of existing roads are widely fulfilled for controlling a geometric parameters and detecting defects in the road surface. Photogrammetry as accurate non-contact measuring method provides powerful means for solving different tasks in road surface reconstruction and analysis. The range of dimensions concerned in road surface analysis can have great variation from tenths of millimetre to hundreds meters and more. So a set of techniques is needed to meet all requirements of road parameters estimation. Two photogrammetric techniques for road surface analysis are presented: for accurate measuring of road pavement and for road surface reconstruction based on imagery obtained from unmanned aerial vehicle. The first technique uses photogrammetric system based on structured light for fast and accurate surface 3D reconstruction and it allows analysing the characteristics of road texture and monitoring the pavement behaviour. The second technique provides dense 3D model road suitable for road macro parameters estimation.

  11. A Beginner’s Guide to Factor Analysis: Focusing on Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    An Gie Yong

    2013-10-01

    Full Text Available The following paper discusses exploratory factor analysis and gives an overview of the statistical technique and how it is used in various research designs and applications. A basic outline of how the technique works and its criteria, including its main assumptions are discussed as well as when it should be used. Mathematical theories are explored to enlighten students on how exploratory factor analysis works, an example of how to run an exploratory factor analysis on SPSS is given, and finally a section on how to write up the results is provided. This will allow readers to develop a better understanding of when to employ factor analysis and how to interpret the tables and graphs in the output.

  12. Parallelization of events generation for data analysis techniques

    CERN Document Server

    Lazzaro, A

    2010-01-01

    With the startup of the LHC experiments at CERN, the involved community is now focusing on the analysis of the collected data. The complexity of the data analyses will be a key factor for finding eventual new phenomena. For such a reason many data analysis tools have been developed in the last several years, which implement several data analysis techniques. Goal of these techniques is the possibility of discriminating events of interest and measuring parameters on a given input sample of events, which are themselves defined by several variables. Also particularly important is the possibility of repeating the determination of the parameters by applying the procedure on several simulated samples, which are generated using Monte Carlo techniques and the knowledge of the probability density functions of the input variables. This procedure achieves a better estimation of the results. Depending on the number of variables, complexity of their probability density functions, number of events, and number of sample to g...

  13. TECHNIQUES AND FACTORS CONTRIBUTING TO DEVELOPING CRITICAL THINKING SKILLS

    Directory of Open Access Journals (Sweden)

    Irina Vladimirovna Glukhova

    2015-01-01

    Full Text Available The paper deals with the issue of working out and introduction in educational process of higher educational institutions of the innovative technology for developing skills of critical thinking skills of the future specialists. Research is aimed at revealing of the factors promoting formation of students’ critical thinking in high schools; the search of strategy and the receptions actualizing creative abilities of students and helping to formation of an active, independent person. The author gives the reasoned proving that it’s necessary to set up the creative educational environment and adjustment of positive dialogue between the teacher and the trainee for education of such person, development of abilities of an objective reflection, interpretation of the phenomena, formulations of adequate conclusions, well-founded evaluating. Methods. The methods involve the analysis of the philosophical, psychology-pedagogical, methodical literature and the scientific periodical publications; generalisation of the Russian and foreign background, classification and arrangement of the considered issues, supervision. Results. Current approaches to the rendering of critical thinking and a problem of its formation in the scientific literature are considered; the concept «the creative educational environment» is specified; the ways of increasing the educational process efficiency are shown. Scientific novelty. The complex of procedures and the conditions promoting effective development of critical thinking skills is theoretically proved on the basis of the analysis of various information sources. Practical significance. The research outcomes and the recommended methods of critical thinking skills formation can be useful for the professors and lecturers of higher education institutions to optimize subject matter selection, techniques and methods of education under the conditions of dynamically updated educational process. 

  14. Attitude Exploration Using Factor Analysis Technique

    OpenAIRE

    2016-01-01

    Attitude is a psychological variable that contains positive or negative evaluation about people or an environment. The growing generation possesses learning skills, so if positive attitude is inculcated at the right age, it might therefore become habitual. Students in the age group 14-20 years from the city of Bikaner, India, are the target population for this study. An inventory of 30Likert-type scale statements was prepared in order to measure attitude towards the environment and matters re...

  15. A Comparative Analysis of Biomarker Selection Techniques

    Directory of Open Access Journals (Sweden)

    Nicoletta Dessì

    2013-01-01

    Full Text Available Feature selection has become the essential step in biomarker discovery from high-dimensional genomics data. It is recognized that different feature selection techniques may result in different set of biomarkers, that is, different groups of genes highly correlated to a given pathological condition, but few direct comparisons exist which quantify these differences in a systematic way. In this paper, we propose a general methodology for comparing the outcomes of different selection techniques in the context of biomarker discovery. The comparison is carried out along two dimensions: (i measuring the similarity/dissimilarity of selected gene sets; (ii evaluating the implications of these differences in terms of both predictive performance and stability of selected gene sets. As a case study, we considered three benchmarks deriving from DNA microarray experiments and conducted a comparative analysis among eight selection methods, representatives of different classes of feature selection techniques. Our results show that the proposed approach can provide useful insight about the pattern of agreement of biomarker discovery techniques.

  16. UPLC: a preeminent technique in pharmaceutical analysis.

    Science.gov (United States)

    Kumar, Ashok; Saini, Gautam; Nair, Anroop; Sharma, Rishbha

    2012-01-01

    The pharmaceutical companies today are driven to create novel and more efficient tools to discover, develop, deliver and monitor the drugs. In this contest the development of rapid chromatographic method is crucial for the analytical laboratories. In precedent decade, substantial technological advances have been done in enhancing particle chemistry performance, improving detector design and in optimizing the system, data processors and various controls of chromatographic techniques. When all was blended together, it resulted in the outstanding performance via ultra-high performance liquid chromatography (UPLC), which holds back the principle of HPLC technique. UPLC shows a dramatic enhancement in speed, resolution as well as the sensitivity of analysis by using particle size less than 2 pm and the system is operational at higher pressure, while the mobile phase could be able to run at greater linear velocities as compared to HPLC. This technique is considered as a new focal point in field of liquid chromatographic studies. This review focuses on the basic principle, instrumentation of UPLC and its advantages over HPLC, furthermore, this article emphasizes various pharmaceutical applications of this technique.

  17. Flash Infrared Thermography Contrast Data Analysis Technique

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    This paper provides information on an IR Contrast technique that involves extracting normalized contrast versus time evolutions from the flash thermography inspection infrared video data. The analysis calculates thermal measurement features from the contrast evolution. In addition, simulation of the contrast evolution is achieved through calibration on measured contrast evolutions from many flat-bottom holes in the subject material. The measurement features and the contrast simulation are used to evaluate flash thermography data in order to characterize delamination-like anomalies. The thermal measurement features relate to the anomaly characteristics. The contrast evolution simulation is matched to the measured contrast evolution over an anomaly to provide an assessment of the anomaly depth and width which correspond to the depth and diameter of the equivalent flat-bottom hole (EFBH) similar to that used as input to the simulation. A similar analysis, in terms of diameter and depth of an equivalent uniform gap (EUG) providing a best match with the measured contrast evolution, is also provided. An edge detection technique called the half-max is used to measure width and length of the anomaly. Results of the half-max width and the EFBH/EUG diameter are compared to evaluate the anomaly. The information provided here is geared towards explaining the IR Contrast technique. Results from a limited amount of validation data on reinforced carbon-carbon (RCC) hardware are included in this paper.

  18. Comparative Analysis of Hand Gesture Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Arpana K. Patel

    2015-03-01

    Full Text Available During past few years, human hand gesture for interaction with computing devices has continues to be active area of research. In this paper survey of hand gesture recognition is provided. Hand Gesture Recognition is contained three stages: Pre-processing, Feature Extraction or matching and Classification or recognition. Each stage contains different methods and techniques. In this paper define small description of different methods used for hand gesture recognition in existing system with comparative analysis of all method with its benefits and drawbacks are provided.

  19. COSIMA data analysis using multivariate techniques

    Directory of Open Access Journals (Sweden)

    J. Silén

    2014-08-01

    Full Text Available We describe how to use multivariate analysis of complex TOF-SIMS spectra introducing the method of random projections. The technique allows us to do full clustering and classification of the measured mass spectra. In this paper we use the tool for classification purposes. The presentation describes calibration experiments of 19 minerals on Ag and Au substrates using positive mode ion spectra. The discrimination between individual minerals gives a crossvalidation Cohen κ for classification of typically about 80%. We intend to use the method as a fast tool to deduce a qualitative similarity of measurements.

  20. Internet-induced marketing techniques: Critical factors of viral marketing

    Directory of Open Access Journals (Sweden)

    Woerndl, M.

    2008-01-01

    Full Text Available The rapid diffusion of the Internet and the emergence of various social constructs facilitated by Internet technologies are changing the drivers that define how marketing techniques are developed and refined. This paper identifies critical factors for viral marketing, an Internet-based ‘word-of-mouth’ marketing technique. Based on existing knowledge, five types of viral marketing factors that may critically influence the success of viral marketing campaigns are identified. These factors are the overall structure of the campaign, the characteristics of the product or service, the content of the message, the characteristics of the diffusion and, the peer-to-peer information conduit. The paper discusses three examples of viral marketing campaigns and identifies the specific factors in each case that influence its success. The paper concludes with a viral marketing typology differentiating between viral marketing communications, unintended viral marketing and commercial viral marketing. This is still a rapidly evolving area and further research is clearly needed to monitor new developments and make sense of the radical changes these developments bring to the market.

  1. Data analysis techniques for gravitational wave observations

    Indian Academy of Sciences (India)

    S V Dhurandhar

    2004-10-01

    Astrophysical sources of gravitational waves fall broadly into three categories: (i) transient and bursts, (ii) periodic or continuous wave and (iii) stochastic. Each type of source requires a different type of data analysis strategy. In this talk various data analysis strategies will be reviewed. Optimal filtering is used for extracting binary inspirals; Fourier transforms over Doppler shifted time intervals are computed for long duration periodic sources; optimally weighted cross-correlations for stochastic background. Some recent schemes which efficiently search for inspirals will be described. The performance of some of these techniques on real data obtained will be discussed. Finally, some results on cancellation of systematic noises in laser interferometric space antenna (LISA) will be presented and future directions indicated.

  2. Application of Electromigration Techniques in Environmental Analysis

    Science.gov (United States)

    Bald, Edward; Kubalczyk, Paweł; Studzińska, Sylwia; Dziubakiewicz, Ewelina; Buszewski, Bogusław

    Inherently trace-level concentration of pollutants in the environment, together with the complexity of sample matrices, place a strong demand on the detection capabilities of electromigration methods. Significant progress is continually being made, widening the applicability of these techniques, mostly capillary zone electrophoresis, micellar electrokinetic chromatography, and capillary electrochromatography, to the analysis of real-world environmental samples, including the concentration sensitivity and robustness of the developed analytical procedures. This chapter covers the recent major developments in the domain of capillary electrophoresis analysis of environmental samples for pesticides, polycyclic aromatic hydrocarbons, phenols, amines, carboxylic acids, explosives, pharmaceuticals, and ionic liquids. Emphasis is made on pre-capillary and on-capillary chromatography and electrophoresis-based concentration of analytes and detection improvement.

  3. A numerical comparison of sensitivity analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hamby, D.M.

    1993-12-31

    Engineering and scientific phenomena are often studied with the aid of mathematical models designed to simulate complex physical processes. In the nuclear industry, modeling the movement and consequence of radioactive pollutants is extremely important for environmental protection and facility control. One of the steps in model development is the determination of the parameters most influential on model results. A {open_quotes}sensitivity analysis{close_quotes} of these parameters is not only critical to model validation but also serves to guide future research. A previous manuscript (Hamby) detailed many of the available methods for conducting sensitivity analyses. The current paper is a comparative assessment of several methods for estimating relative parameter sensitivity. Method practicality is based on calculational ease and usefulness of the results. It is the intent of this report to demonstrate calculational rigor and to compare parameter sensitivity rankings resulting from various sensitivity analysis techniques. An atmospheric tritium dosimetry model (Hamby) is used here as an example, but the techniques described can be applied to many different modeling problems. Other investigators (Rose; Dalrymple and Broyd) present comparisons of sensitivity analyses methodologies, but none as comprehensive as the current work.

  4. A technique for human error analysis (ATHEANA)

    Energy Technology Data Exchange (ETDEWEB)

    Cooper, S.E.; Ramey-Smith, A.M.; Wreathall, J.; Parry, G.W. [and others

    1996-05-01

    Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge- base was developed which describes the links between performance shaping factors and resulting unsafe actions.

  5. An SPSSR -Menu for Ordinal Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mario Basto

    2012-01-01

    Full Text Available Exploratory factor analysis is a widely used statistical technique in the social sciences. It attempts to identify underlying factors that explain the pattern of correlations within a set of observed variables. A statistical software package is needed to perform the calculations. However, there are some limitations with popular statistical software packages, like SPSS. The R programming language is a free software package for statistical and graphical computing. It offers many packages written by contributors from all over the world and programming resources that allow it to overcome the dialog limitations of SPSS. This paper offers an SPSS dialog written in theR programming language with the help of some packages, so that researchers with little or no knowledge in programming, or those who are accustomed to making their calculations based on statistical dialogs, have more options when applying factor analysis to their data and hence can adopt a better approach when dealing with ordinal, Likert-type data.

  6. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    Science.gov (United States)

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  7. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    Science.gov (United States)

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  8. Transforming Rubrics Using Factor Analysis

    Science.gov (United States)

    Baryla, Ed; Shelley, Gary; Trainor, William

    2012-01-01

    Student learning and program effectiveness is often assessed using rubrics. While much time and effort may go into their creation, it is equally important to assess how effective and efficient the rubrics actually are in terms of measuring competencies over a number of criteria. This study demonstrates the use of common factor analysis to identify…

  9. Full Information Item Factor Analysis of the FCI

    Science.gov (United States)

    Hagedorn, Eric

    2010-02-01

    Traditional factor analytical methods, principal factors or principal components analysis, are inappropriate techniques for analyzing dichotomously scored responses to standardized tests or concept inventories because they lead to artifactual factors often referred to as ``difficulty factors.'' Full information item factor analysis (Bock, Gibbons and Muraki, 1988) based on Thurstone's multiple factor model and calculated using marginal maximum likelihood estimation, is an appropriate technique for such analyses. Force Concept Inventory (Hestenes, Wells and Swackhamer, 1992) data from 1582 university students completing an introductory physics course, was analyzed using the full information item factor analysis software TESTFACT v. 4. Analyzing the statistical significance of successive factors added to the model, using chi-squared statistics, led to a six factor model interpretable in terms of the conceptual dimensions of the FCI. )

  10. Cost analysis and estimating tools and techniques

    CERN Document Server

    Nussbaum, Daniel

    1990-01-01

    Changes in production processes reflect the technological advances permeat­ ing our products and services. U. S. industry is modernizing and automating. In parallel, direct labor is fading as the primary cost driver while engineering and technology related cost elements loom ever larger. Traditional, labor-based ap­ proaches to estimating costs are losing their relevance. Old methods require aug­ mentation with new estimating tools and techniques that capture the emerging environment. This volume represents one of many responses to this challenge by the cost analysis profession. The Institute of Cost Analysis (lCA) is dedicated to improving the effective­ ness of cost and price analysis and enhancing the professional competence of its members. We encourage and promote exchange of research findings and appli­ cations between the academic community and cost professionals in industry and government. The 1990 National Meeting in Los Angeles, jointly spo~sored by ICA and the National Estimating Society (NES),...

  11. Quantitative analysis of Li by PIGE technique

    Science.gov (United States)

    Fonseca, M.; Mateus, R.; Santos, C.; Cruz, J.; Silva, H.; Luis, H.; Martins, L.; Jesus, A. P.

    2017-09-01

    In this work, the cross section of the reactions 7Li(p,pγ)7Li (γ - 478 keV) at the proton energy range 2.0-4.2 MeV was measured. The measurements were carried out at the 3 MV Tandem Accelerator at the CTN/IST Laboratory in Lisbon. To validate the obtained results, calculated gamma-ray yields were compared, at several proton energy values, with experimental yields for thick samples made of inorganic compounds containing lithium. In order to quantify the light elements present in the samples, we used a standard free method for PIGE in thick samples, based on a code - Emitted Radiation Yield Analysis (ERYA), which integrates the nuclear reaction excitation function along the depth of the sample. We also demonstrated the capacity of the technique for analysis of Li ores, as Spodumene, Lithium Muscovite and Holmquistite, and Li-alloys for plasma facing materials showing that this is a reliable and accurate method for PIGE analysis of Li in thick samples.

  12. Analytical techniques in pharmaceutical analysis: A review

    Directory of Open Access Journals (Sweden)

    Masoom Raza Siddiqui

    2017-02-01

    Full Text Available The development of the pharmaceuticals brought a revolution in human health. These pharmaceuticals would serve their intent only if they are free from impurities and are administered in an appropriate amount. To make drugs serve their purpose various chemical and instrumental methods were developed at regular intervals which are involved in the estimation of drugs. These pharmaceuticals may develop impurities at various stages of their development, transportation and storage which makes the pharmaceutical risky to be administered thus they must be detected and quantitated. For this analytical instrumentation and methods play an important role. This review highlights the role of the analytical instrumentation and the analytical methods in assessing the quality of the drugs. The review highlights a variety of analytical techniques such as titrimetric, chromatographic, spectroscopic, electrophoretic, and electrochemical and their corresponding methods that have been applied in the analysis of pharmaceuticals.

  13. Techniques for Analysis of Plant Phenolic Compounds

    Directory of Open Access Journals (Sweden)

    Thomas H. Roberts

    2013-02-01

    Full Text Available Phenolic compounds are well-known phytochemicals found in all plants. They consist of simple phenols, benzoic and cinnamic acid, coumarins, tannins, lignins, lignans and flavonoids. Substantial developments in research focused on the extraction, identification and quantification of phenolic compounds as medicinal and/or dietary molecules have occurred over the last 25 years. Organic solvent extraction is the main method used to extract phenolics. Chemical procedures are used to detect the presence of total phenolics, while spectrophotometric and chromatographic techniques are utilized to identify and quantify individual phenolic compounds. This review addresses the application of different methodologies utilized in the analysis of phenolic compounds in plant-based products, including recent technical developments in the quantification of phenolics.

  14. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  15. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  16. Sky-View Factor as a Relief Visualization Technique

    Directory of Open Access Journals (Sweden)

    Žiga Kokalj

    2011-02-01

    Full Text Available Remote sensing has become the most important data source for the digital elevation model (DEM generation. DEM analyses can be applied in various fields and many of them require appropriate DEM visualization support. Analytical hill-shading is the most frequently used relief visualization technique. Although widely accepted, this method has two major drawbacks: identifying details in deep shades and inability to properly represent linear features lying parallel to the light beam. Several authors have tried to overcome these limitations by changing the position of the light source or by filtering. This paper proposes a new relief visualization technique based on diffuse, rather than direct, illumination. It utilizes the sky-view factor—a parameter corresponding to the portion of visible sky limited by relief. Sky-view factor can be used as a general relief visualization technique to show relief characteristics. In particular, we show that this visualization is a very useful tool in archaeology as it improves the recognition of small scale features from high resolution DEMs.

  17. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    2015-01-01

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  18. Handbook of Qualitative Research Techniques and Analysis in Entrepreneurship

    DEFF Research Database (Denmark)

    One of the most challenging tasks in the research design process is choosing the most appropriate data collection and analysis techniques. This Handbook provides a detailed introduction to five qualitative data collection and analysis techniques pertinent to exploring entreprneurial phenomena....

  19. Instruments measuring perceived racism/racial discrimination: review and critique of factor analytic techniques.

    Science.gov (United States)

    Atkins, Rahshida

    2014-01-01

    Several compendiums of instruments that measure perceived racism and/or discrimination are present in the literature. Other works have reviewed the psychometric properties of these instruments in terms of validity and reliability and have indicated if the instrument was factor analyzed. However, little attention has been given to the quality of the factor analysis performed. The aim of this study was to evaluate the exploratory factor analyses done on instruments measuring perceived racism/racial discrimination using guidelines from experts in psychometric theory. The techniques used for factor analysis were reviewed and critiqued and the adequacy of reporting was evaluated. Internet search engines and four electronic abstract databases were used to identify 16 relevant instruments that met the inclusion/exclusion criteria. Principal component analysis was the most frequent method of extraction (81%). Sample sizes were adequate for factor analysis in 81 percent of studies. The majority of studies reported appropriate criteria for the acceptance of un-rotated factors (81%) and justified the rotation method (75%). Exactly 94 percent of studies reported partially acceptable criteria for the acceptance of rotated factors. The majority of articles (69%) reported adequate coefficient alphas for the resultant subscales. In 81 percent of the studies, the conceptualized dimensions were supported by factor analysis.

  20. 78 FR 37690 - Federal Acquisition Regulation; Price Analysis Techniques

    Science.gov (United States)

    2013-06-21

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify and give a precise reference in the use of a price analysis technique in order to establish a fair... reference used in FAR 15.404-1(b)(2)(i). FAR 15.404-1(b)(2) addresses various price analysis techniques and...

  1. 77 FR 40552 - Federal Acquisition Regulation; Price Analysis Techniques

    Science.gov (United States)

    2012-07-10

    ... Federal Acquisition Regulation; Price Analysis Techniques AGENCY: Department of Defense (DoD), General... clarify the use of a price analysis technique in order to establish a fair and reasonable price. DATES....404-1(b)(2) addresses various price analysis techniques and procedures the Government may use to...

  2. Numerical modeling techniques for flood analysis

    Science.gov (United States)

    Anees, Mohd Talha; Abdullah, K.; Nawawi, M. N. M.; Ab Rahman, Nik Norulaini Nik; Piah, Abd. Rahni Mt.; Zakaria, Nor Azazi; Syakir, M. I.; Mohd. Omar, A. K.

    2016-12-01

    Topographic and climatic changes are the main causes of abrupt flooding in tropical areas. It is the need to find out exact causes and effects of these changes. Numerical modeling techniques plays a vital role for such studies due to their use of hydrological parameters which are strongly linked with topographic changes. In this review, some of the widely used models utilizing hydrological and river modeling parameters and their estimation in data sparse region are discussed. Shortcomings of 1D and 2D numerical models and the possible improvements over these models through 3D modeling are also discussed. It is found that the HEC-RAS and FLO 2D model are best in terms of economical and accurate flood analysis for river and floodplain modeling respectively. Limitations of FLO 2D in floodplain modeling mainly such as floodplain elevation differences and its vertical roughness in grids were found which can be improve through 3D model. Therefore, 3D model was found to be more suitable than 1D and 2D models in terms of vertical accuracy in grid cells. It was also found that 3D models for open channel flows already developed recently but not for floodplain. Hence, it was suggested that a 3D model for floodplain should be developed by considering all hydrological and high resolution topographic parameter's models, discussed in this review, to enhance the findings of causes and effects of flooding.

  3. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    Wixson, James Robert

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Technique (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  4. Function Analysis and Decomposistion using Function Analysis Systems Technique

    Energy Technology Data Exchange (ETDEWEB)

    J. R. Wixson

    1999-06-01

    The "Father of Value Analysis", Lawrence D. Miles, was a design engineer for General Electric in Schenectady, New York. Miles developed the concept of function analysis to address difficulties in satisfying the requirements to fill shortages of high demand manufactured parts and electrical components during World War II. His concept of function analysis was further developed in the 1960s by Charles W. Bytheway, a design engineer at Sperry Univac in Salt Lake City, Utah. Charles Bytheway extended Mile's function analysis concepts and introduced the methodology called Function Analysis Systems Techniques (FAST) to the Society of American Value Engineers (SAVE) at their International Convention in 1965 (Bytheway 1965). FAST uses intuitive logic to decompose a high level, or objective function into secondary and lower level functions that are displayed in a logic diagram called a FAST model. Other techniques can then be applied to allocate functions to components, individuals, processes, or other entities that accomplish the functions. FAST is best applied in a team setting and proves to be an effective methodology for functional decomposition, allocation, and alternative development.

  5. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    of PCA and related techniques. An interesting dilemma in reduction of dimensionality of data is the desire to obtain simplicity for better understanding, visualization and interpretation of the data on the one hand, and the desire to retain sufficient detail for adequate representation on the other hand......Based on work by Pearson in 1901, Hotelling in 1933 introduced principal component analysis (PCA). PCA is often used for general feature generation and linear orthogonalization or compression by dimensionality reduction of correlated multivariate data, see Jolliffe for a comprehensive description...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  6. An Evaluation of the Critical Factors Affecting the Efficiency of Some Sorting Techniques

    Directory of Open Access Journals (Sweden)

    Olabiyisi S.O.

    2013-02-01

    Full Text Available Sorting allows information or data to be put into a meaningful order. As efficiency is a major concern of computing, data are sorted in order to gain the efficiency in retrieving or searching tasks. The factors affecting the efficiency of shell, Heap, Bubble, Quick and Merge sorting techniques in terms of running time, memory usage and the number of exchanges were investigated. Experiment was conducted for the decision variables generated from algorithms implemented in Java programming and factor analysis by principal components of the obtained experimental data was carried out in order to estimate the contribution of each factor to the success of the sorting algorithms. Further statistical analysis was carried out to generate eigenvalue of the extracted factor and hence, a system of linear equations which was used to estimate the assessment of each factor of the sorting techniques was proposed. The study revealed that the main factor affecting these sorting techniques was time taken to sort. It contributed 97.842%, 97.693%, 89.351%, 98.336% and 90.480% for Bubble sort, Heap sort, Merge sort, Quick sort and Shell sort respectively. The number of swap came second contributing 1.587% for Bubble sort, 2.305% for Heap sort, 10.63% for Merge sort, 1.643% for Quick sort and 9.514% for Shell sort. The memory used was the least of the factors contributing negligible percentage for the five sorting techniques. It contributed 0.571% for Bubble sort, 0.002% for Heap sort, 0.011% for Merge sort, 0.021% for Quick sort and 0.006% for Shell sort.

  7. Application of Multivariable Statistical Techniques in Plant-wide WWTP Control Strategies Analysis

    DEFF Research Database (Denmark)

    Flores Alsina, Xavier; Comas, J.; Rodríguez-Roda, I.

    2007-01-01

    The main objective of this paper is to present the application of selected multivariable statistical techniques in plant-wide wastewater treatment plant (WWTP) control strategies analysis. In this study, cluster analysis (CA), principal component analysis/factor analysis (PCA/FA) and discriminant...

  8. Artificial Intelligence Techniques for Automatic Screening of Amblyogenic Factors

    Science.gov (United States)

    Van Eenwyk, Jonathan; Agah, Arvin; Giangiacomo, Joseph; Cibis, Gerhard

    2008-01-01

    Purpose To develop a low-cost automated video system to effectively screen children aged 6 months to 6 years for amblyogenic factors. Methods In 1994 one of the authors (G.C.) described video vision development assessment, a digitizable analog video-based system combining Brückner pupil red reflex imaging and eccentric photorefraction to screen young children for amblyogenic factors. The images were analyzed manually with this system. We automated the capture of digital video frames and pupil images and applied computer vision and artificial intelligence to analyze and interpret results. The artificial intelligence systems were evaluated by a tenfold testing method. Results The best system was the decision tree learning approach, which had an accuracy of 77%, compared to the “gold standard” specialist examination with a “refer/do not refer” decision. Criteria for referral were strabismus, including microtropia, and refractive errors and anisometropia considered to be amblyogenic. Eighty-two percent of strabismic individuals were correctly identified. High refractive errors were also correctly identified and referred 90% of the time, as well as significant anisometropia. The program was less correct in identifying more moderate refractive errors, below +5 and less than −7. Conclusions Although we are pursuing a variety of avenues to improve the accuracy of the automated analysis, the program in its present form provides acceptable cost benefits for detecting ambylogenic factors in children aged 6 months to 6 years. PMID:19277222

  9. Real analysis modern techniques and their applications

    CERN Document Server

    Folland, Gerald B

    1999-01-01

    An in-depth look at real analysis and its applications-now expanded and revised.This new edition of the widely used analysis book continues to cover real analysis in greater detail and at a more advanced level than most books on the subject. Encompassing several subjects that underlie much of modern analysis, the book focuses on measure and integration theory, point set topology, and the basics of functional analysis. It illustrates the use of the general theories and introduces readers to other branches of analysis such as Fourier analysis, distribution theory, and probability theory.This edi

  10. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  11. Trends and Techniques in Visual Gaze Analysis

    CERN Document Server

    Stellmach, Sophie; Dachselt, Raimund; Lindley, Craig A

    2010-01-01

    Visualizing gaze data is an effective way for the quick interpretation of eye tracking results. This paper presents a study investigation benefits and limitations of visual gaze analysis among eye tracking professionals and researchers. The results were used to create a tool for visual gaze analysis within a Master's project.

  12. Effect Factors of Liquid Scintillation Analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    <正>Over the past decades, the liquid scintillation analysis (LSA) technique remains one of the most popular experimental tools used for the quantitative analysis of radionuclide, especially low-energy β

  13. 48 CFR 215.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Proposal analysis techniques. 215.404-1 Section 215.404-1 Federal Acquisition Regulations System DEFENSE ACQUISITION... Contract Pricing 215.404-1 Proposal analysis techniques. (1) Follow the procedures at PGI 215.404-1 for...

  14. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  15. The Network Protocol Analysis Technique in Snort

    Science.gov (United States)

    Wu, Qing-Xiu

    Network protocol analysis is a network sniffer to capture data for further analysis and understanding of the technical means necessary packets. Network sniffing is intercepted by packet assembly binary format of the original message content. In order to obtain the information contained. Required based on TCP / IP protocol stack protocol specification. Again to restore the data packets at protocol format and content in each protocol layer. Actual data transferred, as well as the application tier.

  16. Human Factors Analysis in Software Engineering

    Institute of Scientific and Technical Information of China (English)

    Xu Ren-zuo; Ma Ruo-feng; Liu Li-na; Xiong Zhong-wei

    2004-01-01

    The general human factors analysis analyzes human functions, effects and influence in a system. But in a narrow sense, it analyzes human influence upon the reliability of a system, it includes traditional human reliability analysis, human error analysis, man-machine interface analysis, human character analysis, and others. A software development project in software engineering is successful or not to be completely determined by human factors. In this paper, we discuss the human factors intensions, declare the importance of human factors analysis for software engineering by listed some instances. At last, we probe preliminarily into the mentality that a practitioner in software engineering should possess.

  17. Uncertainty Analysis Technique for OMEGA Dante Measurements

    Energy Technology Data Exchange (ETDEWEB)

    May, M J; Widmann, K; Sorce, C; Park, H; Schneider, M

    2010-05-07

    The Dante is an 18 channel X-ray filtered diode array which records the spectrally and temporally resolved radiation flux from various targets (e.g. hohlraums, etc.) at X-ray energies between 50 eV to 10 keV. It is a main diagnostics installed on the OMEGA laser facility at the Laboratory for Laser Energetics, University of Rochester. The absolute flux is determined from the photometric calibration of the X-ray diodes, filters and mirrors and an unfold algorithm. Understanding the errors on this absolute measurement is critical for understanding hohlraum energetic physics. We present a new method for quantifying the uncertainties on the determined flux using a Monte-Carlo parameter variation technique. This technique combines the uncertainties in both the unfold algorithm and the error from the absolute calibration of each channel into a one sigma Gaussian error function. One thousand test voltage sets are created using these error functions and processed by the unfold algorithm to produce individual spectra and fluxes. Statistical methods are applied to the resultant set of fluxes to estimate error bars on the measurements.

  18. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  19. 10th Australian conference on nuclear techniques of analysis. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-06-01

    These proceedings contains abstracts and extended abstracts of 80 lectures and posters presented at the 10th Australian conference on nuclear techniques of analysis hosted by the Australian National University in Canberra, Australia from 24-26 of November 1997. The conference was divided into sessions on the following topics : ion beam analysis and its applications; surface science; novel nuclear techniques of analysis, characterization of thin films, electronic and optoelectronic material formed by ion implantation, nanometre science and technology, plasma science and technology. A special session was dedicated to new nuclear techniques of analysis, future trends and developments. Separate abstracts were prepared for the individual presentation included in this volume.

  20. A factor analysis to detect factors influencing building national brand

    Directory of Open Access Journals (Sweden)

    Naser Azad

    Full Text Available Developing a national brand is one of the most important issues for development of a brand. In this study, we present factor analysis to detect the most important factors in building a national brand. The proposed study uses factor analysis to extract the most influencing factors and the sample size has been chosen from two major auto makers in Iran called Iran Khodro and Saipa. The questionnaire was designed in Likert scale and distributed among 235 experts. Cronbach alpha is calculated as 84%, which is well above the minimum desirable limit of 0.70. The implementation of factor analysis provides six factors including “cultural image of customers”, “exciting characteristics”, “competitive pricing strategies”, “perception image” and “previous perceptions”.

  1. OPERATIONAL MODAL ANALYSIS SCHEMES USING CORRELATION TECHNIQUE

    Institute of Scientific and Technical Information of China (English)

    Zheng Min; Shen Fan; Chen Huaihai

    2005-01-01

    For some large-scale engineering structures in operating conditions, modal parameters estimation must base itself on response-only data. This problem has received a considerable amount of attention in the past few years. It is well known that the cross-correlation function between the measured responses is a sum of complex exponential functions of the same form as the impulse response function of the original system. So this paper presents a time-domain operating modal identification global scheme and a frequency-domain scheme from output-only by coupling the cross-correlation function with conventional modal parameter estimation. The outlined techniques are applied to an airplane model to estimate modal parameters from response-only data.

  2. The German Passive: Analysis and Teaching Technique.

    Science.gov (United States)

    Griffen, T. D.

    1981-01-01

    Proposes an analysis of German passive based upon internal structure rather than translation conventions from Latin and Greek. Claims that this approach leads to a description of the perfect participle as an adjectival complement, which eliminates the classification of a passive voice for German and simplifies the learning task. (MES)

  3. Comparison of Hydrogen Sulfide Analysis Techniques

    Science.gov (United States)

    Bethea, Robert M.

    1973-01-01

    A summary and critique of common methods of hydrogen sulfide analysis is presented. Procedures described are: reflectance from silver plates and lead acetate-coated tiles, lead acetate and mercuric chloride paper tapes, sodium nitroprusside and methylene blue wet chemical methods, infrared spectrophotometry, and gas chromatography. (BL)

  4. Measurement Bias Detection through Factor Analysis

    Science.gov (United States)

    Barendse, M. T.; Oort, F. J.; Werner, C. S.; Ligtvoet, R.; Schermelleh-Engel, K.

    2012-01-01

    Measurement bias is defined as a violation of measurement invariance, which can be investigated through multigroup factor analysis (MGFA), by testing across-group differences in intercepts (uniform bias) and factor loadings (nonuniform bias). Restricted factor analysis (RFA) can also be used to detect measurement bias. To also enable nonuniform…

  5. An analysis technique for microstrip antennas

    Science.gov (United States)

    Agrawal, P. K.; Bailey, M. C.

    1977-01-01

    The paper presents a combined numerical and empirical approach to the analysis of microstrip antennas over a wide range of frequencies. The method involves representing the antenna by a fine wire grid immersed in a dielectric medium and then using Richmond's reaction formulation (1974) to evaluate the piecewise sinusoidal currents on the grid segments. The calculated results are then modified to account for the finite dielectric discontinuity. The method is applied to round and square microstrip antennas.

  6. Ranking factors affecting the productivity of human resources using MADM techniques

    Directory of Open Access Journals (Sweden)

    G. A. Shekari

    2012-12-01

    Full Text Available For improving and efficient uses of various resources such as labor, capital, materials, energy and information, productivity is the purpose of all economic and industrial organizations and service enterprises. The human factor is the main strategic resource and the realization axis of productivity for each type of organization. Therefore the factors affecting the productivity, depends on suitable conditions for labor. This study is performed to identification and prioritization the factors affecting the productivity of human resources in Khorasan Razavi Gas Company. The objective of this research is an applied and the data collection methods and conclusions are descriptive - survey. Statistical sample size by using Cochran's formula is considered equal to 120. To perform this study with the Delphi method, we identify the factors affecting the productivity of human resources in Khorasan Razavi Gas Company and by using MADM techniques, prioritization of these factors has been done. Also Team Expert Choice2000 software have used for analysis. Research results show that factors affecting the productivity of human resources in Khorasan Razavi Gas Company in order of importance are: Health aspects, leadership style, motivational factors, organizational commitment, work experience, general and applied education, demographic characteristics, physical environment within the organization, external environment and competitive spirit.

  7. A comparison of wavelet analysis techniques in digital holograms

    Science.gov (United States)

    Molony, Karen M.; Maycock, Jonathan; McDonald, John B.; Hennelly, Bryan M.; Naughton, Thomas J.

    2008-04-01

    This study explores the effectiveness of wavelet analysis techniques on digital holograms of real-world 3D objects. Stationary and discrete wavelet transform techniques have been applied for noise reduction and compared. Noise is a common problem in image analysis and successful reduction of noise without degradation of content is difficult to achieve. These wavelet transform denoising techniques are contrasted with traditional noise reduction techniques; mean filtering, median filtering, Fourier filtering. The different approaches are compared in terms of speckle reduction, edge preservation and resolution preservation.

  8. Soft computing techniques in voltage security analysis

    CERN Document Server

    Chakraborty, Kabir

    2015-01-01

    This book focuses on soft computing techniques for enhancing voltage security in electrical power networks. Artificial neural networks (ANNs) have been chosen as a soft computing tool, since such networks are eminently suitable for the study of voltage security. The different architectures of the ANNs used in this book are selected on the basis of intelligent criteria rather than by a “brute force” method of trial and error. The fundamental aim of this book is to present a comprehensive treatise on power system security and the simulation of power system security. The core concepts are substantiated by suitable illustrations and computer methods. The book describes analytical aspects of operation and characteristics of power systems from the viewpoint of voltage security. The text is self-contained and thorough. It is intended for senior undergraduate students and postgraduate students in electrical engineering. Practicing engineers, Electrical Control Center (ECC) operators and researchers will also...

  9. BATMAN: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BATMAN), a novel image-segmentation technique based on Bayesian statistics that characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). We illustrate its operation and performance with a set of test cases including both synthetic and real Integral-Field Spectroscopic data. The output segmentations adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in regions with low signal-to-noise ratio. However, the algorithm may be sensitive to small-scale random fluctuations, and its performance in presence of spatial gradients is limited. Due to these effects, errors may be underestimated by as much as a factor of two. Our analysis reveals that the algorithm prioritizes conservation of all the statistically-significant information over noise reduction, and that the precise choice of the input data has a crucial impact on the results. Hence, the philosophy of BATMAN is not to be used as a `black box' to improve the signal-to-noise ratio, but as a new approach to characterize spatially-resolved data prior to its analysis. The source code is publicly available at http://astro.ft.uam.es/SELGIFS/BaTMAn.

  10. Correspondence factor analysis of steroid libraries.

    Science.gov (United States)

    Ojasoo, T; Raynaud, J P; Doré, J C

    1995-06-01

    The receptor binding of a library of 187 steroids to five steroid hormone receptors (estrogen, progestin, androgen, mineralocorticoid, and glucocorticoid) has been analyzed by correspondence factor analysis (CFA) in order to illustrate how the method could be used to derive structure-activity-relationships from much larger libraries. CFA is a cartographic multivariate technique that provides objective distribution maps of the data after reduction and filtering of redundant information and noise. The key to the analysis of very complex data tables is the formation of barycenters (steroids with one or more common structural fragments) that can be introduced into CFA analyses used as mathematical models. This is possible in CFA because the method uses X2-metrics and is based on the distributional equivalence of the rows and columns of the transformed data matrix. We have thus demonstrated, in purely objective statistical terms, the general conclusions on the specificity of various functional and other groups derived from prior analyses by expert intuition and reasoning. A finer analysis was made of a series of A-ring phenols showing the high degree of glucocorticoid receptor and progesterone receptor binding that can be generated by certain C-11-substitutions despite the presence of the phenolic A-ring characteristic of estrogen receptor-specific binding.

  11. The Human Factors of Graphic Interaction: Tasks and Techniques

    Science.gov (United States)

    1980-12-01

    representation an interaction technique diagram. Our diagrams are not as detailed as the Labanotation [HUTC70], but unlike that notation they represent more...Conference Proceedings on Data Handling Devices (1970), 8. HUTC70 Hutchinson, A., " Labanotation ", Theatre Arts Books, New York (1970). IRVI76 Irving

  12. Biomechanical analysis of cross-country skiing techniques.

    Science.gov (United States)

    Smith, G A

    1992-09-01

    The development of new techniques for cross-country skiing based on skating movements has stimulated biomechanical research aimed at understanding the various movement patterns, the forces driving the motions, and the mechanical factors affecting performance. Research methods have evolved from two-dimensional kinematic descriptions of classic ski techniques to three-dimensional analyses involving measurement of the forces and energy relations of skating. While numerous skiing projects have been completed, most have focused on either the diagonal stride or the V1 skating technique on uphill terrain. Current understanding of skiing mechanics is not sufficiently complete to adequately assess and optimize an individual skier's technique.

  13. Exploratory matrix factorization for PET image analysis.

    Science.gov (United States)

    Kodewitz, A; Keck, I R; Tomé, A M; Lang, E W

    2010-01-01

    Features are extracted from PET images employing exploratory matrix factorization techniques such as nonnegative matrix factorization (NMF). Appropriate features are fed into classifiers such as a support vector machine or a random forest tree classifier. An automatic feature extraction and classification is achieved with high classification rate which is robust and reliable and can help in an early diagnosis of Alzheimer's disease.

  14. 海底管道牺牲阳极更换及腐蚀因子分析%The Replacement Technique of the Subsea Pipeline Sacrificial Anode and the Analysis of the Corrosion Factor

    Institute of Scientific and Technical Information of China (English)

    肖治国; 张敬安; 郑辉; 李成钢

    2012-01-01

      Subsea pipeline was the lifeline of the offshore oil&gas transportation system. Anticorrosion was critical for the subsea pipeline. Sacrificial anode protection was one of the most effective anticorrosion technologies for the subsea pipeline electrochemical corrosion. It should be replaced when it reached to the design life. The anode replacement technique of the subsea pipeline and the change in anode corrosion with corrosion factor in the sea-mud was discussed in this paper. It offered us a reference for replacement and design of the subsea pipeline sacrificial anode system.%  海底管道作为海上的油气运输的生命线,必须对其做好腐蚀保护。牺牲阳极阴极保护是一种控制海底管道电化学腐蚀的有效保护方法,当其达到设计寿命后,必须对其进行更换。本文介绍了海底管道阳极更换技术,并分析了不同腐蚀因子也会对阳极的腐蚀产生影响。以期为海底管道的牺牲阳极腐蚀保护设计和更换提供参考。

  15. New techniques for emulsion analysis in a hybrid experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kodama, K. (Aichi University of Education, Kariya 448 (Japan)); Ushida, N. (Aichi University of Education, Kariya 448 (Japan)); Mokhtarani, A. (University of California (Davis), Davis, CA 95616 (United States)); Paolone, V.S. (University of California (Davis), Davis, CA 95616 (United States)); Volk, J.T. (University of California (Davis), Davis, CA 95616 (United States)); Wilcox, J.O. (University of California (Davis), Davis, CA 95616 (United States)); Yager, P.M. (University of California (Davis), Davis, CA 95616 (United States)); Edelstein, R.M. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Freyberger, A.P. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Gibaut, D.B. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Lipton, R.J. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Nichols, W.R. (Carnegie-Mellon University, Pittsburgh, PA 15213 (United States)); Potter, D.M. (Carnegie-Mellon Univers

    1994-08-01

    A new method, called graphic scanning, was developed by the Nagoya University Group for emulsion analysis in a hybrid experiment. This method enhances both speed and reliability of emulsion analysis. Details of the application of this technique to the analysis of Fermilab experiment E653 are described. ((orig.))

  16. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  17. 48 CFR 815.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 5 2010-10-01 2010-10-01 false Proposal analysis techniques. 815.404-1 Section 815.404-1 Federal Acquisition Regulations System DEPARTMENT OF VETERANS AFFAIRS... techniques. (a) Contracting officers are responsible for the technical and administrative sufficiency of the...

  18. Comparative Analysis of Vehicle Make and Model Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Faiza Ayub Syed

    2014-03-01

    Full Text Available Vehicle Make and Model Recognition (VMMR has emerged as a significant element of vision based systems because of its application in access control systems, traffic control and monitoring systems, security systems and surveillance systems, etc. So far a number of techniques have been developed for vehicle recognition. Each technique follows different methodology and classification approaches. The evaluation results highlight the recognition technique with highest accuracy level. In this paper we have pointed out the working of various vehicle make and model recognition techniques and compare these techniques on the basis of methodology, principles, classification approach, classifier and level of recognition After comparing these factors we concluded that Locally Normalized Harris Corner Strengths (LHNS performs best as compared to other techniques. LHNS uses Bayes and K-NN classification approaches for vehicle classification. It extracts information from frontal view of vehicles for vehicle make and model recognition.

  19. On discriminant analysis techniques and correlation structures in high dimensions

    DEFF Research Database (Denmark)

    Clemmensen, Line Katrine Harder

    This paper compares several recently proposed techniques for performing discriminant analysis in high dimensions, and illustrates that the various sparse methods dier in prediction abilities depending on their underlying assumptions about the correlation structures in the data. The techniques...... generally focus on two things: Obtaining sparsity (variable selection) and regularizing the estimate of the within-class covariance matrix. For high-dimensional data, this gives rise to increased interpretability and generalization ability over standard linear discriminant analysis. Here, we group...

  20. Identification of noise in linear data sets by factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roscoe, B.A.; Hopke, P.K.

    1982-01-01

    With the use of atomic and nuclear methods to analyze samples for a multitude of elements, very large data sets have been generated. Due to the ease of obtaining these results with computerized systems, the elemental data acquired are not always as thoroughly checked as they should be leading to some, if not many, bad data points. It is advantageous to have some feeling for the trouble spots in a data set before it is used for further studies. A technique which has the ability to identify bad data points, after the data has been generated, is classical factor analysis. The ability of classical factor analysis to identify two different types of data errors make it ideally suited for scanning large data sets. Since the results yielded by factor analysis indicate correlations between parameters, one must know something about the nature of the data set and the analytical techniques used to obtain it to confidentially isolate errors.

  1. Key-space analysis of double random phase encryption technique

    Science.gov (United States)

    Monaghan, David S.; Gopinathan, Unnikrishnan; Naughton, Thomas J.; Sheridan, John T.

    2007-09-01

    We perform a numerical analysis on the double random phase encryption/decryption technique. The key-space of an encryption technique is the set of possible keys that can be used to encode data using that technique. In the case of a strong encryption scheme, many keys must be tried in any brute-force attack on that technique. Traditionally, designers of optical image encryption systems demonstrate only how a small number of arbitrary keys cannot decrypt a chosen encrypted image in their system. However, this type of demonstration does not discuss the properties of the key-space nor refute the feasibility of an efficient brute-force attack. To clarify these issues we present a key-space analysis of the technique. For a range of problem instances we plot the distribution of decryption errors in the key-space indicating the lack of feasibility of a simple brute-force attack.

  2. Statistical inference of Minimum Rank Factor Analysis

    NARCIS (Netherlands)

    Shapiro, A; Ten Berge, JMF

    2002-01-01

    For any given number of factors, Minimum Rank Factor Analysis yields optimal communalities for an observed covariance matrix in the sense that the unexplained common variance with that number of factors is minimized, subject to the constraint that both the diagonal matrix of unique variances and the

  3. Statistical inference of Minimum Rank Factor Analysis

    NARCIS (Netherlands)

    Shapiro, A; Ten Berge, JMF

    For any given number of factors, Minimum Rank Factor Analysis yields optimal communalities for an observed covariance matrix in the sense that the unexplained common variance with that number of factors is minimized, subject to the constraint that both the diagonal matrix of unique variances and the

  4. Comparison of visibility measurement techniques for forklift truck design factors.

    Science.gov (United States)

    Choi, Chin-Bong; Park, Peom; Kim, Young-Ho; Susan Hallbeck, M; Jung, Myung-Chul

    2009-03-01

    This study applied the light bulb shadow test, a manikin vision assessment test, and an individual test to a forklift truck to identify forklift truck design factors influencing visibility. The light bulb shadow test followed the standard of ISO/DIS 13564-1 for traveling and maneuvering tests with four test paths (Test Nos. 1, 3, 4, and 6). Digital human and forklift truck models were developed for the manikin vision assessment test with CATIA V5R13 human modeling solutions. Six participants performed the individual tests. Both employed similar parameters to the light bulb shadow test. The individual test had better visibility with fewer numbers and a greater distribution of the shadowed grids than the other two tests due to eye movement and anthropometric differences. The design factors of load backrest extension, lift chain, hose, dashboard, and steering wheel should be the first factors considered to improve visibility, especially when a forklift truck mainly performs a forward traveling task in an open area.

  5. PROGNOSTIC FACTORS ANALYSIS FOR STAGEⅠ RECTAL CANCER

    Institute of Scientific and Technical Information of China (English)

    武爱文; 顾晋; 薛钟麒; 王怡; 徐光炜

    2001-01-01

    To explore the death-related factors of stageⅠrectal cancer patients. Methods: 89 cases of stage I rectal cancer patients between 1985 and 2000 were retrospectively studied for prognostic factors. Factors including age, gender, tumor size, circumferential occupation, gross type, pathological type, depth of tumor invasion, surgical procedure, adjuvant chemotherapy and postoperative complication were chosen for cox multivariate analysis (forward procedure) using Spss software (10.0 version). Results: multivariate analysis demonstrated that muscular invasion was an independent negative prognostic factor for stageⅠrectal cancer patients (P=0.003). Conclusion: Muscular invasion is a negative prognostic factor for stage I rectal cancer patients.

  6. Acoustical Characteristics of Mastication Sounds: Application of Speech Analysis Techniques

    Science.gov (United States)

    Brochetti, Denise

    Food scientists have used acoustical methods to study characteristics of mastication sounds in relation to food texture. However, a model for analysis of the sounds has not been identified, and reliability of the methods has not been reported. Therefore, speech analysis techniques were applied to mastication sounds, and variation in measures of the sounds was examined. To meet these objectives, two experiments were conducted. In the first experiment, a digital sound spectrograph generated waveforms and wideband spectrograms of sounds by 3 adult subjects (1 male, 2 females) for initial chews of food samples differing in hardness and fracturability. Acoustical characteristics were described and compared. For all sounds, formants appeared in the spectrograms, and energy occurred across a 0 to 8000-Hz range of frequencies. Bursts characterized waveforms for peanut, almond, raw carrot, ginger snap, and hard candy. Duration and amplitude of the sounds varied with the subjects. In the second experiment, the spectrograph was used to measure the duration, amplitude, and formants of sounds for the initial 2 chews of cylindrical food samples (raw carrot, teething toast) differing in diameter (1.27, 1.90, 2.54 cm). Six adult subjects (3 males, 3 females) having normal occlusions and temporomandibular joints chewed the samples between the molar teeth and with the mouth open. Ten repetitions per subject were examined for each food sample. Analysis of estimates of variation indicated an inconsistent intrasubject variation in the acoustical measures. Food type and sample diameter also affected the estimates, indicating the variable nature of mastication. Generally, intrasubject variation was greater than intersubject variation. Analysis of ranks of the data indicated that the effect of sample diameter on the acoustical measures was inconsistent and depended on the subject and type of food. If inferences are to be made concerning food texture from acoustical measures of mastication

  7. A comparison between active and passive techniques for measurements of radon emanation factors

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Coto, I. [Dept. Fisica Aplicada, University of Huelva, Huelva (Spain)], E-mail: Israel.lopez@dfa.uhu.es; Mas, J.L. [Dept. de Fisica Aplicada I, E.U.P., University of Seville, Seville (Spain); San Miguel, E.G.; Bolivar, J.P. [Dept. Fisica Aplicada, University of Huelva, Huelva (Spain); Sengupta, D. [Department of Geology and Geophysics, I.I.T. Kharagpur, West Bengal (India)

    2009-05-15

    Some radon related parameters have been determined through two different techniques (passive and active) in soil and phosphogypsum samples. Emanation factors determined through these techniques show a good agreement for soil samples while for phosphogympsum samples appear large discrepancies. In this paper, these discrepancies are analyzed and explained if non-controlled radon leakages in the passive technique are taken into account.

  8. Sensitivity analysis and related analysis : A survey of statistical techniques

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    1995-01-01

    This paper reviews the state of the art in five related types of analysis, namely (i) sensitivity or what-if analysis, (ii) uncertainty or risk analysis, (iii) screening, (iv) validation, and (v) optimization. The main question is: when should which type of analysis be applied; which statistical

  9. Regional environmental analysis and management: New techniques for current problems

    Science.gov (United States)

    Honea, R. B.; Paludan, C. T. N.

    1974-01-01

    Advances in data acquisition and processing procedures for regional environmental analysis are discussed. Automated and semi-automated techniques employing Earth Resources Technology Satellite data and conventional data sources are presented. Experiences are summarized. The ERTS computer compatible tapes provide a very complete and flexible record of earth resources data and represent a viable medium to enhance regional environmental analysis research.

  10. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    OpenAIRE

    Rodica IVORSCHI

    2012-01-01

    SWOT analysis is the most important management techniques for understanding the strategic position of an organization. Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be benefi cial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  11. Meta-analysis in a nutshell: Techniques and general findings

    DEFF Research Database (Denmark)

    Paldam, Martin

    2015-01-01

    The purpose of this article is to introduce the technique and main findings of meta-analysis to the reader, who is unfamiliar with the field and has the usual objections. A meta-analysis is a quantitative survey of a literature reporting estimates of the same parameter. The funnel showing...

  12. SWOT ANALYSIS-MANAGEMENT TECHNIQUES TO STREAMLINE PUBLIC BUSINESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Rodica IVORSCHI

    2012-06-01

    Full Text Available SWOT analysis is the most important management techniques for understanding the strategic position of an organization.Objective SWOT analysis is to recommend strategies to ensure the best alignment between internal and external environment, and choosing the right strategy can be beneficial organization in order to adapt their strengths to opportunities, minimize risks and eliminate weaknesses.

  13. Kernel Factor Analysis Algorithm with Varimax

    Institute of Scientific and Technical Information of China (English)

    Xia Guoen; Jin Weidong; Zhang Gexiang

    2006-01-01

    Kernal factor analysis (KFA) with varimax was proposed by using Mercer kernel function which can map the data in the original space to a high-dimensional feature space, and was compared with the kernel principle component analysis (KPCA). The results show that the best error rate in handwritten digit recognition by kernel factor analysis with varimax (4.2%) was superior to KPCA (4.4%). The KFA with varimax could more accurately image handwritten digit recognition.

  14. Kinematics analysis technique fouettes 720° classic ballet.

    Directory of Open Access Journals (Sweden)

    Li Bo

    2011-07-01

    Full Text Available Athletics practice proved that the more complex the item, the more difficult technique of the exercises. Fouettes at 720° one of the most difficult types of the fouettes. Its implementation is based on high technology during rotation of the performer. To perform this element not only requires good physical condition of the dancer, but also requires possession correct technique dancer. On the basis corresponding kinematic theory in this study, qualitative analysis and quantitative assessment of fouettes at 720 by the best Chinese dancers. For analysis, was taken the method of stereoscopic images and the theoretical analysis.

  15. MEASURING THE LEANNESS OF SUPPLIERS USING PRINCIPAL COMPONENT ANALYSIS TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Y. Zare Mehrjerdi

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: A technique that helps management to reduce costs and improve quality is ‘lean supply chain management’, which focuses on the elimination of all wastes in every stage of the supply chain and is derived from ‘agile production’. This research aims to assess and rank the suppliers in an auto industry, based upon the concept of ‘production leanness’. The focus of this research is on the suppliers of a company called Touse-Omron Naein. We have examined the literature about leanness, and classified its criteria into ten dimensions and 76 factors. A questionnaire was used to collect the data, and the suppliers were ranked using the principal component analysis (PCA technique.

    AFRIKAANSE OPSOMMING: Lenige voorsieningsbestuur (“lean supply chain management” is ’n tegniek wat bestuur in staat stel om koste te verminder en gehalte te verbeter. Dit fokus op die vermindering van vermorsing op elke stadium van die voorsieningsketting en word afgelei van ratse vervaardiging (“agile production”. Hierdie navorsing poog om leweransiers in ’n motorbedryf te beoordeel aan die hand van die konsep van vervaardigingslenigheid (“production leanness”. Die navorsing fokus op leweransiers van ’n maatskappy genaamd Touse-Omron Naein. ’n Literatuurstudie aangaande lenigheid het gelei tot die klassifikasie van kriteria in tien dimensies en 76 faktore. ’n Vraelys is gebruik om die data te versamel en die leweransiers is in rangvolgorde geplaas aan die hand van die PCA-tegniek.

  16. A Comparative Analysis of Techniques for PAPR Reduction of OFDM Signals

    Directory of Open Access Journals (Sweden)

    M. Janjić

    2014-06-01

    Full Text Available In this paper the problem of high Peak-to-Average Power Ratio (PAPR in Orthogonal Frequency-Division Multiplexing (OFDM signals is studied. Besides describing three techniques for PAPR reduction, SeLective Mapping (SLM, Partial Transmit Sequence (PTS and Interleaving, a detailed analysis of the performances of these techniques for various values of relevant parameters (number of phase sequences, number of interleavers, number of phase factors, number of subblocks depending on applied technique, is carried out. Simulation of these techniques is run in Matlab software. Results are presented in the form of Complementary Cumulative Distribution Function (CCDF curves for PAPR of 30000 randomly generated OFDM symbols. Simulations are performed for OFDM signals with 32 and 256 subcarriers, oversampled by a factor of 4. A detailed comparison of these techniques is made based on Matlab simulation results.

  17. Multiple factor analysis by example using R

    CERN Document Server

    Pagès, Jérôme

    2014-01-01

    Multiple factor analysis (MFA) enables users to analyze tables of individuals and variables in which the variables are structured into quantitative, qualitative, or mixed groups. Written by the co-developer of this methodology, Multiple Factor Analysis by Example Using R brings together the theoretical and methodological aspects of MFA. It also includes examples of applications and details of how to implement MFA using an R package (FactoMineR).The first two chapters cover the basic factorial analysis methods of principal component analysis (PCA) and multiple correspondence analysis (MCA). The

  18. Internet-Induced Marketing Techniques: Critical Factors in Viral Marketing Campaigns.

    OpenAIRE

    Woerdl, M.; Papagiannidis, Savvas; Bourlakis, Michael A.; Li, Feng

    2008-01-01

    The rapid diffusion of the Internet and the emergence of various social constructs facilitated by Internet technologies are changing the drivers that define how marketing techniques are developed and refined. This paper identifies critical factors for viral marketing, an Internet-based ‘word-of-mouth’ marketing technique. Based on existing knowledge, five types of viral marketing factors that may critically influence the success of viral marketing campaigns are identified. These factors are t...

  19. Design, data analysis and sampling techniques for clinical research

    OpenAIRE

    Karthik Suresh; Sanjeev V Thomas; Geetha Suresh

    2011-01-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains...

  20. Error Analysis for the Airborne Direct Georeferincing Technique

    Science.gov (United States)

    Elsharkawy, Ahmed S.; Habib, Ayman F.

    2016-10-01

    Direct Georeferencing was shown to be an important alternative to standard indirect image orientation using classical or GPS-supported aerial triangulation. Since direct Georeferencing without ground control relies on an extrapolation process only, particular focus has to be laid on the overall system calibration procedure. The accuracy performance of integrated GPS/inertial systems for direct Georeferencing in airborne photogrammetric environments has been tested extensively in the last years. In this approach, the limiting factor is a correct overall system calibration including the GPS/inertial component as well as the imaging sensor itself. Therefore remaining errors in the system calibration will significantly decrease the quality of object point determination. This research paper presents an error analysis for the airborne direct Georeferencing technique, where integrated GPS/IMU positioning and navigation systems are used, in conjunction with aerial cameras for airborne mapping compared with GPS/INS supported AT through the implementation of certain amount of error on the EOP and Boresight parameters and study the effect of these errors on the final ground coordinates. The data set is a block of images consists of 32 images distributed over six flight lines, the interior orientation parameters, IOP, are known through careful camera calibration procedure, also 37 ground control points are known through terrestrial surveying procedure. The exact location of camera station at time of exposure, exterior orientation parameters, EOP, is known through GPS/INS integration process. The preliminary results show that firstly, the DG and GPS-supported AT have similar accuracy and comparing with the conventional aerial photography method, the two technologies reduces the dependence on ground control (used only for quality control purposes). Secondly, In the DG Correcting overall system calibration including the GPS/inertial component as well as the imaging sensor itself

  1. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  2. Analysis On Classification Techniques In Mammographic Mass Data Set

    OpenAIRE

    K.K.Kavitha; Dr.A.Kangaiammal

    2015-01-01

    Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such a...

  3. Application of pattern recognition techniques to crime analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.

    1976-08-15

    The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)

  4. Nuclear analysis techniques as a component of thermoluminescence dating

    Energy Technology Data Exchange (ETDEWEB)

    Prescott, J.R.; Hutton, J.T.; Habermehl, M.A. [Adelaide Univ., SA (Australia); Van Moort, J. [Tasmania Univ., Sandy Bay, TAS (Australia)

    1996-12-31

    In luminescence dating, an age is found by first measuring dose accumulated since the event being dated, then dividing by the annual dose rate. Analyses of minor and trace elements performed by nuclear techniques have long formed an essential component of dating. Results from some Australian sites are reported to illustrate the application of nuclear techniques of analysis in this context. In particular, a variety of methods for finding dose rates are compared, an example of a site where radioactive disequilibrium is significant and a brief summary is given of a problem which was not resolved by nuclear techniques. 5 refs., 2 tabs.

  5. Applications of Electromigration Techniques: Applications of Electromigration Techniques in Food Analysis

    Science.gov (United States)

    Wieczorek, Piotr; Ligor, Magdalena; Buszewski, Bogusław

    Electromigration techniques, including capillary electrophoresis (CE), are widely used for separation and identification of compounds present in food products. These techniques may also be considered as alternate and complementary with respect to commonly used analytical techniques, such as high-performance liquid chromatography (HPLC), or gas chromatography (GC). Applications of CE concern the determination of high-molecular compounds, like polyphenols, including flavonoids, pigments, vitamins, food additives (preservatives, antioxidants, sweeteners, artificial pigments) are presented. Also, the method developed for the determination of proteins and peptides composed of amino acids, which are basic components of food products, are studied. Other substances such as carbohydrates, nucleic acids, biogenic amines, natural toxins, and other contaminations including pesticides and antibiotics are discussed. The possibility of CE application in food control laboratories, where analysis of the composition of food and food products are conducted, is of great importance. CE technique may be used during the control of technological processes in the food industry and for the identification of numerous compounds present in food. Due to the numerous advantages of the CE technique it is successfully used in routine food analysis.

  6. Virtual Mold Technique in Thermal Stress Analysis during Casting Process

    Institute of Scientific and Technical Information of China (English)

    Si-Young Kwak; Jae-Wook Baek; Jeong-Ho Nam; Jeong-Kil Choi

    2008-01-01

    It is important to analyse the casting product and the mold at the same time considering thermal contraction of the casting and thermal expansion of the mold. The analysis considering contact of the casting and the mold induces the precise prediction of stress distribution and the defect such as hot tearing. But it is difficult to generate FEM mesh for the interface of the casting and the mold. Moreover the mesh for the mold domain spends lots of computational time and memory for the analysis due to a number of meshes. Consequently we proposed the virtual mold technique which only uses mesh of the casting part for thermal stress analysis in casting process. The spring bar element in virtual mold technique is used to consider the contact of the casting and the mold. In general, a volume of the mold is much bigger than that of casting part, so the proposed technique decreases the number of mesh and saves the computational memory and time greatly. In this study, the proposed technique was verified by the comparison with the traditional contact technique on a specimen. And the proposed technique gave satisfactory results.

  7. Image analysis techniques for the study of turbulent flows

    Directory of Open Access Journals (Sweden)

    Ferrari Simone

    2017-01-01

    Full Text Available In this paper, a brief review of Digital Image Analysis techniques employed in Fluid Mechanics for the study of turbulent flows is given. Particularly the focus is on the techniques developed by the research teams the Author worked in, that can be considered relatively “low cost” techniques. Digital Image Analysis techniques have the advantage, when compared to the traditional techniques employing physical point probes, to be non-intrusive and quasi-continuous in space, as every pixel on the camera sensor works as a single probe: consequently, they allow to obtain two-dimensional or three-dimensional fields of the measured quantity in less time. Traditionally, the disadvantages are related to the frequency of acquisition, but modern high-speed cameras are typically able to acquire at frequencies from the order of 1 KHz to the order of 1 MHz. Digital Image Analysis techniques can be employed to measure concentration, temperature, position, displacement, velocity, acceleration and pressure fields with similar equipment and setups, and can be consequently considered as a flexible and powerful tool for measurements on turbulent flows.

  8. A survey on reliability and safety analysis techniques of robot systems in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Eom, H.S.; Kim, J.H.; Lee, J.C.; Choi, Y.R.; Moon, S.S

    2000-12-01

    The reliability and safety analysis techniques was surveyed for the purpose of overall quality improvement of reactor inspection system which is under development in our current project. The contents of this report are : 1. Reliability and safety analysis techniques suvey - Reviewed reliability and safety analysis techniques are generally accepted techniques in many industries including nuclear industry. And we selected a few techniques which are suitable for our robot system. They are falut tree analysis, failure mode and effect analysis, reliability block diagram, markov model, combinational method, and simulation method. 2. Survey on the characteristics of robot systems which are distinguished from other systems and which are important to the analysis. 3. Survey on the nuclear environmental factors which affect the reliability and safety analysis of robot system 4. Collection of the case studies of robot reliability and safety analysis which are performed in foreign countries. The analysis results of this survey will be applied to the improvement of reliability and safety of our robot system and also will be used for the formal qualification and certification of our reactor inspection system.

  9. Review and classification of variability analysis techniques with clinical applications.

    Science.gov (United States)

    Bravi, Andrea; Longtin, André; Seely, Andrew J E

    2011-10-10

    Analysis of patterns of variation of time-series, termed variability analysis, represents a rapidly evolving discipline with increasing applications in different fields of science. In medicine and in particular critical care, efforts have focussed on evaluating the clinical utility of variability. However, the growth and complexity of techniques applicable to this field have made interpretation and understanding of variability more challenging. Our objective is to provide an updated review of variability analysis techniques suitable for clinical applications. We review more than 70 variability techniques, providing for each technique a brief description of the underlying theory and assumptions, together with a summary of clinical applications. We propose a revised classification for the domains of variability techniques, which include statistical, geometric, energetic, informational, and invariant. We discuss the process of calculation, often necessitating a mathematical transform of the time-series. Our aims are to summarize a broad literature, promote a shared vocabulary that would improve the exchange of ideas, and the analyses of the results between different studies. We conclude with challenges for the evolving science of variability analysis.

  10. Determining Dimensionality of Exercise Readiness Using Exploratory Factor Analysis

    Directory of Open Access Journals (Sweden)

    Kelley Strohacker, Rebecca A. Zakrajsek

    2016-06-01

    Full Text Available Assessment of “exercise readiness” is a central component to the flexible non-linear periodization (FNLP method of organizing training workloads, but the underlying factor structure of this construct has not been empirically determined. The purpose of this study was to assess construct dimensionality of exercise readiness using exploratory factor analysis. The result of which serve as initial steps of developing a brief measure of exercise readiness. Participants consisted of students recruited from undergraduate Kinesiology courses at a racially diverse, southern University. Independent, anonymous online survey data were collected across three stages: 1 generation of item pool (n = 290, 2 assessment of face validity and refinement of item pool (n = 168, and 3 exploratory factor analysis (n = 684. A principal axis factor analysis was conducted with 41 items using oblique rotation (promax. Four statistically significant factors, as determined through parallel analysis, explained 61.5% of the variance in exercise readiness. Factor 1 contained items that represented vitality (e.g., lively, revived. Factor 2 items related to physical fatigue (e.g. tired, drained. Factors 3 and 4 were descriptive of, discomfort (e.g. pain, sick and health (i.e. healthy, fit, respectively. This inductive approach indicates that exercise readiness is comprised of four dimensions: vitality, physical fatigue, discomfort, and health. This finding supports readiness assessment techniques currently recommended for practitioners according to the FNLP model. These results serve as a theoretical foundation upon which to further develop and refine a brief survey instrument to measure exercise readiness.

  11. Automated thermal mapping techniques using chromatic image analysis

    Science.gov (United States)

    Buck, Gregory M.

    1989-01-01

    Thermal imaging techniques are introduced using a chromatic image analysis system and temperature sensitive coatings. These techniques are used for thermal mapping and surface heat transfer measurements on aerothermodynamic test models in hypersonic wind tunnels. Measurements are made on complex vehicle configurations in a timely manner and at minimal expense. The image analysis system uses separate wavelength filtered images to analyze surface spectral intensity data. The system was initially developed for quantitative surface temperature mapping using two-color thermographic phosphors but was found useful in interpreting phase change paint and liquid crystal data as well.

  12. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.

    The thesis describes and develops the theoretical foundations of the Random Decrement technique, while giving several examples of modal analysis of large building constructions (bridges). The connection between modal parameters and Random Decrement functions is described theoretically...... is expanded to include both a vector formulation that increases speed considerably, and a new method for the prediction of the variance of the estimated Random Decrement functions. The thesis closes with a number of examples of modal analysis of bridges exposed to natural (ambient) load....

  13. Data analysis techniques for nuclear and particle physicists

    CERN Document Server

    Pruneau, Claude

    2017-01-01

    This is an advanced data analysis textbook for scientists specializing in the areas of particle physics, nuclear physics, and related subfields. As a practical guide for robust, comprehensive data analysis, it focuses on realistic techniques to explain instrumental effects. The topics are relevant for engineers, scientists, and astroscientists working in the fields of geophysics, chemistry, and the physical sciences. The book serves as a reference for more senior scientists while being eminently accessible to advanced undergraduate and graduate students.

  14. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  15. Kernel parameter dependence in spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2010-01-01

    feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply a kernel version of maximum autocorrelation factor (MAF) [7, 8] analysis to irregularly sampled stream sediment geochemistry data from South Greenland and illustrate the dependence...

  16. Multistructure Statistical Model Applied To Factor Analysis

    Science.gov (United States)

    Bentler, Peter M.

    1976-01-01

    A general statistical model for the multivariate analysis of mean and covariance structures is described. Matrix calculus is used to develop the statistical aspects of one new special case in detail. This special case separates the confounding of principal components and factor analysis. (DEP)

  17. Regression Commonality Analysis: A Technique for Quantitative Theory Building

    Science.gov (United States)

    Nimon, Kim; Reio, Thomas G., Jr.

    2011-01-01

    When it comes to multiple linear regression analysis (MLR), it is common for social and behavioral science researchers to rely predominately on beta weights when evaluating how predictors contribute to a regression model. Presenting an underutilized statistical technique, this article describes how organizational researchers can use commonality…

  18. Optimization Techniques for Analysis of Biological and Social Networks

    Science.gov (United States)

    2012-03-28

    systematic fashion under a unifying theoretical and algorithmic framework . Optimization, Complex Networks, Social Network Analysis, Computational...analyzing a new metaheuristic technique, variable objective search. 3. Experimentation and application: Implement the proposed algorithms, test and fine...exact solutions are presented. In [3], we introduce the variable objective search framework for combinatorial optimization. The method utilizes

  19. Evaluation of Meterorite Amono Acid Analysis Data Using Multivariate Techniques

    Science.gov (United States)

    McDonald, G.; Storrie-Lombardi, M.; Nealson, K.

    1999-01-01

    The amino acid distributions in the Murchison carbonaceous chondrite, Mars meteorite ALH84001, and ice from the Allan Hills region of Antarctica are shown, using a multivariate technique known as Principal Component Analysis (PCA), to be statistically distinct from the average amino acid compostion of 101 terrestrial protein superfamilies.

  20. Analysis of effect factors-based stochastic network planning model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Looking at all the indeterminate factors as a whole and regarding activity durations as independent random variables,the traditional stochastic network planning models ignore the inevitable relationship and dependence among activity durations when more than one activity is possibly affected by the same indeterminate factors.On this basis of analysis of indeterminate effect factors of durations,the effect factors-based stochastic network planning (EFBSNP) model is proposed,which emphasizes on the effects of not only logistic and organizational relationships,but also the dependent relationships,due to indeterminate factors among activity durations on the project period.By virtue of indeterminate factor analysis the model extracts and describes the quantitatively indeterminate effect factors,and then takes into account the indeterminate factors effect schedule by using the Monte Carlo simulation technique.The method is flexible enough to deal with effect factors and is coincident with practice.A software has been developed to simplify the model-based calculation,in VisualStudio.NET language.Finally,a case study is included to demonstrate the applicability of the proposed model and comparison is made with some advantages over the existing models.

  1. What Child Analysis Can Teach Us about Psychoanalytic Technique.

    Science.gov (United States)

    Ablon, Steven Luria

    2014-01-01

    Child analysis has much to teach us about analytic technique. Children have an innate, developmentally driven sense of analytic process. Children in analysis underscore the importance of an understanding and belief in the therapeutic action of play, the provisional aspects of play, and that not all play will be understood. Each analysis requires learning a new play signature that is constantly reorganized. Child analysis emphasizes the emergence and integration of dissociated states, the negotiation of self-other relationships, the importance of co-creation, and the child's awareness of the analyst's sensibility. Child analysis highlights the robust nature of transference and how working through and repairing is related to the initiation of coordinated patterns of high predictability in the context of deep attachments. I will illustrate these and other ideas in the description of the analysis of a nine-year-old boy.

  2. Developing techniques for cause-responsibility analysis of occupational accidents.

    Science.gov (United States)

    Jabbari, Mousa; Ghorbani, Roghayeh

    2016-11-01

    The aim of this study was to specify the causes of occupational accidents, determine social responsibility and the role of groups involved in work-related accidents. This study develops occupational accidents causes tree, occupational accidents responsibility tree, and occupational accidents component-responsibility analysis worksheet; based on these methods, it develops cause-responsibility analysis (CRA) techniques, and for testing them, analyzes 100 fatal/disabling occupational accidents in the construction setting that were randomly selected from all the work-related accidents in Tehran, Iran, over a 5-year period (2010-2014). The main result of this study involves two techniques for CRA: occupational accidents tree analysis (OATA) and occupational accidents components analysis (OACA), used in parallel for determination of responsible groups and responsibilities rate. From the results, we find that the management group of construction projects has 74.65% responsibility of work-related accidents. The developed techniques are purposeful for occupational accidents investigation/analysis, especially for the determination of detailed list of tasks, responsibilities, and their rates. Therefore, it is useful for preventing work-related accidents by focusing on the responsible group's duties.

  3. Multivariate Analysis Techniques for Optimal Vision System Design

    DEFF Research Database (Denmark)

    Sharifzadeh, Sara

    used in this thesis are described. The methodological strategies are outlined including sparse regression and pre-processing based on feature selection and extraction methods, supervised versus unsupervised analysis and linear versus non-linear approaches. One supervised feature selection algorithm......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... (SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits...

  4. Design, data analysis and sampling techniques for clinical research.

    Science.gov (United States)

    Suresh, Karthik; Thomas, Sanjeev V; Suresh, Geetha

    2011-10-01

    Statistical analysis is an essential technique that enables a medical research practitioner to draw meaningful inference from their data analysis. Improper application of study design and data analysis may render insufficient and improper results and conclusion. Converting a medical problem into a statistical hypothesis with appropriate methodological and logical design and then back-translating the statistical results into relevant medical knowledge is a real challenge. This article explains various sampling methods that can be appropriately used in medical research with different scenarios and challenges.

  5. THE ‘HYBRID’ TECHNIQUE FOR RISK ANALYSIS OF SOME DISEASES

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept ‘Illness Fuzzy Set’ and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determine the weights wi in fuzzy comprehensive evaluation, and a new method ‘Improved Information Distribution’ is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method.

  6. THE 'HYBRID' TECHNIQUE FOR RISK ANALYSIS OF SOME DISEASES

    Institute of Scientific and Technical Information of China (English)

    SHANGHANJI; LUYUCHU; XUXUEMEI; CHENQIAN

    2001-01-01

    Based on the data obtained from a survey recently made in Shanghai, this paper presents the hybrid technique for risk analysis and evaluation of some diseases. After determination of main risk factors of these diseases by analysis of variance, the authors introduce a new concept 'Illness Fuzzy Set' and use fuzzy comprehensive evaluation to evaluate the risk of suffering from a disease for residents. Optimal technique is used to determinethe weights wi in fuzzy comprehensive evaluation, and a new method 'Improved Information Distribution' is also introduced for the treatment of small sample problem. It is shown that the results obtained by using the hybrid technique are better than by using single fuzzy technique or single statistical method.

  7. Evaluation of Damping Using Frequency Domain Operational Modal Analysis Techniques

    DEFF Research Database (Denmark)

    Bajric, Anela; Georgakis, Christos T.; Brincker, Rune

    2015-01-01

    Operational Modal Analysis (OMA) techniques provide in most cases reasonably accurate estimates of structural frequencies and mode shapes. In contrast though, they are known to often produce uncertain structural damping estimates, which is mainly due to inherent random and/or bias errors...... domain techniques, the Frequency Domain Decomposition (FDD) and the Frequency Domain Polyreference (FDPR). The response of a two degree-of-freedom (2DOF) system is numerically established with specified modal parameters subjected to white noise loading. The system identification is evaluated with well...

  8. Error analysis in correlation computation of single particle reconstruction technique

    Institute of Scientific and Technical Information of China (English)

    胡悦; 隋森芳

    1999-01-01

    The single particle reconstruction technique has become particularly important in the structure analysis of hiomaeromolecules. The problem of reconstructing a picture from identical samples polluted by colored noises is studied, and the alignment error in the correlation computation of single particle reconstruction technique is analyzed systematically. The concept of systematic error is introduced, and the explicit form of the systematic error is given under the weak noise approximation. The influence of the systematic error on the reconstructed picture is discussed also, and an analytical formula for correcting the distortion in the picture reconstruction is obtained.

  9. Analysis On Classification Techniques In Mammographic Mass Data Set

    Directory of Open Access Journals (Sweden)

    Mrs. K. K. Kavitha

    2015-07-01

    Full Text Available Data mining, the extraction of hidden information from large databases, is to predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. Data-Mining classification techniques deals with determining to which group each data instances are associated with. It can deal with a wide variety of data so that large amount of data can be involved in processing. This paper deals with analysis on various data mining classification techniques such as Decision Tree Induction, Naïve Bayes , k-Nearest Neighbour (KNN classifiers in mammographic mass dataset.

  10. Identification of noise in linear data sets by factor analysis

    Energy Technology Data Exchange (ETDEWEB)

    Roscoe, B.A.; Hopke, P.K.

    1981-01-01

    The approach to classical factor analysis described in this paper, i.e. doing the analysis for varying numbers of factors without prior assumptions to the number of factors, prevents one from getting eroneous results by inherent computer code assumptions. Identification of a factor containing most of the variance of one variable with little variance of other variables, pinpoints a possible difficulty in the data, if the singularity has no obvious physical significance. Examination of the factor scores will determine whether the problem is isolated to a few samples or over all the samples. Having this information, one may then go back to the raw data and take the appropriate corrective action. Classical factor analysis has the ability to identify several types of errors in data after it has been generated. It is then ideally suited for scanning large data sets. The ease of the identification technique makes it a beneficial tool to use before reduction and analysis of large data sets and should, in the long run, save time and effort.

  11. Determining Dimensionality of Exercise Readiness Using Exploratory Factor Analysis.

    Science.gov (United States)

    Strohacker, Kelley; Zakrajsek, Rebecca A

    2016-06-01

    Assessment of "exercise readiness" is a central component to the flexible non-linear periodization (FNLP) method of organizing training workloads, but the underlying factor structure of this construct has not been empirically determined. The purpose of this study was to assess construct dimensionality of exercise readiness using exploratory factor analysis. The result of which serve as initial steps of developing a brief measure of exercise readiness. Participants consisted of students recruited from undergraduate Kinesiology courses at a racially diverse, southern University. Independent, anonymous online survey data were collected across three stages: 1) generation of item pool (n = 290), 2) assessment of face validity and refinement of item pool (n = 168), and 3) exploratory factor analysis (n = 684). A principal axis factor analysis was conducted with 41 items using oblique rotation (promax). Four statistically significant factors, as determined through parallel analysis, explained 61.5% of the variance in exercise readiness. Factor 1 contained items that represented vitality (e.g., lively, revived). Factor 2 items related to physical fatigue (e.g. tired, drained). Factors 3 and 4 were descriptive of, discomfort (e.g. pain, sick) and health (i.e. healthy, fit), respectively. This inductive approach indicates that exercise readiness is comprised of four dimensions: vitality, physical fatigue, discomfort, and health. This finding supports readiness assessment techniques currently recommended for practitioners according to the FNLP model. These results serve as a theoretical foundation upon which to further develop and refine a brief survey instrument to measure exercise readiness. Key pointsAssessment of exercise readiness is a key component in implementing an exercise program based on flexible nonlinear periodization, but the dimensionality of this concept has not been empirically determined.Based on a series of surveys and a robust exploratory factor analysis

  12. Post-tonsillectomy hemorrhage: assessment of risk factors with special attention to introduction of coblation technique.

    Science.gov (United States)

    Heidemann, Christian H; Wallén, Mia; Aakesson, Marie; Skov, Peter; Kjeldsen, Anette D; Godballe, Christian

    2009-07-01

    Post-tonsillectomy hemorrhage (PTH) is a relatively common and potentially life-threatening complication. The objective of this study was to examine the rate of PTH and identify risk factors. A retrospective cohort study was carried out including all tonsillectomies (430 patients) performed at Odense University Hospital (OUH) or Svendborg Hospital (SH), Denmark. PTH occurred in 52 patients (12.1%). Of the 180 patients treated with coblation technique, 41 (22.7%) had PTH. There were no fatal bleeding episodes. Multiple regression analysis resulted in three significant covariates: "Coblation as surgical technique" [relative risk (RR) = 5.3], "peritonsillar abscess as indication for surgery" (RR = 0.3) and "age equal to or above 15 years at the time of surgery" (RR = 5.4). It is concluded that patient age, PTA as indication for surgery and the use of coblation significantly affect the occurrence of PTH when coblation procedures are performed by non-experienced surgeons. We advise that implementation of coblation tonsillectomy is thoroughly planned with sufficient training of surgeons and continuous surveillance of results. If PTH rates comparable to "cold dissections tonsillectomy" cannot be reached intervention (learning or closing down of coblation tonsillectomy) has to be done.

  13. Golden glazes analysis by PIGE and PIXE techniques

    Science.gov (United States)

    Fonseca, M.; Luís, H.; Franco, N.; Reis, M. A.; Chaves, P. C.; Taborda, A.; Cruz, J.; Galaviz, D.; Fernandes, N.; Vieira, P.; Ribeiro, J. P.; Jesus, A. P.

    2011-12-01

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 °C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  14. Development of fault diagnostic technique using reactor noise analysis

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Ho; Kim, J. S.; Oh, I. S.; Ryu, J. S.; Joo, Y. S.; Choi, S.; Yoon, D. B

    1999-04-01

    The ultimate goal of this project is to establish the analysis technique to diagnose the integrity of reactor internals using reactor noise. The reactor noise analyses techniques for the PWR and CANDU NPP(Nuclear Power Plants) were established by which the dynamic characteristics of reactor internals and SPND instrumentations could be identified, and the noise database corresponding to each plant(both Korean and foreign one) was constructed and compared. Also the change of dynamic characteristics of the Ulchin 1 and 2 reactor internals were simulated under presumed fault conditions. Additionally portable reactor noise analysis system was developed so that real time noise analysis could directly be able to be performed at plant site. The reactor noise analyses techniques developed and the database obtained from the fault simulation, can be used to establish a knowledge based expert system to diagnose the NPP's abnormal conditions. And the portable reactor noise analysis system may be utilized as a substitute for plant IVMS(Internal Vibration Monitoring System). (author)

  15. Probabilistic Analysis Techniques Applied to Complex Spacecraft Power System Modeling

    Science.gov (United States)

    Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2005-01-01

    Electric power system performance predictions are critical to spacecraft, such as the International Space Station (ISS), to ensure that sufficient power is available to support all the spacecraft s power needs. In the case of the ISS power system, analyses to date have been deterministic, meaning that each analysis produces a single-valued result for power capability because of the complexity and large size of the model. As a result, the deterministic ISS analyses did not account for the sensitivity of the power capability to uncertainties in model input variables. Over the last 10 years, the NASA Glenn Research Center has developed advanced, computationally fast, probabilistic analysis techniques and successfully applied them to large (thousands of nodes) complex structural analysis models. These same techniques were recently applied to large, complex ISS power system models. This new application enables probabilistic power analyses that account for input uncertainties and produce results that include variations caused by these uncertainties. Specifically, N&R Engineering, under contract to NASA, integrated these advanced probabilistic techniques with Glenn s internationally recognized ISS power system model, System Power Analysis for Capability Evaluation (SPACE).

  16. Nuclear techniques of analysis in diamond synthesis and annealing

    Energy Technology Data Exchange (ETDEWEB)

    Jamieson, D. N.; Prawer, S.; Gonon, P.; Walker, R.; Dooley, S.; Bettiol, A.; Pearce, J. [Melbourne Univ., Parkville, VIC (Australia). School of Physics

    1996-12-31

    Nuclear techniques of analysis have played an important role in the study of synthetic and laser annealed diamond. These measurements have mainly used ion beam analysis with a focused MeV ion beam in a nuclear microprobe system. A variety of techniques have been employed. One of the most important is nuclear elastic scattering, sometimes called non-Rutherford scattering, which has been used to accurately characterise diamond films for thickness and composition. This is possible by the use of a database of measured scattering cross sections. Recently, this work has been extended and nuclear elastic scattering cross sections for both natural boron isotopes have been measured. For radiation damaged diamond, a focused laser annealing scheme has been developed which produces near complete regrowth of MeV phosphorus implanted diamonds. In the laser annealed regions, proton induced x-ray emission has been used to show that 50 % of the P atoms occupy lattice sites. This opens the way to produce n-type diamond for microelectronic device applications. All these analytical applications utilize a focused MeV microbeam which is ideally suited for diamond analysis. This presentation reviews these applications, as well as the technology of nuclear techniques of analysis for diamond with a focused beam. 9 refs., 6 figs.

  17. Noble Gas Measurement and Analysis Technique for Monitoring Reprocessing Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Charlton, William S [Univ. of California, Berkeley, CA (United States)

    1999-09-01

    An environmental monitoring technique using analysis of stable noble gas isotopic ratios on-stack at a reprocessing facility was developed. This technique integrates existing technologies to strengthen safeguards at reprocessing facilities. The isotopic ratios are measured using a mass spectrometry system and are compared to a database of calculated isotopic ratios using a Bayesian data analysis method to determine specific fuel parameters (e.g., burnup, fuel type, fuel age, etc.). These inferred parameters can be used by investigators to verify operator declarations. A user-friendly software application (named NOVA) was developed for the application of this technique. NOVA included a Visual Basic user interface coupling a Bayesian data analysis procedure to a reactor physics database (calculated using the Monteburns 3.01 code system). The integrated system (mass spectrometry, reactor modeling, and data analysis) was validated using on-stack measurements during the reprocessing of target fuel from a U.S. production reactor and gas samples from the processing of EBR-II fast breeder reactor driver fuel. These measurements led to an inferred burnup that matched the declared burnup with sufficient accuracy and consistency for most safeguards applications. The NOVA code was also tested using numerous light water reactor measurements from the literature. NOVA was capable of accurately determining spent fuel type, burnup, and fuel age for these experimental results. Work should continue to demonstrate the robustness of this system for production, power, and research reactor fuels.

  18. Analysis of Economic Factors Affecting Stock Market

    OpenAIRE

    Xie, Linyin

    2010-01-01

    This dissertation concentrates on analysis of economic factors affecting Chinese stock market through examining relationship between stock market index and economic factors. Six economic variables are examined: industrial production, money supply 1, money supply 2, exchange rate, long-term government bond yield and real estate total value. Stock market comprises fixed interest stocks and equities shares. In this dissertation, stock market is restricted to equity market. The stock price in thi...

  19. Model order reduction techniques with applications in finite element analysis

    CERN Document Server

    Qu, Zu-Qing

    2004-01-01

    Despite the continued rapid advance in computing speed and memory the increase in the complexity of models used by engineers persists in outpacing them. Even where there is access to the latest hardware, simulations are often extremely computationally intensive and time-consuming when full-blown models are under consideration. The need to reduce the computational cost involved when dealing with high-order/many-degree-of-freedom models can be offset by adroit computation. In this light, model-reduction methods have become a major goal of simulation and modeling research. Model reduction can also ameliorate problems in the correlation of widely used finite-element analyses and test analysis models produced by excessive system complexity. Model Order Reduction Techniques explains and compares such methods focusing mainly on recent work in dynamic condensation techniques: - Compares the effectiveness of static, exact, dynamic, SEREP and iterative-dynamic condensation techniques in producing valid reduced-order mo...

  20. Large areas elemental mapping by ion beam analysis techniques

    Science.gov (United States)

    Silva, T. F.; Rodrigues, C. L.; Curado, J. F.; Allegro, P.; Moro, M. V.; Campos, P. H. O. V.; Santos, S. B.; Kajiya, E. A. M.; Rizzutto, M. A.; Added, N.; Tabacniks, M. H.

    2015-07-01

    The external beam line of the Laboratory for Material Analysis with Ion Beams (LAMFI) is a versatile setup for multi-technique analysis. X-ray detectors for Particle Induced X-rays Emission (PIXE) measurements, a Gamma-ray detector for Particle Induced Gamma- ray Emission (PIGE), and a particle detector for scattering analysis, such as Rutherford Backscattering Spectrometry (RBS), were already installed. In this work, we present some results, using a large (60-cm range) XYZ computer controlled sample positioning system, completely developed and build in our laboratory. The XYZ stage was installed at the external beam line and its high spacial resolution (better than 5 μm over the full range) enables positioning the sample with high accuracy and high reproducibility. The combination of a sub-millimeter beam with the large range XYZ robotic stage is being used to produce elemental maps of large areas in samples like paintings, ceramics, stones, fossils, and all sort of samples. Due to its particular characteristics, this is a unique device in the sense of multi-technique analysis of large areas. With the continuous development of the external beam line at LAMFI, coupled to the robotic XYZ stage, it is becoming a robust and reliable option for regular analysis of trace elements (Z > 5) competing with the traditional in-vacuum ion-beam-analysis with the advantage of automatic rastering.

  1. Phasor analysis of binary diffraction gratings with different fill factors

    Energy Technology Data Exchange (ETDEWEB)

    MartInez, Antonio [Departamento de Ciencia de Materiales, Optica y TecnologIa Electronica, Universidad Miguel Hernandez, 03202 Elche (Spain); Sanchez-Lopez, Ma del Mar [Instituto de BioingenierIa y Departamento de Fisica y Arquitectura de Computadores, Universidad Miguel Hernandez, 03202 Elche (Spain); Moreno, Ignacio [Departamento de Ciencia de Materiales, Optica y TecnologIa Electronica, Universidad Miguel Hernandez, 03202 Elche (Spain)

    2007-09-11

    In this work, we present a simple analysis of binary diffraction gratings with different slit widths relative to the grating period. The analysis is based on a simple phasor technique directly derived from the Huygens principle. By introducing a slit phasor and a grating phasor, the intensity of the diffracted orders and the grating's resolving power can be easily obtained without applying the usual Fourier transform operations required for these calculations. The proposed phasor technique is mathematically equivalent to the Fourier transform calculation of the diffraction order amplitude, and it can be useful to explain binary diffraction gratings in a simple manner in introductory physics courses. This theoretical analysis is illustrated with experimental results using a liquid crystal device to display diffraction gratings with different fill factors.

  2. Efficient techniques for genotype-phenotype correlational analysis.

    Science.gov (United States)

    Saha, Subrata; Rajasekaran, Sanguthevar; Bi, Jinbo; Pathak, Sudipta

    2013-04-04

    Single Nucleotide Polymorphisms (SNPs) are sequence variations found in individuals at some specific points in the genomic sequence. As SNPs are highly conserved throughout evolution and within a population, the map of SNPs serves as an excellent genotypic marker. Conventional SNPs analysis mechanisms suffer from large run times, inefficient memory usage, and frequent overestimation. In this paper, we propose efficient, scalable, and reliable algorithms to select a small subset of SNPs from a large set of SNPs which can together be employed to perform phenotypic classification. Our algorithms exploit the techniques of gene selection and random projections to identify a meaningful subset of SNPs. To the best of our knowledge, these techniques have not been employed before in the context of genotype-phenotype correlations. Random projections are used to project the input data into a lower dimensional space (closely preserving distances). Gene selection is then applied on the projected data to identify a subset of the most relevant SNPs. We have compared the performance of our algorithms with one of the currently known best algorithms called Multifactor Dimensionality Reduction (MDR), and Principal Component Analysis (PCA) technique. Experimental results demonstrate that our algorithms are superior in terms of accuracy as well as run time. In our proposed techniques, random projection is used to map data from a high dimensional space to a lower dimensional space, and thus overcomes the curse of dimensionality problem. From this space of reduced dimension, we select the best subset of attributes. It is a unique mechanism in the domain of SNPs analysis, and to the best of our knowledge it is not employed before. As revealed by our experimental results, our proposed techniques offer the potential of high accuracies while keeping the run times low.

  3. The analysis of gastric function using computational techniques

    CERN Document Server

    Young, P

    2002-01-01

    The work presented in this thesis was carried out at the Magnetic Resonance Centre, Department of Physics and Astronomy, University of Nottingham, between October 1996 and June 2000. This thesis describes the application of computerised techniques to the analysis of gastric function, in relation to Magnetic Resonance Imaging data. The implementation of a computer program enabling the measurement of motility in the lower stomach is described in Chapter 6. This method allowed the dimensional reduction of multi-slice image data sets into a 'Motility Plot', from which the motility parameters - the frequency, velocity and depth of contractions - could be measured. The technique was found to be simple, accurate and involved substantial time savings, when compared to manual analysis. The program was subsequently used in the measurement of motility in three separate studies, described in Chapter 7. In Study 1, four different meal types of varying viscosity and nutrient value were consumed by 12 volunteers. The aim of...

  4. Technique Triangulation for Validation in Directed Content Analysis

    Directory of Open Access Journals (Sweden)

    Áine M. Humble PhD

    2009-09-01

    Full Text Available Division of labor in wedding planning varies for first-time marriages, with three types of couples—traditional, transitional, and egalitarian—identified, but nothing is known about wedding planning for remarrying individuals. Using semistructured interviews, the author interviewed 14 couples in which at least one person had remarried and used directed content analysis to investigate the extent to which the aforementioned typology could be transferred to this different context. In this paper she describes how a triangulation of analytic techniques provided validation for couple classifications and also helped with moving beyond “blind spots” in data analysis. Analytic approaches were the constant comparative technique, rank order comparison, and visual representation of coding, using MAXQDA 2007's tool called TextPortraits.

  5. Characterization of PTFE Using Advanced Thermal Analysis Techniques

    Science.gov (United States)

    Blumm, J.; Lindemann, A.; Meyer, M.; Strasser, C.

    2010-10-01

    Polytetrafluoroethylene (PTFE) is a synthetic fluoropolymer used in numerous industrial applications. It is often referred to by its trademark name, Teflon. Thermal characterization of a PTFE material was carried out using various thermal analysis and thermophysical properties test techniques. The transformation energetics and specific heat were measured employing differential scanning calorimetry. The thermal expansion and the density changes were determined employing pushrod dilatometry. The viscoelastic properties (storage and loss modulus) were analyzed using dynamic mechanical analysis. The thermal diffusivity was measured using the laser flash technique. Combining thermal diffusivity data with specific heat and density allows calculation of the thermal conductivity of the polymer. Measurements were carried out from - 125 °C up to 150 °C. Additionally, measurements of the mechanical properties were carried out down to - 170 °C. The specific heat tests were conducted into the fully molten regions up to 370 °C.

  6. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  7. Mathematical analysis techniques for modeling the space network activities

    Science.gov (United States)

    Foster, Lisa M.

    1992-01-01

    The objective of the present work was to explore and identify mathematical analysis techniques, and in particular, the use of linear programming. This topic was then applied to the Tracking and Data Relay Satellite System (TDRSS) in order to understand the space network better. Finally, a small scale version of the system was modeled, variables were identified, data was gathered, and comparisons were made between actual and theoretical data.

  8. Calcium Hardness Analysis of Water Samples Using EDXRF Technique

    Directory of Open Access Journals (Sweden)

    Kanan Deep

    2014-08-01

    Full Text Available Calcium hardness of water samples has been determined using a method based upon the Energy Dispersive X-ray fluorescence (EDXRF technique for elemental analysis. The minimum detection limit for Ca has been found in the range 0.1-100ppm. The experimental approach and analytical method for calcium studies seem satisfactory for the purpose and can be utilized for similar investigations.

  9. Technique of Hadamard transform microscope fluorescence image analysis

    Institute of Scientific and Technical Information of China (English)

    梅二文; 顾文芳; 曾晓斌; 陈观铨; 曾云鹗

    1995-01-01

    Hadamard transform spatial multiplexed imaging technique is combined with fluorescence microscope and an instrument of Hadamard transform microscope fluorescence image analysis is developed. Images acquired by this instrument can provide a lot of useful information simultaneously, including three-dimensional Hadamard transform microscope cell fluorescence image, the fluorescence intensity and fluorescence distribution of a cell, the background signal intensity and the signal/noise ratio, etc.

  10. Failure Analysis Seminar: Techniques and Teams. Seminar Notes. Volume I.

    Science.gov (United States)

    1981-01-01

    and Progress - Evaluate 7* 6 *~ 0 6 9 9 S 9 FAILURE ANALYSIS STRATEGY1 Augustine E. Magistro *. Introduction A primary task of management and systems...by Augustine Magistro , Picatinny Arsenal and Lawrence R. Seggel, U. S. Army Missile Command. The report Is available from the National Technical...to emphasize techniques - Identification and improvement of your leadership styles 2I BIOGRAPHIC SKETCHES: A.E. "Gus" Magistro - Systems Evaluation

  11. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  12. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis

    OpenAIRE

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    Introduction An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a comm...

  13. Analysis of diagnostic calorimeter data by the transfer function technique

    Science.gov (United States)

    Delogu, R. S.; Poggi, C.; Pimazzoni, A.; Rossi, G.; Serianni, G.

    2016-02-01

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  14. Analysis of diagnostic calorimeter data by the transfer function technique

    Energy Technology Data Exchange (ETDEWEB)

    Delogu, R. S., E-mail: rita.delogu@igi.cnr.it; Pimazzoni, A.; Serianni, G. [Consorzio RFX, Corso Stati Uniti, 35127 Padova (Italy); Poggi, C.; Rossi, G. [Università degli Studi di Padova, Via 8 Febbraio 1848, 35122 Padova (Italy)

    2016-02-15

    This paper describes the analysis procedure applied to the thermal measurements on the rear side of a carbon fibre composite calorimeter with the purpose of reconstructing the energy flux due to an ion beam colliding on the front side. The method is based on the transfer function technique and allows a fast analysis by means of the fast Fourier transform algorithm. Its efficacy has been tested both on simulated and measured temperature profiles: in all cases, the energy flux features are well reproduced and beamlets are well resolved. Limits and restrictions of the method are also discussed, providing strategies to handle issues related to signal noise and digital processing.

  15. What Is Rotating in Exploratory Factor Analysis?

    Science.gov (United States)

    Osborne, Jason W.

    2015-01-01

    Exploratory factor analysis (EFA) is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what "rotation" is, what exactly is rotating, and why we use rotation when performing…

  16. Stepwise Variable Selection in Factor Analysis.

    Science.gov (United States)

    Kano, Yutaka; Harada, Akira

    2000-01-01

    Takes several goodness-of-fit statistics as measures of variable selection and develops backward elimination and forward selection procedures in exploratory factor analysis. A newly developed variable selection program, SEFA, can print several fit measures for a current model and models obtained by removing an internal variable or adding an…

  17. Multilevel exploratory factor analysis of discrete data

    NARCIS (Netherlands)

    Barendse, M.T.; Oort, F.J.; Jak, S.; Timmerman, M.E.

    2013-01-01

    Exploratory factor analysis (EFA) can be used to determine the dimensionality of a set of items. When data come from clustered subjects, such as pupils within schools or children within families, the hierarchical structure of the data should be taken into account. Standard multilevel EFA is only sui

  18. Categorical and nonparametric data analysis choosing the best statistical technique

    CERN Document Server

    Nussbaum, E Michael

    2014-01-01

    Featuring in-depth coverage of categorical and nonparametric statistics, this book provides a conceptual framework for choosing the most appropriate type of test in various research scenarios. Class tested at the University of Nevada, the book's clear explanations of the underlying assumptions, computer simulations, and Exploring the Concept boxes help reduce reader anxiety. Problems inspired by actual studies provide meaningful illustrations of the techniques. The underlying assumptions of each test and the factors that impact validity and statistical power are reviewed so readers can explain

  19. Are factor analytical techniques used appropriately in the validation of health status questionnaires?

    DEFF Research Database (Denmark)

    de Vet, Henrica C W; Adér, Herman J; Terwee, Caroline B

    2005-01-01

    Factor analysis is widely used to evaluate whether questionnaire items can be grouped into clusters representing different dimensions of the construct under study. This review focuses on the appropriate use of factor analysis. The Medical Outcomes Study Short Form-36 (SF-36) is used as an example....... Articles were systematically searched and assessed according to a number of criteria for appropriate use and reporting. Twenty-eight studies were identified: exploratory factor analysis was performed in 22 studies, confirmatory factor analysis was performed in five studies and in one study both were...... of methods is crucial for correct interpretation of the results and verification of the conclusions. Our list of criteria may be useful for journal editors, reviewers and researchers who have to assess publications in which factor analysis is applied....

  20. ANALYSIS OF THE FACTORS AFFECTING THE AVERAGE

    Directory of Open Access Journals (Sweden)

    Carmen BOGHEAN

    2013-12-01

    Full Text Available Productivity in agriculture most relevantly and concisely expresses the economic efficiency of using the factors of production. Labour productivity is affected by a considerable number of variables (including the relationship system and interdependence between factors, which differ in each economic sector and influence it, giving rise to a series of technical, economic and organizational idiosyncrasies. The purpose of this paper is to analyse the underlying factors of the average work productivity in agriculture, forestry and fishing. The analysis will take into account the data concerning the economically active population and the gross added value in agriculture, forestry and fishing in Romania during 2008-2011. The distribution of the average work productivity per factors affecting it is conducted by means of the u-substitution method.

  1. The potential of electroanalytical techniques in pharmaceutical analysis.

    Science.gov (United States)

    Kauffmann, J M; Pékli-Novák, M; Nagy, A

    1996-03-01

    With the considerable progresses observed in analytical instrumentation, it was of interest to survey recent trends in the field of electroanalysis of drugs. Potentiometric, voltammetric and amperometric techniques were scrutinized both in terms of historical evolution and in terms of potentialities with respect to the analysis of drugs in various matrices. With regard to the former, it appeared that numerous original selective electrodes (for drugs and ions) have been studied and several ion-selective electrodes have been successfully commercialized. Improvements are still expected in this field in order to find more robust membrane matrices and to minimize the surface fouling. Electrochemistry is well suited for trace metal analysis. A renewed interest in potentiometric stripping analysis is observed and is stimulated by the power of computers and microprocessors which allow rapid signal recording and data handling. Polarography and its refinements (Pulsed Waveform, Automation,...) is ideally applied for trace metal analysis and speciation. The technique is still useful in the analysis of drug formulations and in biological samples provided that the method is adequately validated (selectivity!). The same holds for solid electrodes which are currently routinely applied as sensitive detectors after chromatographic separation. New instrumentation is soon expected as regard electrochemical detection in capillary electrophoresis. Actually, in order to increase the responses and improve the selectivity, solid electrodes are facing exponential research dedicated to surface modifications. Perm-selectivity, chelations catalysis, etc. may be considered as appropriate strategies. Microelectrodes and screen printed (disposable) sensors are of considerable interest in cell culture e.g. for single cell excretion analysis and in field (decentralized) assays, respectively. Finally several biosensors and electrochemical immunoassays have been successfully development for the

  2. A comparison study on detection of key geochemical variables and factors through three different types of factor analysis

    Science.gov (United States)

    Hoseinzade, Zohre; Mokhtari, Ahmad Reza

    2017-10-01

    Large numbers of variables have been measured to explain different phenomena. Factor analysis has widely been used in order to reduce the dimension of datasets. Additionally, the technique has been employed to highlight underlying factors hidden in a complex system. As geochemical studies benefit from multivariate assays, application of this method is widespread in geochemistry. However, the conventional protocols in implementing factor analysis have some drawbacks in spite of their advantages. In the present study, a geochemical dataset including 804 soil samples collected from a mining area in central Iran in order to search for MVT type Pb-Zn deposits was considered to outline geochemical analysis through various fractal methods. Routine factor analysis, sequential factor analysis, and staged factor analysis were applied to the dataset after opening the data with (additive logratio) alr-transformation to extract mineralization factor in the dataset. A comparison between these methods indicated that sequential factor analysis has more clearly revealed MVT paragenesis elements in surface samples with nearly 50% variation in F1. In addition, staged factor analysis has given acceptable results while it is easy to practice. It could detect mineralization related elements while larger factor loadings are given to these elements resulting in better pronunciation of mineralization.

  3. Human factors multi-technique approach to teenage engagement in digital technologies health research

    OpenAIRE

    Lang, Alexandra R; Craven, Michael P; Atkinson, Sarah; Simons, Lucy; Cobb, Sue; Mazzola, Marco

    2016-01-01

    This chapter explores the use of multi-techniques for teenage HCI health research. Through four case studies we present information about adolescents as users of healthcare services and technologies, adolescent personal development and the human factors approaches through which teenagers have been involved in healthcare research projects. In each case study; comprising of the design or evaluation of a new digital technology for supporting health or well-being, the techniques used by researche...

  4. Applying Stylometric Analysis Techniques to Counter Anonymity in Cyberspace

    Directory of Open Access Journals (Sweden)

    Jianwen Sun

    2012-02-01

    Full Text Available Due to the ubiquitous nature and anonymity abuses in cyberspace, it’s difficult to make criminal identity tracing in cybercrime investigation. Writeprint identification offers a valuable tool to counter anonymity by applying stylometric analysis technique to help identify individuals based on textual traces. In this study, a framework for online writeprint identification is proposed. Variable length character n-gram is used to represent the author’s writing style. The technique of IG seeded GA based feature selection for Ensemble (IGAE is also developed to build an identification model based on individual author level features. Several specific components for dealing with the individual feature set are integrated to improve the performance. The proposed feature and technique are evaluated on a real world data set encompassing reviews posted by 50 Amazon customers. The experimental results show the effectiveness of the proposed framework, with accuracy over 94% for 20 authors and over 80% for 50 ones. Compared with the baseline technique (Support Vector Machine, a higher performance is achieved by using IGAE, resulting in a 2% and 8% improvement over SVM for 20 and 50 authors respectively. Moreover, it has been shown that IGAE is more scalable in terms of the number of authors, than author group level based methods.

  5. Comparative analysis of affinity-based 5-hydroxymethylation enrichment techniques

    Science.gov (United States)

    Thomson, John P.; Hunter, Jennifer M.; Nestor, Colm E.; Dunican, Donncha S.; Terranova, Rémi; Moggs, Jonathan G.; Meehan, Richard R.

    2013-01-01

    The epigenetic modification of 5-hydroxymethylcytosine (5hmC) is receiving great attention due to its potential role in DNA methylation reprogramming and as a cell state identifier. Given this interest, it is important to identify reliable and cost-effective methods for the enrichment of 5hmC marked DNA for downstream analysis. We tested three commonly used affinity-based enrichment techniques; (i) antibody, (ii) chemical capture and (iii) protein affinity enrichment and assessed their ability to accurately and reproducibly report 5hmC profiles in mouse tissues containing high (brain) and lower (liver) levels of 5hmC. The protein-affinity technique is a poor reporter of 5hmC profiles, delivering 5hmC patterns that are incompatible with other methods. Both antibody and chemical capture-based techniques generate highly similar genome-wide patterns for 5hmC, which are independently validated by standard quantitative PCR (qPCR) and glucosyl-sensitive restriction enzyme digestion (gRES-qPCR). Both antibody and chemical capture generated profiles reproducibly link to unique chromatin modification profiles associated with 5hmC. However, there appears to be a slight bias of the antibody to bind to regions of DNA rich in simple repeats. Ultimately, the increased specificity observed with chemical capture-based approaches makes this an attractive method for the analysis of locus-specific or genome-wide patterns of 5hmC. PMID:24214958

  6. Node Augmentation Technique in Bayesian Network Evidence Analysis and Marshaling

    Energy Technology Data Exchange (ETDEWEB)

    Keselman, Dmitry [Los Alamos National Laboratory; Tompkins, George H [Los Alamos National Laboratory; Leishman, Deborah A [Los Alamos National Laboratory

    2010-01-01

    Given a Bayesian network, sensitivity analysis is an important activity. This paper begins by describing a network augmentation technique which can simplifY the analysis. Next, we present two techniques which allow the user to determination the probability distribution of a hypothesis node under conditions of uncertain evidence; i.e. the state of an evidence node or nodes is described by a user specified probability distribution. Finally, we conclude with a discussion of three criteria for ranking evidence nodes based on their influence on a hypothesis node. All of these techniques have been used in conjunction with a commercial software package. A Bayesian network based on a directed acyclic graph (DAG) G is a graphical representation of a system of random variables that satisfies the following Markov property: any node (random variable) is independent of its non-descendants given the state of all its parents (Neapolitan, 2004). For simplicities sake, we consider only discrete variables with a finite number of states, though most of the conclusions may be generalized.

  7. Analysis of Interaction Factors Between Two Piles

    Institute of Scientific and Technical Information of China (English)

    CAO Ming; CHEN Long-zhu

    2008-01-01

    A rigorous analytical method is presented for calculating the interaction factor between two identical piles subjected to vertical loads. Following the technique proposed by Muki and Sternberg, the problem is decomposed into an extended soil mass and two fictitious piles characterized respectively by Young's modulus of the soil and that of the difference between the pile and soil. The unknown axial forces along fictitious piles are determined by solving a Fredholm integral equation of the second kind, which imposes the compatibility condition that the axial strains of the fictitious piles are equal to those corresponding to the centroidal axes of the extended soil. The real pile forces and displacements can subequally be calculated based on the determined fictitious pile forces, and finally, the desired pile interaction factors may be obtained. Results confirm the validity of the proposed approach and portray the influence of the governing parameters on the pile interaction.

  8. Requirements Analyses Integrating Goals and Problem Analysis Techniques

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    One of the difficulties that goal-oriented requirements analyses encounters is that the efficiency of the goal refinement is based on the analysts' subjective knowledge and experience. To improve the efficiency of the requirements eiicitation process, engineers need approaches with more systemized analysis techniques. This paper integrates the goal-oriented requirements language i* with concepts from a structured problem analysis notation, problem frames (PF). The PF approach analyzes software design as a contextualized problem which has to respond to constraints imposed by the environment. The proposed approach is illustrated using the meeting scheduler exemplar. Results show that integration of the goal and the problem analysis enables simultaneous consideration of the designer's subjective intentions and the physical environmental constraints.

  9. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  10. Arc-length technique for nonlinear finite element analysis

    Institute of Scientific and Technical Information of China (English)

    MEMON Bashir-Ahmed; SU Xiao-zu(苏小卒)

    2004-01-01

    Nonlinear solution of reinforced concrete structures, particularly complete load-deflection response, requires tracing of the equilibrium path and proper treatment of the limit and bifurcation points. In this regard, ordinary solution techniques lead to instability near the limit points and also have problems in case of snap-through and snap-back. Thus they fail to predict the complete load-displacement response. The arc-length method serves the purpose well in principle, Received wide acceptance in finite element analysis, and has been used extensively. However modifications to the basic idea are vital to meet the particular needs of the analysis. This paper reviews some of the recent developments of the method in the last two decades, with particular emphasis on nonlinear finite element analysis of reinforced concrete structures.

  11. Error reduction technique using covariant approximation and application to nucleon form factor

    CERN Document Server

    Blum, Thomas; Shintani, Eigo

    2012-01-01

    We demonstrate the new class of variance reduction techniques for hadron propagator and nucleon isovector form factor in the realistic lattice of $N_f=2+1$ domain-wall fermion. All-mode averaging (AMA) is one of the powerful tools to reduce the statistical noise effectively for wider varieties of observables compared to existing techniques such as low-mode averaging (LMA). We adopt this technique to hadron two-point functions and three-point functions, and compare with LMA and traditional source-shift method in the same ensembles. We observe AMA is much more cost effective in reducing statistical error for these observables.

  12. COMPARISON ANALYSIS OF WEB USAGE MINING USING PATTERN RECOGNITION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Nanhay Singh

    2013-07-01

    Full Text Available Web usage mining is the application of data mining techniques to better serve the needs of web-based applications on the web site. In this paper, we analyze the web usage mining by applying the pattern recognition techniques on web log data. Pattern recognition is defined as the act of taking in raw data and making an action based on the ‘category’ of the pattern. Web usage mining is divided into three partsPreprocessing, Pattern discovery and Pattern analysis. Further, this paper intended with experimental work in which web log data is used. We have taken the web log data from the “NASA” web server which is analyzed with “Web Log Explorer”. Web Log Explorer is a web usage mining tool which plays the vital role to carry out this work.

  13. BaTMAn: Bayesian Technique for Multi-image Analysis

    CERN Document Server

    Casado, J; García-Benito, R; Guidi, G; Choudhury, O S; Bellocchi, E; Sánchez, S; Díaz, A I

    2016-01-01

    This paper describes the Bayesian Technique for Multi-image Analysis (BaTMAn), a novel image segmentation technique based on Bayesian statistics, whose main purpose is to characterize an astronomical dataset containing spatial information and perform a tessellation based on the measurements and errors provided as input. The algorithm will iteratively merge spatial elements as long as they are statistically consistent with carrying the same information (i.e. signal compatible with being identical within the errors). We illustrate its operation and performance with a set of test cases that comprises both synthetic and real Integral-Field Spectroscopic (IFS) data. Our results show that the segmentations obtained by BaTMAn adapt to the underlying structure of the data, regardless of the precise details of their morphology and the statistical properties of the noise. The quality of the recovered signal represents an improvement with respect to the input, especially in those regions where the signal is actually con...

  14. Application of thermal analysis techniques in activated carbon production

    Science.gov (United States)

    Donnals, G.L.; DeBarr, J.A.; Rostam-Abadi, M.; Lizzio, A.A.; Brady, T.A.

    1996-01-01

    Thermal analysis techniques have been used at the ISGS as an aid in the development and characterization of carbon adsorbents. Promising adsorbents from fly ash, tires, and Illinois coals have been produced for various applications. Process conditions determined in the preparation of gram quantities of carbons were used as guides in the preparation of larger samples. TG techniques developed to characterize the carbon adsorbents included the measurement of the kinetics of SO2 adsorption, the performance of rapid proximate analyses, and the determination of equilibrium methane adsorption capacities. Thermal regeneration of carbons was assessed by TG to predict the life cycle of carbon adsorbents in different applications. TPD was used to determine the nature of surface functional groups and their effect on a carbon's adsorption properties.

  15. Infrared Spectroscopy of Explosives Residues: Measurement Techniques and Spectral Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, Mark C.; Bernacki, Bruce E.

    2015-03-11

    Infrared laser spectroscopy of explosives is a promising technique for standoff and non-contact detection applications. However, the interpretation of spectra obtained in typical standoff measurement configurations presents numerous challenges. Understanding the variability in observed spectra from explosives residues and particles is crucial for design and implementation of detection algorithms with high detection confidence and low false alarm probability. We discuss a series of infrared spectroscopic techniques applied toward measuring and interpreting the reflectance spectra obtained from explosives particles and residues. These techniques utilize the high spectral radiance, broad tuning range, rapid wavelength tuning, high scan reproducibility, and low noise of an external cavity quantum cascade laser (ECQCL) system developed at Pacific Northwest National Laboratory. The ECQCL source permits measurements in configurations which would be either impractical or overly time-consuming with broadband, incoherent infrared sources, and enables a combination of rapid measurement speed and high detection sensitivity. The spectroscopic methods employed include standoff hyperspectral reflectance imaging, quantitative measurements of diffuse reflectance spectra, reflection-absorption infrared spectroscopy, microscopic imaging and spectroscopy, and nano-scale imaging and spectroscopy. Measurements of explosives particles and residues reveal important factors affecting observed reflectance spectra, including measurement geometry, substrate on which the explosives are deposited, and morphological effects such as particle shape, size, orientation, and crystal structure.

  16. An Empirical Analysis of Rough Set Categorical Clustering Techniques

    Science.gov (United States)

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy. PMID:28068344

  17. An Empirical Analysis of Rough Set Categorical Clustering Techniques.

    Science.gov (United States)

    Uddin, Jamal; Ghazali, Rozaida; Deris, Mustafa Mat

    2017-01-01

    Clustering a set of objects into homogeneous groups is a fundamental operation in data mining. Recently, many attentions have been put on categorical data clustering, where data objects are made up of non-numerical attributes. For categorical data clustering the rough set based approaches such as Maximum Dependency Attribute (MDA) and Maximum Significance Attribute (MSA) has outperformed their predecessor approaches like Bi-Clustering (BC), Total Roughness (TR) and Min-Min Roughness(MMR). This paper presents the limitations and issues of MDA and MSA techniques on special type of data sets where both techniques fails to select or faces difficulty in selecting their best clustering attribute. Therefore, this analysis motivates the need to come up with better and more generalize rough set theory approach that can cope the issues with MDA and MSA. Hence, an alternative technique named Maximum Indiscernible Attribute (MIA) for clustering categorical data using rough set indiscernible relations is proposed. The novelty of the proposed approach is that, unlike other rough set theory techniques, it uses the domain knowledge of the data set. It is based on the concept of indiscernibility relation combined with a number of clusters. To show the significance of proposed approach, the effect of number of clusters on rough accuracy, purity and entropy are described in the form of propositions. Moreover, ten different data sets from previously utilized research cases and UCI repository are used for experiments. The results produced in tabular and graphical forms shows that the proposed MIA technique provides better performance in selecting the clustering attribute in terms of purity, entropy, iterations, time, accuracy and rough accuracy.

  18. Investigation of spectral analysis techniques for randomly sampled velocimetry data

    Science.gov (United States)

    Sree, Dave

    1993-01-01

    It is well known that velocimetry (LV) generates individual realization velocity data that are randomly or unevenly sampled in time. Spectral analysis of such data to obtain the turbulence spectra, and hence turbulence scales information, requires special techniques. The 'slotting' technique of Mayo et al, also described by Roberts and Ajmani, and the 'Direct Transform' method of Gaster and Roberts are well known in the LV community. The slotting technique is faster than the direct transform method in computation. There are practical limitations, however, as to how a high frequency and accurate estimate can be made for a given mean sampling rate. These high frequency estimates are important in obtaining the microscale information of turbulence structure. It was found from previous studies that reliable spectral estimates can be made up to about the mean sampling frequency (mean data rate) or less. If the data were evenly samples, the frequency range would be half the sampling frequency (i.e. up to Nyquist frequency); otherwise, aliasing problem would occur. The mean data rate and the sample size (total number of points) basically limit the frequency range. Also, there are large variabilities or errors associated with the high frequency estimates from randomly sampled signals. Roberts and Ajmani proposed certain pre-filtering techniques to reduce these variabilities, but at the cost of low frequency estimates. The prefiltering acts as a high-pass filter. Further, Shapiro and Silverman showed theoretically that, for Poisson sampled signals, it is possible to obtain alias-free spectral estimates far beyond the mean sampling frequency. But the question is, how far? During his tenure under 1993 NASA-ASEE Summer Faculty Fellowship Program, the author investigated from his studies on the spectral analysis techniques for randomly sampled signals that the spectral estimates can be enhanced or improved up to about 4-5 times the mean sampling frequency by using a suitable

  19. Modular Sampling and Analysis Techniques for the Real-Time Analysis of Human Breath

    Energy Technology Data Exchange (ETDEWEB)

    Frank, M; Farquar, G; Adams, K; Bogan, M; Martin, A; Benner, H; Spadaccini, C; Steele, P; Davis, C; Loyola, B; Morgan, J; Sankaran, S

    2007-07-09

    At LLNL and UC Davis, we are developing several techniques for the real-time sampling and analysis of trace gases, aerosols and exhaled breath that could be useful for a modular, integrated system for breath analysis. Those techniques include single-particle bioaerosol mass spectrometry (BAMS) for the analysis of exhaled aerosol particles or droplets as well as breath samplers integrated with gas chromatography mass spectrometry (GC-MS) or MEMS-based differential mobility spectrometry (DMS). We describe these techniques and present recent data obtained from human breath or breath condensate, in particular, addressing the question of how environmental exposure influences the composition of breath.

  20. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  1. Impedance Flow Cytometry: A Novel Technique in Pollen Analysis.

    Science.gov (United States)

    Heidmann, Iris; Schade-Kampmann, Grit; Lambalk, Joep; Ottiger, Marcel; Di Berardino, Marco

    2016-01-01

    An efficient and reliable method to estimate plant cell viability, especially of pollen, is important for plant breeding research and plant production processes. Pollen quality is determined by classical methods, like staining techniques or in vitro pollen germination, each having disadvantages with respect to reliability, analysis speed, and species dependency. Analysing single cells based on their dielectric properties by impedance flow cytometry (IFC) has developed into a common method for cellular characterisation in microbiology and medicine during the last decade. The aim of this study is to demonstrate the potential of IFC in plant cell analysis with the focus on pollen. Developing and mature pollen grains were analysed during their passage through a microfluidic chip to which radio frequencies of 0.5 to 12 MHz were applied. The acquired data provided information about the developmental stage, viability, and germination capacity. The biological relevance of the acquired IFC data was confirmed by classical staining methods, inactivation controls, as well as pollen germination assays. Different stages of developing pollen, dead, viable and germinating pollen populations could be detected and quantified by IFC. Pollen viability analysis by classical FDA staining showed a high correlation with IFC data. In parallel, pollen with active germination potential could be discriminated from the dead and the viable but non-germinating population. The presented data demonstrate that IFC is an efficient, label-free, reliable and non-destructive technique to analyse pollen quality in a species-independent manner.

  2. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M.A. Wasiolek

    2005-04-28

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standards. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and the five analyses that develop parameter values for the biosphere model (BSC 2005 [DIRS 172827]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis

  3. Analysis of Cultural Heritage by Accelerator Techniques and Analytical Imaging

    Science.gov (United States)

    Ide-Ektessabi, Ari; Toque, Jay Arre; Murayama, Yusuke

    2011-12-01

    In this paper we present the result of experimental investigation using two very important accelerator techniques: (1) synchrotron radiation XRF and XAFS; and (2) accelerator mass spectrometry and multispectral analytical imaging for the investigation of cultural heritage. We also want to introduce a complementary approach to the investigation of artworks which is noninvasive and nondestructive that can be applied in situ. Four major projects will be discussed to illustrate the potential applications of these accelerator and analytical imaging techniques: (1) investigation of Mongolian Textile (Genghis Khan and Kublai Khan Period) using XRF, AMS and electron microscopy; (2) XRF studies of pigments collected from Korean Buddhist paintings; (3) creating a database of elemental composition and spectral reflectance of more than 1000 Japanese pigments which have been used for traditional Japanese paintings; and (4) visible light-near infrared spectroscopy and multispectral imaging of degraded malachite and azurite. The XRF measurements of the Japanese and Korean pigments could be used to complement the results of pigment identification by analytical imaging through spectral reflectance reconstruction. On the other hand, analysis of the Mongolian textiles revealed that they were produced between 12th and 13th century. Elemental analysis of the samples showed that they contained traces of gold, copper, iron and titanium. Based on the age and trace elements in the samples, it was concluded that the textiles were produced during the height of power of the Mongol empire, which makes them a valuable cultural heritage. Finally, the analysis of the degraded and discolored malachite and azurite demonstrates how multispectral analytical imaging could be used to complement the results of high energy-based techniques.

  4. Replication Analysis in Exploratory Factor Analysis: What It Is and Why It Makes Your Analysis Better

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2012-11-01

    Full Text Available Exploratory Factor Analysis (EFA is a powerful and commonly-used tool for investigating the underlying variable structure of a psychometric instrument. However, there is much controversy in the social sciences with regard to the techniques used in EFA (Ford, MacCallum, & Tait, 1986; Henson & Roberts, 2006 and the reliability of the outcome. Simulations by Costello and Osborne (2005, for example, demonstrate how poorly some EFA analyses replicate, even with clear underlying factor structures and large samples. Thus, we argue that researchers should routinely examine the stability or volatility of their EFA solutions to gain more insight into the robustness of their solutions and insight into how to improve their instruments while still at the exploratory stage of development.

  5. Image analysis technique applied to lock-exchange gravity currents

    OpenAIRE

    Nogueira, Helena; Adduce, Claudia; Alves, Elsa; Franca, Rodrigues Pereira Da; Jorge, Mario

    2013-01-01

    An image analysis technique is used to estimate the two-dimensional instantaneous density field of unsteady gravity currents produced by full-depth lock-release of saline water. An experiment reproducing a gravity current was performed in a 3.0 m long, 0.20 m wide and 0.30 m deep Perspex flume with horizontal smooth bed and recorded with a 25 Hz CCD video camera under controlled light conditions. Using dye concentration as a tracer, a calibration procedure was established for each pixel in th...

  6. Dynamic Range Analysis of the Phase Generated Carrier Demodulation Technique

    Directory of Open Access Journals (Sweden)

    M. J. Plotnikov

    2014-01-01

    Full Text Available The dependence of the dynamic range of the phase generated carrier (PGC technique on low-pass filters passbands is investigated using a simulation model. A nonlinear character of this dependence, which could lead to dynamic range limitations or measurement uncertainty, is presented for the first time. A detailed theoretical analysis is provided to verify the simulation results and these results are consistent with performed calculations. The method for the calculation of low-pass filters passbands according to the required dynamic range upper limit is proposed.

  7. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...... contents and the requirement for the project prior to its start are described together with thee results obtained during the 3 year period of the project. The project was mainly carried out as a Ph.D project by the first author from September 1994 to August 1997 in cooperation with associate professor Rune...

  8. Quality assurance and quantitative error analysis by tracer techniques

    Energy Technology Data Exchange (ETDEWEB)

    Schuetze, N.; Hermann, U.

    1983-12-01

    The locations, types and sources of casting defects have been tested by tracer techniques. Certain sites of moulds were labelled using /sup 199/Au, /sup 24/Na sodium carbonate solution, and technetium solution produced in the technetium generator on a /sup 99/Mo//sup 99/Tc elution column. Evaluations were made by means of activity measurements and autoradiography. The locations and causes of casting defects can be determined by error analysis. The surface defects of castings resulting from the moulding materials and from the blacking can be detected by technetium, the subsurface defects are located by gold.

  9. New technique for high-speed microjet breakup analysis

    Energy Technology Data Exchange (ETDEWEB)

    Vago, N. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland); Spiegel, A. [Department of Atomic Physics, Budapest University of Technology and Economics, Budafoki ut 8, 1111, Budapest (Hungary); Couty, P. [Institute of Imaging and Applied Optics, Swiss Federal Institute of Technology, Lausanne, BM, 1015, Lausanne (Switzerland); Wagner, F.R.; Richerzhagen, B. [Synova SA, Ch. Dent d' Oche, 1024 Ecublens (Switzerland)

    2003-10-01

    In this paper we introduce a new technique for visualizing the breakup of thin high-speed liquid jets. Focused light of a He-Ne laser is coupled into a water jet, which behaves as a cylindrical waveguide until the point where the amplitude of surface waves is large enough to scatter out the light from the jet. Observing the jet from a direction perpendicular to its axis, the light that appears indicates the location of breakup. Real-time examination and also statistical analysis of the jet disruption is possible with this method. A ray tracing method was developed to demonstrate the light scattering process. (orig.)

  10. Data Analysis Techniques for a Lunar Surface Navigation System Testbed

    Science.gov (United States)

    Chelmins, David; Sands, O. Scott; Swank, Aaron

    2011-01-01

    NASA is interested in finding new methods of surface navigation to allow astronauts to navigate on the lunar surface. In support of the Vision for Space Exploration, the NASA Glenn Research Center developed the Lunar Extra-Vehicular Activity Crewmember Location Determination System and performed testing at the Desert Research and Technology Studies event in 2009. A significant amount of sensor data was recorded during nine tests performed with six test subjects. This paper provides the procedure, formulas, and techniques for data analysis, as well as commentary on applications.

  11. Prompt Gamma Activation Analysis (PGAA): Technique of choice for nondestructive bulk analysis of returned comet samples

    Science.gov (United States)

    Lindstrom, David J.; Lindstrom, Richard M.

    1989-01-01

    Prompt gamma activation analysis (PGAA) is a well-developed analytical technique. The technique involves irradiation of samples in an external neutron beam from a nuclear reactor, with simultaneous counting of gamma rays produced in the sample by neutron capture. Capture of neutrons leads to excited nuclei which decay immediately with the emission of energetic gamma rays to the ground state. PGAA has several advantages over other techniques for the analysis of cometary materials: (1) It is nondestructive; (2) It can be used to determine abundances of a wide variety of elements, including most major and minor elements (Na, Mg, Al, Si, P, K, Ca, Ti, Cr, Mn, Fe, Co, Ni), volatiles (H, C, N, F, Cl, S), and some trace elements (those with high neutron capture cross sections, including B, Cd, Nd, Sm, and Gd); and (3) It is a true bulk analysis technique. Recent developments should improve the technique's sensitivity and accuracy considerably.

  12. A kernel version of spatial factor analysis

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg

    2009-01-01

    . Schölkopf et al. introduce kernel PCA. Shawe-Taylor and Cristianini is an excellent reference for kernel methods in general. Bishop and Press et al. describe kernel methods among many other subjects. Nielsen and Canty use kernel PCA to detect change in univariate airborne digital camera images. The kernel...... version of PCA handles nonlinearities by implicitly transforming data into high (even infinite) dimensional feature space via the kernel function and then performing a linear analysis in that space. In this paper we shall apply kernel versions of PCA, maximum autocorrelation factor (MAF) analysis...

  13. What Is Rotating in Exploratory Factor Analysis?

    Directory of Open Access Journals (Sweden)

    Jason W. Osborne

    2015-01-01

    Full Text Available Exploratory factor analysis (EFA is one of the most commonly-reported quantitative methodology in the social sciences, yet much of the detail regarding what happens during an EFA remains unclear. The goal of this brief technical note is to explore what - rotation- is, what exactly is rotating, and why we use rotation when performing EFAs. Some commentary about the relative utility and desirability of different rotation methods concludes the narrative.

  14. Macro elemental analysis of food samples by nuclear analytical technique

    Science.gov (United States)

    Syahfitri, W. Y. N.; Kurniawati, S.; Adventini, N.; Damastuti, E.; Lestiani, D. D.

    2017-06-01

    Energy-dispersive X-ray fluorescence (EDXRF) spectrometry is a non-destructive, rapid, multi elemental, accurate, and environment friendly analysis compared with other detection methods. Thus, EDXRF spectrometry is applicable for food inspection. The macro elements calcium and potassium constitute important nutrients required by the human body for optimal physiological functions. Therefore, the determination of Ca and K content in various foods needs to be done. The aim of this work is to demonstrate the applicability of EDXRF for food analysis. The analytical performance of non-destructive EDXRF was compared with other analytical techniques; neutron activation analysis and atomic absorption spectrometry. Comparison of methods performed as cross checking results of the analysis and to overcome the limitations of the three methods. Analysis results showed that Ca found in food using EDXRF and AAS were not significantly different with p-value 0.9687, whereas p-value of K between EDXRF and NAA is 0.6575. The correlation between those results was also examined. The Pearson correlations for Ca and K were 0.9871 and 0.9558, respectively. Method validation using SRM NIST 1548a Typical Diet was also applied. The results showed good agreement between methods; therefore EDXRF method can be used as an alternative method for the determination of Ca and K in food samples.

  15. Golden glazes analysis by PIGE and PIXE techniques

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, M., E-mail: mmfonseca@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Luis, H., E-mail: heliofluis@itn.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Franco, N., E-mail: nfranco@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Reis, M.A., E-mail: mareis@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Chaves, P.C., E-mail: cchaves@itn.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Taborda, A., E-mail: galaviz@cii.fc.ul.pt [Instituto Tecnologico Nuclear, Sacavem (Portugal); Cruz, J., E-mail: jdc@fct.unl.pt [Dept. Fisica, Faculdade de Ciencias e Tecnologia, Universidade Nova de Lisboa, Caparica (Portugal); Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Galaviz, D., E-mail: ataborda@itn.pt [Centro de Fisica Nuclear da Universidade de Lisboa, Lisboa (Portugal); Dept. Fisica, Faculdade de Ciencias, Universidade de Lisboa, Lisboa (Portugal); and others

    2011-12-15

    We present the analysis performed on the chemical composition of two golden glazes available in the market using the PIGE and PIXE techniques at the ITN ion beam laboratory. The analysis of the light elements was performed using the Emitted Radiation Yield Analysis (ERYA) code, a standard-free method for PIGE analysis on thick samples. The results were compared to those obtained on an old glaze. Consistently high concentrations of lead and sodium were found in all analyzed golden glazes. The analysis of the samples pointed to Mo and Co as the specific elements responsible of the gold colour at the desired temperature, and allowed Portuguese ceramists to produce a golden glaze at 997 Degree-Sign C. Optical reflection spectra of the glazes are given, showing that the produced glaze has a spectrum similar to the old glaze. Also, in order to help the ceramists, the unknown compositions of four different types of frits (one of the components of glazes) were analysed.

  16. Carbon dioxide-water oxygen isotope fractionation factor using chlorine trifluoride and guanidine hydrochloride techniques

    Energy Technology Data Exchange (ETDEWEB)

    Dugan, J.P. Jr.; Borthwick, J.

    1986-12-01

    A new value for the CO/sub 2/-H/sub 2/O oxygen isotope fractionation factor of 1.04145 +/- 0.000 15 (2sigma) has been determined. The data have been normalized to the V-SMOW/V-SLAP scale and were obtained by measuring isotopic compositions with the guanidine hydrochloride and chlorine trifluoride techniques.

  17. Real-time flight test analysis and display techniques for the X-29A aircraft

    Science.gov (United States)

    Hicks, John W.; Petersen, Kevin L.

    1989-01-01

    The X-29A advanced technology demonstrator flight envelope expansion program and the subsequent flight research phase gave impetus to the development of several innovative real-time analysis and display techniques. These new techniques produced significant improvements in flight test productivity, flight research capabilities, and flight safety. These techniques include real-time measurement and display of in-flight structural loads, dynamic structural mode frequency and damping, flight control system dynamic stability and control response, aeroperformance drag polars, and aircraft specific excess power. Several of these analysis techniques also provided for direct comparisons of flight-measured results with analytical predictions. The aeroperformance technique was made possible by the concurrent development of a new simplified in-flight net thrust computation method. To achieve these levels of on-line flight test analysis, integration of ground and airborne systems was required. The capability of NASA Ames Research Center, Dryden Flight Research Facility's Western Aeronautical Test Range was a key factor to enable implementation of these methods.

  18. METHODOLOGICAL STUDY OF OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Pravesh Kumar Singh

    2014-02-01

    Full Text Available Decision making both on individual and organizational level is always accompanied by the search of other’s opinion on the same. With tremendous establishment of opinion rich resources like, reviews, forum discussions, blogs, micro-blogs, Twitter etc provide a rich anthology of sentiments. This user generated content can serve as a benefaction to market if the semantic orientations are deliberated. Opinion mining and sentiment analysis are the formalization for studying and construing opinions and sentiments. The digital ecosystem has itself paved way for use of huge volume of opinionated data recorded. This paper is an attempt to review and evaluate the various techniques used for opinion and sentiment analysis.

  19. Recovering prehistoric woodworking skills using spatial analysis techniques

    Science.gov (United States)

    Kovács, K.; Hanke, K.

    2015-08-01

    Recovering of ancient woodworking skills can be achieved by the simultaneous documentation and analysis of the tangible evidences such as the geometry parameters of prehistoric hand tools or the fine morphological characteristics of well preserved wooden archaeological finds. During this study, altogether 10 different hand tool forms and over 60 hand tool impressions were investigated for the better understanding of the Bronze Age woodworking efficiency. Two archaeological experiments were also designed in this methodology and unknown prehistoric adzes could be reconstructed by the results of these studies and by the spatial analysis of the Bronze Age tool marks. Finally, the trimming efficiency of these objects were also implied and these woodworking skills could be quantified in the case of a Bronze Age wooden construction from Austria. The proposed GIS-based tool mark segmentation and comparison can offer an objective, user-independent technique for the related intangible heritage interpretations in the future.

  20. Stalked protozoa identification by image analysis and multivariable statistical techniques.

    Science.gov (United States)

    Amaral, A L; Ginoris, Y P; Nicolau, A; Coelho, M A Z; Ferreira, E C

    2008-06-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determining the geometrical, morphological and signature data and subsequent processing by discriminant analysis and neural network techniques. Geometrical descriptors were found to be responsible for the best identification ability and the identification of the crucial Opercularia and Vorticella microstoma microorganisms provided some degree of confidence to establish their presence in wastewater treatment plants.

  1. BaTMAn: Bayesian Technique for Multi-image Analysis

    Science.gov (United States)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  2. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  3. Sensitivity analysis techniques for models of human behavior.

    Energy Technology Data Exchange (ETDEWEB)

    Bier, Asmeret Brooke

    2010-09-01

    Human and social modeling has emerged as an important research area at Sandia National Laboratories due to its potential to improve national defense-related decision-making in the presence of uncertainty. To learn about which sensitivity analysis techniques are most suitable for models of human behavior, different promising methods were applied to an example model, tested, and compared. The example model simulates cognitive, behavioral, and social processes and interactions, and involves substantial nonlinearity, uncertainty, and variability. Results showed that some sensitivity analysis methods create similar results, and can thus be considered redundant. However, other methods, such as global methods that consider interactions between inputs, can generate insight not gained from traditional methods.

  4. Pressure transient analysis for long homogeneous reservoirs using TDS technique

    Energy Technology Data Exchange (ETDEWEB)

    Escobar, Freddy Humberto [Universidad Surcolombiana, Av. Pastrana - Cra. 1, Neiva, Huila (Colombia); Hernandez, Yuly Andrea [Hocol S.A., Cra. 7 No 114-43, Floor 16, Bogota (Colombia); Hernandez, Claudia Marcela [Weatherford, Cra. 7 No 81-90, Neiva, Huila (Colombia)

    2007-08-15

    A significant number of well pressure tests are conducted in long, narrow reservoirs with close and open extreme boundaries. It is desirable not only to appropriately identify these types of systems but also to develop an adequate and practical interpretation technique to determine their parameters and size, when possible. An accurate understanding of how the reservoir produces and the magnitude of producible reserves can lead to competent decisions and adequate reservoir management. So far, studies found for identification and determination of parameters for such systems are conducted by conventional techniques (semilog analysis) and semilog and log-log type-curve matching of pressure versus time. Type-curve matching is basically a trial-and-error procedure which may provide inaccurate results. Besides, a limitation in the number of type curves plays a negative role. In this paper, a detailed analysis of pressure derivative behavior for a vertical well in linear reservoirs with open and closed extreme boundaries is presented for the case of constant rate production. We studied independently each flow regime, especially the linear flow regime since it is the most characteristic 'fingerprint' of these systems. We found that when the well is located at one of the extremes of the reservoir, a single linear flow regime develops once radial flow and/or wellbore storage effects have ended. When the well is located at a given distance from both extreme boundaries, the pressure derivative permits the identification of two linear flows toward the well and it has been called that 'dual-linear flow regime'. This is characterized by an increment of the intercept of the 1/2-slope line from {pi}{sup 0.5} to {pi} with a consequent transition between these two straight lines. The identification of intersection points, lines, and characteristic slopes allows us to develop an interpretation technique without employing type-curve matching. This technique uses

  5. Scale-model charge-transfer technique for measuring enhancement factors

    Science.gov (United States)

    Kositsky, J.; Nanevicz, J. E.

    1991-01-01

    Determination of aircraft electric field enhancement factors is crucial when using airborne field mill (ABFM) systems to accurately measure electric fields aloft. SRI used the scale model charge transfer technique to determine enhancement factors of several canonical shapes and a scale model Learjet 36A. The measured values for the canonical shapes agreed with known analytic solutions within about 6 percent. The laboratory determined enhancement factors for the aircraft were compared with those derived from in-flight data gathered by a Learjet 36A outfitted with eight field mills. The values agreed to within experimental error (approx. 15 percent).

  6. Comparison of correlation analysis techniques for irregularly sampled time series

    Directory of Open Access Journals (Sweden)

    K. Rehfeld

    2011-06-01

    Full Text Available Geoscientific measurements often provide time series with irregular time sampling, requiring either data reconstruction (interpolation or sophisticated methods to handle irregular sampling. We compare the linear interpolation technique and different approaches for analyzing the correlation functions and persistence of irregularly sampled time series, as Lomb-Scargle Fourier transformation and kernel-based methods. In a thorough benchmark test we investigate the performance of these techniques.

    All methods have comparable root mean square errors (RMSEs for low skewness of the inter-observation time distribution. For high skewness, very irregular data, interpolation bias and RMSE increase strongly. We find a 40 % lower RMSE for the lag-1 autocorrelation function (ACF for the Gaussian kernel method vs. the linear interpolation scheme,in the analysis of highly irregular time series. For the cross correlation function (CCF the RMSE is then lower by 60 %. The application of the Lomb-Scargle technique gave results comparable to the kernel methods for the univariate, but poorer results in the bivariate case. Especially the high-frequency components of the signal, where classical methods show a strong bias in ACF and CCF magnitude, are preserved when using the kernel methods.

    We illustrate the performances of interpolation vs. Gaussian kernel method by applying both to paleo-data from four locations, reflecting late Holocene Asian monsoon variability as derived from speleothem δ18O measurements. Cross correlation results are similar for both methods, which we attribute to the long time scales of the common variability. The persistence time (memory is strongly overestimated when using the standard, interpolation-based, approach. Hence, the Gaussian kernel is a reliable and more robust estimator with significant advantages compared to other techniques and suitable for large scale application to paleo-data.

  7. Accelerator human interface. 4. Object analysis by OMT technique

    Energy Technology Data Exchange (ETDEWEB)

    Abe, Isamu; Nakahara, Kazuo [National Lab. for High Energy Physics, Tsukuba, Ibaraki (Japan); Mutoh, Masakatsu; Shibasaki, Yoshinobu

    1995-07-01

    The analysis of the objects of various classes in accelerator domain was carried out by OMT technique, and its summary is reported. By changing the technique from the conventional procedural type to object-oriented type, and reconsidering accelerator control, it becomes possible to give large impact to the development of software and accelerator control. Also the importance of establishing fundamental object system and its outline are described. As to the original objects of accelerators, the change due to age is small as compared with computer environment. Accordingly, by extracting the standard objects as far as possible, and making the objects so as to be able to deal with control system and multi-platform progressing with age, the situation that control systems become out of date can be clearly solved. The establishment of optimal objects shows the possibility of presenting optimal control system, and object analysis brings about many merits. OMT was adopted here. Accelerator control domain is divided into device class and generic task class, and the latter is divided into operation, diagnosis, operation support, simulation, data base, indication system and maintenance classes. The abstracting of data and procedure, the succession between devices and the behavior of objects are described. (K.I.)

  8. Application of transport phenomena analysis technique to cerebrospinal fluid.

    Science.gov (United States)

    Lam, C H; Hansen, E A; Hall, W A; Hubel, A

    2013-12-01

    The study of hydrocephalus and the modeling of cerebrospinal fluid flow have proceeded in the past using mathematical analysis that was very capable of prediction phenomenonologically but not well in physiologic parameters. In this paper, the basis of fluid dynamics at the physiologic state is explained using first established equations of transport phenomenon. Then, microscopic and molecular level techniques of modeling are described using porous media theory and chemical kinetic theory and then applied to cerebrospinal fluid (CSF) dynamics. Using techniques of transport analysis allows the field of cerebrospinal fluid dynamics to approach the level of sophistication of urine and blood transport. Concepts such as intracellular and intercellular pathways, compartmentalization, and tortuosity are associated with quantifiable parameters that are relevant to the anatomy and physiology of cerebrospinal fluid transport. The engineering field of transport phenomenon is rich and steeped in architectural, aeronautical, nautical, and more recently biological history. This paper summarizes and reviews the approaches that have been taken in the field of engineering and applies it to CSF flow.

  9. Vortex metrology using Fourier analysis techniques: vortex networks correlation fringes.

    Science.gov (United States)

    Angel-Toro, Luciano; Sierra-Sosa, Daniel; Tebaldi, Myrian; Bolognini, Néstor

    2012-10-20

    In this work, we introduce an alternative method of analysis in vortex metrology based on the application of the Fourier optics techniques. The first part of the procedure is conducted as is usual in vortex metrology for uniform in-plane displacement determination. On the basis of two recorded intensity speckled distributions, corresponding to two states of a diffuser coherently illuminated, we numerically generate an analytical signal from each recorded intensity pattern by using a version of the Riesz integral transform. Then, from each analytical signal, a two-dimensional pseudophase map is generated in which the vortices are located and characterized in terms of their topological charges and their core's structural properties. The second part of the procedure allows obtaining Young's interference fringes when Fourier transforming the light passing through a diffracting mask with multiple apertures at the locations of the homologous vortices. In fact, we use the Fourier transform as a mathematical operation to compute the far-field diffraction intensity pattern corresponding to the multiaperture set. Each aperture from the set is associated with a rectangular hole that coincides both in shape and size with a pixel from recorded images. We show that the fringe analysis can be conducted as in speckle photography in an extended range of displacement measurements. Effects related with speckled decorrelation are also considered. Our experimental results agree with those of speckle photography in the range in which both techniques are applicable.

  10. PVUSA instrumentation and data analysis techniques for photovoltaic systems

    Energy Technology Data Exchange (ETDEWEB)

    Newmiller, J.; Hutchinson, P.; Townsend, T.; Whitaker, C.

    1995-10-01

    The Photovoltaics for Utility Scale Applications (PVUSA) project tests two types of PV systems at the main test site in Davis, California: new module technologies fielded as 20-kW Emerging Module Technology (EMT) arrays and more mature technologies fielded as 70- to 500-kW turnkey Utility-Scale (US) systems. PVUSA members have also installed systems in their service areas. Designed appropriately, data acquisition systems (DASs) can be a convenient and reliable means of assessing system performance, value, and health. Improperly designed, they can be complicated, difficult to use and maintain, and provide data of questionable validity. This report documents PVUSA PV system instrumentation and data analysis techniques and lessons learned. The report is intended to assist utility engineers, PV system designers, and project managers in establishing an objective, then, through a logical series of topics, facilitate selection and design of a DAS to meet the objective. Report sections include Performance Reporting Objectives (including operational versus research DAS), Recommended Measurements, Measurement Techniques, Calibration Issues, and Data Processing and Analysis Techniques. Conclusions and recommendations based on the several years of operation and performance monitoring are offered. This report is one in a series of 1994--1995 PVUSA reports documenting PVUSA lessons learned at the demonstration sites in Davis and Kerman, California. Other topical reports address: five-year assessment of EMTs; validation of the Kerman 500-kW grid support PV plant benefits; construction and safety experience in installing and operating PV systems; balance-of-system design and costs; procurement, acceptance, and rating practices for PV power plants; experience with power conditioning units and power quality.

  11. Factor analysis identifies subgroups of constipation

    Institute of Scientific and Technical Information of China (English)

    Philip G Dinning; Mike Jones; Linda Hunt; Sergio E Fuentealba; Jamshid Kalanter; Denis W King; David Z Lubowski; Nicholas J Talley; Ian J Cook

    2011-01-01

    AIM: To determine whether distinct symptom groupings exist in a constipated population and whether such grouping might correlate with quantifiable pathophysiological measures of colonic dysfunction. METHODS: One hundred and ninety-one patients presenting to a Gastroenterology clinic with constipation and 32 constipated patients responding to a newspaper advertisement completed a 53-item, wide-ranging selfreport questionnaire. One hundred of these patients had colonic transit measured scintigraphically. Factor analysis determined whether constipation-related symptoms grouped into distinct aspects of symptomatology. Cluster analysis was used to determine whether individual patients naturally group into distinct subtypes. RESULTS: Cluster analysis yielded a 4 cluster solution with the presence or absence of pain and laxative unresponsiveness providing the main descriptors. Amongst all clusters there was a considerable proportion of patients with demonstrable delayed colon transit, irritable bowel syndrome positive criteria and regular stool frequency. The majority of patients with these characteristics also reported regular laxative use. CONCLUSION: Factor analysis identified four constipation subgroups, based on severity and laxative unresponsiveness, in a constipated population. However, clear stratification into clinically identifiable groups remains imprecise.

  12. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the Total System Performance Assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the groundwater exposure scenario, and the development of conversion factors for assessing compliance with the groundwater protection standard. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA-LA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the ''Biosphere Model Report'' in Figure 1-1, contain detailed description of the model input parameters, their development, and the relationship between the parameters and specific features events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the groundwater exposure scenario. The objectives of this analysis are to develop BDCFs for the groundwater exposure scenario for the three climate states considered in the TSPA-LA as well as conversion factors for evaluating compliance with the groundwater protection standard. The BDCFs will be used in performance assessment for calculating all-pathway annual doses for a given concentration of radionuclides in groundwater. The conversion factors will be used for calculating gross alpha particle

  13. Thermal Response Analysis of Phospholipid Bilayers Using Ellipsometric Techniques.

    Science.gov (United States)

    González-Henríquez, Carmen M; Villegas-Opazo, Vanessa A; Sagredo-Oyarce, Dallits H; Sarabia-Vallejos, Mauricio A; Terraza, Claudio A

    2017-08-18

    Biomimetic planar artificial membranes have been widely studied due to their multiple applications in several research fields. Their humectation and thermal response are crucial for reaching stability; these characteristics are related to the molecular organization inside the bilayer, which is affected by the aliphatic chain length, saturations, and molecule polarity, among others. Bilayer stability becomes a fundamental factor when technological devices are developed-like biosensors-based on those systems. Thermal studies were performed for different types of phosphatidylcholine (PC) molecules: two pure PC bilayers and four binary PC mixtures. These analyses were carried out through the detection of slight changes in their optical and structural parameters via Ellipsometry and Surface Plasmon Resonance (SPR) techniques. Phospholipid bilayers were prepared by Langmuir-Blodgett technique and deposited over a hydrophilic silicon wafer. Their molecular inclination degree, mobility, and stability of the different phases were detected and analyzed through bilayer thickness changes and their optical phase-amplitude response. Results show that certain binary lipid mixtures-with differences in its aliphatic chain length-present a co-existence of two thermal responses due to non-ideal mixing.

  14. Exploratory factor analysis in Rehabilitation Psychology: a content analysis.

    Science.gov (United States)

    Roberson, Richard B; Elliott, Timothy R; Chang, Jessica E; Hill, Jessica N

    2014-11-01

    Our objective was to examine the use and quality of exploratory factor analysis (EFA) in articles published in Rehabilitation Psychology. Trained raters examined 66 separate exploratory factor analyses in 47 articles published between 1999 and April 2014. The raters recorded the aim of the EFAs, the distributional statistics, sample size, factor retention method(s), extraction and rotation method(s), and whether the pattern coefficients, structure coefficients, and the matrix of association were reported. The primary use of the EFAs was scale development, but the most widely used extraction and rotation method was principle component analysis, with varimax rotation. When determining how many factors to retain, multiple methods (e.g., scree plot, parallel analysis) were used most often. Many articles did not report enough information to allow for the duplication of their results. EFA relies on authors' choices (e.g., factor retention rules extraction, rotation methods), and few articles adhered to all of the best practices. The current findings are compared to other empirical investigations into the use of EFA in published research. Recommendations for improving EFA reporting practices in rehabilitation psychology research are provided.

  15. Improving Your Exploratory Factor Analysis for Ordinal Data: A Demonstration Using FACTOR

    Directory of Open Access Journals (Sweden)

    James Baglin

    2014-06-01

    Full Text Available Exploratory factor analysis (EFA methods are used extensively in the field of assessment and evaluation. Due to EFA's widespread use, common methods and practices have come under close scrutiny. A substantial body of literature has been compiled highlighting problems with many of the methods and practices used in EFA, and, in response, many guidelines have been proposed with the aim to improve application. Unfortunately, implementing recommended EFA practices has been restricted by the range of options available in commercial statistical packages and, perhaps, due to an absence of clear, practical - how-to' demonstrations. Consequently, this article describes the application of methods recommended to get the most out of your EFA. The article focuses on dealing with the common situation of analysing ordinal data as derived from Likert-type scales. These methods are demonstrated using the free, stand-alone, easy-to-use and powerful EFA package FACTOR (http://psico.fcep.urv.es/utilitats/factor/, Lorenzo-Seva & Ferrando, 2006. The demonstration applies the recommended techniques using an accompanying dataset, based on the Big 5 personality test. The outcomes obtained by the EFA using the recommended procedures through FACTOR are compared to the default techniques currently available in SPSS.

  16. Insight to Nanoparticle Size Analysis-Novel and Convenient Image Analysis Method Versus Conventional Techniques.

    Science.gov (United States)

    Vippola, Minnamari; Valkonen, Masi; Sarlin, Essi; Honkanen, Mari; Huttunen, Heikki

    2016-12-01

    The aim of this paper is to introduce a new image analysis program "Nanoannotator" particularly developed for analyzing individual nanoparticles in transmission electron microscopy images. This paper describes the usefulness and efficiency of the program when analyzing nanoparticles, and at the same time, we compare it to more conventional nanoparticle analysis techniques. The techniques which we are concentrating here are transmission electron microscopy (TEM) linked with different image analysis methods and X-ray diffraction techniques. The developed program appeared as a good supplement to the field of particle analysis techniques, since the traditional image analysis programs suffer from the inability to separate the individual particles from agglomerates in the TEM images. The program is more efficient, and it offers more detailed morphological information of the particles than the manual technique. However, particle shapes that are very different from spherical proved to be problematic also for the novel program. When compared to X-ray techniques, the main advantage of the small-angle X-ray scattering (SAXS) method is the average data it provides from a very large amount of particles. However, the SAXS method does not provide any data about the shape or appearance of the sample.

  17. Analysis of Ultra Linguistic Factors in Interpretation

    Institute of Scientific and Technical Information of China (English)

    姚嘉

    2015-01-01

    The quality of interpretation is a dynamic conception, involving a good deal of variables, such as the participants, the situations, working conditions, cultures etc.. Therefore, in interpretation, those static elements, such as traditional grammars and certain linguistic rules can not be counted as the only criteria for the quality of interpretation. That is, there are many other non-language elements—Ultra-linguistic factors that play an important role in interpretation. Ultra-linguistic factors get rid of the bounding of traditional grammar and parole, and reveal the facts in an indirect way. This paper gives a brief analysis of Ultra Lin⁃guistic elements in interpretation in order to achieve better result in interpretation practice.

  18. Vibration impact acoustic emission technique for identification and analysis of defects in carbon steel tubes: Part A Statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Halim, Zakiah Abd [Universiti Teknikal Malaysia Melaka (Malaysia); Jamaludin, Nordin; Junaidi, Syarif [Faculty of Engineering and Built, Universiti Kebangsaan Malaysia, Bangi (Malaysia); Yahya, Syed Yusainee Syed [Universiti Teknologi MARA, Shah Alam (Malaysia)

    2015-04-15

    Current steel tubes inspection techniques are invasive, and the interpretation and evaluation of inspection results are manually done by skilled personnel. This paper presents a statistical analysis of high frequency stress wave signals captured from a newly developed noninvasive, non-destructive tube inspection technique known as the vibration impact acoustic emission (VIAE) technique. Acoustic emission (AE) signals have been introduced into the ASTM A179 seamless steel tubes using an impact hammer, and the AE wave propagation was captured using an AE sensor. Specifically, a healthy steel tube as the reference tube and four steel tubes with through-hole artificial defect at different locations were used in this study. The AE features extracted from the captured signals are rise time, peak amplitude, duration and count. The VIAE technique also analysed the AE signals using statistical features such as root mean square (r.m.s.), energy, and crest factor. It was evident that duration, count, r.m.s., energy and crest factor could be used to automatically identify the presence of defect in carbon steel tubes using AE signals captured using the non-invasive VIAE technique.

  19. Factor Rotation and Standard Errors in Exploratory Factor Analysis

    Science.gov (United States)

    Zhang, Guangjian; Preacher, Kristopher J.

    2015-01-01

    In this article, we report a surprising phenomenon: Oblique CF-varimax and oblique CF-quartimax rotation produced similar point estimates for rotated factor loadings and factor correlations but different standard error estimates in an empirical example. Influences of factor rotation on asymptotic standard errors are investigated using a numerical…

  20. Techniques of DNA methylation analysis with nutritional applications.

    Science.gov (United States)

    Mansego, Maria L; Milagro, Fermín I; Campión, Javier; Martínez, J Alfredo

    2013-01-01

    Epigenetic mechanisms are likely to play an important role in the regulation of metabolism and body weight through gene-nutrient interactions. This review focuses on methods for analyzing one of the most important epigenetic mechanisms, DNA methylation, from single nucleotide to global measurement depending on the study goal and scope. In addition, this study highlights the major principles and methods for DNA methylation analysis with emphasis on nutritional applications. Recent developments concerning epigenetic technologies are showing promising results of DNA methylation levels at a single-base resolution and provide the ability to differentiate between 5-methylcytosine and other nucleotide modifications such as 5-hydroxymethylcytosine. A large number of methods can be used for the analysis of DNA methylation such as pyrosequencing™, primer extension or real-time PCR methods, and genome-wide DNA methylation profile from microarray or sequencing-based methods. Researchers should conduct a preliminary analysis focused on the type of validation and information provided by each technique in order to select the best method fitting for their nutritional research interests.

  1. Comparing dynamical systems concepts and techniques for biomechanical analysis

    Institute of Scientific and Technical Information of China (English)

    Richard E.A. van Emmerik; Scott W. Ducharme; Avelino C. Amado; Joseph Hamill

    2016-01-01

    Traditional biomechanical analyses of human movement are generally derived from linear mathematics. While these methods can be useful in many situations, they do not describe behaviors in human systems that are predominately nonlinear. For this reason, nonlinear analysis methods based on a dynamical systems approach have become more prevalent in recent literature. These analysis techniques have provided new insights into how systems (1) maintain pattern stability, (2) transition into new states, and (3) are governed by short-and long-term (fractal) correlational processes at different spatio-temporal scales. These different aspects of system dynamics are typically investigated using concepts related to variability, stability, complexity, and adaptability. The purpose of this paper is to compare and contrast these different concepts and demonstrate that, although related, these terms represent fundamentally different aspects of system dynamics. In particular, we argue that variability should not uniformly be equated with stability or complexity of movement. In addition, current dynamic stability measures based on nonlinear analysis methods (such as the finite maximal Lyapunov exponent) can reveal local instabilities in movement dynamics, but the degree to which these local instabilities relate to global postural and gait stability and the ability to resist external perturbations remains to be explored. Finally, systematic studies are needed to relate observed reductions in complexity with aging and disease to the adaptive capabilities of the movement system and how complexity changes as a function of different task constraints.

  2. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-06-01

    Full Text Available This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case study on natural base isolation using naturally available soils is presented. Also, the future areas of research are indicated. Earthquakes are one of nature IS greatest hazards; throughout historic time they have caused significant loss offline and severe damage to property, especially to man-made structures. On the other hand, earthquakes provide architects and engineers with a number of important design criteria foreign to the normal design process. From well established procedures reviewed by many researchers, seismic isolation may be used to provide an effective solution for a wide range of seismic design problems. The application of the base isolation techniques to protect structures against damage from earthquake attacks has been considered as one of the most effective approaches and has gained increasing acceptance during the last two decades. This is because base isolation limits the effects of the earthquake attack, a flexible base largely decoupling the structure from the ground motion, and the structural response accelerations are usually less than the ground acceleration. In general, the increase of additional viscous damping in the structure may reduce displacement and acceleration responses of the structure. This study also seeks to evaluate the effects of additional damping on the seismic response when compared with structures without additional damping for the different ground motions.

  3. An analysis of matching cognitive-behavior therapy techniques to learning styles.

    Science.gov (United States)

    van Doorn, Karlijn; McManus, Freda; Yiend, Jenny

    2012-12-01

    To optimize the effectiveness of cognitive-behavior therapy (CBT) for each individual patient, it is important to discern whether different intervention techniques may be differentially effective. One factor influencing the differential effectiveness of CBT intervention techniques may be the patient's preferred learning style, and whether this is 'matched' to the intervention. The current study uses a retrospective analysis to examine whether the impact of two common CBT interventions (thought records and behavioral experiments) is greater when the intervention is either matched or mismatched to the individual's learning style. Results from this study give some indication that greater belief change is achieved when the intervention technique is matched to participants' learning style, than when intervention techniques are mismatched to learning style. Conclusions are limited by the retrospective nature of the analysis and the limited dose of the intervention in non-clinical participants. Results suggest that further investigation of the impact of matching the patient's learning style to CBT intervention techniques is warranted, using clinical samples with higher dose interventions. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Directory of Open Access Journals (Sweden)

    Mahmoud I. Al-Kadi

    2013-05-01

    Full Text Available Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  5. Evolution of electroencephalogram signal analysis techniques during anesthesia.

    Science.gov (United States)

    Al-Kadi, Mahmoud I; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-05-17

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device.

  6. Evolution of Electroencephalogram Signal Analysis Techniques during Anesthesia

    Science.gov (United States)

    Al-Kadi, Mahmoud I.; Reaz, Mamun Bin Ibne; Ali, Mohd Alauddin Mohd

    2013-01-01

    Biosignal analysis is one of the most important topics that researchers have tried to develop during the last century to understand numerous human diseases. Electroencephalograms (EEGs) are one of the techniques which provides an electrical representation of biosignals that reflect changes in the activity of the human brain. Monitoring the levels of anesthesia is a very important subject, which has been proposed to avoid both patient awareness caused by inadequate dosage of anesthetic drugs and excessive use of anesthesia during surgery. This article reviews the bases of these techniques and their development within the last decades and provides a synopsis of the relevant methodologies and algorithms that are used to analyze EEG signals. In addition, it aims to present some of the physiological background of the EEG signal, developments in EEG signal processing, and the effective methods used to remove various types of noise. This review will hopefully increase efforts to develop methods that use EEG signals for determining and classifying the depth of anesthesia with a high data rate to produce a flexible and reliable detection device. PMID:23686141

  7. Model correction factor method for system analysis

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Johannesen, Johannes M.

    2000-01-01

    The Model Correction Factor Method is an intelligent response surface method based on simplifiedmodeling. MCFM is aimed for reliability analysis in case of a limit state defined by an elaborate model. Herein it isdemonstrated that the method is applicable for elaborate limit state surfaces on which...... severallocally most central points exist without there being a simple geometric definition of the corresponding failuremodes such as is the case for collapse mechanisms in rigid plastic hinge models for frame structures. Taking as simplifiedidealized model a model of similarity with the elaborate model...... surface than existing in the idealized model....

  8. Analysis techniques for background rejection at the Majorana Demonstrator

    Energy Technology Data Exchange (ETDEWEB)

    Cuestra, Clara [University of Washington; Rielage, Keith Robert [Los Alamos National Laboratory; Elliott, Steven Ray [Los Alamos National Laboratory; Xu, Wenqin [Los Alamos National Laboratory; Goett, John Jerome III [Los Alamos National Laboratory

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.

  9. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  10. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    Energy Technology Data Exchange (ETDEWEB)

    Cuesta, C.; Buuck, M.; Detwiler, J. A.; Gruszko, J.; Guinn, I. S.; Leon, J.; Robertson, R. G. H. [Center for Experimental Nuclear Physics and Astrophysics, and Department of Physics, University of Washington, Seattle, WA (United States); Abgrall, N.; Bradley, A. W.; Chan, Y-D.; Mertens, S.; Poon, A. W. P. [Nuclear Science Division, Lawrence Berkeley National Laboratory, Berkeley, CA (United States); Arnquist, I. J.; Hoppe, E. W.; Kouzes, R. T.; LaFerriere, B. D.; Orrell, J. L. [Pacific Northwest National Laboratory, Richland, WA (United States); Avignone, F. T. [Department of Physics and Astronomy, University of South Carolina, Columbia, SC (United States); Oak Ridge National Laboratory, Oak Ridge, TN (United States); Baldenegro-Barrera, C. X.; Bertrand, F. E. [Oak Ridge National Laboratory, Oak Ridge, TN (United States); and others

    2015-08-17

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40- kg modular HPGe detector array to search for neutrinoless double beta decay in {sup 76}Ge. In view of the next generation of tonne-scale Ge-based 0νβ β-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR’s germanium detectors allows for significant reduction of gamma background.

  11. Radial Velocity Data Analysis with Compressed Sensing Techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2016-09-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  12. Analysis techniques for background rejection at the MAJORANA DEMONSTRATOR

    CERN Document Server

    Cuesta, C; Arnquist, I J; Avignone, F T; Baldenegro-Barrera, C X; Barabash, A S; Bertrand, F E; Bradley, A W; Brudanin, V; Busch, M; Buuck, M; Byram, D; Caldwell, A S; Chan, Y-D; Christofferson, C D; Detwiler, J A; Efremenko, Yu; Ejiri, H; Elliott, S R; Galindo-Uribarri, A; Gilliss, T; Giovanetti, G K; Goett, J; Green, M P; Gruszko, J; Guinn, I S; Guiseppe, V E; Henning, R; Hoppe, E W; Howard, S; Howe, M A; Jasinski, B R; Keeter, K J; Kidd, M F; Konovalov, S I; Kouzes, R T; LaFerriere, B D; Leon, J; MacMullin, J; Martin, R D; Meijer, S J; Mertens, S; Orrell, J L; O'Shaughnessy, C; Poon, A W P; Radford, D C; Rager, J; Rielage, K; Robertson, R G H; Romero-Romero, E; Shanks, B; Shirchenko, M; Snyder, N; Suriano, A M; Tedeschi, D; Trimble, J E; Varner, R L; Vasilyev, S; Vetter, K; Vorren, K; White, B R; Wilkerson, J F; Wiseman, C; Xu, W; Yakushev, E; Yu, C -H; Yumatov, V; Zhitnikov, I

    2015-01-01

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0nbb-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulse shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR 0s germanium detectors allows for significant reduction of gamma background.

  13. ANALYSIS OF ANDROID VULNERABILITIES AND MODERN EXPLOITATION TECHNIQUES

    Directory of Open Access Journals (Sweden)

    Himanshu Shewale

    2014-03-01

    Full Text Available Android is an operating system based on the Linux kernel. It is the most widely used and popular operating system among Smartphones and portable devices. Its programmable and open nature attracts attackers to take undue advantage. Android platform allows developers to freely access and modify source code. But at the same time it increases the security issue. A user is likely to download and install malicious applications written by software hackers. This paper focuses on understanding and analyzing the vulnerabilities present in android platform. In this paper firstly we study the android architecture; analyze the existing threats and security weaknesses. Then we identify various exploit mitigation techniques to mitigate known vulnerabilities. A detailed analysis will help us to identify the existing loopholes and it will give strategic direction to make android operating system more secure.

  14. Radial Velocity Data Analysis with Compressed Sensing Techniques

    CERN Document Server

    Hara, Nathan C; Laskar, Jacques; Correia, Alexandre C M

    2016-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian processes framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  15. Radial velocity data analysis with compressed sensing techniques

    Science.gov (United States)

    Hara, Nathan C.; Boué, G.; Laskar, J.; Correia, A. C. M.

    2017-01-01

    We present a novel approach for analysing radial velocity data that combines two features: all the planets are searched at once and the algorithm is fast. This is achieved by utilizing compressed sensing techniques, which are modified to be compatible with the Gaussian process framework. The resulting tool can be used like a Lomb-Scargle periodogram and has the same aspect but with much fewer peaks due to aliasing. The method is applied to five systems with published radial velocity data sets: HD 69830, HD 10180, 55 Cnc, GJ 876 and a simulated very active star. The results are fully compatible with previous analysis, though obtained more straightforwardly. We further show that 55 Cnc e and f could have been respectively detected and suspected in early measurements from the Lick Observatory and Hobby-Eberly Telescope available in 2004, and that frequencies due to dynamical interactions in GJ 876 can be seen.

  16. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    Science.gov (United States)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the

  17. Adolescent baseball pitching technique: lower extremity biomechanical analysis.

    Science.gov (United States)

    Milewski, Matthew D; Õunpuu, Sylvia; Solomito, Matthew; Westwell, Melany; Nissen, Carl W

    2012-11-01

    Documentation of the lower extremity motion patterns of adolescent pitchers is an important part of understanding the pitching motion and the implication of lower extremity technique on upper extremity loads, injury and performance. The purpose of this study was to take the initial step in this process by documenting the biomechanics of the lower extremities during the pitching cycle in adolescent pitchers and to compare these findings with the published data for older pitchers. Three-dimensional motion analysis using a comprehensive lower extremity model was used to evaluate the fast ball pitch technique in adolescent pitchers. Thirty-two pitchers with a mean age of 12.4 years (range 10.5-14.7 years) and at least 2 years of experience were included in this study. The pitchers showed a mean of 49 ± 12° of knee flexion of the lead leg at foot contact. They tended to maintain this position through ball release, and then extended their knee during the follow through phase (ball release to maximal internal glenohumeral rotation). The lead leg hip rapidly progressed into adduction and flexion during the arm cocking phase with a range of motion of 40 ± 10° adduction and 30 ± 13° flexion. The lead hip mean peak adduction velocity was 434 ± 83°/s and flexion velocity was 456 ± 156°/s. Simultaneously, the trailing leg hip rapidly extended approaching to a mean peak extension of -8 ± 5° at 39% of the pitch cycle, which is close to passive range of motion constraints. Peak hip abduction of the trailing leg at foot contact was -31 ± 12°, which also approached passive range of motion constraints. Differences and similarities were also noted between the adolescent lower extremity kinematics and adult pitchers; however, a more comprehensive analysis using similar methods is needed for a complete comparison.

  18. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  19. Disruptive Event Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2004-09-08

    This analysis report is one of the technical reports containing documentation of the Environmental Radiation Model for Yucca Mountain, Nevada (ERMYN), a biosphere model supporting the total system performance assessment (TSPA) for the license application (LA) for the Yucca Mountain repository. This analysis report describes the development of biosphere dose conversion factors (BDCFs) for the volcanic ash exposure scenario, and the development of dose factors for calculating inhalation dose during volcanic eruption. A graphical representation of the documentation hierarchy for the ERMYN is presented in Figure 1-1. This figure shows the interrelationships among the products (i.e., analysis and model reports) developed for biosphere modeling and provides an understanding of how this analysis report contributes to biosphere modeling. This report is one of two reports that develop biosphere BDCFs, which are input parameters for the TSPA model. The ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) describes in detail the ERMYN conceptual model and mathematical model. The input parameter reports, shown to the right of the Biosphere Model Report in Figure 1-1, contain detailed descriptions of the model input parameters, their development and the relationship between the parameters and specific features, events and processes (FEPs). This report describes biosphere model calculations and their output, the BDCFs, for the volcanic ash exposure scenario. This analysis receives direct input from the outputs of the ''Biosphere Model Report'' (BSC 2004 [DIRS 169460]) and from the five analyses that develop parameter values for the biosphere model (BSC 2004 [DIRS 169671]; BSC 2004 [DIRS 169672]; BSC 2004 [DIRS 169673]; BSC 2004 [DIRS 169458]; and BSC 2004 [DIRS 169459]). The results of this report are further analyzed in the ''Biosphere Dose Conversion Factor Importance and Sensitivity Analysis''. The objective of this

  20. Anatomy-based transmission factors for technique optimization in portable chest x-ray

    Science.gov (United States)

    Liptak, Christopher L.; Tovey, Deborah; Segars, William P.; Dong, Frank D.; Li, Xiang

    2015-03-01

    Portable x-ray examinations often account for a large percentage of all radiographic examinations. Currently, portable examinations do not employ automatic exposure control (AEC). To aid in the design of a size-specific technique chart, acrylic slabs of various thicknesses are often used to estimate x-ray transmission for patients of various body thicknesses. This approach, while simple, does not account for patient anatomy, tissue heterogeneity, and the attenuation properties of the human body. To better account for these factors, in this work, we determined x-ray transmission factors using computational patient models that are anatomically realistic. A Monte Carlo program was developed to model a portable x-ray system. Detailed modeling was done of the x-ray spectrum, detector positioning, collimation, and source-to-detector distance. Simulations were performed using 18 computational patient models from the extended cardiac-torso (XCAT) family (9 males, 9 females; age range: 2-58 years; weight range: 12-117 kg). The ratio of air kerma at the detector with and without a patient model was calculated as the transmission factor. Our study showed that the transmission factor decreased exponentially with increasing patient thickness. For the range of patient thicknesses examined (12-28 cm), the transmission factor ranged from approximately 21% to 1.9% when the air kerma used in the calculation represented an average over the entire imaging field of view. The transmission factor ranged from approximately 21% to 3.6% when the air kerma used in the calculation represented the average signals from two discrete AEC cells behind the lung fields. These exponential relationships may be used to optimize imaging techniques for patients of various body thicknesses to aid in the design of clinical technique charts.

  1. Analysis of Proposed Noise Detection & Removal Technique in Degraded Fingerprint Images

    Science.gov (United States)

    Hamid, Ainul Azura Abdul; Rahim, Mohd Shafry Mohd; Al-Mazyad, Abdulaziz S.; Saba, Tanzila

    2015-12-01

    The quality of fingerprint images is important to ensure good performance of fingerprint recognition since recognition process depends heavily on the quality of fingerprint images. Fingerprint images obtained from the acquisition phase are either contaminated with noise or degraded due to poor quality machines. Several factors such as scars, moist in scanner and many more noises affect the quality of the images during scanning process. This paper performed an analysis and compared noise removal techniques reported in the literature for fingerprint images. We also implemented histogram equalization, median filter, Fourier transform, unsharp mask and grayscale enhancement techniques. The quality of enhanced images is measured by peak signal to noise ratio (PSNR) calculation for analysis and comparisons.

  2. The influence of "C-factor" and light activation technique on polymerization contraction forces of resin composite

    Directory of Open Access Journals (Sweden)

    Sérgio Kiyoshi Ishikiriama

    2012-12-01

    Full Text Available OBJECTIVES: This study evaluated the influence of the cavity configuration factor ("C-Factor" and light activation technique on polymerization contraction forces of a Bis-GMA-based composite resin (Charisma, Heraeus Kulzer. MATERIAL AND METHODS: Three different pairs of steel moving bases were connected to a universal testing machine (emic DL 500: groups A and B - 2x2 mm (CF=0.33, groups C and D - 3x2 mm (CF=0.66, groups e and F - 6x2 mm (CF=1.5. After adjustment of the height between the pair of bases so that the resin had a volume of 12 mm³ in all groups, the material was inserted and polymerized by two different methods: pulse delay (100 mW/cm² for 5 s, 40 s interval, 600 mW/cm² for 20 s and continuous pulse (600 mW/cm² for 20 s. Each configuration was light cured with both techniques. Tensions generated during polymerization were recorded by 120 s. The values were expressed in curves (Force(N x Time(s and averages compared by statistical analysis (ANOVA and Tukey's test, p<0.05. RESULTS: For the 2x2 and 3x2 bases, with a reduced C-Factor, significant differences were found between the light curing methods. For 6x2 base, with high C-Factor, the light curing method did not influence the contraction forces of the composite resin. CONCLUSIONS: Pulse delay technique can determine less stress on tooth/restoration interface of adhesive restorations only when a reduced C-Factor is present.

  3. FACTORS AFFECTING THE MECHANICAL PROPERTIES OF COMPACT BONE AND MINIATURE SPECIMEN TEST TECHNIQUES: A REVIEW

    Directory of Open Access Journals (Sweden)

    Vandana Chittibabu

    2016-12-01

    Full Text Available This paper presents the review concerning mechanical properties of bone and the miniature specimen test techniques. For developing a realistic understanding of how factors such as moisture content, mineralization, age, species, location, gender, rate of deformation etc. affect the mechanical properties of bone, it is critical to understand the role of these factors. A general survey on existing research work is presented on this aspect. The essential features of miniature specimen test techniques are described, along with the application of small punch test method to evaluate the mechanical behavior of materials. The procedure for the determination of tensile and fracture properties, such as: yield strength, ultimate strength, ductility, fracture toughness etc. using small punch test technique have been described. The empirical equations proposed by various investigators for the prediction of tensile and fracture properties are presented and discussed. In some cases, the predictions of material properties have been essentially made through the finite element simulation. The finite element simulation of miniature specimen test technique is also covered in this review. The use of inverse finite element procedure for the prediction of uniaxial tensile constitutive behaviour of materials is also presented

  4. Efficient geometric rectification techniques for spectral analysis algorithm

    Science.gov (United States)

    Chang, C. Y.; Pang, S. S.; Curlander, J. C.

    1992-01-01

    The spectral analysis algorithm is a viable technique for processing synthetic aperture radar (SAR) data in near real time throughput rates by trading the image resolution. One major challenge of the spectral analysis algorithm is that the output image, often referred to as the range-Doppler image, is represented in the iso-range and iso-Doppler lines, a curved grid format. This phenomenon is known to be the fanshape effect. Therefore, resampling is required to convert the range-Doppler image into a rectangular grid format before the individual images can be overlaid together to form seamless multi-look strip imagery. An efficient algorithm for geometric rectification of the range-Doppler image is presented. The proposed algorithm, realized in two one-dimensional resampling steps, takes into consideration the fanshape phenomenon of the range-Doppler image as well as the high squint angle and updates of the cross-track and along-track Doppler parameters. No ground reference points are required.

  5. Pattern recognition software and techniques for biological image analysis.

    Directory of Open Access Journals (Sweden)

    Lior Shamir

    Full Text Available The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  6. Pattern recognition software and techniques for biological image analysis.

    Science.gov (United States)

    Shamir, Lior; Delaney, John D; Orlov, Nikita; Eckley, D Mark; Goldberg, Ilya G

    2010-11-24

    The increasing prevalence of automated image acquisition systems is enabling new types of microscopy experiments that generate large image datasets. However, there is a perceived lack of robust image analysis systems required to process these diverse datasets. Most automated image analysis systems are tailored for specific types of microscopy, contrast methods, probes, and even cell types. This imposes significant constraints on experimental design, limiting their application to the narrow set of imaging methods for which they were designed. One of the approaches to address these limitations is pattern recognition, which was originally developed for remote sensing, and is increasingly being applied to the biology domain. This approach relies on training a computer to recognize patterns in images rather than developing algorithms or tuning parameters for specific image processing tasks. The generality of this approach promises to enable data mining in extensive image repositories, and provide objective and quantitative imaging assays for routine use. Here, we provide a brief overview of the technologies behind pattern recognition and its use in computer vision for biological and biomedical imaging. We list available software tools that can be used by biologists and suggest practical experimental considerations to make the best use of pattern recognition techniques for imaging assays.

  7. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    Energy Technology Data Exchange (ETDEWEB)

    Belushkin, M.

    2007-09-29

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  8. Dispersion-theoretical analysis of the nucleon electromagnetic form factors

    Energy Technology Data Exchange (ETDEWEB)

    Belushkin, M.

    2007-09-29

    The structure of the proton and the neutron is of fundamental importance for the study of the strong interaction dynamics over a wide range of momentum transfers. The nucleon form factors encode information on the internal structure of the nucleon as probed by the electromagnetic interaction, and, to a certain extent, reflect the charge and magnetisation distributions within the proton and the neutron. In this thesis we report on our investigation of the electromagnetic form factors of the proton and the neutron with dispersion relation techniques, including known experimental input on the {pi}{pi}, K anti K and the {rho}{pi} continua and perturbative QCD constraints. We include new experimental data on the pion form factor and the nucleon form factors in our simultaneous analysis of all four form factors in both the space- and the timelike regions for all momentum transfers, and perform Monte- Carlo sampling in order to obtain theoretical uncertainty bands. Finally, we discuss the implications of our results on the pion cloud of the nucleon, the nucleon radii and the Okubo-Zweig-Iizuka rule, and present our results of a model-independent approach to estimating two-photon effects in elastic electron-proton scattering. (orig.)

  9. International publications about sustainability: a review of articles using the technique of qualitative content analysis

    Directory of Open Access Journals (Sweden)

    Cristiane Froehlich

    2014-04-01

    Full Text Available This study aims to identify articles related to sustainability in international publications and analyze the categories that emerge from these studies through the use of the technique of qualitative content analysis in order to identify the main approaches, author contributions and theoretical gaps and suggestions for further studies. For application of the technique have been rising about 20 articles in Journals that feature sustainability impact factor relevant, and used the three basic steps for content analysis of the publications listed by Bardin (1977: (a pre-analysis; (b material exploration and; (c treatment data, inference and interpretation. The main results show that the major theories that are related to sustainability are: the resources and capabilities theory, institutional theory, stakeholders theory, the theory of market orientation, theory of supply chain and competitive advantage and both seek to help and explain the factors that facilitate or hinder the practice of corporate sustainability. In addition, some suggestions for further research were identified and analysis of the results presented in this study

  10. ENERGY EFFICIENCY ANALYSIS OF ERROR CORRECTION TECHNIQUES IN UNDERWATER WIRELESS SENSOR NETWORKS

    Directory of Open Access Journals (Sweden)

    M. NORDIN B. ZAKARIA

    2011-02-01

    Full Text Available Research in underwater acoustic networks has been developed rapidly to support large variety of applications such as mining equipment and environmental monitoring. As in terrestrial sensor networks; reliable data transport is demanded in underwater sensor networks. The energy efficiency of error correction technique should be considered because of the severe energy constraints of underwater wireless sensor networks. Forward error correction (FEC andautomatic repeat request (ARQ are the two main error correction techniques in underwater networks. In this paper, a mathematical energy efficiency analysis for FEC and ARQ techniques in underwater environment has been done based on communication distance and packet size. The effects of wind speed, and shipping factor are studied. A comparison between FEC and ARQ in terms of energy efficiency is performed; it is found that energy efficiency of both techniquesincreases with increasing packet size in short distances, but decreases in longer distances. There is also a cut-off distance below which ARQ is more energy efficient than FEC, and after which FEC is more energy efficient than ARQ. This cut-off distance decreases by increasing wind speed. Wind speed has great effecton energy efficiency where as shipping factor has unnoticeable effect on energy efficiency for both techniques.

  11. Factorized molecular wave functions: Analysis of the nuclear factor

    Energy Technology Data Exchange (ETDEWEB)

    Lefebvre, R., E-mail: roland.lefebvre@u-psud.fr [Institut des Sciences Moléculaires d’ Orsay, Bâtiment 350, UMR8214, CNRS- Université. Paris-Sud, 91405 Orsay, France and Sorbonne Universités, UPMC Univ Paris 06, UFR925, F-75005 Paris (France)

    2015-06-07

    The exact factorization of molecular wave functions leads to nuclear factors which should be nodeless functions. We reconsider the case of vibrational perturbations in a diatomic species, a situation usually treated by combining Born-Oppenheimer products. It was shown [R. Lefebvre, J. Chem. Phys. 142, 074106 (2015)] that it is possible to derive, from the solutions of coupled equations, the form of the factorized function. By increasing artificially the interstate coupling in the usual approach, the adiabatic regime can be reached, whereby the wave function can be reduced to a single product. The nuclear factor of this product is determined by the lowest of the two potentials obtained by diagonalization of the potential matrix. By comparison with the nuclear wave function of the factorized scheme, it is shown that by a simple rectification, an agreement is obtained between the modified nodeless function and that of the adiabatic scheme.

  12. Emerging techniques for soil analysis via mid-infrared spectroscopy

    Science.gov (United States)

    Linker, R.; Shaviv, A.

    2009-04-01

    Transmittance and diffuse reflectance (DRIFT) spectroscopy in the mid-IR range are well-established methods for soil analysis. Over the last five years, additional mid-IR techniques have been investigated, and in particular: 1. Attenuated total reflectance (ATR) Attenuated total reflectance is commonly used for analysis of liquids and powders for which simple transmittance measurements are not possible. The method relies on a crystal with a high refractive index, which is in contact with the sample and serves as a waveguide for the IR radiation. The radiation beam is directed in such a way that it hits the crystal/sample interface several times, each time penetrating a few microns into the sample. Since the penetration depth is limited to a few microns, very good contact between the sample and the crystal must be ensured, which can be achieved by working with samples close to water saturation. However, the strong absorbance of water in the mid-infrared range as well as the absorbance of some soil constituents (e.g., calcium carbonate) interfere with some of the absorbance bands of interest. This has led to the development of several post-processing methods for analysis of the spectra. The FTIR-ATR technique has been successfully applied to soil classification as well as to determination of nitrate concentration [1, 6-8, 10]. Furthermore, Shaviv et al. [12] demonstrated the possibility of using fiber optics as an ATR devise for direct determination of nitrate concentration in soil extracts. Recently, Du et al. [5] showed that it is possible to differentiate between 14N and 15N in such spectra, which opens very promising opportunities for developing FTIR-ATR based methods for investigating nitrogen transformation in soils by tracing changes in N-isotopic species. 2. Photo-acoustic spectroscopy Photoacoustic spectroscopy (PAS) is based on absorption-induced heating of the sample, which produces pressure fluctuations in a surrounding gas. These fluctuations are

  13. Manure management and greenhouse gas mitigation techniques : a comparative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Langmead, C.

    2003-09-03

    Alberta is the second largest agricultural producer in Canada, ranking just behind Ontario. Approximately 62 per cent of the province's farm cash receipts are attributable to the livestock industry. Farmers today maintain large numbers of a single animal type. The drivers for more advanced manure management systems include: the trend towards confined feeding operations (CFO) is creating large, concentrated quantities of manure; public perception of CFO; implementation of provincial legislation regulating the expansion and construction of CFO; ratification of the Kyoto Protocol raised interest in the development of improved manure management systems capable of reducing greenhouse gas (GHG) emissions; and rising energy costs. The highest methane emissions factors are found with liquid manure management systems. They contribute more than 80 per cent of the total methane emissions from livestock manure in Alberta. The author identified and analyzed three manure management techniques to mitigate GHG emissions. They were: bio-digesters, gasification systems, and composting. Three recommendations were made to establish a strategy to support emissions offsets and maximize the reduction of methane emissions from the livestock industry. The implementation of bio-digesters, especially for the swine industry, was recommended. It was suggested that a gasification pilot project for poultry manure should be pursued by Climate Change Central. Public outreach programs promoting composting of cattle manure for beef feedlots and older style dairy barns should also be established. 19 refs., 11 tabs., 3 figs.

  14. Two-dimensional Imaging Velocity Interferometry: Technique and Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Erskine, D J; Smith, R F; Bolme, C; Celliers, P; Collins, G

    2011-03-23

    We describe the data analysis procedures for an emerging interferometric technique for measuring motion across a two-dimensional image at a moment in time, i.e. a snapshot 2d-VISAR. Velocity interferometers (VISAR) measuring target motion to high precision have been an important diagnostic in shockwave physics for many years Until recently, this diagnostic has been limited to measuring motion at points or lines across a target. We introduce an emerging interferometric technique for measuring motion across a two-dimensional image, which could be called a snapshot 2d-VISAR. If a sufficiently fast movie camera technology existed, it could be placed behind a traditional VISAR optical system and record a 2d image vs time. But since that technology is not yet available, we use a CCD detector to record a single 2d image, with the pulsed nature of the illumination providing the time resolution. Consequently, since we are using pulsed illumination having a coherence length shorter than the VISAR interferometer delay ({approx}0.1 ns), we must use the white light velocimetry configuration to produce fringes with significant visibility. In this scheme, two interferometers (illuminating, detecting) having nearly identical delays are used in series, with one before the target and one after. This produces fringes with at most 50% visibility, but otherwise has the same fringe shift per target motion of a traditional VISAR. The 2d-VISAR observes a new world of information about shock behavior not readily accessible by traditional point or 1d-VISARS, simultaneously providing both a velocity map and an 'ordinary' snapshot photograph of the target. The 2d-VISAR has been used to observe nonuniformities in NIF related targets (polycrystalline diamond, Be), and in Si and Al.

  15. Sample preparation and in situ hybridization techniques for automated molecular cytogenetic analysis of white blood cells

    Energy Technology Data Exchange (ETDEWEB)

    Rijke, F.M. van de; Vrolijk, H.; Sloos, W. [Leiden Univ. (Netherlands)] [and others

    1996-06-01

    With the advent in situ hybridization techniques for the analysis of chromosome copy number or structure in interphase cells, the diagnostic and prognostic potential of cytogenetics has been augmented considerably. In theory, the strategies for detection of cytogenetically aberrant cells by in situ hybridization are simple and straightforward. In practice, however, they are fallible, because false classification of hybridization spot number or patterns occurs. When a decision has to be made on molecular cytogenetic normalcy or abnormalcy of a cell sample, the problem of false classification becomes particularly prominent if the fraction of aberrant cells is relatively small. In such mosaic situations, often > 200 cells have to be evaluated to reach a statistical sound figure. The manual enumeration of in situ hybridization spots in many cells in many patient samples is tedious. Assistance in the evaluation process by automation of microscope functions and image analysis techniques is, therefore, strongly indicated. Next to research and development of microscope hardware, camera technology, and image analysis, the optimization of the specimen for the (semi)automated microscopic analysis is essential, since factors such as cell density, thickness, and overlap have dramatic influences on the speed and complexity of the analysis process. Here we describe experiments that have led to a protocol for blood cell specimen that results in microscope preparations that are well suited for automated molecular cytogenetic analysis. 13 refs., 4 figs., 1 tab.

  16. ANALYSIS OF RISK FACTORS ECTOPIC PREGNANCY

    Directory of Open Access Journals (Sweden)

    Budi Santoso

    2017-04-01

    Full Text Available Introduction: Ectopic pregnancy is a pregnancy with extrauterine implantation. This situation is gynecologic emergency that contributes to maternal mortality. Therefore, early recognition, based on identification of the causes of ectopic pregnancy risk factors, is needed. Methods: The design descriptive observational. The samples were pregnant women who had ectopic pregnancy at Maternity Room, Emergency Unit, Dr. Soetomo Hospital, Surabaya, from 1 July 2008 to 1 July 2010. Sampling technique was total sampling using medical records. Result: Patients with ectopic pregnancy were 99 individuals out of 2090 pregnant women who searched for treatment in Dr. Soetomo Hospital. However, only 29 patients were accompanied with traceable risk factors. Discussion:. Most ectopic pregnancies were in the age group of 26-30 years, comprising 32 patients (32.32%, then in age groups of 31–35 years as many as 25 patients (25.25%, 18 patients in age group 21–25 years (18.18%, 17 patients in age group 36–40 years (17.17%, 4 patients in age group 41 years and more (4.04%, and the least was in age group of 16–20 years with 3 patients (3.03%. A total of 12 patients with ectopic pregnancy (41.38% had experience of abortion and 6 patients (20.69% each in groups of patients with ectopic pregnancy who used family planning, in those who used family planning as well as ectopic pregnancy patients with history of surgery. There were 2 patients (6.90% of the group of patients ectopic pregnancy who had history of surgery and history of abortion. The incidence rate of ectopic pregnancy was 4.73%, mostly in the second gravidity (34.34%, whereas the nulliparous have the highest prevalence of 39.39%. Acquired risk factors, i.e. history of operations was 10.34%, patients with family planning 20.69%, patients with history of abortion 41.38%, patients with history of abortion and operation 6.90% patients with family and history of abortion was 20.69%.

  17. Factorial invariance in multilevel confirmatory factor analysis.

    Science.gov (United States)

    Ryu, Ehri

    2014-02-01

    This paper presents a procedure to test factorial invariance in multilevel confirmatory factor analysis. When the group membership is at level 2, multilevel factorial invariance can be tested by a simple extension of the standard procedure. However level-1 group membership raises problems which cannot be appropriately handled by the standard procedure, because the dependency between members of different level-1 groups is not appropriately taken into account. The procedure presented in this article provides a solution to this problem. This paper also shows Muthén's maximum likelihood (MUML) estimation for testing multilevel factorial invariance across level-1 groups as a viable alternative to maximum likelihood estimation. Testing multilevel factorial invariance across level-2 groups and testing multilevel factorial invariance across level-1 groups are illustrated using empirical examples. SAS macro and Mplus syntax are provided.

  18. Physiological Factors Analysis in Unpressurized Aircraft Cabins

    Science.gov (United States)

    Patrao, Luis; Zorro, Sara; Silva, Jorge

    2016-11-01

    Amateur and sports flight is an activity with growing numbers worldwide. However, the main cause of flight incidents and accidents is increasingly pilot error, for a number of reasons. Fatigue, sleep issues and hypoxia, among many others, are some that can be avoided, or, at least, mitigated. This article describes the analysis of psychological and physiological parameters during flight in unpressurized aircraft cabins. It relates cerebral oximetry and heart rate with altitude, as well as with flight phase. The study of those parameters might give clues on which variations represent a warning sign to the pilot, thus preventing incidents and accidents due to human factors. Results show that both cerebral oximetry and heart rate change along the flight and altitude in the alert pilot. The impaired pilot might not reveal these variations and, if this is detected, he can be warned in time.

  19. Fractographic ceramic failure analysis using the replica technique

    Science.gov (United States)

    Scherrer, Susanne S.; Quinn, Janet B.; Quinn, George D.; Anselm Wiskott, H. W.

    2007-01-01

    Objectives To demonstrate the effectiveness of in vivo replicas of fractured ceramic surfaces for descriptive fractography as applied to the analysis of clinical failures. Methods The fracture surface topography of partially failed veneering ceramic of a Procera Alumina molar and an In Ceram Zirconia premolar were examined utilizing gold-coated epoxy poured replicas viewed using scanning electron microscopy. The replicas were inspected for fractographic features such as hackle, wake hackle, twist hackle, compression curl and arrest lines for determination of the direction of crack propagation and location of the origin. Results For both veneering ceramics, replicas provided an excellent reproduction of the fractured surfaces. Fine details including all characteristic fracture features produced by the interaction of the advancing crack with the material's microstructure could be recognized. The observed features are indicators of the local direction of crack propagation and were used to trace the crack's progression back to its initial starting zone (the origin). Drawbacks of replicas such as artifacts (air bubbles) or imperfections resulting from inadequate epoxy pouring were noted but not critical for the overall analysis of the fractured surfaces. Significance The replica technique proved to be easy to use and allowed an excellent reproduction of failed ceramic surfaces. It should be applied before attempting to remove any failed part remaining in situ as the fracture surface may be damaged during this procedure. These two case studies are intended as an introduction for the clinical researcher in using qualitative (descriptive) fractography as a tool for understanding fracture processes in brittle restorative materials and, secondarily, to draw conclusions as to possible design inadequacies in failed restorations. PMID:17270267

  20. Learning From Hidden Traits: Joint Factor Analysis and Latent Clustering

    Science.gov (United States)

    Yang, Bo; Fu, Xiao; Sidiropoulos, Nicholas D.

    2017-01-01

    Dimensionality reduction techniques play an essential role in data analytics, signal processing and machine learning. Dimensionality reduction is usually performed in a preprocessing stage that is separate from subsequent data analysis, such as clustering or classification. Finding reduced-dimension representations that are well-suited for the intended task is more appealing. This paper proposes a joint factor analysis and latent clustering framework, which aims at learning cluster-aware low-dimensional representations of matrix and tensor data. The proposed approach leverages matrix and tensor factorization models that produce essentially unique latent representations of the data to unravel latent cluster structure -- which is otherwise obscured because of the freedom to apply an oblique transformation in latent space. At the same time, latent cluster structure is used as prior information to enhance the performance of factorization. Specific contributions include several custom-built problem formulations, corresponding algorithms, and discussion of associated convergence properties. Besides extensive simulations, real-world datasets such as Reuters document data and MNIST image data are also employed to showcase the effectiveness of the proposed approaches.

  1. Romanian medieval earring analysis by X-ray fluorescence technique

    Energy Technology Data Exchange (ETDEWEB)

    Therese, Laurent; Guillot, Philippe, E-mail: philippe.guillot@univ-jfc.fr [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Muja, Cristina [Laboratoire Diagnostics des Plasmas, CUFR J.F.C, Albi (France); Faculty of Biology, University of Bucharest (Romania); Vasile Parvan Institute of Archaeology, Bucharest, (Romania)

    2011-07-01

    Full text: Several instrumental techniques of elemental analysis are now used for the characterization of archaeological materials. The combination between archaeological and analytical information can provide significant knowledge on the constituting material origin, heritage authentication and restoration, provenance, migration, social interaction and exchange. Surface mapping techniques such as X-Ray Fluorescence have become a powerful tool for obtaining qualitative and semi-quantitative information about the chemical composition of cultural heritage materials, including metallic archaeological objects. In this study, the material comes from the Middle Age cemetery of Feldioara (Romania). The excavation of the site located between the evangelical church and the parsonage led to the discovery of several funeral artifacts in 18 graves among a total of 127 excavated. Even if the inventory was quite poor, some of the objects helped in establishing the chronology. Six anonymous Hungarian denarii (silver coins) were attributed to Geza II (1141-1161) and Stefan III (1162-1172), placing the cemetery in the second half of the XII century. This period was also confirmed by three loop shaped earrings with the end in 'S' form (one small and two large earrings). The small earring was found during the excavation in grave number 86, while the two others were discovered together in grave number 113. The anthropological study shown that skeletons excavated from graves 86 and 113 belonged respectively to a child (1 individual, medium level preservation, 9 months +/- 3 months) and to an adult (1 individual). In this work, elemental mapping were obtained by X-ray fluorescence (XRF) technique from Jobin Yvon Horiba XGT-5000 instrument offering detailed elemental images with a spatial resolution of 100{mu}m. The analysis revealed that the earrings were composed of copper, zinc and tin as major elements. Minor elements were also determined. The comparison between the two

  2. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    Science.gov (United States)

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  3. Stratified source-sampling techniques for Monte Carlo eigenvalue analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Mohamed, A.

    1998-07-10

    In 1995, at a conference on criticality safety, a special session was devoted to the Monte Carlo ''Eigenvalue of the World'' problem. Argonne presented a paper, at that session, in which the anomalies originally observed in that problem were reproduced in a much simplified model-problem configuration, and removed by a version of stratified source-sampling. In this paper, stratified source-sampling techniques are generalized and applied to three different Eigenvalue of the World configurations which take into account real-world statistical noise sources not included in the model problem, but which differ in the amount of neutronic coupling among the constituents of each configuration. It is concluded that, in Monte Carlo eigenvalue analysis of loosely-coupled arrays, the use of stratified source-sampling reduces the probability of encountering an anomalous result over that if conventional source-sampling methods are used. However, this gain in reliability is substantially less than that observed in the model-problem results.

  4. Sentiment Analysis of Twitter tweets using supervised classification technique

    Directory of Open Access Journals (Sweden)

    Pranav Waykar

    2016-05-01

    Full Text Available Making use of social media for analyzing the perceptions of the masses over a product, event or a person has gained momentum in recent times. Out of a wide array of social networks, we chose Twitter for our analysis as the opinions expressed their, are concise and bear a distinctive polarity. Here, we collect the most recent tweets on users' area of interest and analyze them. The extracted tweets are then segregated as positive, negative and neutral. We do the classification in following manner: collect the tweets using Twitter API; then we process the collected tweets to convert all letters to lowercase, eliminate special characters etc. which makes the classification more efficient; the processed tweets are classified using a supervised classification technique. We make use of Naive Bayes classifier to segregate the tweets as positive, negative and neutral. We use a set of sample tweets to train the classifier. The percentage of the tweets in each category is then computed and the result is represented graphically. The result can be used further to gain an insight into the views of the people using Twitter about a particular topic that is being searched by the user. It can help corporate houses devise strategies on the basis of the popularity of their product among the masses. It may help the consumers to make informed choices based on the general sentiment expressed by the Twitter users on a product

  5. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Lalitha Jayaraman

    2010-01-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.

  6. Metrology Optical Power Budgeting in SIM Using Statistical Analysis Techniques

    Science.gov (United States)

    Kuan, Gary M

    2008-01-01

    The Space Interferometry Mission (SIM) is a space-based stellar interferometry instrument, consisting of up to three interferometers, which will be capable of micro-arc second resolution. Alignment knowledge of the three interferometer baselines requires a three-dimensional, 14-leg truss with each leg being monitored by an external metrology gauge. In addition, each of the three interferometers requires an internal metrology gauge to monitor the optical path length differences between the two sides. Both external and internal metrology gauges are interferometry based, operating at a wavelength of 1319 nanometers. Each gauge has fiber inputs delivering measurement and local oscillator (LO) power, split into probe-LO and reference-LO beam pairs. These beams experience power loss due to a variety of mechanisms including, but not restricted to, design efficiency, material attenuation, element misalignment, diffraction, and coupling efficiency. Since the attenuation due to these sources may degrade over time, an accounting of the range of expected attenuation is needed so an optical power margin can be book kept. A method of statistical optical power analysis and budgeting, based on a technique developed for deep space RF telecommunications, is described in this paper and provides a numerical confidence level for having sufficient optical power relative to mission metrology performance requirements.

  7. Seismic margin analysis technique for nuclear power plant structures

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choi, In Kil

    2001-04-01

    In general, the Seismic Probabilistic Risk Assessment (SPRA) and the Seismic Margin Assessment(SAM) are used for the evaluation of realistic seismic capacity of nuclear power plant structures. Seismic PRA is a systematic process to evaluate the seismic safety of nuclear power plant. In our country, SPRA has been used to perform the probabilistic safety assessment for the earthquake event. SMA is a simple and cost effective manner to quantify the seismic margin of individual structural elements. This study was performed to improve the reliability of SMA results and to confirm the assessment procedure. To achieve this goal, review for the current status of the techniques and procedures was performed. Two methodologies, CDFM (Conservative Deterministic Failure Margin) sponsored by NRC and FA (Fragility Analysis) sponsored by EPRI, were developed for the seismic margin review of NPP structures. FA method was originally developed for Seismic PRA. CDFM approach is more amenable to use by experienced design engineers including utility staff design engineers. In this study, detailed review on the procedures of CDFM and FA methodology was performed.

  8. An evaluation of wind turbine blade cross section analysis techniques.

    Energy Technology Data Exchange (ETDEWEB)

    Paquette, Joshua A.; Griffith, Daniel Todd; Laird, Daniel L.; Resor, Brian Ray

    2010-03-01

    The blades of a modern wind turbine are critical components central to capturing and transmitting most of the load experienced by the system. They are complex structural items composed of many layers of fiber and resin composite material and typically, one or more shear webs. Large turbine blades being developed today are beyond the point of effective trial-and-error design of the past and design for reliability is always extremely important. Section analysis tools are used to reduce the three-dimensional continuum blade structure to a simpler beam representation for use in system response calculations to support full system design and certification. One model simplification approach is to analyze the two-dimensional blade cross sections to determine the properties for the beam. Another technique is to determine beam properties using static deflections of a full three-dimensional finite element model of a blade. This paper provides insight into discrepancies observed in outputs from each approach. Simple two-dimensional geometries and three-dimensional blade models are analyzed in this investigation. Finally, a subset of computational and experimental section properties for a full turbine blade are compared.

  9. Analysis of Consistency of Printing Blankets using Correlation Technique

    Directory of Open Access Journals (Sweden)

    Balaraman Kumar

    2010-06-01

    Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.

  10. Driving forces of change in environmental indicators an analysis based on divisia index decomposition techniques

    CERN Document Server

    González, Paula Fernández; Presno, Mª José

    2014-01-01

    This book addresses several index decomposition analysis methods to assess progress made by EU countries in the last decade in relation to energy and climate change concerns. Several applications of these techniques are carried out in order to decompose changes in both energy and environmental aggregates. In addition to this, a new methodology based on classical spline approximations is introduced, which provides useful mathematical and statistical properties. Once a suitable set of determinant factors has been identified, these decomposition methods allow the researcher to quantify the respec

  11. Applications of surface analysis techniques to photovoltaic research: Grain and grain boundary studies

    Science.gov (United States)

    Kazmerski, L. L.

    Complementary surface analysis techniques (AES, SIMS, XPS) are applied to photovoltaic devices in order to assess the limiting factors of grain and grain boundary chemistry to the performance of polycrystalline solar cells. Results of these compositional and chemical studies are directly correlated with electrical measurements (EBIC) and with resulting device performance. Examples of grain boundary passivation in polycrystalline Si and GaAs solar cells are cited. The quality of the intragrain material used in these devices is shown to be equally important to the grain boundary activity in determining overall photovoltaic performance.

  12. An Exploratory Factor Analysis Examining Traits, Perceived Fit and Job Satisfaction in Employed College Graduates

    Science.gov (United States)

    Brandon, John R.

    2011-01-01

    This study is an analysis of 24 variables associated with employee attitudes, behaviors and outcomes. A total of 140 college graduates participated in the study. Utilizing exploratory factor analysis (EFA) techniques, the research examined relationships among the following variables: perceived fit, job satisfaction, cognitive ability, vocational…

  13. Multivariate analysis of remote LIBS spectra using partial least squares, principal component analysis, and related techniques

    Energy Technology Data Exchange (ETDEWEB)

    Clegg, Samuel M [Los Alamos National Laboratory; Barefield, James E [Los Alamos National Laboratory; Wiens, Roger C [Los Alamos National Laboratory; Sklute, Elizabeth [MT HOLYOKE COLLEGE; Dyare, Melinda D [MT HOLYOKE COLLEGE

    2008-01-01

    Quantitative analysis with LIBS traditionally employs calibration curves that are complicated by the chemical matrix effects. These chemical matrix effects influence the LIBS plasma and the ratio of elemental composition to elemental emission line intensity. Consequently, LIBS calibration typically requires a priori knowledge of the unknown, in order for a series of calibration standards similar to the unknown to be employed. In this paper, three new Multivariate Analysis (MV A) techniques are employed to analyze the LIBS spectra of 18 disparate igneous and highly-metamorphosed rock samples. Partial Least Squares (PLS) analysis is used to generate a calibration model from which unknown samples can be analyzed. Principal Components Analysis (PCA) and Soft Independent Modeling of Class Analogy (SIMCA) are employed to generate a model and predict the rock type of the samples. These MV A techniques appear to exploit the matrix effects associated with the chemistries of these 18 samples.

  14. The Occupational Success of the Retarded: Critical Factors, Predictive Tests and Remedial Techniques.

    Science.gov (United States)

    Laradon Hall Occupational Center, Denver, CO.

    A job success rating scale was developed by use with 60 mentally retarded young adults (IQ's under 80, ages from 18 to 30), their parents, and employers. Interviews and job histories were analyzed; an experimental test battery measuring 101 aptitude and personality variables was administered. By factor analysis and statistical procedures, 17 tests…

  15. Emotional Freedom Technique (EFT Effects on Psychoimmunological Factors of Chemically Pulmonary Injured Veterans.

    Directory of Open Access Journals (Sweden)

    Abdolreza Babamahmoodi

    2015-02-01

    Full Text Available Emotional Freedom Technique (EFT as a new therapeutic technique in energy psychology has positive effects on psychological and physiological symptoms, and quality of life. In this research we studied the effect of this treatment on immunological factors. This study tested whether 8-week group sessions of EFT (compared to a wait-list control group with emphasis on patient's respiratory, psychological and immunological problems in chemically pulmonary injured veterans (N=28 can affect on immunological and psychological factors. Mixed effect linear models indicated that EFT improved mental health (F=79.24, p=0 and health-related quality of life (F=13.89, p=0.001, decreased somatic symptoms (F=5.81, p=0.02, anxiety/insomnia (F=24.03, p<0.001, social dysfunction (F=21.59, p<0.001, frequency and severity of respiratory symptoms (F=20.38, p<0.001, and increased lymphocyte proliferation with nonspecific mitogens Concanavalin A (Con A (F=14.32, p=0.001 and Phytohemagglutinin (PHA (F=12.35, p=0.002, and peripheral blood IL-17 (F=9.11, p=0.006. This study provides an initial indication that EFT may be a new therapeutic approach for improving psychological and immunological factors.

  16. Is There an Economical Running Technique? A Review of Modifiable Biomechanical Factors Affecting Running Economy.

    Science.gov (United States)

    Moore, Isabel S

    2016-06-01

    Running economy (RE) has a strong relationship with running performance, and modifiable running biomechanics are a determining factor of RE. The purposes of this review were to (1) examine the intrinsic and extrinsic modifiable biomechanical factors affecting RE; (2) assess training-induced changes in RE and running biomechanics; (3) evaluate whether an economical running technique can be recommended and; (4) discuss potential areas for future research. Based on current evidence, the intrinsic factors that appeared beneficial for RE were using a preferred stride length range, which allows for stride length deviations up to 3 % shorter than preferred stride length; lower vertical oscillation; greater leg stiffness; low lower limb moment of inertia; less leg extension at toe-off; larger stride angles; alignment of the ground reaction force and leg axis during propulsion; maintaining arm swing; low thigh antagonist-agonist muscular coactivation; and low activation of lower limb muscles during propulsion. Extrinsic factors associated with a better RE were a firm, compliant shoe-surface interaction and being barefoot or wearing lightweight shoes. Several other modifiable biomechanical factors presented inconsistent relationships with RE. Running biomechanics during ground contact appeared to play an important role, specifically those during propulsion. Therefore, this phase has the strongest direct links with RE. Recurring methodological problems exist within the literature, such as cross-comparisons, assessing variables in isolation, and acute to short-term interventions. Therefore, recommending a general economical running technique should be approached with caution. Future work should focus on interdisciplinary longitudinal investigations combining RE, kinematics, kinetics, and neuromuscular and anatomical aspects, as well as applying a synergistic approach to understanding the role of kinetics.

  17. Fluorometric Discrimination Technique of Phytoplankton Population Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Shanshan; SU Rongguo; DUAN Yali; ZHANG Cui; SONG Zhijie; WANG Xiulin

    2012-01-01

    The discrete excitation-emission-matrix fluorescence spectra(EEMS)at 12 excitation wavelengths (400,430,450,460,470,490,500,510,525,550,570,and 590 nm)and emission wavelengths ranging from 600-750 nm were determined for 43 phytoplankton species.A two-rank fluorescence spectra database was established by wavelet analysis and a fluorometric discrimination technique for determining phytoplankton population was developed.For laboratory simulatively mixed samples,the samples mixed from 43 algal species(the algae of one division accounted for 25%,50%,75%,85%,and 100% of the gross biomass,respectively),the average discrimination rates at the level of division were 65.0%,87.5%,98.6%,99.0%,and 99.1%,with average relative contents of 18.9%,44.5%,68.9%,73.4%,and 82.9%,respectively;the samples mixed from 32 red tide algal species(the dominant species accounted for 60%,70%,80%,90%,and 100% of the gross biomass,respectively),the average correct discrimination rates of the dominant species at the level of genus were 63.3%,74.2%,78.8%,83.4%,and 79.4%,respectively.For the 81 laboratory mixed samples with the dominant species accounting for 75% of the gross biomass(chlorophyll),the discrimination rates of the dominant species were 95.1% and 72.8% at the level of division and genus,respectively.For the 12 samples collected from the mesocosm experiment in Maidao Bay of Qingdao in August 2007,the dominant species of the 11 samples were recognized at the division level and the dominant species of four of the five samples in which the dominant species accounted for more than 80% of the gross biomass were discriminated at the genus level;for the 12 samples obtained from Jiaozhou Bay in August 2007,the dominant species of all the 12 samples were recognized at the division level.The technique can be directly applied to fluorescence spectrophotometers and to the developing of an in situ algae fluorescence auto-analyzer for

  18. A computerised morphometric technique for the analysis of intimal hyperplasia.

    OpenAIRE

    Tennant, M; McGeachie, J K

    1991-01-01

    The aim of this study was to design, develop and employ a method for the acquisition of a significant data base of thickness measurements. The integration of standard histological techniques (step serial sectioning), modern computer technology and a personally developed software package (specifically designed for thickness measurement) produced a novel technique suitable for the task. The technique allowed the elucidation of a larger data set from tissue samples. Thus a detailed and accurate ...

  19. COMPARATIVE ANALYSIS OF SATELLITE IMAGE PRE-PROCESSING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    T. Sree Sharmila

    2013-01-01

    Full Text Available Satellite images are corrupted by noise in its acquisition and transmission. The removal of noise from the image by attenuating the high frequency image components, removes some important details as well. In order to retain the useful information and improve the visual appearance, an effective denoising and resolution enhancement techniques are required. In this research, Hybrid Directional Lifting (HDL technique is proposed to retain the important details of the image and improve the visual appearance. The Discrete Wavelet Transform (DWT based interpolation technique is developed for enhancing the resolution of the denoised image. The performance of the proposed techniques are tested by Land Remote-Sensing Satellite (LANDSAT images, using the quantitative performance measure, Peak Signal to Noise Ratio (PSNR and computation time to show the significance of the proposed techniques. The PSNR of the HDL technique increases 1.02 dB compared to the standard denoising technique and the DWT based interpolation technique increases 3.94 dB. From the experimental results it reveals that newly developed image denoising and resolution enhancement techniques improve the image visual quality with rich textures.

  20. A new technique for analysing interacting factors affecting biodiversity patterns: crossed-DPCoA.

    Science.gov (United States)

    Pavoine, Sandrine; Blondel, Jacques; Dufour, Anne B; Gasc, Amandine; Bonsall, Michael B

    2013-01-01

    We developed an approach for analysing the effects of two crossed factors A and B on the functional, taxonomic or phylogenetic composition of communities. The methodology, known as crossed-DPCoA, defines a space where species, communities and the levels of the two factors are organised as a set of points. In this space, the Euclidean distance between two species-specific points is a measure of the (functional, taxonomic or phylogenetic) dissimilarity. The communities are positioned at the centroid of their constitutive species; and the levels of two factors at the centroid of the communities associated with them. We develop two versions for crossed-DPCoA, the first one moves the levels of factor B to the centre of the space and analyses the axes of highest variance in the coordinates of the levels of factor A. It is related to previous ordination approaches such as partial canonical correspondence analysis and partial non-symmetrical correspondence analysis. The second version projects all points on the orthogonal complement of the space generated by the principal axes of factor B. This second version should be preferred when there is an a priori suspicion that factor A and B are associated. We apply the two versions of crossed-DPCoA to analyse the phylogenetic composition of Central European and Mediterranean bird communities. Applying crossed-DPCoA on bird communities supports the hypothesis that allopatric speciation processes during the Quaternary occurred in open and patchily distributed landscapes, while the lack of geographic barriers to dispersal among forest habitats may explain the homogeneity of forest bird communities over the whole western Palaearctic. Generalizing several ordination analyses commonly used in ecology, crossed-DPCoA provides an approach for analysing the effects of crossed factors on functional, taxonomic and phylogenetic diversity, environmental and geographic structure of species niches, and more broadly the role of genetics on

  1. A new technique for analysing interacting factors affecting biodiversity patterns: crossed-DPCoA.

    Directory of Open Access Journals (Sweden)

    Sandrine Pavoine

    Full Text Available We developed an approach for analysing the effects of two crossed factors A and B on the functional, taxonomic or phylogenetic composition of communities. The methodology, known as crossed-DPCoA, defines a space where species, communities and the levels of the two factors are organised as a set of points. In this space, the Euclidean distance between two species-specific points is a measure of the (functional, taxonomic or phylogenetic dissimilarity. The communities are positioned at the centroid of their constitutive species; and the levels of two factors at the centroid of the communities associated with them. We develop two versions for crossed-DPCoA, the first one moves the levels of factor B to the centre of the space and analyses the axes of highest variance in the coordinates of the levels of factor A. It is related to previous ordination approaches such as partial canonical correspondence analysis and partial non-symmetrical correspondence analysis. The second version projects all points on the orthogonal complement of the space generated by the principal axes of factor B. This second version should be preferred when there is an a priori suspicion that factor A and B are associated. We apply the two versions of crossed-DPCoA to analyse the phylogenetic composition of Central European and Mediterranean bird communities. Applying crossed-DPCoA on bird communities supports the hypothesis that allopatric speciation processes during the Quaternary occurred in open and patchily distributed landscapes, while the lack of geographic barriers to dispersal among forest habitats may explain the homogeneity of forest bird communities over the whole western Palaearctic. Generalizing several ordination analyses commonly used in ecology, crossed-DPCoA provides an approach for analysing the effects of crossed factors on functional, taxonomic and phylogenetic diversity, environmental and geographic structure of species niches, and more broadly the role of

  2. Reservoir stimulation techniques to minimize skin factor of Longwangmiao Fm gas reservoirs in the Sichuan Basin

    Directory of Open Access Journals (Sweden)

    Guo Jianchun

    2014-10-01

    Full Text Available The Lower Cambrian Longwangmiao Fm carbonatite gas reservoirs in the Leshan-Longnüsi Paleouplift in the Sichuan Basin feature strong heterogeneity, well-developed fractures and caverns, and a high content of H2S, so these reservoirs are prone to reservoir damages caused by the invasion of drilling fluid or the improper well completion, so to minimize the reservoir skin factor is key to achieving high yield of oil and gas in this study area. Therefore, based on the geological characteristics of the Longwangmiao reservoirs, the binomial productivity equation was applied to demonstrate the possibility and scientificity of minimizing the skin factor. According to the current status of reservoir stimulation, the overall skin factors of reservoir damage caused by drilling fluid invasion, improper drilling and completion modes etc were analyzed, which shows there is still potential for skin factor reduction. Analysis of reservoir damage factors indicates that the main skin factor of Longwangmiao Fm reservoirs consists of that caused by drilling fluid and by improper completion modes. Along with the minimization of skin factor caused by drilling and improper completion, a fracture-network acidizing process to achieve “non-radial & network-fracture” plug-removal by making good use of natural fractures was proposed according to the characteristics of Longwangmiao Fm carbonatite reservoirs.

  3. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  4. Advanced patch-clamp techniques and single-channel analysis

    NARCIS (Netherlands)

    Biskup, B; Elzenga, JTM; Homann, U; Thiel, G; Wissing, F; Maathuis, FJM

    1999-01-01

    Much of our knowledge of ion-transport mechanisms in plant cell membranes comes from experiments using voltage-clamp. This technique allows the measurement of ionic currents across the membrane, whilst the voltage is held under experimental control. The patch-clamp technique was developed to study t

  5. 中国物流业碳排放特征及其影响因素分析——基于LMDI分解技术%Character of Carbon Emission of Logistics Industry in China and Its Affecting Factors Decomposition Analysis:based on LMDI technique

    Institute of Scientific and Technical Information of China (English)

    马越越; 王维国

    2013-01-01

    This paper uses LMDI technique to establish a factors decomposition model of carbon emission per capital in China's logistics industry,based on the feature analysis of carbon emission in China's logistics industry.The model is adopted to quantitative analyze the impact of six factors,which are energy structure、energy efficiency、transportation mode、development of logistics economic growth and population factors on carbon emission per head from 1991to 2010 in China's logistic industry.The results illustrate that economic growth is the most important factor which pull the carbon emission growth of the logistics industry and show the exponential growth trend during this period.The transportation mode also shows a significant role in promoting carbon emission growth.Although energy structure and energy efficiency show a pulling effect,the effect is weak.While logistics development factor exhibits a significant inhibitory effect on per capita carbon emission scales of the logistics industry.Therefore,we should vigorously promote the scientific and technological level of the logisticsindustry in order to further develop the inhibitory effect of logistics development factors in the decrease of carbon emission in logistics industry,at the same time,we should optimize the logistics and transport system and formulate a system which mainly lies on railways,waterways and pipelines for support,road and air transport for supplement.%在对我国物流业碳排放特征进行分析的基础上,运用LMDI分解技术,建立中国物流业人均碳排放的因素分解模型,定量分析了1991-2010年能源结构、能源效率、运输方式、物流发展、经济增长以及人口等6种因素对物流业人均碳排放的影响.分析表明:经济增长是拉动物流业碳排放增长最主要的动力,在研究期间呈指数增长的趋势.运输方式对碳排放增长也表现出明显的促进作用,能源结构和能源效率虽然表现出拉动作用,但效果

  6. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  7. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  8. Qualitative analysis of Orzooiyeh plain groundwater resources using GIS techniques

    Directory of Open Access Journals (Sweden)

    Mohsen Pourkhosravani

    2016-09-01

    Full Text Available Background: Unsustainable development of human societies, especially in arid and semi-arid areas, is one of the most important environmental hazards that require preservation of groundwater resources, and permanent study of qualitative and quantitative changes through sampling. Accordingly, this research attempts to assess and analyze the spatial variation of quantitative and qualitative indicators of Orzooiyeh groundwater resources in the Kerman province by using the geographic information system (GIS. Methods: This study attempts to survey the spatial analysis of these indexes using GIS techniques besides the evaluation of the groundwater resources quality in the study area. For this purpose, data quality indicators and statistics such as electrical conductivity, pH, sulphate, residual total dissolved solids (TDS, sodium, calcium; magnesium and chlorine of 28 selected wells sampled by the Kerman regional water organization were used. Results: A comparison of the present research results with standard of Industrial Research of Iran and also the World Health Organization (WHO shows that, among the measured indices, the electrical conductivity and TDS in the chosen samples are higher than the national standard of Iran and of the WHO but other indices are more favourable. Conclusion: Results showed that the electrical conductivity index of 64.3% of the samples have an optimal level, 71.4% have the limit of Iran national standard and only 3.6% of them have the WHO standard. The TDS index, too, did not reach national standards in any of the samples and in 82.1% of the samples this index was on the national standard limit. As per this index, only 32.1% of the samples were in the WHO standards.

  9. Technique Errors and Limiting Factors in Laser Ranging to Geodetic Satellites

    Science.gov (United States)

    Appleby, G. M.; Luceri, V.; Mueller, H.; Noll, C. E.; Otsubo, T.; Wilkinson, M.

    2012-12-01

    The tracking stations of the International Laser Ranging Service (ILRS) global network provide to the Data Centres a steady stream of very precise laser range normal points to the primary geodetic spherical satellites LAGEOS (-1 and -2) and Etalon (-1 and -2). Analysis of these observations to determine instantaneous site coordinates and Earth orientation parameters provides a major contribution to ongoing international efforts to define a precise terrestrial reference frame, which itself supports research into geophysical processes at the few mm level of precision. For example, the latest realization of the reference frame, ITRF2008, used weekly laser range solutions from 1983 to 2009, the origin of the Frame being determined solely by the SLR technique. However, in the ITRF2008 publication, Altamimi et al (2011, Journal of Geodesy) point out that further improvement in the ITRF is partly dependent upon improving an understanding of sources of technique error. In this study we look at SLR station hardware configuration that has been subject to major improvements over the last four decades, at models that strive to provide accurate translations of the laser range observations to the centres of mass of the small geodetic satellites and at the considerable body of work that has been carried out via orbital analyses to determine range corrections for some of the tracking stations. Through this study, with specific examples, we start to put together an inventory of system-dependent technique errors that will be important information for SLR re-analysis towards the next realization of the ITRF.

  10. Water quality analysis of the Rapur area, Andhra Pradesh, South India using multivariate techniques

    Science.gov (United States)

    Nagaraju, A.; Sreedhar, Y.; Thejaswi, A.; Sayadi, Mohammad Hossein

    2016-11-01

    The groundwater samples from Rapur area were collected from different sites to evaluate the major ion chemistry. The large number of data can lead to difficulties in the integration, interpretation, and representation of the results. Two multivariate statistical methods, hierarchical cluster analysis (HCA) and factor analysis (FA), were applied to evaluate their usefulness to classify and identify geochemical processes controlling groundwater geochemistry. Four statistically significant clusters were obtained from 30 sampling stations. This has resulted two important clusters viz., cluster 1 (pH, Si, CO3, Mg, SO4, Ca, K, HCO3, alkalinity, Na, Na + K, Cl, and hardness) and cluster 2 (EC and TDS) which are released to the study area from different sources. The application of different multivariate statistical techniques, such as principal component analysis (PCA), assists in the interpretation of complex data matrices for a better understanding of water quality of a study area. From PCA, it is clear that the first factor (factor 1), accounted for 36.2% of the total variance, was high positive loading in EC, Mg, Cl, TDS, and hardness. Based on the PCA scores, four significant cluster groups of sampling locations were detected on the basis of similarity of their water quality.

  11. Finding of Correction Factor and Dimensional Error in Bio-AM Model by FDM Technique

    Science.gov (United States)

    Manmadhachary, Aiamunoori; Ravi Kumar, Yennam; Krishnanand, Lanka

    2016-06-01

    Additive Manufacturing (AM) is the swift manufacturing process, in which input data can be provided from various sources like 3-Dimensional (3D) Computer Aided Design (CAD), Computed Tomography (CT), Magnetic Resonance Imaging (MRI) and 3D scanner data. From the CT/MRI data can be manufacture Biomedical Additive Manufacturing (Bio-AM) models. The Bio-AM model gives a better lead on preplanning of oral and maxillofacial surgery. However manufacturing of the accurate Bio-AM model is one of the unsolved problems. The current paper demonstrates error between the Standard Triangle Language (STL) model to Bio-AM model of dry mandible and found correction factor in Bio-AM model with Fused Deposition Modelling (FDM) technique. In the present work dry mandible CT images are acquired by CT scanner and supplied into a 3D CAD model in the form of STL model. Further the data is sent to FDM machine for fabrication of Bio-AM model. The difference between Bio-AM to STL model dimensions is considered as dimensional error and the ratio of STL to Bio-AM model dimensions considered as a correction factor. This correction factor helps to fabricate the AM model with accurate dimensions of the patient anatomy. These true dimensional Bio-AM models increasing the safety and accuracy in pre-planning of oral and maxillofacial surgery. The correction factor for Dimension SST 768 FDM AM machine is 1.003 and dimensional error is limited to 0.3 %.

  12. The Analysis of PM2.5 Source Apportionment Technique's Competitiveness in China

    Science.gov (United States)

    Qian, K.; Deng, L.; An, Y. B.; Liu, S. Y.; Hao, H. Z.

    Nowadays, people has paid more attention to PM2.5 in various countries of the world. PM2.5 is a kind of particulate matter whose diameter less than 2.5μm, with great damage to environment and public's health. The origin of source apportionment technique is studies of atmospheric particulate matter, it uses two mathematical models, one of them is diffusion model which study the source of pollution, and another one called receptor model which study the pollution of area. In my study, I will analyze the competitiveness of similar technology in various countries by using microscope to analyze shape characteristic, Enrichment Factor Method (EF), Factor Analyze Method (FA), EPA-CMB8.2 Model, combining with the consequence of Improved-source-analysis Technology and Orthogonal matrix decomposition Model.

  13. Application of Factor Analysis on the Financial Ratios of Indian Cement Industry and Validation of the Results by Cluster Analysis

    Science.gov (United States)

    De, Anupam; Bandyopadhyay, Gautam; Chakraborty, B. N.

    2010-10-01

    Financial ratio analysis is an important and commonly used tool in analyzing financial health of a firm. Quite a large number of financial ratios, which can be categorized in different groups, are used for this analysis. However, to reduce number of ratios to be used for financial analysis and regrouping them into different groups on basis of empirical evidence, Factor Analysis technique is being used successfully by different researches during the last three decades. In this study Factor Analysis has been applied over audited financial data of Indian cement companies for a period of 10 years. The sample companies are listed on the Stock Exchange India (BSE and NSE). Factor Analysis, conducted over 44 variables (financial ratios) grouped in 7 categories, resulted in 11 underlying categories (factors). Each factor is named in an appropriate manner considering the factor loads and constituent variables (ratios). Representative ratios are identified for each such factor. To validate the results of Factor Analysis and to reach final conclusion regarding the representative ratios, Cluster Analysis had been performed.

  14. Image analysis techniques associated with automatic data base generation.

    Science.gov (United States)

    Bond, A. D.; Ramapriyan, H. K.; Atkinson, R. J.; Hodges, B. C.; Thomas, D. T.

    1973-01-01

    This paper considers some basic problems relating to automatic data base generation from imagery, the primary emphasis being on fast and efficient automatic extraction of relevant pictorial information. Among the techniques discussed are recursive implementations of some particular types of filters which are much faster than FFT implementations, a 'sequential similarity detection' technique of implementing matched filters, and sequential linear classification of multispectral imagery. Several applications of the above techniques are presented including enhancement of underwater, aerial and radiographic imagery, detection and reconstruction of particular types of features in images, automatic picture registration and classification of multiband aerial photographs to generate thematic land use maps.

  15. In Vivo Imaging Techniques: A New Era for Histochemical Analysis

    Science.gov (United States)

    Busato, A.; Feruglio, P. Fumene; Parnigotto, P.P.; Marzola, P.; Sbarbati, A.

    2016-01-01

    In vivo imaging techniques can be integrated with classical histochemistry to create an actual histochemistry of water. In particular, Magnetic Resonance Imaging (MRI), an imaging technique primarily used as diagnostic tool in clinical/preclinical research, has excellent anatomical resolution, unlimited penetration depth and intrinsic soft tissue contrast. Thanks to the technological development, MRI is not only capable to provide morphological information but also and more interestingly functional, biophysical and molecular. In this paper we describe the main features of several advanced imaging techniques, such as MRI microscopy, Magnetic Resonance Spectroscopy, functional MRI, Diffusion Tensor Imaging and MRI with contrast agent as a useful support to classical histochemistry. PMID:28076937

  16. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    of the model stays high at longer horizons. The estimated factors are strongly statistically signi…cant according to a bootstrap resampling method which takes into account that the factors are estimated regressors. The simple three-factor model also contains substantial out-of-sample predictive power...

  17. A Second Generation Nonlinear Factor Analysis.

    Science.gov (United States)

    Etezadi-Amoli, Jamshid; McDonald, Roderick P.

    1983-01-01

    Nonlinear common factor models with polynomial regression functions, including interaction terms, are fitted by simultaneously estimating the factor loadings and common factor scores, using maximum likelihood and least squares methods. A Monte Carlo study gives support to a conjecture about the form of the distribution of the likelihood ratio…

  18. Multidimensional scaling technique for analysis of magnetic storms at Indian observatories

    Indian Academy of Sciences (India)

    M Sridharan; A M S Ramasamy

    2002-12-01

    Multidimensional scaling is a powerful technique for analysis of data. The latitudinal dependenceof geomagnetic field variation in horizontal component (H) during magnetic storms is analysed in this paper by employing this technique.

  19. Implementation of numerical simulation techniques in analysis of the accidents in complex technological systems

    Energy Technology Data Exchange (ETDEWEB)

    Klishin, G.S.; Seleznev, V.E.; Aleoshin, V.V. [RFNC-VNIIEF (Russian Federation)

    1997-12-31

    Gas industry enterprises such as main pipelines, compressor gas transfer stations, gas extracting complexes belong to the energy intensive industry. Accidents there can result into the catastrophes and great social, environmental and economic losses. Annually, according to the official data several dozens of large accidents take place at the pipes in the USA and Russia. That is why prevention of the accidents, analysis of the mechanisms of their development and prediction of their possible consequences are acute and important tasks nowadays. The accidents reasons are usually of a complicated character and can be presented as a complex combination of natural, technical and human factors. Mathematical and computer simulations are safe, rather effective and comparatively inexpensive methods of the accident analysis. It makes it possible to analyze different mechanisms of a failure occurrence and development, to assess its consequences and give recommendations to prevent it. Besides investigation of the failure cases, numerical simulation techniques play an important role in the treatment of the diagnostics results of the objects and in further construction of mathematical prognostic simulations of the object behavior in the period of time between two inspections. While solving diagnostics tasks and in the analysis of the failure cases, the techniques of theoretical mechanics, of qualitative theory of different equations, of mechanics of a continuous medium, of chemical macro-kinetics and optimizing techniques are implemented in the Conversion Design Bureau {number_sign}5 (DB{number_sign}5). Both universal and special numerical techniques and software (SW) are being developed in DB{number_sign}5 for solution of such tasks. Almost all of them are calibrated on the calculations of the simulated and full-scale experiments performed at the VNIIEF and MINATOM testing sites. It is worth noting that in the long years of work there has been established a fruitful and effective

  20. Categorizing the Driving Affecting Factors on Iran’s Carpet Industry competitiveness by Fuzzy Topsis Technique

    Directory of Open Access Journals (Sweden)

    F. Haghshenas Kashani

    2011-01-01

    Full Text Available One of the most prominent and important problems of Iran industries is the lack of competitiveness and the major reason among several various reasons is due to the absence of a defined approach for competitiveness. During this study, by testing an integrated model and presenting it as the research final model, we are trying to categorize the driving affecting factors on Iran’s carpet industry competitiveness. Thus, one of the new Multi Criteria Decision Making (MCDM techniques – Fuzzy Topsis- was applied. The components of research conceptual model which has 3 main criteria (internal resources, market situation, and innovation strength and 44 sub criteria was categorized by Fuzzy Topsis technique. Accordingly, “market share”, “e-commerce”, “knowledge creation’, “industry reliability”, and “exporters expertise and skills” were recognized as the most important sub criteria and simultaneously “customers satisfaction”, “employees’ education”, “international certifications”, and “fundamental researches” were recognized as the least momentous and effective sub criteria. These results represent that Iran’s hand-made carpet industry has still some difficulties in applying marketing knowledge such as: on line marketing, e-commerce, and making merchants familiar to these techniques. In addition, paying excessive attention to the quality, durability, and appearance of the Iranian carpets make managers to ignore some other factors such as customer satisfaction. Among the main criteria, market-based perspective was chosen as the most leading and significant criterion. In other words, the approach of position improvement in the international markets is recommended for this industry.

  1. Categorizing the Driving Affecting Factors on Iran’s Carpet Industry competitiveness by Fuzzy Topsis Technique

    Directory of Open Access Journals (Sweden)

    Farideh Haghshenas

    2011-07-01

    One of the most prominent and important problems of Iran industries is the lack of competitiveness and the major reason among several various reasons is due to the absence of a defined approach for competitiveness. During this study, by testing an integrated model and presenting it as the research final model, we are trying to categorize the driving affecting factors on Iran’s carpet industry competitiveness. Thus, one of the new Multi Criteria Decision Making (MCDM techniques – Fuzzy Topsis- was applied. The components of research conceptual model which has 3 main criteria (internal resources, market situation, and innovation strength and 44 sub criteria was categorized by Fuzzy Topsis technique. Accordingly, “market share”, “e-commerce”, “knowledge creation’, “industry reliability”, and “exporters expertise and skills” were recognized as the most important sub criteria and simultaneously “customers satisfaction”, “employees’ education”, “international certifications”, and “fundamental researches” were recognized as the least momentous and effective sub criteria. These results represent that Iran’s hand-made carpet industry has still some difficulties in applying marketing knowledge such as: on line marketing, e-commerce, and making merchants familiar to these techniques. In addition, paying excessive attention to the quality, durability, and appearance of the Iranian carpets make managers to ignore some other factors such as customer satisfaction. Among the main criteria, market-based perspective was chosen as the most leading and significant criterion. In other words, the approach of position improvement in the international markets is recommended for this industry.

  2. ANALYSIS OF RELATIONS BETWEEN JUDO TECHNIQUES AND SPECIFIC MOTOR ABILITIES

    Directory of Open Access Journals (Sweden)

    Patrik Drid

    2006-06-01

    Full Text Available Specific physical preparation affects the development of motor abilities required for execution of specific movements in judo. When selecting proper specific exercises for judo for a target motor ability, it is necessary to precede it with the study of the structure of specific judo techniques and activities of individual muscle groups engaged for execution of the technique. On the basis of this, one can understand which muscles are most engaged during realization of individual techniques, which serves as a standpoint for selection of a particular complex of specific exercises to produce the highest effects. In addition to the development of particular muscle groups, the means of specific preparation will take effect on the development of those motor abilities which are evaluated as the indispensable for the development of particular qualities which are characteristic for judo. This paper analyses the relationship between judo techniques field and specific motor abilities.

  3. Comparative analysis of data mining techniques for business data

    Science.gov (United States)

    Jamil, Jastini Mohd; Shaharanee, Izwan Nizal Mohd

    2014-12-01

    Data mining is the process of employing one or more computer learning techniques to automatically analyze and extract knowledge from data contained within a database. Companies are using this tool to further understand their customers, to design targeted sales and marketing campaigns, to predict what product customers will buy and the frequency of purchase, and to spot trends in customer preferences that can lead to new product development. In this paper, we conduct a systematic approach to explore several of data mining techniques in business application. The experimental result reveals that all data mining techniques accomplish their goals perfectly, but each of the technique has its own characteristics and specification that demonstrate their accuracy, proficiency and preference.

  4. RAPD analysis : a rapid technique for differentation of spoilage yeasts

    NARCIS (Netherlands)

    Baleiras Couto, M.M.; Vossen, J.M.B.M. van der; Hofstra, H.; Huis in 't Veld, J.H.J.

    1994-01-01

    Techniques for the identification of the spoilage yeasts Saccharomyces cerevisiae and members of the Zygosaccharomyces genus from food and beverages sources were evaluated. The use of identification systems based on physiological characteristics resulted often in incomplete identification or misiden

  5. Patient size and x-ray technique factors in head computed tomography examinations. I. Radiation doses.

    Science.gov (United States)

    Huda, Walter; Lieberman, Kristin A; Chang, Jack; Roskopf, Marsha L

    2004-03-01

    We investigated how patient age, size and composition, together with the choice of x-ray technique factors, affect radiation doses in head computed tomography (CT) examinations. Head size dimensions, cross-sectional areas, and mean Hounsfield unit (HU) values were obtained from head CT images of 127 patients. For radiation dosimetry purposes patients were modeled as uniform cylinders of water. Dose computations were performed for 18 x 7 mm sections, scanned at a constant 340 mAs, for x-ray tube voltages ranging from 80 to 140 kV. Values of mean section dose, energy imparted, and effective dose were computed for patients ranging from the newborn to adults. There was a rapid growth of head size over the first two years, followed by a more modest increase of head size until the age of 18 or so. Newborns have a mean HU value of about 50 that monotonically increases with age over the first two decades of life. Average adult A-P and lateral dimensions were 186+/-8 mm and 147+/-8 mm, respectively, with an average HU value of 209+/-40. An infant head was found to be equivalent to a water cylinder with a radius of approximately 60 mm, whereas an adult head had an equivalent radius 50% greater. Adult males head dimensions are about 5% larger than for females, and their average x-ray attenuation is approximately 20 HU greater. For adult examinations performed at 120 kV, typical values were 32 mGy for the mean section dose, 105 mJ for the total energy imparted, and 0.64 mSv for the effective dose. Increasing the x-ray tube voltage from 80 to 140 kV increases patient doses by about a factor of 5. For the same technique factors, mean section doses in infants are 35% higher than in adults. Energy imparted for adults is 50% higher than for infants, but infant effective doses are four times higher than for adults. CT doses need to take into account patient age, head size, and composition as well as the selected x-ray technique factors.

  6. Cepstrum Analysis: An Advanced Technique in Vibration Analysis of Defects in Rotating Machinery

    Directory of Open Access Journals (Sweden)

    M. Satyam

    1994-01-01

    Full Text Available Conventional frequency analysis in machinery vibration is not adequate to find out accurately defects in gears, bearings, and blades where sidebands and harmonics are present. Also such an approach is dependent on the transmission path. On the other hand, cepstrum analysis accurately identifies harmonics and sideband families and is a better technique available for fault diagnosis in gears, bearings, and turbine blades of ships and submarines. Cepstrum represents the global power content of a whole family of harmonics and sidebands when more than one family of sidebands are presents at the same time. Also it is insensitive to the transmission path effects since source and transmission path effects are additive and can be separated in cepstrum. The concept, underlying theory and the measurement and analysis involved for using the technique are briefly outlined. Two cases were taken to demonstrate advantage of cepstrum technique over the spectrum analysis. An LP compressor was chosen to study the transmission path effects and a marine gearbox having two sets of sideband families was studied to diagnose the problematic sideband and its severity.

  7. Quantitative Image Analysis Techniques with High-Speed Schlieren Photography

    Science.gov (United States)

    Pollard, Victoria J.; Herron, Andrew J.

    2017-01-01

    Optical flow visualization techniques such as schlieren and shadowgraph photography are essential to understanding fluid flow when interpreting acquired wind tunnel test data. Output of the standard implementations of these visualization techniques in test facilities are often limited only to qualitative interpretation of the resulting images. Although various quantitative optical techniques have been developed, these techniques often require special equipment or are focused on obtaining very precise and accurate data about the visualized flow. These systems are not practical in small, production wind tunnel test facilities. However, high-speed photography capability has become a common upgrade to many test facilities in order to better capture images of unsteady flow phenomena such as oscillating shocks and flow separation. This paper describes novel techniques utilized by the authors to analyze captured high-speed schlieren and shadowgraph imagery from wind tunnel testing for quantification of observed unsteady flow frequency content. Such techniques have applications in parametric geometry studies and in small facilities where more specialized equipment may not be available.

  8. Total reflection X-ray fluorescence as a fast multielemental technique for human placenta sample analysis

    Science.gov (United States)

    Marguí, E.; Ricketts, P.; Fletcher, H.; Karydas, A. G.; Migliori, A.; Leani, J. J.; Hidalgo, M.; Queralt, I.; Voutchkov, M.

    2017-04-01

    In the present contribution, benchtop total reflection X-ray fluorescence spectrometry (TXRF) has been evaluated as a cost-effective multielemental analytical technique for human placenta analysis. An easy and rapid sample preparation consisting of suspending 50 mg of sample in 1 mL of a Triton 1% solution in deionized water showed to be the most suitable for this kind of samples. However, for comparison purposes, an acidic microwave acidic digestion procedure was also applied. For both sample treatment methodologies, limits of detection for most elements were in the low mg/kg level. Accurate and precise results were obtained using internal standardization as quantification approach and applying a correction factor to compensate for absorption effects. The correction factor was based on the proportional ratio between the slurry preparation results and those obtained for the analysis of a set of human placenta samples analysed by microwave acidic digestion and ICP-AES analysis. As a study case, the developed TXRF methodology was applied for multielemental analysis (K, Ca, Fe, Cu, Zn, As, Se, Br, Rb and Sr) of several healthy women's placenta samples from two regions in Jamaica.

  9. Analysis of significant factors for dengue fever incidence prediction.

    Science.gov (United States)

    Siriyasatien, Padet; Phumee, Atchara; Ongruk, Phatsavee; Jampachaisri, Katechan; Kesorn, Kraisak

    2016-04-16

    Many popular dengue forecasting techniques have been used by several researchers to extrapolate dengue incidence rates, including the K-H model, support vector machines (SVM), and artificial neural networks (ANN). The time series analysis methodology, particularly ARIMA and SARIMA, has been increasingly applied to the field of epidemiological research for dengue fever, dengue hemorrhagic fever, and other infectious diseases. The main drawback of these methods is that they do not consider other variables that are associated with the dependent variable. Additionally, new factors correlated to the disease are needed to enhance the prediction accuracy of the model when it is applied to areas of similar climates, where weather factors such as temperature, total rainfall, and humidity are not substantially different. Such drawbacks may consequently lower the predictive power for the outbreak. The predictive power of the forecasting model-assessed by Akaike's information criterion (AIC), Bayesian information criterion (BIC), and the mean absolute percentage error (MAPE)-is improved by including the new parameters for dengue outbreak prediction. This study's selected model outperforms all three other competing models with the lowest AIC, the lowest BIC, and a small MAPE value. The exclusive use of climate factors from similar locations decreases a model's prediction power. The multivariate Poisson regression, however, effectively forecasts even when climate variables are slightly different. Female mosquitoes and seasons were strongly correlated with dengue cases. Therefore, the dengue incidence trends provided by this model will assist the optimization of dengue prevention. The present work demonstrates the important roles of female mosquito infection rates from the previous season and climate factors (represented as seasons) in dengue outbreaks. Incorporating these two factors in the model significantly improves the predictive power of dengue hemorrhagic fever forecasting

  10. MUMAL: Multivariate analysis in shotgun proteomics using machine learning techniques

    Directory of Open Access Journals (Sweden)

    Cerqueira Fabio R

    2012-10-01

    Full Text Available Abstract Background The shotgun strategy (liquid chromatography coupled with tandem mass spectrometry is widely applied for identification of proteins in complex mixtures. This method gives rise to thousands of spectra in a single run, which are interpreted by computational tools. Such tools normally use a protein database from which peptide sequences are extracted for matching with experimentally derived mass spectral data. After the database search, the correctness of obtained peptide-spectrum matches (PSMs needs to be evaluated also by algorithms, as a manual curation of these huge datasets would be impractical. The target-decoy database strategy is largely used to perform spectrum evaluation. Nonetheless, this method has been applied without considering sensitivity, i.e., only error estimation is taken into account. A recently proposed method termed MUDE treats the target-decoy analysis as an optimization problem, where sensitivity is maximized. This method demonstrates a significant increase in the retrieved number of PSMs for a fixed error rate. However, the MUDE model is constructed in such a way that linear decision boundaries are established to separate correct from incorrect PSMs. Besides, the described heuristic for solving the optimization problem has to be executed many times to achieve a significant augmentation in sensitivity. Results Here, we propose a new method, termed MUMAL, for PSM assessment that is based on machine learning techniques. Our method can establish nonlinear decision boundaries, leading to a higher chance to retrieve more true positives. Furthermore, we need few iterations to achieve high sensitivities, strikingly shortening the running time of the whole process. Experiments show that our method achieves a considerably higher number of PSMs compared with standard tools such as MUDE, PeptideProphet, and typical target-decoy approaches. Conclusion Our approach not only enhances the computational performance, and

  11. Reticle defect sizing of optical proximity correction defects using SEM imaging and image analysis techniques

    Science.gov (United States)

    Zurbrick, Larry S.; Wang, Lantian; Konicek, Paul; Laird, Ellen R.

    2000-07-01

    Sizing of programmed defects on optical proximity correction (OPC) feature sis addressed using high resolution scanning electron microscope (SEM) images and image analysis techniques. A comparison and analysis of different sizing methods is made. This paper addresses the issues of OPC defect definition and discusses the experimental measurement results obtained by SEM in combination with image analysis techniques.

  12. TECHNIQUE OF THE STATISTICAL ANALYSIS OF INVESTMENT APPEAL OF THE REGION

    Directory of Open Access Journals (Sweden)

    А. А. Vershinina

    2014-01-01

    Full Text Available The technique of the statistical analysis of investment appeal of the region is given in scientific article for direct foreign investments. Definition of a technique of the statistical analysis is given, analysis stages reveal, the mathematico-statistical tools are considered.

  13. Disruptive Event Biosphere Doser Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-28

    The purpose of this report was to document the process leading to, and the results of, development of radionuclide-, exposure scenario-, and ash thickness-specific Biosphere Dose Conversion Factors (BDCFs) for the postulated postclosure extrusive igneous event (volcanic eruption) at Yucca Mountain. BDCF calculations were done for seventeen radionuclides. The selection of radionuclides included those that may be significant dose contributors during the compliance period of up to 10,000 years, as well as radionuclides of importance for up to 1 million years postclosure. The approach documented in this report takes into account human exposure during three different phases at the time of, and after, volcanic eruption. Calculations of disruptive event BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. The pathway analysis included consideration of different exposure pathway's contribution to the BDCFs. BDCFs for volcanic eruption, when combined with the concentration of radioactivity deposited by eruption on the soil surface, allow calculation of potential radiation doses to the receptor of interest. Calculation of radioactivity deposition is outside the scope of this report and so is the transport of contaminated ash from the volcano to the location of the receptor. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA), in which doses are calculated to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  14. Nominal Performance Biosphere Dose Conversion Factor Analysis

    Energy Technology Data Exchange (ETDEWEB)

    M. Wasiolek

    2000-12-21

    The purpose of this report was to document the process leading to development of the Biosphere Dose Conversion Factors (BDCFs) for the postclosure nominal performance of the potential repository at Yucca Mountain. BDCF calculations concerned twenty-four radionuclides. This selection included sixteen radionuclides that may be significant nominal performance dose contributors during the compliance period of up to 10,000 years, five additional radionuclides of importance for up to 1 million years postclosure, and three relatively short-lived radionuclides important for the human intrusion scenario. Consideration of radionuclide buildup in soil caused by previous irrigation with contaminated groundwater was taken into account in the BDCF development. The effect of climate evolution, from the current arid conditions to a wetter and cooler climate, on the BDCF values was evaluated. The analysis included consideration of different exposure pathway's contribution to the BDCFs. Calculations of nominal performance BDCFs used the GENII-S computer code in a series of probabilistic realizations to propagate the uncertainties of input parameters into the output. BDCFs for the nominal performance, when combined with the concentrations of radionuclides in groundwater allow calculation of potential radiation doses to the receptor of interest. Calculated estimates of radionuclide concentration in groundwater result from the saturated zone modeling. The integration of the biosphere modeling results (BDCFs) with the outcomes of the other component models is accomplished in the Total System Performance Assessment (TSPA) to calculate doses to the receptor of interest from radionuclides postulated to be released to the environment from the potential repository at Yucca Mountain.

  15. A New V2G Control Strategy for Load Factor Improvement Using Smoothing Technique

    Directory of Open Access Journals (Sweden)

    CHANHOM, P.

    2017-08-01

    Full Text Available This paper proposes a new vehicle-to-grid (V2G control strategy for improving the load factor in the power network. To operate the proposed strategy, the available storage capacity of the PEVs’ batteries is considered as a battery energy storage system (BESS for charging and discharging an amount of power corresponding to the V2G power command. Due to the remarkable advantages of the technique so-called simple moving average, it is selected for applying in the proposed V2G control strategy. In this research, for investigating the load factor improvement, the essential data including the daily-load profiles with 7-day and 14-day periods are used for the 3 studied cases. These 3 studied cases present the power network with variation of the PEVs locations for describing the PEVs usage and charging or discharging behavior. The performance of the proposed strategy is simulated and verified by the MATPOWER software. The simulation results show that the load factors of the 3 studied cases are improved. Moreover, the encouragement of energy arbitrage for the PEVs owners is also discussed in this paper.

  16. Coke drums inspection and evaluation using stress and strain analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Haraguchi, Marcio Issamu [Tricom Tecnologia e Servicos de Manutencao Industrial Ltda., Piquete, SP (Brazil); Samman, Mahmod [Houston Engineering Solutions, Houston, TX (United States); Tinoco, Ediberto Bastos; Marangone, Fabio de Castro; Silva, Hezio Rosa da; Barcelos, Gustavo de Carvalho [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    Coke drums deform due to a complex combination of mechanical and thermal cyclic stresses. Bulges have progressive behavior and represent the main maintenance problem related to these drums. Bulge failure typically result in through-wall cracks, leaks, and sometimes fires. Such failures generally do not represent a great risk to personnel. Repairs needed to maintain reliability of these vessels might require extensive interruption to operation which in turn considerably impacts the profitability of the unit. Therefore the condition, progression and severity of these bulges should be closely monitored. Coke drums can be inspected during turnaround with 3D Laser Scanning and Remote Visual Inspection (RVI) tools, resulting in a detailed dimensional and visual evaluation of the internal surface. A typical project has some goals: inspect the equipment to generate maintenance or inspection recommendations, comparison with previous results and baseline data. Until recently, coke drum structural analysis has been traditionally performed analyzing Stress Concentration Factors (SCF) thought Finite Element Analysis methods; however this technique has some serious technical and practical limitations. To avoid these shortcomings, the new strain analysis technique PSI (Plastic Strain Index) was developed. This method which is based on API 579/ ASME FFS standard failure limit represents the state of the art of coke drum bulging severity assessment has an excellent correlation with failure history. (author)

  17. Structural Image Analysis of the Brain in Neuropsychology Using Magnetic Resonance Imaging (MRI) Techniques.

    Science.gov (United States)

    Bigler, Erin D

    2015-09-01

    Magnetic resonance imaging (MRI) of the brain provides exceptional image quality for visualization and neuroanatomical classification of brain structure. A variety of image analysis techniques provide both qualitative as well as quantitative methods to relate brain structure with neuropsychological outcome and are reviewed herein. Of particular importance are more automated methods that permit analysis of a broad spectrum of anatomical measures including volume, thickness and shape. The challenge for neuropsychology is which metric to use, for which disorder and the timing of when image analysis methods are applied to assess brain structure and pathology. A basic overview is provided as to the anatomical and pathoanatomical relations of different MRI sequences in assessing normal and abnormal findings. Some interpretive guidelines are offered including factors related to similarity and symmetry of typical brain development along with size-normalcy features of brain anatomy related to function. The review concludes with a detailed example of various quantitative techniques applied to analyzing brain structure for neuropsychological outcome studies in traumatic brain injury.

  18. EXPLORATORY FACTOR ANALYSIS (EFA IN CONSUMER BEHAVIOR AND MARKETING RESEARCH

    Directory of Open Access Journals (Sweden)

    Marcos Pascual Soler

    2012-06-01

    Full Text Available Exploratory Factor Analysis (EFA is one of the most widely used statistical procedures in social research. The main objective of this work is to describe the most common practices used by researchers in the consumer behavior and marketing area. Through a literature review methodology the practices of AFE in five consumer behavior and marketing journals(2000-2010 were analyzed. Then, the choices made by the researchers concerning factor model, retention criteria, rotation, factors interpretation and other relevant issues to factor analysis were analized. The results suggest that researchers routinely conduct analyses using such questionable methods. Suggestions for improving the use of factor analysis and the reporting of results are presented and a checklist (Exploratory Factor Analysis Checklist, EFAC is provided to help editors, reviewers, and authors improve reporting exploratory factor analysis.

  19. Factor analysis of serogroups botanica and aurisina of Leptospira biflexa.

    Science.gov (United States)

    Cinco, M

    1977-11-01

    Factor analysis is performed on serovars of Botanica and Aurisina serogroup of Leptospira biflexa. The results show the arrangement of main factors serovar and serogroup specific, as well as the antigens common with serovars of heterologous serogroups.

  20. An Item Factor Analysis of the Mooney Problem Check List

    Science.gov (United States)

    Stewart, David W.; Deiker, Thomas

    1976-01-01

    Explores the factor structure of the Mooney Problem Check List (MPCL) at the junior and senior high school level by undertaking a large obverse factor analysis of item responses in three adolescent criterion groups. (Author/DEP)

  1. COMPARATIVE ANALYSIS OF TRAUMA REDUCTION TECHNIQUES IN LAPAROSCOPIC CHOLECYSTECTOMY

    Directory of Open Access Journals (Sweden)

    Anton Koychev

    2017-02-01

    Full Text Available Nowadays, there is no operation in the field of abdominal surgery, which cannot be performed laparoscopically. Both surgeons and patients have at their disposal an increasing number of laparoscopic techniques to perform the surgical interventions. The prevalence of laparoscopic cholecystectomy is due to its undeniable advantages over the traditional open surgery, namely small invasiveness, reducing the frequency and severity of perioperative complications, the incomparably better cosmetic result, and the so much better medical and social, and medical and economic efficiency. Single-port laparoscopic techniques to perform laparoscopic cholecystectomy are acceptable alternative to the classical conventional multi-port techniques. The security of the laparoscopic cholecystectomy requires precise identification of anatomical structures and precise observing the diagnostic and treatment protocols, and criteria for selection of patients to be treated surgically by these methods.

  2. Comparative study of Authorship Identification Techniques for Cyber Forensics Analysis

    Directory of Open Access Journals (Sweden)

    Smita Nirkhi

    2013-06-01

    Full Text Available Authorship Identification techniques are used to identify the most appropriate author from group of potential suspects of online messages and find evidences to support the conclusion. Cybercriminals make misuse of online communication for sending blackmail or a spam email and then attempt to hide their true identities to void detection.Authorship Identification of online messages is the contemporary research issue for identity tracing in cyber forensics. This is highly interdisciplinary area as it takes advantage of machine learning, information retrieval, and natural language processing. In this paper, a study of recent techniques and automated approaches to attributing authorship of online messages is presented. The focus of this review study is to summarize all existing authorship identification techniques used in literature to identify authors of online messages. Also it discusses evaluation criteria and parameters for authorship attribution studies and list open questions that will attract future work in this area.

  3. Spatial epidemiological techniques in cholera mapping and analysis towards a local scale predictive modelling

    Science.gov (United States)

    Rasam, A. R. A.; Ghazali, R.; Noor, A. M. M.; Mohd, W. M. N. W.; Hamid, J. R. A.; Bazlan, M. J.; Ahmad, N.

    2014-02-01

    Cholera spatial epidemiology is the study of the spread and control of the disease spatial pattern and epidemics. Previous studies have shown that multi-factorial causation such as human behaviour, ecology and other infectious risk factors influence the disease outbreaks. Thus, understanding spatial pattern and possible interrelationship factors of the outbreaks are crucial to be explored an in-depth study. This study focuses on the integration of geographical information system (GIS) and epidemiological techniques in exploratory analyzing the cholera spatial pattern and distribution in the selected district of Sabah. Spatial Statistic and Pattern tools in ArcGIS and Microsoft Excel software were utilized to map and analyze the reported cholera cases and other data used. Meanwhile, cohort study in epidemiological technique was applied to investigate multiple outcomes of the disease exposure. The general spatial pattern of cholera was highly clustered showed the disease spread easily at a place or person to others especially 1500 meters from the infected person and locations. Although the cholera outbreaks in the districts are not critical, it could be endemic at the crowded areas, unhygienic environment, and close to contaminated water. It was also strongly believed that the coastal water of the study areas has possible relationship with the cholera transmission and phytoplankton bloom since the areas recorded higher cases. GIS demonstrates a vital spatial epidemiological technique in determining the distribution pattern and elucidating the hypotheses generating of the disease. The next research would be applying some advanced geo-analysis methods and other disease risk factors for producing a significant a local scale predictive risk model of the disease in Malaysia.

  4. OPINION MINING AND SENTIMENT ANALYSIS TECHNIQUES: A RECENT SURVEY

    OpenAIRE

    Ms. Kalyani D. Gaikwad*, Prof. Sonawane V.R

    2016-01-01

    Sentiment analysis (also known as opinion mining) refers to the use of natural language processing, text analysis and computational linguistics to identify and extract subjective information in source materials. Sentiment analysis is widely applied to reviews and social media for a variety of applications, ranging from marketing to customer service. The difficulties of performing sentiment analysis in this domain can be overcome by leveraging on common-sense knowledge bases. Opinion Mining is...

  5. Data Mining Techniques: A Source for Consumer Behavior Analysis

    CERN Document Server

    Raorane, Abhijit

    2011-01-01

    Various studies on consumer purchasing behaviors have been presented and used in real problems. Data mining techniques are expected to be a more effective tool for analyzing consumer behaviors. However, the data mining method has disadvantages as well as advantages. Therefore, it is important to select appropriate techniques to mine databases. The objective of this paper is to know consumer behavior, his psychological condition at the time of purchase and how suitable data mining method apply to improve conventional method. Moreover, in an experiment, association rule is employed to mine rules for trusted customers using sales data in a super market industry

  6. An Information Diffusion Technique for Fire Risk Analysis

    Institute of Scientific and Technical Information of China (English)

    刘静; 黄崇福

    2004-01-01

    There are many kinds of fires occurring under different conditions. For a specific site, it is difficult to collect sufficient data for analyzing the fire risk. In this paper, we suggest an information diffusion technique to analyze fire risk with a small sample. The information distribution method is applied to change crisp observations into fuzzy sets, and then to effectively construct a fuzzy relationship between fire and surroundings. With the data of Shanghai in winter, we show how to use the technique to analyze the fire risk.

  7. Spacecraft Electrical Power System (EPS) generic analysis tools and techniques

    Science.gov (United States)

    Morris, Gladys M.; Sheppard, Mark A.

    1992-01-01

    An overview is provided of the analysis tools and techiques used in modeling the Space Station Freedom electrical power system, as well as future space vehicle power systems. The analysis capabilities of the Electrical Power System (EPS) are described and the EPS analysis tools are surveyed.

  8. Krylov Subspace Method with Communication Avoiding Technique for Linear System Obtained from Electromagnetic Analysis

    National Research Council Canada - National Science Library

    IKUNO, Soichiro; CHEN, Gong; YAMAMOTO, Susumu; ITOH, Taku; ABE, Kuniyoshi; NAKAMURA, Hiroaki

    2016-01-01

    Krylov subspace method and the variable preconditioned Krylov subspace method with communication avoiding technique for a linear system obtained from electromagnetic analysis are numerically investigated. In the k...

  9. Tuck Jump Assessment: An Exploratory Factor Analysis in a College Age Population.

    Science.gov (United States)

    Lininger, Monica R; Smith, Craig A; Chimera, Nicole J; Hoog, Philipp; Warren, Meghan

    2017-03-01

    Lininger, MR, Smith, CA, Chimera, NJ, Hoog, P, and Warren, M. Tuck Jump Assessment: An exploratory factor analysis in a college age population. J Strength Cond Res 31(3): 653-659, 2017-Due to the high rate of noncontact lower extremity injuries that occur in the collegiate setting, medical personnel are implementing screening mechanisms to identify those athletes that may be at risk for certain injuries before starting a sports season. The tuck jump assessment (TJA) was created as a "clinician friendly" tool to identify lower extremity landing technique flaws during a plyometric activity. There are 10 technique flaws that are assessed as either having the apparent deficit or not during the TJA. Technique flaws are then summed up for an overall score. Through expert consensus, these 10 technique flaws have been grouped into 5 modifiable risk factors: ligament dominance, quadriceps dominance, leg dominance or residual injury deficits, trunk dominance ("core" dysfunction), and technique perfection. Research has not investigated the psychometric properties of the TJA technique flaws or the modifiable risk factors. The present study is a psychometric analysis of the TJA technique flaws to measure the internal structure using an exploratory factor analysis (EFA) using data from collegiate athletes (n = 90) and a general college cohort (n = 99). The EFA suggested a 3 factor model accounting for 46% of the variance. The 3 factors were defined as fatigue, distal landing pattern, and proximal control. The results differ from the 5 modifiable risk categories as previously suggested. These results may question the use of a single score, a unidimensional construct, of the TJA for injury screening.

  10. Analysis of Factors Affecting the Quality of an E-commerce Website Using Factor Analysis

    Directory of Open Access Journals (Sweden)

    Saurabh Mishra

    2014-12-01

    Full Text Available The purpose of this study is to identify factors which affect the quality and effectiveness of an e commerce website which also majorly affect customer satisfaction and ultimately customer retention and loyalty. This research paper examines a set of 23 variables and integrates them into 4 factors which affect the quality of a website. An online questionnaire survey was conducted to generate statistics regarding the preferences of the e-commerce website users.The 23 variables taken from customer survey are generalized into 4 major factors using exploratory factor analysis which are content, navigation, services and interface design. The research majorly consists of the responses of students between the age group of 18-25 years and considers different B2C commercial websites. Identified variables are important with respect to the current competition in the market as service of an e-commerce website also play a major role in ensuring customer satisfaction. Further research in this domain can be done for websites’ version for mobile devices.

  11. Radioassay of granulocyte chemotaxis. Studies of human granulocytes and chemotactic factors. [/sup 51/Cr tracer technique

    Energy Technology Data Exchange (ETDEWEB)

    Gallin, J.I.

    1974-01-01

    The above studies demonstrate that the /sup 51/Cr radiolabel chemotactic assay is a relatively simple and objective means for studying leukocyte chemotaxis in both normal and pathological conditions. Application of this method to studies of normal human chemotaxis revealed a relatively narrow range of normal and little day-to-day variability. Analysis of this variability revealed that there is more variability among the response of different granulocytes to a constant chemotactic stimulus than among the chemotactic activity of different sera to a single cell source. Utilizing the /sup 51/Cr radioassay, the abnormal granulocyte chemotactic behavior reported in Chediak-Higashi syndrome and a patient with recurrent pyogenic infections and mucocutaneous candidiasis has been confirmed. The /sup 51/Cr chemotactic assay has also been used to assess the generation of chemotactic activity from human serum and plasma. The in vitro generation of two distinct chemotactic factors were examined; the complement product (C5a) and kallikrein, an enzyme of the kinin-generating pathway. Kinetic analysis of complement-related chemotactic factor formation, utilizing immune complexes or endotoxin to activate normal sera in the presence or absence of EGTA as well as kinetic analysis of activation of C2-deficient human serum, provided an easy means of distinguishing the classical (antibody-mediated) complement pathway from the alternate pathway. Such kinetic analysis is necessary to detect clinically important abnormalities since, after 60 min of generation time, normal chemotactic activity may be present despite complete absence or inhibition of one complement pathway. The chemotactic factor generated by either pathway of complement activation appears to be predominately attributable to C5a.

  12. Comprehensive evaluation of formulation factors for ocular penetration of fluoroquinolones in rabbits using cassette dosing technique

    Science.gov (United States)

    Sharma, Charu; Biswas, Nihar R; Ojha, Shreesh; Velpandian, Thirumurthy

    2016-01-01

    Objective Corneal permeability of drugs is an important factor used to assess the efficacy of topical preparations. Transcorneal penetration of drugs from aqueous formulation is governed by various physiological, physiochemical, and formulation factors. In the present study, we investigated the effect of formulation factors like concentration, pH, and volume of instillation across the cornea using cassette dosing technique for ophthalmic fluoroquinolones (FQs). Materials and methods Sterile cocktail formulations were prepared using four congeneric ophthalmic FQs (ofloxacin, sparfloxacin, pefloxacin mesylate, and gatifloxacin) at concentrations of 0.025%, 0.5%, and 0.1%. Each formulation was adjusted to different pH ranges (4.5, 7.0, and 8.0) and assessed for transcorneal penetration in vivo in rabbit’s cornea (n=4 eyes) at three different volumes (12.5, 25, and 50 μL). Aqueous humor was aspirated through paracentesis after applying local anesthesia at 0, 5, 15, 30, 60, 120, and 240 minutes postdosing. The biosamples collected from a total of 27 groups were analyzed using liquid chromatography–tandem mass spectroscopy to determine transcorneal permeability of all four FQs individually. Results Increase in concentration showed an increase in penetration up to 0.05%; thereafter, the effect of concentration was found to be dependent on volume of instillation as we observed a decrease in transcorneal penetration. The highest transcorneal penetration of all FQs was observed at pH 7.0 at concentration 0.05% followed by 0.025% at pH 4.5. Lastly, increasing the volume of instillation from 12.5 to 50 μL showed a significant fall in transcorneal penetration. Conclusion The study concludes that formulation factors showed discernible effect on transcorneal permeation; therefore, due emphasis should be given on drug development and design of ophthalmic formulation. PMID:26955263

  13. Exploring Technostress: Results of a Large Sample Factor Analysis

    OpenAIRE

    Steponas Jonušauskas; Agota Giedre Raisiene

    2016-01-01

    With reference to the results of a large sample factor analysis, the article aims to propose the frame examining technostress in a population. The survey and principal component analysis of the sample consisting of 1013 individuals who use ICT in their everyday work was implemented in the research. 13 factors combine 68 questions and explain 59.13 per cent of the answers dispersion. Based on the factor analysis, questionnaire was reframed and prepared to reasonably analyze the respondents’ an...

  14. Distal wound complications following pedal bypass: analysis of risk factors.

    Science.gov (United States)

    Robison, J G; Ross, J P; Brothers, T E; Elliott, B M

    1995-01-01

    Wound complications of the pedal incision continue to compromise successful limb salvage following aggressive revascularization. Significant distal wound disruption occurred in 14 of 142 (9.8%) patients undergoing pedal bypass with autogenous vein for limb salvage between 1986 and 1993. One hundred forty-two pedal bypass procedures were performed for rest pain in 66 patients and tissue necrosis in 76. Among the 86 men and 56 women, 76% were diabetic and 73% were black. All but eight patients had a history of diabetes and/or tobacco use. Eight wounds were successfully managed with maintenance of patent grafts from 5 to 57 months. Exposure of a patent graft precipitated amputation in three patients, as did graft occlusion in an additional patient. One graft was salvaged by revision to the peroneal artery and one was covered by a local bipedicled flap. Multiple regression analysis identified three factors associated with wound complications at the pedal incision site: diabetes mellitus (p = 0.03), age > 70 years (p = 0.03), and rest pain (p = 0.05). Ancillary techniques ("pie-crusting") to reduce skin tension resulted in no distal wound problems among 15 patients considered to be at greatest risk for wound breakdown. Attention to technique of distal graft tunneling, a wound closure that reduces tension, and control of swelling by avoiding dependency on and use of gentle elastic compression assume crucial importance in minimizing pedal wound complications following pedal bypass.

  15. Using Multilevel Factor Analysis with Clustered Data: Investigating the Factor Structure of the Positive Values Scale

    Science.gov (United States)

    Huang, Francis L.; Cornell, Dewey G.

    2016-01-01

    Advances in multilevel modeling techniques now make it possible to investigate the psychometric properties of instruments using clustered data. Factor models that overlook the clustering effect can lead to underestimated standard errors, incorrect parameter estimates, and model fit indices. In addition, factor structures may differ depending on…

  16. Creation of reversed phase high-performance liquid chromatographic technique to assay platelet-activating factor

    Institute of Scientific and Technical Information of China (English)

    杨云梅; 曹红翠; 徐哲荣; 陈晓明

    2004-01-01

    Objective: To establish a new assay for platelet-activating factor (PAF), to compare it with bio-assay; and to discuss its significance in some elderly people diseases such as cerebral infarction and coronary heart disease. Methods: To measure PAF levels in 100 controls, 23 elderly patients with cerebral infarction and 65 cases with coronary heart disease by reversed phase high-performance liquid chromatographic technique (rHPLC). Results: rHPLC is more convenient, sensitive, specific, and less confusing, compared with bio-assay. The level of plasma PAF in patients with cerebral infarction was higher than that in the controls (P<0.01), and in patients with coronary heart disease. Conclusion: Detection of PAF with rHPLC is more reliable and more accurate. The new assay has important significance in PAF research.

  17. Creation of reversed phase high-performance liquid chromatographic technique to assay platelet-activating factor

    Institute of Scientific and Technical Information of China (English)

    杨云梅; 曹红翠; 徐哲荣; 陈晓明

    2004-01-01

    Objective: To establish a new assay for platelet-activating factor (PAF), to compare it with bio-assay; and to discuss its significance in some elderly people diseases such as cerebral infarction and coronary heart disease. Methods: To measure PAF levels in 100 controls, 23 elderly patients with cerebral infarction and 65 cases with coronary heart disease by reversed phase high-performance liquid chromatographic technique (rHPLC). Results:rHPLC is more convenient, sensitive,specific, and less confusing, compared with bio-assay. The level of plasma PAF in patients with cerebral infarction was higher than that in the controls (P<0.01), and in patients with coronary heart disease. Conclusion: Detection of PAF with rHPLC is more reliable and more accurate. The new assay has important significance in PAF research.

  18. Merkel Cell Carcinoma: An Update of Key Imaging Techniques, Prognostic Factors, Treatment, and Follow-up.

    Science.gov (United States)

    Llombart, B; Kindem, S; Chust, M

    2017-03-01

    Merkel cell carcinoma, though rare, is one of the most aggressive tumors a dermatologist faces. More than a third of patients with this diagnosis die from the disease. Numerous researchers have attempted to identify clinical and pathologic predictors to guide prognosis, but their studies have produced inconsistent results. Because the incidence of Merkel cell carcinoma is low and it appears in patients of advanced age, prospective studies have not been done and no clear treatment algorithm has been developed. This review aims to provide an exhaustive, up-to-date account of Merkel cell carcinoma for the dermatologist. We describe prognostic factors and the imaging techniques that are most appropriate for evaluating disease spread. We also discuss current debates on treating Merkel cell carcinoma. Copyright © 2016 AEDV. Publicado por Elsevier España, S.L.U. All rights reserved.

  19. Applying data-mining techniques in honeypot analysis

    CSIR Research Space (South Africa)

    Veerasamy, N

    2006-07-01

    Full Text Available This paper proposes the use of a data mining techniques to analyse the data recorded by the honeypot. This data can also be used to train Intrusion Detection Systems (IDS) in identifying attacks. Since the training is based on real data...

  20. UPLC-ICP-MS - a fast technique for speciation analysis

    DEFF Research Database (Denmark)

    Bendahl, L.; Sturup, S.; Gammelgaard, Bente;

    2005-01-01

    Ultra performance liquid chromatography is a new development of the HPLC separation technique that allows separations on column materials at high pressures up to 10(8) Pa using particle diameters of 1.7 mu m. This increases the efficiency, the resolution and the speed of the separation. Four aque...

  1. Analysis on Poe's Unique Techniques to Achieve Aestheticism

    Institute of Scientific and Technical Information of China (English)

    孔佳鸣

    2008-01-01

    Edgar Allan Poe was one of the most important poets in the American poetic history for his unremitting pursuit for ‘ideal beauty'.This essay proves by various examples chosen from his poems that his aestheticism was obvious in his versification techniques.His poetic theory and practice gave an immortal example for the development of the English poetry.

  2. Tape Stripping Technique for Stratum Corneum Protein Analysis

    DEFF Research Database (Denmark)

    Clausen, Maja-Lisa; Slotved, H.-C.; Krogfelt, Karen Angeliki

    2016-01-01

    The aim of this study was to investigate the amount of protein in stratum corneum in atopic dermatitis (AD) patients and healthy controls, using tape stripping technique. Furthermore, to compare two different methods for protein assessment. Tape stripping was performed in AD patients and healthy ...

  3. Infrared Contrast Analysis Technique for Flash Thermography Nondestructive Evaluation

    Science.gov (United States)

    Koshti, Ajay

    2014-01-01

    The paper deals with the infrared flash thermography inspection to detect and analyze delamination-like anomalies in nonmetallic materials. It provides information on an IR Contrast technique that involves extracting normalized contrast verses time evolutions from the flash thermography infrared video data. The paper provides the analytical model used in the simulation of infrared image contrast. The contrast evolution simulation is achieved through calibration on measured contrast evolutions from many flat bottom holes in the subject material. The paper also provides formulas to calculate values of the thermal measurement features from the measured contrast evolution curve. Many thermal measurement features of the contrast evolution that relate to the anomaly characteristics are calculated. The measurement features and the contrast simulation are used to evaluate flash thermography inspection data in order to characterize the delamination-like anomalies. In addition, the contrast evolution prediction is matched to the measured anomaly contrast evolution to provide an assessment of the anomaly depth and width in terms of depth and diameter of the corresponding equivalent flat-bottom hole (EFBH) or equivalent uniform gap (EUG). The paper provides anomaly edge detection technique called the half-max technique which is also used to estimate width of an indication. The EFBH/EUG and half-max width estimations are used to assess anomaly size. The paper also provides some information on the "IR Contrast" software application, half-max technique and IR Contrast feature imaging application, which are based on models provided in this paper.

  4. Multivariate factor analysis of Girgentana goat milk composition

    Directory of Open Access Journals (Sweden)

    Pietro Giaccone

    2010-01-01

    kidding and parity on common fac-  tors, while no differences were found between goats with one or more kids. The multivariate factor analysis technique  was effective in describing the quality of Girgentana milk with a low number of new latent variables. These new variables  have been useful in the study of the effect of some technical factors such as parity and season of kidding on the quan-  titative and qualitative aspects of milk production in this goat breed. 

  5. A System of Systems Interface Hazard Analysis Technique

    Science.gov (United States)

    2007-03-01

    Table 3. HAZOP Guide Words for Software or System Interface Analysis....... 22 Table 4. Example System of Systems Architecture Table...steps are applicable for a software HAZOP . 2 Plan HAZOP Establish HAZOP analysis goals, definitions, worksheets, schedule and process. Divide the...Subtle Incorrect Output’s value is wrong, but cannot be detected Table 3. HAZOP Guide Words for Software or System Interface Analysis31 The

  6. Limitations of transient power loads on DEMO and analysis of mitigation techniques

    Energy Technology Data Exchange (ETDEWEB)

    Maviglia, F., E-mail: francesco.maviglia@euro-fusion.org [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); Federici, G. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Strohmayer, G. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Wenninger, R. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Bachmann, C. [EUROfusion Consortium, PPPT Department, Boltzmannstr. 2, Garching (Germany); Albanese, R. [Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); Ambrosino, R. [Consorzio CREATE University Napoli Parthenope, Naples (Italy); Li, M. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Loschiavo, V.P. [Consorzio CREATE, University Napoli Federico II – DIETI, 80125 Napoli (Italy); You, J.H. [Max-Planck-Institut fur Plasmaphysik, Boltzmannstr. 2, Garching (Germany); Zani, L. [CEA, IRFM, F-13108 St Paul-Lez-Durance (France)

    2016-11-01

    Highlights: • A parametric thermo-hydraulic analysis of the candidate DEMO divertor is presented. • The operational space assessment is presented under static and transient heat loads. • Strike points sweeping is analyzed as a divertor power exhaust mitigation technique. • Results are presented on sweeping installed power required, AC losses and thermal fatigue. - Abstract: The present European standard DEMO divertor target technology is based on a water-cooled tungsten mono-block with a copper alloy heat sink. This paper presents the assessment of the operational space of this technology under static and transient heat loads. A transient thermo-hydraulic analysis was performed using the code RACLETTE, which allowed a broad parametric scan of the target geometry and coolant conditions. The limiting factors considered were the coolant critical heat flux (CHF), and the temperature limits of the materials. The second part of the work is devoted to the study of the plasma strike point sweeping as a mitigation technique for the divertor power exhaust. The RACLETTE code was used to evaluate the impact of a large range of sweeping frequencies and amplitudes. A reduced subset of cases, which complied with the constraints, was benchmarked with a 3D FEM model. A reduction of the heat flux to the coolant, up to a factor ∼4, and lower material temperatures were found for an incident heat flux in the range (15–30) MW/m{sup 2}. Finally, preliminary assessments were performed on the installed power required for the sweeping, the AC losses in the superconductors and thermal fatigue analysis. No evident show stoppers were found.

  7. Preliminary analysis techniques for ring and stringer stiffened cylindrical shells

    Science.gov (United States)

    Graham, J.

    1993-03-01

    This report outlines methods of analysis for the buckling of thin-walled circumferentially and longitudinally stiffened cylindrical shells. Methods of analysis for the various failure modes are presented in one cohesive package. Where applicable, more than one method of analysis for a failure mode is presented along with standard practices. The results of this report are primarily intended for use in launch vehicle design in the elastic range. A Microsoft Excel worksheet with accompanying macros has been developed to automate the analysis procedures.

  8. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  9. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  10. Empirical Analysis of Data Mining Techniques for Social Network Websites

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2014-02-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  11. Method development for arsenic analysis by modification in spectrophotometric technique

    Directory of Open Access Journals (Sweden)

    M. A. Tahir

    2012-01-01

    Full Text Available Arsenic is a non-metallic constituent, present naturally in groundwater due to some minerals and rocks. Arsenic is not geologically uncommon and occurs in natural water as arsenate and arsenite. Additionally, arsenic may occur from industrial discharges or insecticide application. World Health Organization (WHO and Pakistan Standard Quality Control Authority have recommended a permissible limit of 10 ppb for arsenic in drinking water. Arsenic at lower concentrations can be determined in water by using high tech instruments like the Atomic Absorption Spectrometer (hydride generation. Because arsenic concentration at low limits of 1 ppb can not be determined easily with simple spectrophotometric technique, the spectrophotometric technique using silver diethyldithiocarbamate was modified to achieve better results, up to the extent of 1 ppb arsenic concentration.

  12. Analysis of Requirement Engineering Processes, Tools/Techniques and Methodologies

    Directory of Open Access Journals (Sweden)

    Tousif ur Rehman

    2013-02-01

    Full Text Available Requirement engineering is an integral part of the software development lifecycle since the basis for developing successful software depends on comprehending its requirements in the first place. Requirement engineering involves a number of processes for gathering requirements in accordance with the needs and demands of users and stakeholders of the software product. In this paper, we have reviewed the prominent processes, tools and technologies used in the requirement gathering phase. The study is useful to perceive the current state of the affairs pertaining to the requirement engineering research and to understand the strengths and limitations of the existing requirement engineering techniques. The study also summarizes the best practices and how to use a blend of the requirement engineering techniques as an effective methodology to successfully conduct the requirement engineering task. The study also highlights the importance of security requirements as though they are part of the non-functional requirement, yet are naturally considered fundamental to secure software development.

  13. EMPIRICAL ANALYSIS OF DATA MINING TECHNIQUES FOR SOCIAL NETWORK WEBSITES

    Directory of Open Access Journals (Sweden)

    S.G.S Fernando

    2015-11-01

    Full Text Available Social networks allow users to collaborate with others. People of similar backgrounds and interests meet and cooperate using these social networks, enabling them to share information across the world. The social networks contain millions of unprocessed raw data. By analyzing this data new knowledge can be gained. Since this data is dynamic and unstructured traditional data mining techniques will not be appropriate. Web data mining is an interesting field with vast amount of applications. With the growth of online social networks have significantly increased data content available because profile holders become more active producers and distributors of such data. This paper identifies and analyzes existing web mining techniques used to mine social network data.

  14. Pathways of distinction analysis: a new technique for multi-SNP analysis of GWAS data.

    Science.gov (United States)

    Braun, Rosemary; Buetow, Kenneth

    2011-06-01

    Genome-wide association studies (GWAS) have become increasingly common due to advances in technology and have permitted the identification of differences in single nucleotide polymorphism (SNP) alleles that are associated with diseases. However, while typical GWAS analysis techniques treat markers individually, complex diseases (cancers, diabetes, and Alzheimers, amongst others) are unlikely to have a single causative gene. Thus, there is a pressing need for multi-SNP analysis methods that can reveal system-level differences in cases and controls. Here, we present a novel multi-SNP GWAS analysis method called Pathways of Distinction Analysis (PoDA). The method uses GWAS data and known pathway-gene and gene-SNP associations to identify pathways that permit, ideally, the distinction of cases from controls. The technique is based upon the hypothesis that, if a pathway is related to disease risk, cases will appear more similar to other cases than to controls (or vice versa) for the SNPs associated with that pathway. By systematically applying the method to all pathways of potential interest, we can identify those for which the hypothesis holds true, i.e., pathways containing SNPs for which the samples exhibit greater within-class similarity than across classes. Importantly, PoDA improves on existing single-SNP and SNP-set enrichment analyses, in that it does not require the SNPs in a pathway to exhibit independent main effects. This permits PoDA to reveal pathways in which epistatic interactions drive risk. In this paper, we detail the PoDA method and apply it to two GWAS: one of breast cancer and the other of liver cancer. The results obtained strongly suggest that there exist pathway-wide genomic differences that contribute to disease susceptibility. PoDA thus provides an analytical tool that is complementary to existing techniques and has the power to enrich our understanding of disease genomics at the systems-level.

  15. Magnetic resonance elastography (MRE) in cancer: Technique, analysis, and applications

    Science.gov (United States)

    Pepin, Kay M.; Ehman, Richard L.; McGee, Kiaran P.

    2015-01-01

    Tissue mechanical properties are significantly altered with the development of cancer. Magnetic resonance elastography (MRE) is a noninvasive technique capable of quantifying tissue mechanical properties in vivo. This review describes the basic principles of MRE and introduces some of the many promising MRE methods that have been developed for the detection and characterization of cancer, evaluation of response to therapy, and investigation of the underlying mechanical mechanisms associated with malignancy. PMID:26592944

  16. Analysis of Acoustic Emission Signals using WaveletTransformation Technique

    Directory of Open Access Journals (Sweden)

    S.V. Subba Rao

    2008-07-01

    Full Text Available Acoustic emission (AE monitoring is carried out during proof pressure testing of pressurevessels to find the occurrence of any crack growth-related phenomenon. While carrying out AEmonitoring, it is often found that the background noise is very high. Along with the noise, thesignal includes various phenomena related to crack growth, rubbing of fasteners, leaks, etc. Dueto the presence of noise, it becomes difficult to identify signature of the original signals related to the above phenomenon. Through various filtering/ thresholding techniques, it was found that the original signals were getting filtered out along with noise. Wavelet transformation technique is found to be more appropriate to analyse the AE signals under such situations. Wavelet transformation technique is used to de-noise the AE data. The de-noised signal is classified to identify a signature based on the type of phenomena.Defence Science Journal, 2008, 58(4, pp.559-564, DOI:http://dx.doi.org/10.14429/dsj.58.1677

  17. An ASIC Low Power Primer Analysis, Techniques and Specification

    CERN Document Server

    Chadha, Rakesh

    2013-01-01

    This book provides an invaluable primer on the techniques utilized in the design of low power digital semiconductor devices.  Readers will benefit from the hands-on approach which starts form the ground-up, explaining with basic examples what power is, how it is measured and how it impacts on the design process of application-specific integrated circuits (ASICs).  The authors use both the Unified Power Format (UPF) and Common Power Format (CPF) to describe in detail the power intent for an ASIC and then guide readers through a variety of architectural and implementation techniques that will help meet the power intent.  From analyzing system power consumption, to techniques that can employed in a low power design, to a detailed description of two alternate standards for capturing the power directives at various phases of the design, this book is filled with information that will give ASIC designers a competitive edge in low-power design. Starts from the ground-up and explains what power is, how it is measur...

  18. A COMPARISON OF SOME STATISTICAL TECHNIQUES FOR ROAD ACCIDENT ANALYSIS

    NARCIS (Netherlands)

    OPPE, S INST ROAD SAFETY RES, SWOV

    1992-01-01

    At the TRRL/SWOV Workshop on Accident Analysis Methodology, heldin Amsterdam in 1988, the need to establish a methodology for the analysis of road accidents was firmly stated by all participants. Data from different countries cannot be compared because there is no agreement on research methodology,

  19. Twitter Sentiment Analysis of Movie Reviews using Machine Learning Techniques.

    Directory of Open Access Journals (Sweden)

    Akshay Amolik

    2015-12-01

    Full Text Available Sentiment analysis is basically concerned with analysis of emotions and opinions from text. We can refer sentiment analysis as opinion mining. Sentiment analysis finds and justifies the sentiment of the person with respect to a given source of content. Social media contain huge amount of the sentiment data in the form of tweets, blogs, and updates on the status, posts, etc. Sentiment analysis of this largely generated data is very useful to express the opinion of the mass. Twitter sentiment analysis is tricky as compared to broad sentiment analysis because of the slang words and misspellings and repeated characters. We know that the maximum length of each tweet in Twitter is 140 characters. So it is very important to identify correct sentiment of each word. In our project we are proposing a highly accurate model of sentiment analysis of tweets with respect to latest reviews of upcoming Bollywood or Hollywood movies. With the help of feature vector and classifiers such as Support vector machine and Naïve Bayes, we are correctly classifying these tweets as positive, negative and neutral to give sentiment of each tweet.

  20. Facilitating the analysis of immunological data with visual analytic techniques.

    Science.gov (United States)

    Shih, David C; Ho, Kevin C; Melnick, Kyle M; Rensink, Ronald A; Kollmann, Tobias R; Fortuno, Edgardo S

    2011-01-02

    Visual analytics (VA) has emerged as a new way to analyze large dataset through interactive visual display. We demonstrated the utility and the flexibility of a VA approach in the analysis of biological datasets. Examples of these datasets in immunology include flow cytometry, Luminex data, and genotyping (e.g., single nucleotide polymorphism) data. Contrary to the traditional information visualization approach, VA restores the analysis power in the hands of analyst by allowing the analyst to engage in real-time data exploration process. We selected the VA software called Tableau after evaluating several VA tools. Two types of analysis tasks analysis within and between datasets were demonstrated in the video presentation using an approach called paired analysis. Paired analysis, as defined in VA, is an analysis approach in which a VA tool expert works side-by-side with a domain expert during the analysis. The domain expert is the one who understands the significance of the data, and asks the questions that the collected data might address. The tool expert then creates visualizations to help find patterns in the data that might answer these questions. The short lag-time between the hypothesis generation and the rapid visual display of the data is the main advantage of a VA approach.

  1. A Survey of Techniques for Security Architecture Analysis

    Science.gov (United States)

    2003-05-01

    Effects Analysis FPG Failure Propagation Graph FTA Fault Tree Analysis HAZOP Hazard and Operability studies IATF Information Assurance Technical...represent logical places, within an information system, where people can perform their work by means of software acting on their behalf. People who...Describes the resources used to support the DIE (Including, for example, hardware, software , communication networks, applications and qualified staff

  2. Improvements in analysis techniques for segmented mirror arrays

    Science.gov (United States)

    Michels, Gregory J.; Genberg, Victor L.; Bisson, Gary R.

    2016-08-01

    The employment of actively controlled segmented mirror architectures has become increasingly common in the development of current astronomical telescopes. Optomechanical analysis of such hardware presents unique issues compared to that of monolithic mirror designs. The work presented here is a review of current capabilities and improvements in the methodology of the analysis of mechanically induced surface deformation of such systems. The recent improvements include capability to differentiate surface deformation at the array and segment level. This differentiation allowing surface deformation analysis at each individual segment level offers useful insight into the mechanical behavior of the segments that is unavailable by analysis solely at the parent array level. In addition, capability to characterize the full displacement vector deformation of collections of points allows analysis of mechanical disturbance predictions of assembly interfaces relative to other assembly interfaces. This capability, called racking analysis, allows engineers to develop designs for segment-to-segment phasing performance in assembly integration, 0g release, and thermal stability of operation. The performance predicted by racking has the advantage of being comparable to the measurements used in assembly of hardware. Approaches to all of the above issues are presented and demonstrated by example with SigFit, a commercially available tool integrating mechanical analysis with optical analysis.

  3. A dynamic factor model for the analysis of multivariate time series

    NARCIS (Netherlands)

    Molenaar, P.C.M.

    1985-01-01

    Describes the new statistical technique of dynamic factor analysis (DFA), which accounts for the entire lagged covariance function of an arbitrary 2nd-order stationary time series. DFA is shown to be applicable to a relatively short stretch of observations and is therefore considered worthwhile for

  4. Applications of Radar Interferometric Techniques to Assess Natural Hazards and their Controlling Factors

    Science.gov (United States)

    Sultan, M.; Becker, R.; Gebremichael, E.; Othman, A.; Emil, M.; Ahmed, M.; Elkadiri, R.; Pankratz, H. G.; Chouinard, K.

    2015-12-01

    Radar interferometric techniques including Persistent Scatterer (PS), Small BAseline Subset (SBAS), and two and three pass (differential interferometry) methods were applied to Synthetic Aperture Radar (SAR) datasets. These include the European Space Agency (ESA) ERS-1, ERS-2, Environmental satellite (Envisat), and Phased Array type L-band Synthetic Aperture Radar (PALSAR) to conduct the following: (1) map the spatial distribution of land deformation associated with a wide range of geologic settings, (2) quantify the rates of the observed land deformation, and (3) identify the factors controlling the observed deformation. The research topics/areas include: (1) subsidence associated with sediment compaction in a Delta setting (Nile Delta, Egypt), (2) deformation in a rifting setting (Red Sea rifting along the Red Sea coastal zone and proximal basement outcrops in Egypt and Saudi Arabia), (3) deformation associated with salt dome intrusion and the dissolution of sabkha deposits (Jazan area in Saudi Arabia), (4) mass transport associated with debris flows (Jazan area in Saudi Arabia), and (5) deformation preceding, contemporaneous with, or following large earthquakes (in Nepal; magnitude: 7.8; date: April, 25, 2015) and medium earthquakes (in Harrat Lunayyir volcanic field, central Saudi Arabia; magnitude: 5.7; date: May 19, 2009). The identification of the factor(s) controlling the observed deformation was attained through spatial correlation of extracted radar velocities with relevant temporal and static ground based and remotely sensed geological and cultural data sets (e.g., lithology, structure, precipitation, land use, and earthquake location, magnitude, and focal mechanism) in a Geographical Information System (GIS) environment.

  5. A Novel Technique of Measuring SOA Differential Carrier Lifetime and a -Factor Using SOA Optical Modulation Response

    Institute of Scientific and Technical Information of China (English)

    Ki-Hyuk Lee; Woo-Young Choi

    2003-01-01

    We demonstrate a new technique of measuring differential carrier lifetime and linewidth enhancement factor in a semiconductor optical amplifier. In our method, the optical responses and fiber transfer functions of a self-gain modulated SOA are measured and, from these, values of carrier lifetimes and linewidth enhancement factors are determined for various SOA input optical powers.

  6. FACTOR ANALYSIS OF THE ELKINS HYPNOTIZABILITY SCALE

    Science.gov (United States)

    Elkins, Gary; Johnson, Aimee K.; Johnson, Alisa J.; Sliwinski, Jim

    2015-01-01

    Assessment of hypnotizability can provide important information for hypnosis research and practice. The Elkins Hypnotizability Scale (EHS) consists of 12 items and was developed to provide a time-efficient measure for use in both clinical and laboratory settings. The EHS has been shown to be a reliable measure with support for convergent validity with the Stanford Hypnotic Susceptibility Scale, Form C (r = .821, p < .001). The current study examined the factor structure of the EHS, which was administered to 252 adults (51.3% male; 48.7% female). Average time of administration was 25.8 minutes. Four factors selected on the basis of the best theoretical fit accounted for 63.37% of the variance. The results of this study provide an initial factor structure for the EHS. PMID:25978085

  7. ANALYSIS OF EXTERNAL FACTORS AFFECTING THE PRICING

    Directory of Open Access Journals (Sweden)

    Irina A. Kiseleva

    2013-01-01

    Full Text Available The external factors influencing the process of formation of tariffs of commercial services are considered in the article. External environment is known to be very diverse and changeable. Currently, pricing has become one of the key processes of strategic development of a company. Pricing in the service sector, in turn, is highly susceptible to changes in the external environment. Its components directly or indirectly affect the market of services, changing it adopted economic processes. As a rule, firms providing services can’t influence the changes in external factors. However, the service market is very flexible, which enables businesses to reshape pricing strategy, to adapt it to the new environment.

  8. Automated image analysis techniques for cardiovascular magnetic resonance imaging

    NARCIS (Netherlands)

    Geest, Robertus Jacobus van der

    2011-01-01

    The introductory chapter provides an overview of various aspects related to quantitative analysis of cardiovascular MR (CMR) imaging studies. Subsequently, the thesis describes several automated methods for quantitative assessment of left ventricular function from CMR imaging studies. Several novel

  9. Cross-impact analysis experimentation using two techniques to ...

    African Journals Online (AJOL)

    coherency. This paper describes cross-impact analysis experimentation in which a Monte ..... [4] is used to accomplish this computational task. Using this method ..... 202–222 in Baldwin MM (Ed), Portraits of complexity: Applications of systems.

  10. Analysis of the changes in keratoplasty indications and preferred techniques.

    Directory of Open Access Journals (Sweden)

    Stefan J Lang

    Full Text Available Recently, novel techniques introduced to the field of corneal surgery, e.g. Descemet membrane endothelial keratoplasty (DMEK and corneal crosslinking, extended the therapeutic options. Additionally contact lens fitting has developed new alternatives. We herein investigated, whether these techniques have affected volume and spectrum of indications of keratoplasties in both a center more specialized in treating Fuchs' dystrophy (center 1 and a second center that is more specialized in treating keratoconus (center 2.We retrospectively reviewed the waiting lists for indication, transplantation technique and the patients' travel distances to the hospital at both centers.We reviewed a total of 3778 procedures. Fuchs' dystrophy increased at center 1 from 17% (42 to 44% (150 and from 13% (27 to 23% (62 at center 2. In center 1, DMEK increased from zero percent in 2010 to 51% in 2013. In center 2, DMEK was not performed until 2013. The percentage of patients with keratoconus slightly decreased from 15% (36 in 2009 vs. 12% (40 in 2013 in center 1. The respective percentages in center 2 were 28% (57 and 19% (51. In both centers, the patients' travel distances increased.The results from center 1 suggest that DMEK might increase the total number of keratoplasties. The increase in travel distance suggests that this cannot be fully attributed to recruiting the less advanced patients from the hospital proximity. The increase is rather due to more referrals from other regions. The decrease of keratoconus patients in both centers is surprising and may be attributed to optimized contact lens fitting or even to the effect corneal crosslinking procedure.

  11. Improved Tandem Measurement Techniques for Aerosol Particle Analysis

    Science.gov (United States)

    Rawat, Vivek Kumar

    Non-spherical, chemically inhomogeneous (complex) nanoparticles are encountered in a number of natural and engineered environments, including combustion systems (which produces highly non-spherical aggregates), reactors used in gas-phase materials synthesis of doped or multicomponent materials, and in ambient air. These nanoparticles are often highly diverse in size, composition and shape, and hence require determination of property distribution functions for accurate characterization. This thesis focuses on development of tandem mobility-mass measurement techniques coupled with appropriate data inversion routines to facilitate measurement of two dimensional size-mass distribution functions while correcting for the non-idealities of the instruments. Chapter 1 provides the detailed background and motivation for the studies performed in this thesis. In chapter 2, the development of an inversion routine is described which is employed to determine two dimensional size-mass distribution functions from Differential Mobility Analyzer-Aerosol Particle Mass analyzer tandem measurements. Chapter 3 demonstrates the application of the two dimensional distribution function to compute cumulative mass distribution function and also evaluates the validity of this technique by comparing the calculated total mass concentrations to measured values for a variety of aerosols. In Chapter 4, this tandem measurement technique with the inversion routine is employed to analyze colloidal suspensions. Chapter 5 focuses on application of a transverse modulation ion mobility spectrometer coupled with a mass spectrometer to study the effect of vapor dopants on the mobility shifts of sub 2 nm peptide ion clusters. These mobility shifts are then compared to models based on vapor uptake theories. Finally, in Chapter 6, a conclusion of all the studies performed in this thesis is provided and future avenues of research are discussed.

  12. Computational Intelligence Techniques for Electro-Physiological Data Analysis

    OpenAIRE

    Riera Sardà, Alexandre

    2012-01-01

    This work contains the efforts I have made in the last years in the field of Electrophysiological data analysis. Most of the work has been done at Starlab Barcelona S.L. and part of it at the Neurodynamics Laboratory of the Department of Psychiatry and Clinical Psychobiology of the University of Barcelona. The main work deals with the analysis of electroencephalography (EEG) signals, although other signals, such as electrocardiography (ECG), electroculography (EOG) and electromiography (EMG) ...

  13. Automated Techniques for Rapid Analysis of Momentum Exchange Devices

    Science.gov (United States)

    2013-12-01

    Contiguousness At this point, it is necessary to introduce the concept of contiguousness. In this thesis, a state space analysis representation is... concept of contiguousness was established to ensure that the results of the analysis would allow for the CMGs to reach every state in the defined...forces at the attachment points of the RWs and CMGs throughout a spacecraft maneuver. Current pedagogy on this topic focuses on the transfer of

  14. Combined Technique Analysis of Punic Make-up Materials

    Energy Technology Data Exchange (ETDEWEB)

    Huq,A.; Stephens, P.; Ayed, N.; Binous, H.; Burgio, L.; Clark, R.; Pantos, E.

    2006-01-01

    Ten archaeological Punic make-up samples from Tunisia dating from the 4th to the 1st centuries BC were analyzed by several techniques including Raman microscopy and synchrotron X-ray diffraction in order to determine their compositions. Eight samples were red and found to contain either quartz and cinnabar or quartz and haematite. The remaining two samples were pink, the main diffracting phase in them being quartz. Examination of these two samples by optical microscopy and by illumination under a UV lamp suggest that the pink dye is madder. These findings reveal the identities of the materials used by Carthaginians for cosmetic and/or ritual make-up purposes.

  15. Factor Analysis for Spectral Reconnaissance and Situational Understanding

    Science.gov (United States)

    2016-07-11

    reviewed journals: Final Report: Factor Analysis for Spectral Reconnaissance and Situational Understanding Report Title The Army has a critical need for...based NP-hard design problems, by associating them with corresponding estimation problems. 1 Factor Analysis for Spectral Reconnaissance and Situational ...SECURITY CLASSIFICATION OF: The Army has a critical need for enhancing situational understanding for dismounted soldiers and rapidly deployed tactical

  16. A Factor Analysis of the BSRI and the PAQ.

    Science.gov (United States)

    Edwards, Teresa A.; And Others

    Factor analysis of the Bem Sex Role Inventory (BSRI) and the Personality Attributes Questionnaire (PAQ) was undertaken to study the independence of the masculine and feminine scales within each instrument. Both instruments were administered to undergraduate education majors. Analysis of primary first and second order factors of the BSRI indicated…

  17. Exploratory Factor Analysis of African Self-Consciousness Scale Scores

    Science.gov (United States)

    Bhagwat, Ranjit; Kelly, Shalonda; Lambert, Michael C.

    2012-01-01

    This study replicates and extends prior studies of the dimensionality, convergent, and external validity of African Self-Consciousness Scale scores with appropriate exploratory factor analysis methods and a large gender balanced sample (N = 348). Viable one- and two-factor solutions were cross-validated. Both first factors overlapped significantly…

  18. Multigroup Confirmatory Factor Analysis: Locating the Invariant Referent Sets

    Science.gov (United States)

    French, Brian F.; Finch, W. Holmes

    2008-01-01

    Multigroup confirmatory factor analysis (MCFA) is a popular method for the examination of measurement invariance and specifically, factor invariance. Recent research has begun to focus on using MCFA to detect invariance for test items. MCFA requires certain parameters (e.g., factor loadings) to be constrained for model identification, which are…

  19. Development of human behavior analysis techniques. Analysis of stress effects on the cognitive operating work

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Chul Jung; Park Jae Hee [Korea Research Institute of Standards and Science, Taejon (Korea, Republic of)

    1995-08-01

    PSFs(Performance Shaping Factors) and performance measures were selected to evaluate the operating tasks of nuclear power plant. Effects of PSFs on performance were studied on the basis of LOCA(Loss of Coolant Accident) and SGTR(Steam generator Tube Rupture) task analysis. The knowledge of relationship between PSFs and performance measures were represented as IF - THEN rule form. The result will be applied to the development of the cognitive operational simulator. (author). 64 refs.

  20. Fed-state gastric media and drug analysis techniques: Current status and points to consider.

    Science.gov (United States)

    Baxevanis, Fotios; Kuiper, Jesse; Fotaki, Nikoletta

    2016-10-01

    Gastric fed state conditions can have a significant effect on drug dissolution and absorption. In vitro dissolution tests with simple aqueous media cannot usually predict drugs' in vivo response, as several factors such as the meal content, the gastric emptying and possible interactions between food and drug formulations can affect drug's pharmacokinetics. Good understanding of the effect of the in vivo fed gastric conditions on the drug is essential for the development of biorelevant dissolution media simulating the gastric environment after the administration of the standard high fat meal proposed by the FDA and the EMA in bioavailability/bioequivalence (BA/BE) studies. The analysis of drugs in fed state media can be quite challenging as most analytical protocols currently employed are time consuming and labour intensive. In this review, an overview of the in vivo gastric conditions and the biorelevant media used for their in vitro simulation are described. Furthermore an analysis of the physicochemical properties of the drugs and the formulations related to food effect is given. In terms of drug analysis, the protocols currently used for the fed state media sample treatment and analysis and the analytical challenges and needs emerging for more efficient and time saving techniques for a broad spectrum of compounds are being discussed.

  1. Factor Analysis of People Rather than Variables: Q and Other Two-Mode Factor Analytic Models.

    Science.gov (United States)

    Frederick, Brigitte N.

    Factor analysis attempts to study how different objects group together to form factors with the purposes of: (1) reducing the number of factorable entities (e.g., variables) with which the researcher needs to deal; (2) searching data for qualitative and quantitative differences; and (3) testing hypotheses (R. Gorsuch, 1983). While most factor…

  2. Chiral analysis of baryon form factors

    Energy Technology Data Exchange (ETDEWEB)

    Gail, T.A.

    2007-11-08

    This work presents an extensive theoretical investigation of the structure of the nucleon within the standard model of elementary particle physics. In particular, the long range contributions to a number of various form factors parametrizing the interactions of the nucleon with an electromagnetic probe are calculated. The theoretical framework for those calculations is chiral perturbation theory, the exact low energy limit of Quantum Chromo Dynamics, which describes such long range contributions in terms of a pion-cloud. In this theory, a nonrelativistic leading one loop order calculation of the form factors parametrizing the vector transition of a nucleon to its lowest lying resonance, the {delta}, a covariant calculation of the isovector and isoscalar vector form factors of the nucleon at next to leading one loop order and a covariant calculation of the isoscalar and isovector generalized vector form factors of the nucleon at leading one loop order are performed. In order to perform consistent loop calculations in the covariant formulation of chiral perturbation theory an appropriate renormalization scheme is defined in this work. All theoretical predictions are compared to phenomenology and results from lattice QCD simulations. These comparisons allow for a determination of the low energy constants of the theory. Furthermore, the possibility of chiral extrapolation, i.e. the extrapolation of lattice data from simulations at large pion masses down to the small physical pion mass is studied in detail. Statistical as well as systematic uncertainties are estimated for all results throughout this work. (orig.)

  3. Mortality risk factors in critical post-surgical patients treated using continuous renal replacement techniques.

    Science.gov (United States)

    Estupiñán-Jiménez, J C; Castro-Rincón, J M; González, O; Lora, D; López, E; Pérez-Cerdà, F

    2015-04-01

    To determine the influence of demographics, medical, and surgical variables on 30-day mortality in patients who need continuous renal replacement therapy (CRRT). A retrospective-following study was conducted using the data of 112 patients admitted to the postoperative intensive care unit who required CRRT, between August 2006 and August 2011, and followed-up for 30 days. The following information was collected: age, gender, history of HBP, DM, cardiovascular disease, and CKD, urgent surgery, surgical speciality, organic dysfunction according to the SOFA scale, the number of organs with dysfunction, use of mechanical ventilation, diagnostic and origin of sepsis, type of CRRT, and 30-day mortality. General linear models were used for estimating the strength of association (relative risk [RR], and 95% confidence interval [CI] between variables and 30-day mortality. In the univariant analysis, the following variables were identified as risk factors for 30-day mortality: age (RR 1.04; 95% CI 1.01-1.06; P=.0005), and history of cardiovascular disease (RR 1.57; 95% CI 1.02-2.41; P=.039). Among the variables included in the multivariable analysis (age, history of cardiovascular disease, sepsis, and number of organs with dysfunction), only age was identified as an independent risk factor for 30-day mortality (RR 1.03; 95% CI 1.00-1.05; P=.007). Thirty-day mortality in postoperative, critically ill patients who require CRRT is high (41.07%). Age has been identified as an independent risk factor, with renal failure as the most common indication for the use of these therapies. Copyright © 2013 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. A simple high-sensitivity technique for purity analysis of xenon gas

    CERN Document Server

    Leonard, D S; Hall, C; Kaufman, L; Langford, T; Slutsky, S; Yen, Y R

    2010-01-01

    We report on the development and performance of a high-sensitivity purity-analysis technique for gaseous xenon. The gas is sampled at macroscopic pressure from the system of interest using a UHV leak valve. The xenon present in the sample is removed with a liquid-nitrogen cold trap, and the remaining impurities are observed with a standard vacuum mass-spectroscopy device. Using calibrated samples of xenon gas spiked with known levels of impurities, we find that the minimum detectable levels of N2, O2, and methane are 1 ppb, 160 ppt, and 60 ppt respectively. This represents an improvement of about a factor of 10,000 compared to measurements performed without a coldtrap.

  5. Application of sensitivity-analysis techniques to the calculation of topological quantities

    Science.gov (United States)

    Gilchrist, Stuart

    2017-08-01

    Magnetic reconnection in the corona occurs preferentially at sites where the magnetic connectivity is either discontinuous or has a large spatial gradient. Hence there is a general interest in computing quantities (like the squashing factor) that characterize the gradient in the field-line mapping function. Here we present an algorithm for calculating certain (quasi)topological quantities using mathematical techniques from the field of ``sensitivity-analysis''. The method is based on the calculation of a three dimensional field-line mapping Jacobian from which all the present topological quantities of interest can be derived. We will present the algorithm and the details of a publicly available set of libraries that implement the algorithm.

  6. Improved analysis techniques for cylindrical and spherical double probes

    Energy Technology Data Exchange (ETDEWEB)

    Beal, Brian; Brown, Daniel; Bromaghim, Daron [Air Force Research Laboratory, 1 Ara Rd., Edwards Air Force Base, California 93524 (United States); Johnson, Lee [Jet Propulsion Laboratory, California Institute of Technology, 4800 Oak Grove Dr., Pasadena, California 91109 (United States); Blakely, Joseph [ERC Inc., 1 Ara Rd., Edwards Air Force Base, California 93524 (United States)

    2012-07-15

    A versatile double Langmuir probe technique has been developed by incorporating analytical fits to Laframboise's numerical results for ion current collection by biased electrodes of various sizes relative to the local electron Debye length. Application of these fits to the double probe circuit has produced a set of coupled equations that express the potential of each electrode relative to the plasma potential as well as the resulting probe current as a function of applied probe voltage. These equations can be readily solved via standard numerical techniques in order to determine electron temperature and plasma density from probe current and voltage measurements. Because this method self-consistently accounts for the effects of sheath expansion, it can be readily applied to plasmas with a wide range of densities and low ion temperature (T{sub i}/T{sub e} Much-Less-Than 1) without requiring probe dimensions to be asymptotically large or small with respect to the electron Debye length. The presented approach has been successfully applied to experimental measurements obtained in the plume of a low-power Hall thruster, which produced a quasineutral, flowing xenon plasma during operation at 200 W on xenon. The measured plasma densities and electron temperatures were in the range of 1 Multiplication-Sign 10{sup 12}-1 Multiplication-Sign 10{sup 17} m{sup -3} and 0.5-5.0 eV, respectively. The estimated measurement uncertainty is +6%/-34% in density and +/-30% in electron temperature.

  7. Techniques for hazard analysis and their use at CERN.

    Science.gov (United States)

    Nuttall, C; Schönbacher, H

    2001-01-01

    CERN, The European Organisation for Nuclear Research is situated near Geneva and has its accelerators and experimental facilities astride the Swiss and French frontiers attracting physicists from all over the world to this unique laboratory. The main accelerator is situated in a 27 km underground ring and the experiments take place in huge underground caverns in order to detect the fragments resulting from the collision of subatomic particles at speeds approaching that of light. These detectors contain many hundreds of tons of flammable materials, mainly plastics in cables and structural components, flammable gases in the detectors themselves, and cryogenic fluids such as helium and argon. The experiments consume high amounts of electrical power, thus the dangers involved have necessitated the use of analytical techniques to identify the hazards and quantify the risks to personnel and the infrastructure. The techniques described in the paper have been developed in the process industries where they have been to be of great value. They have been successfully applied to CERN industrial and experimental installations and, in some cases, have been instrumental in changing the philosophy of the experimentalists and their detectors.

  8. Comparative Analysis of Data Mining Techniques for Malaysian Rainfall Prediction

    Directory of Open Access Journals (Sweden)

    Suhaila Zainudin

    2016-12-01

    Full Text Available Climate change prediction analyses the behaviours of weather for a specific time. Rainfall forecasting is a climate change task where specific features such as humidity and wind will be used to predict rainfall in specific locations. Rainfall prediction can be achieved using classification task under Data Mining. Different techniques lead to different performances depending on rainfall data representation including representation for long term (months patterns and short-term (daily patterns. Selecting an appropriate technique for a specific duration of rainfall is a challenging task. This study analyses multiple classifiers such as Naïve Bayes, Support Vector Machine, Decision Tree, Neural Network and Random Forest for rainfall prediction using Malaysian data. The dataset has been collected from multiple stations in Selangor, Malaysia. Several pre-processing tasks have been applied in order to resolve missing values and eliminating noise. The experimental results show that with small training data (10% from 1581 instances Random Forest correctly classified 1043 instances. This is the strength of an ensemble of trees in Random Forest where a group of classifiers can jointly beat a single classifier.

  9. Microscopy Techniques for Analysis of Nickel Metal Hydride Batteries Constituents.

    Science.gov (United States)

    Carpenter, Graham J C; Wronski, Zbigniew

    2015-12-01

    With the need for improvements in the performance of rechargeable batteries has come the necessity to better characterize cell electrodes and their component materials. Electron microscopy has been shown to reveal many important features of microstructure that are becoming increasingly important for understanding the behavior of the components during the many charge/discharge cycles required in modern applications. The aim of this paper is to present an overview of how the full suite of techniques available using transmission electron microscopy (TEM) and scanning transmission electron microscopy was applied to the case of materials for the positive electrode in nickel metal hydride rechargeable battery electrodes. Embedding and sectioning of battery-grade powders with an ultramicrotome was used to produce specimens that could be readily characterized by TEM. Complete electrodes were embedded after drying, and also after dehydration from the original wet state, for examination by optical microscopy and using focused ion beam techniques. Results of these studies are summarized to illustrate the significance of the microstructural information obtained.

  10. Smart Technique for Induction Motors Diagnosis by Monitoring the Power Factor Using Only the Measured Current

    Science.gov (United States)

    Shnibha, R. A.; Albarabar, A. S.

    2012-05-01

    This paper is concerned with accurate, early and reliable induction motor IM fault detection and diagnosis using an enhanced power parameter measurement technique. IM protection devices typically monitor the motor current and/or voltage to provide the motor protection from e.g. current overload, over/under voltage, etc. One of the interesting parameters to monitor is the operating power factor (PF) of the IM which provides better under-load protection compared to the motor current based approaches. The PF of the motor is determined by the level of the current and voltage that are drawn, and offers non-intrusive monitoring. Traditionally, PF estimation would require both voltage and the current measurements to apply the displacement method. This paper will use a method of determining the operating PF of the IM using only the measured current and the manufacturer data that are typically available from the nameplate and/or datasheet for IM monitoring. The novelty of this work lies in detecting very low phase imbalance related faults and misalignment. Much of the previous work has dealt with detecting phase imbalance faults at higher degrees of severity, i.e. voltage drops of 10% or more. The technique was tested by empirical measurements on test rig comprised a 1.1 kW variable speed three phase induction motor with varying output load (No load, 25%, 50%, 75% and 100% load). One common faults was introduced; imbalance in one phase as the electrical fault The experimental results demonstrate that the PF can be successfully applied for IM fault diagnosis and the present study shows that severity fault detection using PF is promising. The proposed method offers a potentially reliable, non-intrusive, and inexpensive CM tool which can be implemented with real-time monitoring systems

  11. Analytic standard errors for exploratory process factor analysis.

    Science.gov (United States)

    Zhang, Guangjian; Browne, Michael W; Ong, Anthony D; Chow, Sy Miin

    2014-07-01

    Exploratory process factor analysis (EPFA) is a data-driven latent variable model for multivariate time series. This article presents analytic standard errors for EPFA. Unlike standard errors for exploratory factor analysis with independent data, the analytic standard errors for EPFA take into account the time dependency in time series data. In addition, factor rotation is treated as the imposition of equality constraints on model parameters. Properties of the analytic standard errors are demonstrated using empirical and simulated data.

  12. A replication of a factor analysis of motivations for trapping

    Science.gov (United States)

    Schroeder, Susan; Fulton, David C.

    2015-01-01

    Using a 2013 sample of Minnesota trappers, we employed confirmatory factor analysis to replicate an exploratory factor analysis of trapping motivations conducted by Daigle, Muth, Zwick, and Glass (1998).  We employed the same 25 items used by Daigle et al. and tested the same five-factor structure using a recent sample of Minnesota trappers. We also compared motivations in our sample to those reported by Daigle et el.

  13. Modern Theory of Gratings Resonant Scattering: Analysis Techniques and Phenomena

    CERN Document Server

    Sirenko, Yuriy K

    2010-01-01

    Diffraction gratings are one of the most popular objects of analysis in electromagnetic theory. The requirements of applied optics and microwave engineering lead to many new problems and challenges for the theory of diffraction gratings, which force us to search for new methods and tools for their resolution. In Modern Theory of Gratings, the authors present results of the electromagnetic theory of diffraction gratings that will constitute the base of further development of this theory, which meet the challenges provided by modern requirements of fundamental and applied science. This volume covers: spectral theory of gratings (Chapter 1) giving reliable grounds for physical analysis of space-frequency and space-time transformations of the electromagnetic field in open periodic resonators and waveguides; authentic analytic regularization procedures (Chapter 2) that, in contradistinction to the traditional frequency-domain approaches, fit perfectly for the analysis of resonant wave scattering processes; paramet...

  14. Techniques of EMG signal analysis: detection, processing, classification and applications

    Science.gov (United States)

    Hussain, M.S.; Mohd-Yasin, F.

    2006-01-01

    Electromyography (EMG) signals can be used for clinical/biomedical applications, Evolvable Hardware Chip (EHW) development, and modern human computer interaction. EMG signals acquired from muscles require advanced methods for detection, decomposition, processing, and classification. The purpose of this paper is to illustrate the various methodologies and algorithms for EMG signal analysis to provide efficient and effective ways of understanding the signal and its nature. We further point up some of the hardware implementations using EMG focusing on applications related to prosthetic hand control, grasp recognition, and human computer interaction. A comparison study is also given to show performance of various EMG signal analysis methods. This paper provides researchers a good understanding of EMG signal and its analysis procedures. This knowledge will help them develop more powerful, flexible, and efficient applications. PMID:16799694

  15. Analysis of Self-Excited Combustion Instabilities Using Decomposition Techniques

    Science.gov (United States)

    2016-07-05

    SVD ), while for DMD, the data are reduced using the Arnoldi algorithm. POD decomposes data based on optimality to obtain a set of best representations...analysis is performed with the same domains that were used for the POD analysis. The DMD frequency spectra of pressure and heat-release fluctuations are...temporal data andm columns of spatial data, the POD matrix will be of size N ×m. Once we obtain the POD matrix A, the SVD of A is A UΣVT (A4) where U is an

  16. Finite Element Modeling Techniques for Analysis of VIIP

    Science.gov (United States)

    Feola, Andrew J.; Raykin, J.; Gleason, R.; Mulugeta, Lealem; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.; Ethier, C. Ross

    2015-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP.

  17. Analysis of ultrasonic techniques for monitoring milk coagulation during cheesemaking

    Science.gov (United States)

    Budelli, E.; Pérez, N.; Lema, P.; Negreira, C.

    2012-12-01

    Experimental determination of time of flight and attenuation has been proposed in the literature as alternatives to monitoring the evolution of milk coagulation during cheese manufacturing. However, only laboratory scale procedures have been described. In this work, the use of ultrasonic time of flight and attenuation to determine cutting time and its feasibility to be applied at industrial scale were analyzed. Limitations to implement these techniques at industrial scale are shown experimentally. The main limitation of the use of time of flight is its strong dependence with temperature. Attenuation monitoring is affected by a thin layer of milk skin covering the transducer, which modifies the signal in a non-repetitive way. The results of this work can be used to develop alternative ultrasonic systems suitable for application in the dairy industry.

  18. Radio & Optical Interferometry: Basic Observing Techniques and Data Analysis

    CERN Document Server

    Monnier, John D

    2012-01-01

    Astronomers usually need the highest angular resolution possible, but the blurring effect of diffraction imposes a fundamental limit on the image quality from any single telescope. Interferometry allows light collected at widely-separated telescopes to be combined in order to synthesize an aperture much larger than an individual telescope thereby improving angular resolution by orders of magnitude. Radio and millimeter wave astronomers depend on interferometry to achieve image quality on par with conventional visible and infrared telescopes. Interferometers at visible and infrared wavelengths extend angular resolution below the milli-arcsecond level to open up unique research areas in imaging stellar surfaces and circumstellar environments. In this chapter the basic principles of interferometry are reviewed with an emphasis on the common features for radio and optical observing. While many techniques are common to interferometers of all wavelengths, crucial differences are identified that will help new practi...

  19. Metabolic Engineering: Techniques for analysis of targets for genetic manipulations

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1998-01-01

    enzymes. Despite the prospect of obtaining major improvement through metabolic engineering, this approach is, however, not expected to completely replace the classical approach to strain improvement-random mutagenesis followed by screening. Identification of the optimal genetic changes for improvement......Metabolic engineering has been defined as the purposeful modification of intermediary metabolism using recombinant DNA techniques. With this definition metabolic engineering includes: (1) inserting new pathways in microorganisms with the aim of producing novel metabolites, e.g., production...... of polyketides by Streptomyces; (2) production of heterologous peptides, e.g., production of human insulin, erythropoitin, and tPA; and (3) improvement of both new and existing processes, e.g., production of antibiotics and industrial enzymes. Metabolic engineering is a multidisciplinary approach, which involves...

  20. The Analysis of a Phobic Child: Some Problems of Theory and Technique in Child Analysis.

    Science.gov (United States)

    Bornstein, Berta

    2014-01-01

    This paper attempts to clarify some theoretical and technical aspects of child analysis by correlating the course of treatment, the structure of the neurosis, and the technique employed in the case of a phobic boy who was in analysis over a period of three years. The case was chosen for presentation: (1) because of the discrepancy between the clinical simplicity of the symptom and the complicated ego structure behind it; (2) because of the unusual clearness with which the patient brought to the fore the variegated patterns of his libidinal demands; (3) because of the patient's attempts at transitory solutions, oscillations between perversions and symptoms, and processes of new symptom formation; (4) because the vicissitudes and stabilization of character traits could be clearly traced; (5) and finally, because of the rare opportunity to witness during treatment the change from grappling with reality by means of pathological mechanisms, to dealing with reality in a relatively conflict-free fashion.

  1. Analysis of factors affecting fattening of chickens

    OpenAIRE

    OBERMAJEROVÁ, Barbora

    2013-01-01

    Poultry meat belongs to the basic assortment of human nutrition. The meat of an intensively fattened poultry is a source of easily digestible proteins, lipids, mineral substances and vitamins. The aim of this bachelor´s thesis was to write out a literature review, which is focused on the intensity of growth, carcass yield, quality and composition of broiler chickens meat. The following describes the internal and external factors that affect them, i.e. genetic foundation, hybrid combination, s...

  2. Ionospheric Plasma Drift Analysis Technique Based On Ray Tracing

    Science.gov (United States)

    Ari, Gizem; Toker, Cenk

    2016-07-01

    Ionospheric drift measurements provide important information about the variability in the ionosphere, which can be used to quantify ionospheric disturbances caused by natural phenomena such as solar, geomagnetic, gravitational and seismic activities. One of the prominent ways for drift measurement depends on instrumentation based measurements, e.g. using an ionosonde. The drift estimation of an ionosonde depends on measuring the Doppler shift on the received signal, where the main cause of Doppler shift is the change in the length of the propagation path of the signal between the transmitter and the receiver. Unfortunately, ionosondes are expensive devices and their installation and maintenance require special care. Furthermore, the ionosonde network over the world or even Europe is not dense enough to obtain a global or continental drift map. In order to overcome the difficulties related to an ionosonde, we propose a technique to perform ionospheric drift estimation based on ray tracing. First, a two dimensional TEC map is constructed by using the IONOLAB-MAP tool which spatially interpolates the VTEC estimates obtained from the EUREF CORS network. Next, a three dimensional electron density profile is generated by inputting the TEC estimates to the IRI-2015 model. Eventually, a close-to-real situation electron density profile is obtained in which ray tracing can be performed. These profiles can be constructed periodically with a period of as low as 30 seconds. By processing two consequent snapshots together and calculating the propagation paths, we estimate the drift measurements over any coordinate of concern. We test our technique by comparing the results to the drift measurements taken at the DPS ionosonde at Pruhonice, Czech Republic. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  3. Sentiment analysis of Arabic tweets using text mining techniques

    Science.gov (United States)

    Al-Horaibi, Lamia; Khan, Muhammad Badruddin

    2016-07-01

    Sentiment analysis has become a flourishing field of text mining and natural language processing. Sentiment analysis aims to determine whether the text is written to express positive, negative, or neutral emotions about a certain domain. Most sentiment analysis researchers focus on English texts, with very limited resources available for other complex languages, such as Arabic. In this study, the target was to develop an initial model that performs satisfactorily and measures Arabic Twitter sentiment by using machine learning approach, Naïve Bayes and Decision Tree for classification algorithms. The datasets used contains more than 2,000 Arabic tweets collected from Twitter. We performed several experiments to check the performance of the two algorithms classifiers using different combinations of text-processing functions. We found that available facilities for Arabic text processing need to be made from scratch or improved to develop accurate classifiers. The small functionalities developed by us in a Python language environment helped improve the results and proved that sentiment analysis in the Arabic domain needs lot of work on the lexicon side.

  4. 48 CFR 15.404-1 - Proposal analysis techniques.

    Science.gov (United States)

    2010-10-01

    ... of the offeror's cost trends, on the basis of current and historical cost or pricing data; (C... the FAR looseleaf edition), Cost Accounting Standards. (v) Review to determine whether any cost data... required. (2) Price analysis shall be used when certified cost or pricing data are not required...

  5. Instrumental Neutron Activation Analysis Technique using Subsecond Radionuclides

    DEFF Research Database (Denmark)

    Nielsen, H.K.; Schmidt, J.O.

    1987-01-01

    The fast irradiation facility Mach-1 installed at the Danish DR 3 reactor has been used in boron determinations by means of Instrumental Neutron Activation Analysis using12B with 20-ms half-life. The performance characteristics of the system are presented and boron determinations of NBS standard...

  6. Novel microstructures and technologies applied in chemical analysis techniques

    NARCIS (Netherlands)

    Spiering, Vincent L.; Spiering, V.L.; van der Moolen, Johannes N.; Burger, Gert-Jan; Burger, G.J.; van den Berg, Albert

    1997-01-01

    Novel glass and silicon microstructures and their application in chemical analysis are presented. The micro technologies comprise (deep) dry etching, thin layer growth and anodic bonding. With this combination it is possible to create high resolution electrically isolating silicon dioxide structures

  7. Investigating product development strategy in beverage industry using factor analysis

    Directory of Open Access Journals (Sweden)

    Naser Azad

    2013-03-01

    Full Text Available Selecting a product development strategy that is associated with the company's current service or product innovation, based on customers’ needs and changing environment, plays an important role in increasing demand, increasing market share, increasing sales and profits. Therefore, it is important to extract effective variables associated with product development to improve performance measurement of firms. This paper investigates important factors influencing product development strategies using factor analysis. The proposed model of this paper investigates 36 factors and, using factor analysis, we extract six most influential factors including information sharing, intelligence information, exposure strategy, differentiation, research and development strategy and market survey. The first strategy, partnership, includes five sub-factor including product development partnership, partnership with foreign firms, customers’ perception from competitors’ products, Customer involvement in product development, inter-agency coordination, customer-oriented approach to innovation and transmission of product development change where inter-agency coordination has been considered the most important factor. Internal strengths are the most influential factors impacting the second strategy, intelligence information. The third factor, introducing strategy, introducing strategy, includes four sub criteria and consumer buying behavior is the most influencing factor. Differentiation is the next important factor with five components where knowledge and expertise in product innovation is the most important one. Research and development strategy with four sub-criteria where reducing product development cycle plays the most influential factor and finally, market survey strategy is the last important factor with three factors and finding new market plays the most important role.

  8. Determination of Volatile Organic Compounds in the Atmosphere Using Two Complementary Analysis Techniques.

    Science.gov (United States)

    Alonso, L; Durana, N; Navazo, M; García, J A; Ilardia, J L

    1999-08-01

    During a preliminary field campaign of volatile organic compound (VOC) measurements carried out in an urban area, two complementary analysis techniques were applied to establish the technical and scientific bases for a strategy to monitor and control VOCs and photochemical oxidants in the Autonomous Community of the Basque Country. Integrated sampling was conducted using Tenax sorbent tubes and laboratory analysis by gas chromatography, and grab sampling and in situ analysis also were conducted using a portable gas chromatograph. With the first technique, monocyclic aromatic hydrocarbons appeared as the compounds with the higher mean concentrations. The second technique allowed the systematic analysis of eight chlorinated and aromatic hydrocarbons. Results of comparing both techniques, as well as the additional information obtained with the second technique, are included.

  9. Housing Price Forecastability: A Factor Analysis

    DEFF Research Database (Denmark)

    Bork, Lasse; Møller, Stig Vinther

    2016-01-01

    We examine U.S. housing price forecastability using principal component analysis (PCA), partial least squares (PLS), and sparse PLS (SPLS). We incorporate information from a large panel of 128 economic time series and show that macroeconomic fundamentals have strong predictive power for future...

  10. Housing price forecastability: A factor analysis

    DEFF Research Database (Denmark)

    Møller, Stig Vinther; Bork, Lasse

    2017-01-01

    We examine U.S. housing price forecastability using principal component analysis (PCA), partial least squares (PLS), and sparse PLS (SPLS). We incorporate information from a large panel of 128 economic time series and show that macroeconomic fundamentals have strong predictive power for future...

  11. Biomechanical analysis technique choreographic movements (for example, "grand battman jete"

    Directory of Open Access Journals (Sweden)

    Batieieva N.P.

    2015-04-01

    Full Text Available Purpose : biomechanical analysis of the execution of choreographic movement "grand battman jete". Material : the study involved students (n = 7 of the department of classical choreography faculty of choreography. Results : biomechanical analysis of choreographic movement "grand battman jete" (classic exercise, obtained kinematic characteristics (path, velocity, acceleration, force of the center of mass (CM bio parts of the body artist (foot, shin, thigh. Built bio kinematic model (phase. The energy characteristics - mechanical work and kinetic energy units legs when performing choreographic movement "grand battman jete". Conclusions : It was found that the ability of an athlete and coach-choreographer analyze the biomechanics of movement has a positive effect on the improvement of choreographic training of qualified athletes in gymnastics (sport, art, figure skating and dance sports.

  12. Transient analysis techniques in performing impact and crash dynamic studies

    Science.gov (United States)

    Pifko, A. B.; Winter, R.

    1989-01-01

    Because of the emphasis being placed on crashworthiness as a design requirement, increasing demands are being made by various organizations to analyze a wide range of complex structures that must perform safely when subjected to severe impact loads, such as those generated in a crash event. The ultimate goal of crashworthiness design and analysis is to produce vehicles with the ability to reduce the dynamic forces experienced by the occupants to specified levels, while maintaining a survivable envelope around them during a specified crash event. DYCAST is a nonlinear structural dynamic finite element computer code that started from the plans systems of a finite element program for static nonlinear structural analysis. The essential features of DYCAST are outlined.

  13. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  14. Stalked protozoa identification by image analysis and multivariable statistical techniques

    OpenAIRE

    Amaral, A.L.; Ginoris, Y. P.; Nicolau, Ana; M.A.Z. Coelho; Ferreira, E. C.

    2008-01-01

    Protozoa are considered good indicators of the treatment quality in activated sludge systems as they are sensitive to physical, chemical and operational processes. Therefore, it is possible to correlate the predominance of certain species or groups and several operational parameters of the plant. This work presents a semiautomatic image analysis procedure for the recognition of the stalked protozoa species most frequently found in wastewater treatment plants by determinin...

  15. Elemental analysis of silver coins by PIXE technique

    Energy Technology Data Exchange (ETDEWEB)

    Tripathy, B.B. [Department of Physics, Silicon Institute of Technology, Patia, Bhubaneswar 751 024 (India); Rautray, Tapash R. [Department of Dental Biomaterials, School of Dentistry, Kyungpook National University, 2-188-1 Samduk -dong, Jung-gu, Daegu 700 412 (Korea, Republic of); ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India)], E-mail: tapash.rautray@gmail.com; Rautray, A.C. [ARASMIN, G. Udayagiri, Kandhamal, Orissa 762 100 (India); Vijayan, V. [Praveen Institute of Radiation Technology, Flat No. 9A, Avvai Street, New Perungalathur, Chennai 600 063 (India)

    2010-03-15

    Elemental analysis of nine Indian silver coins during British rule was carried out by proton induced X-ray emission spectroscopy. Eight elements, namely Cr, Fe, Ni, Cu, Zn, As, Ag, and Pb were determined in the present study. Ag and Cu were found to be the major elements, Zn was the only minor element and all other elements are present at the trace level. The variation of the elemental concentration may be due to the use of different ores for making coins.

  16. Signs and symptoms of acute mania: a factor analysis

    Directory of Open Access Journals (Sweden)

    de Silva Varuni A

    2011-08-01

    Full Text Available Abstract Background The major diagnostic classifications consider mania as a uni-dimensional illness. Factor analytic studies of acute mania are fewer compared to schizophrenia and depression. Evidence from factor analysis suggests more categories or subtypes than what is included in the classification systems. Studies have found that these factors can predict differences in treatment response and prognosis. Methods The sample included 131 patients consecutively admitted to an acute psychiatry unit over a period of one year. It included 76 (58% males. The mean age was 44.05 years (SD = 15.6. Patients met International Classification of Diseases-10 (ICD-10 clinical diagnostic criteria for a manic episode. Patients with a diagnosis of mixed bipolar affective disorder were excluded. Participants were evaluated using the Young Mania Rating Scale (YMRS. Exploratory factor analysis (principal component analysis was carried out and factors with an eigenvalue > 1 were retained. The significance level for interpretation of factor loadings was 0.40. The unrotated component matrix identified five factors. Oblique rotation was then carried out to identify three factors which were clinically meaningful. Results Unrotated principal component analysis extracted five factors. These five factors explained 65.36% of the total variance. Oblique rotation extracted 3 factors. Factor 1 corresponding to 'irritable mania' had significant loadings of irritability, increased motor activity/energy and disruptive aggressive behaviour. Factor 2 corresponding to 'elated mania' had significant loadings of elevated mood, language abnormalities/thought disorder, increased sexual interest and poor insight. Factor 3 corresponding to 'psychotic mania' had significant loadings of abnormalities in thought content, appearance, poor sleep and speech abnormalities. Conclusions Our findings identified three clinically meaningful factors corresponding to 'elated mania', 'irritable mania

  17. The Application of the Model Correction Factor Method to a Reliability Analysis of a Composite Blade Structure

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimiroy; Friis-Hansen, Peter; Berggreen, Christian

    2009-01-01

    This paper presents a reliability analysis of a composite blade profile. The so-called Model Correction Factor technique is applied as an effective alternate approach to the response surface technique. The structural reliability is determined by use of a simplified idealised analytical model which...

  18. Comparative Analysis of Automatic Vehicle Classification Techniques: A Survey

    Directory of Open Access Journals (Sweden)

    Kanwal Yousaf

    2012-09-01

    Full Text Available Vehicle classification has emerged as a significant field of study because of its importance in variety of applications like surveillance, security system, traffic congestion avoidance and accidents prevention etc. So far numerous algorithms have been implemented for classifying vehicle. Each algorithm follows different procedures for detecting vehicles from videos. By evaluating some of the commonly used techniques we highlighted most beneficial methodology for classifying vehicles. In this paper we pointed out the working of several video based vehicle classification algorithms and compare these algorithms on the basis of different performance metrics such as classifiers, classification methodology or principles and vehicle detection ratio etc. After comparing these parameters we concluded that Hybrid Dynamic Bayesian Network (HDBN Classification algorithm is far better than the other algorithms due to its nature of estimating the simplest features of vehicles from different videos. HDBN detects vehicles by following important stages of feature extraction, selection and classification. It extracts the rear view information of vehicles rather than other information such as distance between the wheels and height of wheel etc.

  19. Chromatographic finger print analysis of Naringi crenulata by HPTLC technique

    Institute of Scientific and Technical Information of China (English)

    Subramanian Sampathkumar; Ramakrishnan N

    2011-01-01

    Objective:To establish the fingerprint profile of Naringi crenulata (N. crenulata) (Roxb.) Nicols. using high performance thin layer chromatography (HPTLC) technique. Methods: Preliminary phytochemical screening was done and HPTLC studies were carried out. CAMAG HPTLC system equipped with Linomat V applicator, TLC scanner 3, Reprostar 3 and WIN CATS-4 software was used. Results: The results of preliminary phytochemical studies confirmed the presence of protein, lipid, carbohydrate, reducing sugar, phenol, tannin, flavonoid, saponin, triterpenoid, alkaloid, anthraquinone and quinone. HPTLC finger printing of ethanolic extract of stem revealed 10 spots with Rf values in the range of 0.08 to 0.65;bark showed 8 peaks with Rf values in the range of 0.07 to 0.63 and the ethanol extract of leaf revealed 8 peaks with Rf values in the range of 0.09 to 0.49, respectively. The purity of sample was confirmed by comparing the absorption spectra at start, middle and end position of the band. Conclusions:It can be concluded that HPTLC finger printing of N. crenulata may be useful in differentiating the species from the adulterant and act as a biochemical marker for this medicinally important plant in the pharmaceutical industry and plant systematic studies.

  20. Skills and Vacancy Analysis with Data Mining Techniques

    Directory of Open Access Journals (Sweden)

    Izabela A. Wowczko

    2015-11-01

    Full Text Available Through recognizing the importance of a qualified workforce, skills research has become one of the focal points in economics, sociology, and education. Great effort is dedicated to analyzing labor demand and supply, and actions are taken at many levels to match one with the other. In this work we concentrate on skills needs, a dynamic variable dependent on many aspects such as geography, time, or the type of industry. Historically, skills in demand were easy to evaluate since transitions in that area were fairly slow, gradual, and easy to adjust to. In contrast, current changes are occurring rapidly and might take an unexpected turn. Therefore, we introduce a relatively simple yet effective method of monitoring skills needs straight from the source—as expressed by potential employers in their job advertisements. We employ open source tools such as RapidMiner and R as well as easily accessible online vacancy data. We demonstrate selected techniques, namely classification with k-NN and information extraction from a textual dataset, to determine effective ways of discovering knowledge from a given collection of vacancies.

  1. Ionospheric Behaviour Analysis over Thailand Using Radio Occultation Technique.

    Directory of Open Access Journals (Sweden)

    Ahmed Wasiu Akande

    2015-11-01

    Full Text Available With the advent in the development of science and technology in the field of space and atmospheric science in order to obtain accurate result, hence the use of radio occultation technique in the investigation of the amount of electron density and Total Electron Content presence in equatorial region particularly over Thailand. In this research, radio occultation data obtained from UCAR/CDAAC was used to observe daily, monthly, seasonal and the entire year 2013 Ionospheric TEC and electron density variation due to changes and instability of solar activities from time to time. It was observed that TEC was high (ionosphere was more disturbed or violent in May and spread over a wide range of altitude and summer season has the highest TEC value for the year 2013 which means at this period GNSS measurements was more prone to error. It was noted that ionospheric variations or fluctuations was maximum between 200km and 450km altitude. The results of the study show that ionospheric perturbation effects or irregularities depend on season and solar activity.

  2. Quantitative analysis of genomic element interactions by molecular colony technique.

    Science.gov (United States)

    Gavrilov, Alexey A; Chetverina, Helena V; Chermnykh, Elina S; Razin, Sergey V; Chetverin, Alexander B

    2014-03-01

    Distant genomic elements were found to interact within the folded eukaryotic genome. However, the used experimental approach (chromosome conformation capture, 3C) enables neither determination of the percentage of cells in which the interactions occur nor demonstration of simultaneous interaction of >2 genomic elements. Each of the above can be done using in-gel replication of interacting DNA segments, the technique reported here. Chromatin fragments released from formaldehyde-cross-linked cells by sodium dodecyl sulfate extraction and sonication are distributed in a polyacrylamide gel layer followed by amplification of selected test regions directly in the gel by multiplex polymerase chain reaction. The fragments that have been cross-linked and separate fragments give rise to multi- and monocomponent molecular colonies, respectively, which can be distinguished and counted. Using in-gel replication of interacting DNA segments, we demonstrate that in the material from mouse erythroid cells, the majority of fragments containing the promoters of active β-globin genes and their remote enhancers do not form complexes stable enough to survive sodium dodecyl sulfate extraction and sonication. This indicates that either these elements do not interact directly in the majority of cells at a given time moment, or the formed DNA-protein complex cannot be stabilized by formaldehyde cross-linking.

  3. Mapping Proxy Sensitivity: A New Technique for Compositional Analysis of Cultured Biominerals and Inorganically Precipitated Materials

    Science.gov (United States)

    Gagnon, A. C.; DePaolo, D. J.; DeYoreo, J.; Spero, H. J.; Russell, A. D.

    2011-12-01

    Mineral composition is controlled by a host of environmental factors during precipitation. To build accurate paleo-reconstructions we need to separate the impact of each parameter on proxy behavior and use these data to build a chemical-scale understanding of mineral growth. Biomineral culture and inorganic precipitation experiments, where growth parameters can be manipulated independently, are uniquely suited to calibrate proxies and probe mechanism. Culture and precipitation experiments often involve overgrowth of an initial material. For example, seed crystals are used to control mineralogy and avoid nucleation during inorganic precipitation, while culture experiments in marine organisms typically start with wild specimens. New growth corresponding to the experimental conditions must be resolved from the initial material. Separation is typically achieved using microanalysis, skeletal dissection, or estimates of the initial mass and composition. Each approach imposes limits on the accuracy, precision or types of materials that can be analyzed. Slow growth rates and complicated geometries can make these techniques especially challenging when applied to biominerals. We present a method of compositional analysis for use in biological culture and inorganic growth experiments that overcomes many of these challenges. This method relies on growth in a mixed element stable isotope spike, requires neither the initial mass nor the initial composition to be known, harnesses the precision and sensitivity of bulk analysis, and applies even when it is impossible to physically identify newly grown material. Error analysis suggests this method can significantly improve the precision of metal/calcium measurements in experimentally grown material compared to current methods. Furthermore, the method can isolate different events through time, separating, for example, the impact of day and night cycles on biomineral composition. We will present metal/calcium ratios measured using the

  4. Novel techniques for the analysis of the TOA radiometric uncertainty

    Science.gov (United States)

    Gorroño, Javier; Banks, Andrew; Gascon, Ferran; Fox, Nigel P.; Underwood, Craig I.

    2016-10-01

    In the framework of the European Copernicus programme, the European Space Agency (ESA) has launched the Sentinel-2 (S2) Earth Observation (EO) mission which provides optical high spatial -resolution imagery over land and coastal areas. As part of this mission, a tool (named S2-RUT, from Sentinel-2 Radiometric Uncertainty Tool) estimates the radiometric uncertainties associated to each pixel using as input the top-of-atmosphere (TOA) reflectance factor images provided by ESA. The initial version of the tool has been implemented — code and user guide available1 — and integrated as part of the Sentinel Toolbox. The tool required the study of several radiometric uncertainty sources as well as the calculation and validation of the combined standard uncertainty in order to estimate the TOA reflectance factor uncertainty per pixel. Here we describe the recent research in order to accommodate novel uncertainty contributions to the TOA reflectance uncertainty estimates in future versions of the tool. The two contributions that we explore are the radiometric impact of the spectral knowledge and the uncertainty propagation of the resampling associated to the orthorectification process. The former is produced by the uncertainty associated to the spectral calibration as well as the spectral variations across the instrument focal plane and the instrument degradation. The latter results of the focal plane image propagation into the provided orthoimage. The uncertainty propagation depends on the radiance levels on the pixel neighbourhood and the pixel correlation in the temporal and spatial dimensions. Special effort has been made studying non-stable scenarios and the comparison with different interpolation methods.

  5. Development of advanced techniques for identification of flow stress and friction parameters for metal forming analysis

    Science.gov (United States)

    Cho, Hyunjoong

    The accuracy of process simulation in metal forming by finite element method depends on the accuracy of flow stress data and friction value that are input to FEM programs. Therefore, it is essential that these input values are determined using reliable tests and evaluation methods. This study presents the development of inverse analysis methodology and its application to determine flow stress data of bulk and sheet materials at room and elevated temperatures. The inverse problem is defined as the minimization of the differences between the experimental measurements and the corresponding FEM predictions. Rigid-viscoplastic FEM is used to analyze the metal flow while a numerical optimization algorithm adjusts the material parameters used in the simulation until the calculated response matches the measured data within a specified tolerance. The use of the developed inverse analysis methodology has been demonstrated by applying it to the selected reference rheological tests; cylinder compression test, ring compression test, instrumented indentation test, modified limiting dome height test, and sheet hydraulic bulge test. Furthermore, using the determined material property data, full 3-D finite element simulation models, as examples of industrial applications for orbital forming and thermoforming processes have been developed for reliable process simulation. As results of this study, it was shown that the developed inverse analysis methodology could identify both the material parameters and friction factors from one set of tests, simultaneously. Therefore, this technique can offer a systematic and cost effective way for determining material property data for simulation of metal forming processes.

  6. Comparison of different techniques for time-frequency analysis of internal combustion engine vibration signals

    Institute of Scientific and Technical Information of China (English)

    Yang JIN; Zhi-yong HAO; Xu ZHENG

    2011-01-01

    In this study,we report an analysis of cylinder head vibration signals at a steady engine speed using short-time Fourier transform (STFT).Three popular time-frequency analysis techniques,i.e.,STFT,analytic wavelet transform (AWT) and S transform (ST),have been examined.AWT and ST are often applied in engine signal analyses.In particular,an AWT expression in terms of the quality factor Q and an analytical relationship between ST and AWT have been derived.The time-frequency resolution of a Gaussian function windowed STFT was studied via numerical simulation.Based on the simulation,the empirical limits for the lowest distinguishable frequency as well as the time and frequency resolutions were determined.These can provide insights for window width selection,spectrogram interpretation and artifact identification.Gaussian function windowed STFTs were applied to some cylinder head vibration signals.The spectrograms of the same signals from ST and AWT were also determined for comparison.The results indicate that the uniform resolution feature of STFT is not necessarily a disadvantage for time-frequency analysis of vibration signals when the engine is in stationary state because it can more accurately localize the frequency components excited by transient excitations without much loss of time resolution.

  7. Factors Effecting Unemployment: A Cross Country Analysis

    Directory of Open Access Journals (Sweden)

    Aurangzeb

    2013-01-01

    Full Text Available This paper investigates macroeconomic determinants of the unemployment for India, China and Pakistan for the period 1980 to 2009. The investigation was conducted through co integration, granger causality and regression analysis. The variables selected for the study are unemployment, inflation, gross domestic product, exchange rate and the increasing rate of population. The results of regression analysis showed significant impact of all the variables for all three countries. GDP of Pakistan showed positive relation with the unemployment rate and the reason of that is the poverty level and underutilization of foreign investment. The result of granger causality showed that bidirectional causality does not exist between any of the variable for all three countries. Co integration result explored that long term relationship do exist among the variables for all the models. It is recommended that distribution of income needs to be improved for Pakistan in order to have positive impact of growth on the employment rate.

  8. Preliminary Analysis of ULPC Light Curves Using Fourier Decomposition Technique

    CERN Document Server

    Ngeow, Chow-Choong; Kanbur, Shashi; Barrett, Brittany; Lin, Bin

    2013-01-01

    Recent work on Ultra Long Period Cepheids (ULPCs) has suggested their usefulness as a distance indicator, but has not commented on their relationship as compared with other types of variable stars. In this work, we use Fourier analysis to quantify the structure of ULPC light curves and compare them to Classical Cepheids and Mira variables. Our preliminary results suggest that the low order Fourier parameters of ULPCs show a continuous trend defined by Classical Cepheids after the resonance around 10 days. However their Fourier parameters also overlapped with those from Miras, which make the classification of long period variable stars difficult based on the light curves information alone.

  9. [Neuroimaging in psychiatry: multivariate analysis techniques for diagnosis and prognosis].

    Science.gov (United States)

    Kambeitz, J; Koutsouleris, N

    2014-06-01

    Multiple studies successfully applied multivariate analysis to neuroimaging data demonstrating the potential utility of neuroimaging for clinical diagnostic and prognostic purposes. Summary of the current state of research regarding the application of neuroimaging in the field of psychiatry. Literature review of current studies. Results of current studies indicate the potential application of neuroimaging data across various diagnoses, such as depression, schizophrenia, bipolar disorder and dementia. Potential applications include disease classification, differential diagnosis and prediction of disease course. The results of the studies are heterogeneous although some studies report promising findings. Further multicentre studies are needed with clearly specified patient populations to systematically investigate the potential utility of neuroimaging for the clinical routine.

  10. Behavior Change Techniques in Popular Alcohol Reduction Apps: Content Analysis

    Science.gov (United States)

    Garnett, Claire; Brown, James; West, Robert; Michie, Susan

    2015-01-01

    Background Mobile phone apps have the potential to reduce excessive alcohol consumption cost-effectively. Although hundreds of alcohol-related apps are available, there is little information about the behavior change techniques (BCTs) they contain, or the extent to which they are based on evidence or theory and how this relates to their popularity and user ratings. Objective Our aim was to assess the proportion of popular alcohol-related apps available in the United Kingdom that focus on alcohol reduction, identify the BCTs they contain, and explore whether BCTs or the mention of theory or evidence is associated with app popularity and user ratings. Methods We searched the iTunes and Google Play stores with the terms “alcohol” and “drink”, and the first 800 results were classified into alcohol reduction, entertainment, or blood alcohol content measurement. Of those classified as alcohol reduction, all free apps and the top 10 paid apps were coded for BCTs and for reference to evidence or theory. Measures of popularity and user ratings were extracted. Results Of the 800 apps identified, 662 were unique. Of these, 13.7% (91/662) were classified as alcohol reduction (95% CI 11.3-16.6), 53.9% (357/662) entertainment (95% CI 50.1-57.7), 18.9% (125/662) blood alcohol content measurement (95% CI 16.1-22.0) and 13.4% (89/662) other (95% CI 11.1-16.3). The 51 free alcohol reduction apps and the top 10 paid apps contained a mean of 3.6 BCTs (SD 3.4), with approximately 12% (7/61) not including any BCTs. The BCTs used most often were “facilitate self-recording” (54%, 33/61), “provide information on consequences of excessive alcohol use and drinking cessation” (43%, 26/61), “provide feedback on performance” (41%, 25/61), “give options for additional and later support” (25%, 15/61) and “offer/direct towards appropriate written materials” (23%, 14/61). These apps also rarely included any of the 22 BCTs frequently used in other health behavior change

  11. Analysis of Machine Learning Techniques for Heart Failure Readmissions.

    Science.gov (United States)

    Mortazavi, Bobak J; Downing, Nicholas S; Bucholz, Emily M; Dharmarajan, Kumar; Manhapra, Ajay; Li, Shu-Xia; Negahban, Sahand N; Krumholz, Harlan M

    2016-11-01

    The current ability to predict readmissions in patients with heart failure is modest at best. It is unclear whether machine learning techniques that address higher dimensional, nonlinear relationships among variables would enhance prediction. We sought to compare the effectiveness of several machine learning algorithms for predicting readmissions. Using data from the Telemonitoring to Improve Heart Failure Outcomes trial, we compared the effectiveness of random forests, boosting, random forests combined hierarchically with support vector machines or logistic regression (LR), and Poisson regression against traditional LR to predict 30- and 180-day all-cause readmissions and readmissions because of heart failure. We randomly selected 50% of patients for a derivation set, and a validation set comprised the remaining patients, validated using 100 bootstrapped iterations. We compared C statistics for discrimination and distributions of observed outcomes in risk deciles for predictive range. In 30-day all-cause readmission prediction, the best performing machine learning model, random forests, provided a 17.8% improvement over LR (mean C statistics, 0.628 and 0.533, respectively). For readmissions because of heart failure, boosting improved the C statistic by 24.9% over LR (mean C statistic 0.678 and 0.543, respectively). For 30-day all-cause readmission, the observed readmission rates in the lowest and highest deciles of predicted risk with random forests (7.8% and 26.2%, respectively) showed a much wider separation than LR (14.2% and 16.4%, respectively). Machine learning methods improved the prediction of readmission after hospitalization for heart failure compared with LR and provided the greatest predictive range in observed readmission rates. © 2016 American Heart Association, Inc.

  12. Structural analysis of irradiated crotoxin by spectroscopic techniques

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Karina C. de; Fucase, Tamara M.; Silva, Ed Carlos S. e; Chagas, Bruno B.; Buchi, Alisson T.; Viala, Vincent L.; Spencer, Patrick J.; Nascimento, Nanci do, E-mail: kcorleto@usp.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Centro de Biotecnologia

    2013-07-01

    Snake bites are a serious public health problem, especially in subtropical countries. In Brazil, the serum, the only effective treatment in case of snake bites, is produced in horses which, despite of their large size, have a reduced lifespan due to the high toxicity of the antigen. Ionizing radiation has been successfully employed to attenuate the biological activity of animal toxins. Crotoxin, the main toxic compound from Crotalus durissus terrificus (Cdt), is a heterodimeric protein composed of two subunits: crotapotin and phospholipase A{sub 2}. Previous data indicated that this protein, following irradiation process, undergoes unfolding and/or aggregation, resulting in a much lower toxic antigen. The exact mechanisms and structural modifications involved in aggregation process are not clear yet. This work investigates the effects of ionizing radiation on crotoxin employing Infrared Spectroscopy, Circular Dichroism and Dynamic Light Scattering techniques. The infrared spectrum of lyophilized crotoxin showed peaks corresponding to the vibrational spectra of the secondary structure of crotoxin, including β-sheet, random coil, α-helix and β-turns. We calculated the area of these spectral regions after adjusting for baseline and normalization using the amide I band (1590-1700 cm{sup -1}), obtaining the variation of secondary structures of the toxin following irradiation. The Circular Dichroism spectra of native and irradiated crotoxin suggests a conformational change within the molecule after the irradiation process. This data indicates structural changes between the samples, apparently from ordered conformation towards a random coil. The analyses by light scattering indicated that the irradiated crotoxin formed multimers with an average molecular radius 100 folds higher than the native toxin. (author)

  13. Analysis technique for quantifying the effectiveness of optical-proximity-corrected photomasks and its application to defect printability

    Science.gov (United States)

    Arthur, Graham G.; Martin, Brian

    1999-04-01

    An analysis technique for quantifying the effectiveness of optical proximity corrected (OPC) photomasks is described. The methodology is able to account for reticle manufacturing tolerances and has a number of applications including the optimization of OPC features and, in the examples described, the analysis of defect printability. The results presented here are generated using aerial image measurements from PROLITH/2, but the technique can be directly transferred to resist image measurements using 3D simulation tools such as PROLITH/3D where other factors such as swing curve effects caused by wafer topography could also be analyzed. With inspection tools such as scanning electron or atomic force microscopes and appropriate image processing and analysis software it should also be possible to apply this methodology to practical results.

  14. Intelligent acoustic data fusion technique for information security analysis

    Science.gov (United States)

    Jiang, Ying; Tang, Yize; Lu, Wenda; Wang, Zhongfeng; Wang, Zepeng; Zhang, Luming

    2017-08-01

    Tone is an essential component of word formation in all tonal languages, and it plays an important role in the transmission of information in speech communication. Therefore, tones characteristics study can be applied into security analysis of acoustic signal by the means of language identification, etc. In speech processing, fundamental frequency (F0) is often viewed as representing tones by researchers of speech synthesis. However, regular F0 values may lead to low naturalness in synthesized speech. Moreover, F0 and tone are not equivalent linguistically; F0 is just a representation of a tone. Therefore, the Electroglottography (EGG) signal is collected for deeper tones characteristics study. In this paper, focusing on the Northern Kam language, which has nine tonal contours and five level tone types, we first collected EGG and speech signals from six natural male speakers of the Northern Kam language, and then achieved the clustering distributions of the tone curves. After summarizing the main characteristics of tones of Northern Kam, we analyzed the relationship between EGG and speech signal parameters, and laid the foundation for further security analysis of acoustic signal.

  15. Analysis of compressive fracture in rock using statistical techniques

    Energy Technology Data Exchange (ETDEWEB)

    Blair, S.C.

    1994-12-01

    Fracture of rock in compression is analyzed using a field-theory model, and the processes of crack coalescence and fracture formation and the effect of grain-scale heterogeneities on macroscopic behavior of rock are studied. The model is based on observations of fracture in laboratory compression tests, and incorporates assumptions developed using fracture mechanics analysis of rock fracture. The model represents grains as discrete sites, and uses superposition of continuum and crack-interaction stresses to create cracks at these sites. The sites are also used to introduce local heterogeneity. Clusters of cracked sites can be analyzed using percolation theory. Stress-strain curves for simulated uniaxial tests were analyzed by studying the location of cracked sites, and partitioning of strain energy for selected intervals. Results show that the model implicitly predicts both development of shear-type fracture surfaces and a strength-vs-size relation that are similar to those observed for real rocks. Results of a parameter-sensitivity analysis indicate that heterogeneity in the local stresses, attributed to the shape and loading of individual grains, has a first-order effect on strength, and that increasing local stress heterogeneity lowers compressive strength following an inverse power law. Peak strength decreased with increasing lattice size and decreasing mean site strength, and was independent of site-strength distribution. A model for rock fracture based on a nearest-neighbor algorithm for stress redistribution is also presented and used to simulate laboratory compression tests, with promising results.

  16. Structural Analysis of Composite Laminates using Analytical and Numerical Techniques

    Directory of Open Access Journals (Sweden)

    Sanghi Divya

    2016-01-01

    Full Text Available A laminated composite material consists of different layers of matrix and fibres. Its properties can vary a lot with each layer’s or ply’s orientation, material property and the number of layers itself. The present paper focuses on a novel approach of incorporating an analytical method to arrive at a preliminary ply layup order of a composite laminate, which acts as a feeder data for the further detailed analysis done on FEA tools. The equations used in our MATLAB are based on analytical study code and supply results that are remarkably close to the final optimized layup found through extensive FEA analysis with a high probabilistic degree. This reduces significant computing time and saves considerable FEA processing to obtain efficient results quickly. The result output by our method also provides the user with the conditions that predicts the successive failure sequence of the composite plies, a result option which is not even available in popular FEM tools. The predicted results are further verified by testing the laminates in the laboratory and the results are found in good agreement.

  17. HPLC-MS technique for radiopharmaceuticals analysis and quality control

    Science.gov (United States)

    Macášek, F.; Búriová, E.; Brúder, P.; Vera-Ruiz, H.

    2003-01-01

    Potentialities of liquid chromatography with mass spectrometric detector (MSD) were investigated with the objective of quality control of radiopharmaceuticals; 2-deoxy-2-[18F]fluoro-D-glucose (FDG) being an example. Screening of suitable MSD analytical lines is presented. Mass-spectrometric monitoring of acetonitrile— aqueous ammonium formate eluant by negatively charged FDG.HCO2 - ions enables isotope analysis (specific activity) of the radiopharmaceutical at m/z 227 and 226. Kryptofix® 222 provides an intense MSD signal of the positive ion associated with NH4 + at m/z 394. Expired FDG injection samples contain decomposition products from which at least one labelled by 18F and characterised by signal of negative ions at m/z 207 does not correspond to FDG fragments but to C5 decomposition products. A glucose chromatographic peak, characterised by m/z 225 negative ion is accompanied by a tail of a component giving a signal of m/z 227, which can belong to [18O]glucose; isobaric sorbitol signals were excluded but FDG-glucose association occurs in the co-elution of separation of model mixtures. The latter can actually lead to a convoluted chromatographic peak, but the absence of 18F makes this inconsistent. Quantification and validation of the FDG component analysis is under way.

  18. [THE COMPARATIVE ANALYSIS OF TECHNIQUES OF IDENTIFICATION OF CORYNEBACTERIUM NON DIPHTHERIAE].

    Science.gov (United States)

    Kharseeva, G G; Voronina, N A; Mironov, A Yu; Alutina, E L

    2015-12-01

    The comparative analysis was carried out concerning effectiveness of three techniques of identification of Corynebacterium non diphtheriae: bacteriological, molecular genetic (sequenation on 16SpRNA) andmass-spectrometric (MALDI-ToFMS). The analysis covered 49 strains of Corynebacterium non diphtheriae (C.pseudodiphheriticum, C.amycolatum, C.propinquum, C.falsenii) and 2 strains of Corynebacterium diphtheriae isolated under various pathology form urogenital tract and upper respiratory ways. The corinbacteria were identified using bacteriologic technique, sequenation on 16SpRNA and mass-spectrometric technique (MALDIToF MS). The full concordance of results of species' identification was marked in 26 (51%) of strains of Corynebacterium non diphtheriae at using three analysis techniques; in 43 (84.3%) strains--at comparison of bacteriologic technique with sequenation on 16S pRNA and in 29 (57%)--at mass-spectrometric analysis and sequenation on 16S pRNA. The bacteriologic technique is effective for identification of Corynebacterium diphtheriae. The precise establishment of species belonging of corynebacteria with variable biochemical characteristics the molecular genetic technique of analysis is to be applied. The mass-spectrometric technique (MALDI-ToF MS) requires further renewal of data bases for identifying larger spectrum of representatives of genus Corynebacterium.

  19. Analysis of cultural development of Isfahan city Using Factor analysis method

    Directory of Open Access Journals (Sweden)

    J.Mohammadi

    2013-01-01

    Full Text Available Extended abstract1-IntroductionCultural spaces are consideredas one of the main factors for development. Cultural development is a qualitative and valuable process that for assessing it, quantitative indicators in cultural planning are used to obtain development objectives in the pattern of goods and services. The aim of the study is to determine and analyze cultural development level and regional inequality of different districts of Isfahan using factor analysis technique. The statistical population of the study is 14 districts of Isfahan municipality. The dominant approach ofthis study is quantitative – description and analytical. In this study, 35 indices have been summarized by factor analysis method and have been reduced to 5 factors and combined in significant ones and delivered.2 – Theoretical basesThe most important objectives of spatial planning, considering limitation of resources, are optimum distributions of facilities and services among different locations in which people live. To do this,there is a need to identify different locations in terms of having different facilities and services, so that developed locations are specified and planners can proceed to do something for spatial equilibrium and reducing privileged distance between districts.The present study has been conducted to reach to an equal development in Isfahan urban districts following identifying the situation and the manner of distributing development facilities cultural selected indices in different districts.3 – DiscussionCultural development of societies is evaluated by considering the changes and improvement of its indices and measured by quantitative frames. Cultural development indices are the most important tools for cultural planning in a special district in a society. In this study, cultural development indices have been used to determine the levels of districts. By using factor analysis model, the share of influential factors in the cultural

  20. Denial of Service Attack Techniques: Analysis, Implementation and Comparison

    Directory of Open Access Journals (Sweden)

    Khaled Elleithy

    2005-02-01

    Full Text Available A denial of service attack (DOS is any type of attack on a networking structure to disable a server from servicing its clients. Attacks range from sending millions of requests to a server in an attempt to slow it down, flooding a server with large packets of invalid data, to sending requests with an invalid or spoofed IP address. In this paper we show the implementation and analysis of three main types of attack: Ping of Death, TCP SYN Flood, and Distributed DOS. The Ping of Death attack will be simulated against a Microsoft Windows 95 computer. The TCP SYN Flood attack will be simulated against a Microsoft Windows 2000 IIS FTP Server. Distributed DOS will be demonstrated by simulating a distribution zombie program that will carry the Ping of Death attack. This paper will demonstrate the potential damage from DOS attacks and analyze the ramifications of the damage.