WorldWideScience

Sample records for density analysis applied

  1. Analysis of multi-layered films. [determining dye densities by applying a regression analysis to the spectral response of the composite transparency

    Science.gov (United States)

    Scarpace, F. L.; Voss, A. W.

    1973-01-01

    Dye densities of multi-layered films are determined by applying a regression analysis to the spectral response of the composite transparency. The amount of dye in each layer is determined by fitting the sum of the individual dye layer densities to the measured dye densities. From this, dye content constants are calculated. Methods of calculating equivalent exposures are discussed. Equivalent exposures are a constant amount of energy over a limited band-width that will give the same dye content constants as the real incident energy. Methods of using these equivalent exposures for analysis of photographic data are presented.

  2. Experimental assessment of an automatic breast density classification algorithm based on principal component analysis applied to histogram data

    Science.gov (United States)

    Angulo, Antonio; Ferrer, Jose; Pinto, Joseph; Lavarello, Roberto; Guerrero, Jorge; Castaneda, Benjamín.

    2015-01-01

    Breast parenchymal density is considered a strong indicator of cancer risk. However, measures of breast density are often qualitative and require the subjective judgment of radiologists. This work proposes a supervised algorithm to automatically assign a BI-RADS breast density score to a digital mammogram. The algorithm applies principal component analysis to the histograms of a training dataset of digital mammograms to create four different spaces, one for each BI-RADS category. Scoring is achieved by projecting the histogram of the image to be classified onto the four spaces and assigning it to the closest class. In order to validate the algorithm, a training set of 86 images and a separate testing database of 964 images were built. All mammograms were acquired in the craniocaudal view from female patients without any visible pathology. Eight experienced radiologists categorized the mammograms according to a BIRADS score and the mode of their evaluations was considered as ground truth. Results show better agreement between the algorithm and ground truth for the training set (kappa=0.74) than for the test set (kappa=0.44) which suggests the method may be used for BI-RADS classification but a better training is required.

  3. Applied analysis

    CERN Document Server

    Lanczos, Cornelius

    1956-01-01

    Basic text for graduate and advanced undergraduate deals with search for roots of algebraic equations encountered in vibration and flutter problems and in those of static and dynamic stability. Other topics devoted to matrices and eigenvalue problems, large-scale linear systems, harmonic analysis and data analysis, more.

  4. Analysis of parenchymal patterns using conspicuous spatial frequency features in mammograms applied to the BI-RADS density rating scheme

    Science.gov (United States)

    Perconti, Philip; Loew, Murray

    2006-03-01

    Automatic classification of the density of breast parenchyma is shown using a measure that is correlated to the human observer performance, and compared against the BI-RADS density rating. Increasingly popular in the United States, the Breast Imaging Reporting and Data System (BI-RADS) is used to draw attention to the increased screening difficulty associated with greater breast density; however, the BI-RADS rating scheme is subjective and is not intended as an objective measure of breast density. So, while popular, BI-RADS does not define density classes using a standardized measure, which leads to increased variability among observers. The adaptive thresholding technique is a more quantitative approach for assessing the percentage breast density, but considerable reader interaction is required. We calculate an objective density rating that is derived using a measure of local feature salience. Previously, this measure was shown to correlate well with radiologists' localization and discrimination of true positive and true negative regions-of-interest. Using conspicuous spatial frequency features, an objective density rating is obtained and correlated with adaptive thresholding, and the subjectively ascertained BI-RADS density ratings. Using 100 cases, obtained from the University of South Florida's DDSM database, we show that an automated breast density measure can be derived that is correlated with the interactive thresholding method for continuous percentage breast density, but not with the BI-RADS density rating categories for the selected cases. Comparison between interactive thresholding and the new salience percentage density resulted in a Pearson correlation of 76.7%. Using a four-category scale equivalent to the BI-RADS density categories, a Spearman correlation coefficient of 79.8% was found.

  5. Applied longitudinal analysis

    CERN Document Server

    Fitzmaurice, Garrett M; Ware, James H

    2012-01-01

    Praise for the First Edition "". . . [this book] should be on the shelf of everyone interested in . . . longitudinal data analysis.""-Journal of the American Statistical Association   Features newly developed topics and applications of the analysis of longitudinal data Applied Longitudinal Analysis, Second Edition presents modern methods for analyzing data from longitudinal studies and now features the latest state-of-the-art techniques. The book emphasizes practical, rather than theoretical, aspects of methods for the analysis of diverse types of lo

  6. Applied Behavior Analysis

    Science.gov (United States)

    Szapacs, Cindy

    2006-01-01

    Teaching strategies that work for typically developing children often do not work for those diagnosed with an autism spectrum disorder. However, teaching strategies that work for children with autism do work for typically developing children. In this article, the author explains how the principles and concepts of Applied Behavior Analysis can be…

  7. Applied multivariate statistical analysis

    CERN Document Server

    Härdle, Wolfgang Karl

    2015-01-01

    Focusing on high-dimensional applications, this 4th edition presents the tools and concepts used in multivariate data analysis in a style that is also accessible for non-mathematicians and practitioners.  It surveys the basic principles and emphasizes both exploratory and inferential statistics; a new chapter on Variable Selection (Lasso, SCAD and Elastic Net) has also been added.  All chapters include practical exercises that highlight applications in different multivariate data analysis fields: in quantitative financial studies, where the joint dynamics of assets are observed; in medicine, where recorded observations of subjects in different locations form the basis for reliable diagnoses and medication; and in quantitative marketing, where consumers’ preferences are collected in order to construct models of consumer behavior.  All of these examples involve high to ultra-high dimensions and represent a number of major fields in big data analysis. The fourth edition of this book on Applied Multivariate ...

  8. Online traffic flow model applying dynamic flow-density relation

    International Nuclear Information System (INIS)

    Kim, Y.

    2002-01-01

    This dissertation describes a new approach of the online traffic flow modelling based on the hydrodynamic traffic flow model and an online process to adapt the flow-density relation dynamically. The new modelling approach was tested based on the real traffic situations in various homogeneous motorway sections and a motorway section with ramps and gave encouraging simulation results. This work is composed of two parts: first the analysis of traffic flow characteristics and second the development of a new online traffic flow model applying these characteristics. For homogeneous motorway sections traffic flow is classified into six different traffic states with different characteristics. Delimitation criteria were developed to separate these states. The hysteresis phenomena were analysed during the transitions between these traffic states. The traffic states and the transitions are represented on a states diagram with the flow axis and the density axis. For motorway sections with ramps the complicated traffic flow is simplified and classified into three traffic states depending on the propagation of congestion. The traffic states are represented on a phase diagram with the upstream demand axis and the interaction strength axis which was defined in this research. The states diagram and the phase diagram provide a basis for the development of the dynamic flow-density relation. The first-order hydrodynamic traffic flow model was programmed according to the cell-transmission scheme extended by the modification of flow dependent sending/receiving functions, the classification of cells and the determination strategy for the flow-density relation in the cells. The unreasonable results of macroscopic traffic flow models, which may occur in the first and last cells in certain conditions are alleviated by applying buffer cells between the traffic data and the model. The sending/receiving functions of the cells are determined dynamically based on the classification of the

  9. Applied Behavior Analysis in Education.

    Science.gov (United States)

    Cooper, John O.

    1982-01-01

    Applied behavioral analysis in education is expanding rapidly. This article describes the dimensions of applied behavior analysis and the contributions this technology offers teachers in the area of systematic applications, direct and daily measurement, and experimental methodology. (CJ)

  10. Analytical Schwartz density applied to heavy two-electron ions

    Energy Technology Data Exchange (ETDEWEB)

    Romera, E.; Dehesa, J.S. [Universidad de Granada (Spain); Koga, Toshikatsu [Muroran Institute of Technology (Japan)

    1997-01-20

    An analytical expression of the electron density function p(r) due to Schwartz for two-electron atomic systems is applied to a detailed study of density-dependent properties of relatively heavy two-electron ions. Comparison of the Schwartz results with those from accurate Hartree-Fock and Hylleraas wave functions shows that despite its simple yet analytical form, the Schwartz density has a quantitative applicability in the density study of two-electron atoms within the nonrelativistic framework. 13 refs., 4 tabs.

  11. Applied nonstandard analysis

    CERN Document Server

    Davis, Martin

    2005-01-01

    Geared toward upper-level undergraduates and graduate students, this text explores the applications of nonstandard analysis without assuming any knowledge of mathematical logic. It develops the key techniques of nonstandard analysis at the outset from a single, powerful construction; then, beginning with a nonstandard construction of the real number system, it leads students through a nonstandard treatment of the basic topics of elementary real analysis, topological spaces, and Hilbert space.Important topics include nonstandard treatments of equicontinuity, nonmeasurable sets, and the existenc

  12. Applied multivariate statistical analysis

    National Research Council Canada - National Science Library

    Johnson, Richard Arnold; Wichern, Dean W

    1988-01-01

    .... The authors hope that their discussions will meet the needs of experimental scientists, in a wide variety of subject matter areas, as a readable introduciton to the staistical analysis of multvariate observations...

  13. Applied functional analysis

    CERN Document Server

    Oden, J Tinsley

    2010-01-01

    The textbook is designed to drive a crash course for beginning graduate students majoring in something besides mathematics, introducing mathematical foundations that lead to classical results in functional analysis. More specifically, Oden and Demkowicz want to prepare students to learn the variational theory of partial differential equations, distributions, and Sobolev spaces and numerical analysis with an emphasis on finite element methods. The 1996 first edition has been used in a rather intensive two-semester course. -Book News, June 2010

  14. Applied functional analysis

    CERN Document Server

    Griffel, DH

    2002-01-01

    A stimulating introductory text, this volume examines many important applications of functional analysis to mechanics, fluid mechanics, diffusive growth, and approximation. Detailed enough to impart a thorough understanding, the text is also sufficiently straightforward for those unfamiliar with abstract analysis. Its four-part treatment begins with distribution theory and discussions of Green's functions. Essentially independent of the preceding material, the second and third parts deal with Banach spaces, Hilbert space, spectral theory, and variational techniques. The final part outlines the

  15. The relationship between local liquid density and force applied on a tip of atomic force microscope: a theoretical analysis for simple liquids.

    Science.gov (United States)

    Amano, Ken-ichi; Suzuki, Kazuhiro; Fukuma, Takeshi; Takahashi, Ohgi; Onishi, Hiroshi

    2013-12-14

    The density of a liquid is not uniform when placed on a solid. The structured liquid pushes or pulls a probe employed in atomic force microscopy, as demonstrated in a number of experimental studies. In the present study, the relation between the force on a probe and the local density of a liquid is derived based on the statistical mechanics of simple liquids. When the probe is identical to a solvent molecule, the strength of the force is shown to be proportional to the vertical gradient of ln(ρDS) with the local liquid's density on a solid surface being ρDS. The intrinsic liquid's density on a solid is numerically calculated and compared with the density reconstructed from the force on a probe that is identical or not identical to the solvent molecule.

  16. The relationship between local liquid density and force applied on a tip of atomic force microscope: A theoretical analysis for simple liquids

    International Nuclear Information System (INIS)

    Amano, Ken-ichi; Takahashi, Ohgi; Suzuki, Kazuhiro; Fukuma, Takeshi; Onishi, Hiroshi

    2013-01-01

    The density of a liquid is not uniform when placed on a solid. The structured liquid pushes or pulls a probe employed in atomic force microscopy, as demonstrated in a number of experimental studies. In the present study, the relation between the force on a probe and the local density of a liquid is derived based on the statistical mechanics of simple liquids. When the probe is identical to a solvent molecule, the strength of the force is shown to be proportional to the vertical gradient of ln(ρ DS ) with the local liquid's density on a solid surface being ρ DS . The intrinsic liquid's density on a solid is numerically calculated and compared with the density reconstructed from the force on a probe that is identical or not identical to the solvent molecule

  17. Conversation Analysis in Applied Linguistics

    DEFF Research Database (Denmark)

    Kasper, Gabriele; Wagner, Johannes

    2014-01-01

    For the last decade, conversation analysis (CA) has increasingly contributed to several established fields in applied linguistics. In this article, we will discuss its methodological contributions. The article distinguishes between basic and applied CA. Basic CA is a sociological endeavor concerned...

  18. Modern charge-density analysis

    CERN Document Server

    Gatti, Carlo

    2012-01-01

    Focusing on developments from the past 10-15 years, this volume presents an objective overview of the research in charge density analysis. The most promising methodologies are included, in addition to powerful interpretative tools and a survey of important areas of research.

  19. Applied analysis and differential equations

    CERN Document Server

    Cârj, Ovidiu

    2007-01-01

    This volume contains refereed research articles written by experts in the field of applied analysis, differential equations and related topics. Well-known leading mathematicians worldwide and prominent young scientists cover a diverse range of topics, including the most exciting recent developments. A broad range of topics of recent interest are treated: existence, uniqueness, viability, asymptotic stability, viscosity solutions, controllability and numerical analysis for ODE, PDE and stochastic equations. The scope of the book is wide, ranging from pure mathematics to various applied fields such as classical mechanics, biomedicine, and population dynamics.

  20. Applied survival analysis using R

    CERN Document Server

    Moore, Dirk F

    2016-01-01

    Applied Survival Analysis Using R covers the main principles of survival analysis, gives examples of how it is applied, and teaches how to put those principles to use to analyze data using R as a vehicle. Survival data, where the primary outcome is time to a specific event, arise in many areas of biomedical research, including clinical trials, epidemiological studies, and studies of animals. Many survival methods are extensions of techniques used in linear regression and categorical data, while other aspects of this field are unique to survival data. This text employs numerous actual examples to illustrate survival curve estimation, comparison of survivals of different groups, proper accounting for censoring and truncation, model variable selection, and residual analysis. Because explaining survival analysis requires more advanced mathematics than many other statistical topics, this book is organized with basic concepts and most frequently used procedures covered in earlier chapters, with more advanced topics...

  1. Speckle photography applied to the density field of a flame

    Science.gov (United States)

    Shu, J.-Z.; Li, J.-Y.

    1987-11-01

    An optical arrangement combining a set-up for taking speckle records with a shearing interferometer using a Wollaston prism is applied to the study of a fluctuating Bunsen burner flame. The simultaneous recording, in real time, of the interferogram facilitates the interpretation of the data field derived by the point-by-point analysis of the specklegram. The pattern of Young's fringes obtained by analysis of the specklegram at 16 different positions in the field of view is shown, displaying the random variation of the light deflection in the flame.

  2. Modern problems in applied analysis

    CERN Document Server

    Rogosin, Sergei

    2018-01-01

    This book features a collection of recent findings in Applied Real and Complex Analysis that were presented at the 3rd International Conference “Boundary Value Problems, Functional Equations and Applications” (BAF-3), held in Rzeszow, Poland on 20-23 April 2016. The contributions presented here develop a technique related to the scope of the workshop and touching on the fields of differential and functional equations, complex and real analysis, with a special emphasis on topics related to boundary value problems. Further, the papers discuss various applications of the technique, mainly in solid mechanics (crack propagation, conductivity of composite materials), biomechanics (viscoelastic behavior of the periodontal ligament, modeling of swarms) and fluid dynamics (Stokes and Brinkman type flows, Hele-Shaw type flows). The book is addressed to all readers who are interested in the development and application of innovative research results that can help solve theoretical and real-world problems.

  3. Analysis of electron-correlation effects in strongly correlated systems (N2 and N2+ ) by applying the density-matrix renormalization-group method and quantum information theory

    Science.gov (United States)

    Stemmle, Christian; Paulus, Beate; Legeza, Örs

    2018-02-01

    The dissociation of N2 and N2 + has been studied by using the ab initio density-matrix renormalization-group (DMRG) method. Accurate potential energy surfaces (PESs) have been obtained for the electronic ground states of N2 (X1 Σg+ ) and N2+ (X2 Σg+ ) as well as for the N2+ excited state B2 Σu+ . Inherent to the DMRG approach, the eigenvalues of the reduced density matrix (ρ ) and their correlation functions are at hand. Thus we can apply quantum information theory directly and investigate how the wave function changes along the PES and depict differences between the different states. Moreover, by characterizing quantum entanglement between different pairs of orbitals and analyzing the reduced density matrix, we achieved a better understanding of the multireference character featured by these systems.

  4. How Thin Is Foil? Applying Density to Find the Thickness of Aluminum Foil

    Science.gov (United States)

    Concannon, James P.

    2011-01-01

    In this activity, I show how high school students apply their knowledge of density to solve an unknown variable, such as thickness. Students leave this activity with a better understanding of density, the knowledge that density is a characteristic property of a given substance, and the ways density can be measured. (Contains 4 figures and 1 table.)

  5. Kernel density estimation applied to bond length, bond angle, and torsion angle distributions.

    Science.gov (United States)

    McCabe, Patrick; Korb, Oliver; Cole, Jason

    2014-05-27

    We describe the method of kernel density estimation (KDE) and apply it to molecular structure data. KDE is a quite general nonparametric statistical method suitable even for multimodal data. The method generates smooth probability density function (PDF) representations and finds application in diverse fields such as signal processing and econometrics. KDE appears to have been under-utilized as a method in molecular geometry analysis, chemo-informatics, and molecular structure optimization. The resulting probability densities have advantages over histograms and, importantly, are also suitable for gradient-based optimization. To illustrate KDE, we describe its application to chemical bond length, bond valence angle, and torsion angle distributions and show the ability of the method to model arbitrary torsion angle distributions.

  6. Temperature-dependent spectral density analysis applied to monitoring backbone dynamics of major urinary protein-I complexed with the pheromone 2-sec-butyl-4,5-dihydrothiazole

    International Nuclear Information System (INIS)

    Backbone dynamics of mouse major urinary protein I (MUP-I) was studied by 15 N NMR relaxation. Data were collected at multiple temperatures for a complex of MUP-I with its natural pheromonal ligand, 2-sec-4,5-dihydrothiazole, and for the free protein. The measured relaxation rates were analyzed using the reduced spectral density mapping. Graphical analysis of the spectral density values provided an unbiased qualitative picture of the internal motions. Varying temperature greatly increased the range of analyzed spectral density values and therefore improved reliability of the analysis. Quantitative parameters describing the dynamics on picosecond to nanosecond time scale were obtained using a novel method of simultaneous data fitting at multiple temperatures. Both methods showed that the backbone flexibility on the fast time scale is slightly increased upon pheromone binding, in accordance with the previously reported results. Zero-frequency spectral density values revealed conformational changes on the microsecond to millisecond time scale. Measurements at different temperatures allowed to monitor temperature depencence of the motional parameters

  7. Essentials of applied dynamic analysis

    CERN Document Server

    Jia, Junbo

    2014-01-01

    This book presents up-to-date knowledge of dynamic analysis in engineering world. To facilitate the understanding of the topics by readers with various backgrounds, general principles are linked to their applications from different angles. Special interesting topics such as statistics of motions and loading, damping modeling and measurement, nonlinear dynamics, fatigue assessment, vibration and buckling under axial loading, structural health monitoring, human body vibrations, and vehicle-structure interactions etc., are also presented. The target readers include industry professionals in civil, marine and mechanical engineering, as well as researchers and students in this area.

  8. Central depression in nucleonic densities: Trend analysis in the nuclear density functional theory approach

    Science.gov (United States)

    Schuetrumpf, B.; Nazarewicz, W.; Reinhard, P.-G.

    2017-08-01

    Background: The central depression of nucleonic density, i.e., a reduction of density in the nuclear interior, has been attributed to many factors. For instance, bubble structures in superheavy nuclei are believed to be due to the electrostatic repulsion. In light nuclei, the mechanism behind the density reduction in the interior has been discussed in terms of shell effects associated with occupations of s orbits. Purpose: The main objective of this work is to reveal mechanisms behind the formation of central depression in nucleonic densities in light and heavy nuclei. To this end, we introduce several measures of the internal nucleonic density. Through the statistical analysis, we study the information content of these measures with respect to nuclear matter properties. Method: We apply nuclear density functional theory with Skyrme functionals. Using the statistical tools of linear least square regression, we inspect correlations between various measures of central depression and model parameters, including nuclear matter properties. We study bivariate correlations with selected quantities as well as multiple correlations with groups of parameters. Detailed correlation analysis is carried out for 34Si for which a bubble structure has been reported recently, 48Ca, and N =82 , 126, and 184 isotonic chains. Results: We show that the central depression in medium-mass nuclei is very sensitive to shell effects, whereas for superheavy systems it is firmly driven by the electrostatic repulsion. An appreciable semibubble structure in proton density is predicted for 294Og, which is currently the heaviest nucleus known experimentally. Conclusion: Our correlation analysis reveals that the central density indicators in nuclei below 208Pb carry little information on parameters of nuclear matter; they are predominantly driven by shell structure. On the other hand, in the superheavy nuclei there exists a clear relationship between the central nucleonic density and symmetry energy.

  9. Caldwell University's Department of Applied Behavior Analysis.

    Science.gov (United States)

    Reeve, Kenneth F; Reeve, Sharon A

    2016-05-01

    Since 2004, faculty members at Caldwell University have developed three successful graduate programs in Applied Behavior Analysis (i.e., PhD, MA, non-degree programs), increased program faculty from two to six members, developed and operated an on-campus autism center, and begun a stand-alone Applied Behavior Analysis Department. This paper outlines a number of strategies used to advance these initiatives, including those associated with an extensive public relations campaign. We also outline challenges that have limited our programs' growth. These strategies, along with a consideration of potential challenges, might prove useful in guiding academicians who are interested in starting their own programs in behavior analysis.

  10. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  11. Building an applied activation analysis centre

    International Nuclear Information System (INIS)

    Bartosek, J.; Kasparec, I.; Masek, J.

    1972-01-01

    Requirements are defined and all available background material is reported and discussed for the building up of a centre of applied activation analysis in Czechoslovakia. A detailed analysis of potential users and the centre's envisaged availability is also presented as part of the submitted study. A brief economic analysis is annexed. The study covers the situation up to the end of 1972. (J.K.)

  12. Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Johnston, J. M.; Foxx, R. M.; Jacobson, J. W.; Green, G.; Mulick, J. A.

    2006-01-01

    This article reviews the origins and characteristics of the positive behavior support (PBS) movement and examines those features in the context of the field of applied behavior analysis (ABA). We raise a number of concerns about PBS as an approach to delivery of behavioral services and its impact on how ABA is viewed by those in human services. We…

  13. Applied Behavior Analysis: Beyond Discrete Trial Teaching

    Science.gov (United States)

    Steege, Mark W.; Mace, F. Charles; Perry, Lora; Longenecker, Harold

    2007-01-01

    We discuss the problem of autism-specific special education programs representing themselves as Applied Behavior Analysis (ABA) programs when the only ABA intervention employed is Discrete Trial Teaching (DTT), and often for limited portions of the school day. Although DTT has many advantages to recommend its use, it is not well suited to teach…

  14. Evaluating the function of applied behavior analysis a bibliometric analysis.

    OpenAIRE

    Critchfield, Thomas S

    2002-01-01

    Analysis of scholarly citations involving behavioral journals reveals that, consistent with its mission, applied behavior analysis research frequently references the basic behavioral literature but, as some have suspected, exerts narrow scholarly influence.

  15. Applied Data Analysis in Energy Monitoring System

    Directory of Open Access Journals (Sweden)

    Kychkin А.V.

    2016-08-01

    Full Text Available Software and hardware system organization is presented as an example for building energy monitoring of multi-sectional lighting and climate control / conditioning needs. System key feature is applied office energy data analysis that allows to provide each type of hardware localized work mode recognition. It is based on general energy consumption profile with following energy consumption and workload evaluation. Applied data analysis includes primary data processing block, smoothing filter, time stamp identification block, clusterization and classification blocks, state change detection block, statistical data calculation block. Time slot consumed energy value and slot time stamp are taken as work mode classification main parameters. Energy data applied analysis with HIL and OpenJEVis visualization system usage experimental research results for chosen time period has been provided. Energy consumption, workload calculation and eight different states identification has been executed for two lighting sections and one climate control / conditioning emulating system by integral energy consumption profile. Research has been supported by university internal grant №2016/PI-2 «Methodology development of monitoring and heat flow utilization as low potential company energy sources».

  16. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    In this thesis we explore the use of functional data analysis as a method to analyse chemometric data, more specically spectral data in metabolomics. Functional data analysis is a vibrant eld in statistics. It has been rapidly expanding in both methodology and applications since it was made well...... nutritional status and metabolic phenotype. We want to understand how metabolomic spectra can be analysed using functional data analysis to detect the in uence of dierent factors on specic metabolites. These factors can include, for example, gender, diet culture or dietary intervention. In Paper I we apply...... of functional condence intervals for mean curves. We also discuss the many practical considerations in wavelet estimation and thresholding, and the important in uence the choices can have on the resulting estimates. On a conceptual level, the purpose of this thesis is to build a stronger connection between...

  17. Applied research in uncertainty modeling and analysis

    CERN Document Server

    Ayyub, Bilal

    2005-01-01

    Uncertainty has been a concern to engineers, managers, and scientists for many years. For a long time uncertainty has been considered synonymous with random, stochastic, statistic, or probabilistic. Since the early sixties views on uncertainty have become more heterogeneous. In the past forty years numerous tools that model uncertainty, above and beyond statistics, have been proposed by several engineers and scientists. The tool/method to model uncertainty in a specific context should really be chosen by considering the features of the phenomenon under consideration, not independent of what is known about the system and what causes uncertainty. In this fascinating overview of the field, the authors provide broad coverage of uncertainty analysis/modeling and its application. Applied Research in Uncertainty Modeling and Analysis presents the perspectives of various researchers and practitioners on uncertainty analysis and modeling outside their own fields and domain expertise. Rather than focusing explicitly on...

  18. Strategic decision analysis applied to borehole seismology

    International Nuclear Information System (INIS)

    Menke, M.M.; Paulsson, B.N.P.

    1994-01-01

    Strategic Decision Analysis (SDA) is the evolving body of knowledge on how to achieve high quality in the decision that shapes an organization's future. SDA comprises philosophy, process concepts, methodology, and tools for making good decisions. It specifically incorporates many concepts and tools from economic evaluation and risk analysis. Chevron Petroleum Technology Company (CPTC) has applied SDA to evaluate and prioritize a number of its most important and most uncertain R and D projects, including borehole seismology. Before SDA, there were significant issues and concerns about the value to CPTC of continuing to work on borehole seismology. The SDA process created a cross-functional team of experts to structure and evaluate this project. A credible economic model was developed, discrete risks and continuous uncertainties were assessed, and an extensive sensitivity analysis was performed. The results, even applied to a very restricted drilling program for a few years, were good enough to demonstrate the value of continuing the project. This paper explains the SDA philosophy concepts, and process and demonstrates the methodology and tools using the borehole seismology project example. SDA is useful in the upstream industry not just in the R and D/technology decisions, but also in major exploration and production decisions. Since a major challenge for upstream companies today is to create and realize value, the SDA approach should have a very broad applicability

  19. [Computerized image analysis applied to urology research].

    Science.gov (United States)

    Urrutia Avisrror, M

    1994-05-01

    Diagnosis with the aid of imaging techniques in urology had developed dramatically over the last few years as a result of using state-of-the-art technology that has added digital angiology to the last generation apparatus for ultrasound. Computerized axial tomography and nuclear magnetic resonance allow very high rates of diagnostic possibilities that only a decade ago were not extended to routine use. Each of these examination procedures has its own limits of sensitivity and specificity which vary as a function of the pathoanatomical characteristics depending on the condition to be explored, although none reaches yet absolute values. With ultrasound, CAT and NMR, identification of the various diseases rely on the analysis of densities although with a significant degree of the examiner's subjectivity in the diagnostic judgement. The logic evolution of these techniques is to eliminate such subjective component and translate the features which characterize each disease in quantifiable parameters, a challenge made feasible by computerized analysis. Thanks to technological advances in the field of microcomputers and the decreased cost of the equipment, currently it is possible for any clinical investigator with average resources to use the most sophisticated imaging analysis techniques for the post-processing of the images obtained, opening in the scope of practical investigation a pathway that just a few years ago was exclusive to only certain organizations due to the high cost involved.

  20. Functional Data Analysis Applied in Chemometrics

    DEFF Research Database (Denmark)

    Muller, Martha

    studies the `unique chemical ngerprints' (Daviss, 2005) that cellular processes create in living systems. Metabolomics is used to study the in uence of nutrition on the human metabolome. Nutritional metabolomics shows great potential for the discovery of novel biomarkers of food consumption, personal...... nutritional status and metabolic phenotype. We want to understand how metabolomic spectra can be analysed using functional data analysis to detect the in uence of dierent factors on specic metabolites. These factors can include, for example, gender, diet culture or dietary intervention. In Paper I we apply...... care, including personalised nutrition for prevention and treatment....

  1. Computerized image analysis: estimation of breast density on mammograms

    Science.gov (United States)

    Zhou, Chuan; Chan, Heang-Ping; Petrick, Nicholas; Sahiner, Berkman; Helvie, Mark A.; Roubidoux, Marilyn A.; Hadjiiski, Lubomir M.; Goodsitt, Mitchell M.

    2000-06-01

    An automated image analysis tool is being developed for estimation of mammographic breast density, which may be useful for risk estimation or for monitoring breast density change in a prevention or intervention program. A mammogram is digitized using a laser scanner and the resolution is reduced to a pixel size of 0.8 mm X 0.8 mm. Breast density analysis is performed in three stages. First, the breast region is segmented from the surrounding background by an automated breast boundary-tracking algorithm. Second, an adaptive dynamic range compression technique is applied to the breast image to reduce the range of the gray level distribution in the low frequency background and to enhance the differences in the characteristic features of the gray level histogram for breasts of different densities. Third, rule-based classification is used to classify the breast images into several classes according to the characteristic features of their gray level histogram. For each image, a gray level threshold is automatically determined to segment the dense tissue from the breast region. The area of segmented dense tissue as a percentage of the breast area is then estimated. In this preliminary study, we analyzed the interobserver variation of breast density estimation by two experienced radiologists using BI-RADS lexicon. The radiologists' visually estimated percent breast densities were compared with the computer's calculation. The results demonstrate the feasibility of estimating mammographic breast density using computer vision techniques and its potential to improve the accuracy and reproducibility in comparison with the subjective visual assessment by radiologists.

  2. Tissue Microarray Analysis Applied to Bone Diagenesis.

    Science.gov (United States)

    Mello, Rafael Barrios; Silva, Maria Regina Regis; Alves, Maria Teresa Seixas; Evison, Martin Paul; Guimarães, Marco Aurelio; Francisco, Rafaella Arrabaca; Astolphi, Rafael Dias; Iwamura, Edna Sadayo Miazato

    2017-01-04

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens. Standard hematoxylin and eosin, periodic acid-Schiff and silver methenamine, and picrosirius red staining, and CD31 and CD34 immunohistochemistry were applied to TMA sections. Osteocyte and osteocyte lacuna counts, percent bone matrix loss, and fungal spheroid element counts could be measured and collagen fibre bundles observed in all specimens. Decalcification with 7% nitric acid proceeded more rapidly than with 0.5 M EDTA and may offer better preservation of histological and cellular structure. No endothelial cells could be detected using CD31 and CD34 immunohistochemistry. Correlation between osteocytes per lacuna and age at death may reflect reported age-related responses to microdamage. Methodological limitations and caveats, and results of the TMA analysis of post mortem diagenesis in bone are discussed, and implications for DNA survival and recovery considered.

  3. Applied Analysis at MGIMO-University

    Directory of Open Access Journals (Sweden)

    A. A. Orlov

    2014-01-01

    Full Text Available Applied analysis of international relations began to form at MGIMO-University in the 1970s. This kind of research always attracted considerable interest of the Ministry of Foreign Affairs of the USSR, and other executive institutions of the government and received their support. The Ministry of Foreign Affairs initiated the creation of a special unit at MGIMO - the Problem Research Laboratory of Systems Analysis in International Relations. The Laboratory was using system analysis and quantitative methods to produce scientific information for decision-makers to make "more informed decisions in the field of international relations in order to reduce the level of uncertainty in the assessment of the expected impact of these decisions". In 2004, the successor to the Problem Laboratory - Center for International Studies - was transformed into a Research Coordination Council for International Studies, which in 2009 handed its functions to the Institute of International Studies. In comparison with previous periods the Institute of International Studies has significantly increased of research for the Ministry of International Affairs. It has also moved functionally outside its institutional boundaries and produces unclassified research for public offer. It also serves as a place for vivid public discussions among IR specialists. There's also an international recognition of the Institute of International Studies. The "Go to think tanks" international ranking produced annually at the University of Pennsylvania has put MGIMO-University on the 10th place in the category of university based think tanks.

  4. Unsupervised Deep Learning Applied to Breast Density Segmentation and Mammographic Risk Scoring.

    Science.gov (United States)

    Kallenberg, Michiel; Petersen, Kersten; Nielsen, Mads; Ng, Andrew Y; Pengfei Diao; Igel, Christian; Vachon, Celine M; Holland, Katharina; Winkel, Rikke Rass; Karssemeijer, Nico; Lillholm, Martin

    2016-05-01

    Mammographic risk scoring has commonly been automated by extracting a set of handcrafted features from mammograms, and relating the responses directly or indirectly to breast cancer risk. We present a method that learns a feature hierarchy from unlabeled data. When the learned features are used as the input to a simple classifier, two different tasks can be addressed: i) breast density segmentation, and ii) scoring of mammographic texture. The proposed model learns features at multiple scales. To control the models capacity a novel sparsity regularizer is introduced that incorporates both lifetime and population sparsity. We evaluated our method on three different clinical datasets. Our state-of-the-art results show that the learned breast density scores have a very strong positive relationship with manual ones, and that the learned texture scores are predictive of breast cancer. The model is easy to apply and generalizes to many other segmentation and scoring problems.

  5. Analysis of the interaction between experimental and applied behavior analysis.

    Science.gov (United States)

    Virues-Ortega, Javier; Hurtado-Parrado, Camilo; Cox, Alison D; Pear, Joseph J

    2014-01-01

    To study the influences between basic and applied research in behavior analysis, we analyzed the coauthorship interactions of authors who published in JABA and JEAB from 1980 to 2010. We paid particular attention to authors who published in both JABA and JEAB (dual authors) as potential agents of cross-field interactions. We present a comprehensive analysis of dual authors' coauthorship interactions using social networks methodology and key word analysis. The number of dual authors more than doubled (26 to 67) and their productivity tripled (7% to 26% of JABA and JEAB articles) between 1980 and 2010. Dual authors stood out in terms of number of collaborators, number of publications, and ability to interact with multiple groups within the field. The steady increase in JEAB and JABA interactions through coauthors and the increasing range of topics covered by dual authors provide a basis for optimism regarding the progressive integration of basic and applied behavior analysis. © Society for the Experimental Analysis of Behavior.

  6. Renormalization techniques applied to the study of density of states in disordered systems

    International Nuclear Information System (INIS)

    Ramirez Ibanez, J.

    1985-01-01

    A general scheme for real space renormalization of formal scattering theory is presented and applied to the calculation of density of states (DOS) in some finite width systems. This technique is extended in a self-consistent way, to the treatment of disordered and partially ordered chains. Numerical results of moments and DOS are presented in comparison with previous calculations. In addition, a self-consistent theory for the magnetic order problem in a Hubbard chain is derived and a parametric transition is observed. Properties of localization of the electronic states in disordered chains are studied through various decimation averaging techniques and using numerical simulations. (author) [pt

  7. Social network analysis applied to team sports analysis

    CERN Document Server

    Clemente, Filipe Manuel; Mendes, Rui Sousa

    2016-01-01

    Explaining how graph theory and social network analysis can be applied to team sports analysis, This book presents useful approaches, models and methods that can be used to characterise the overall properties of team networks and identify the prominence of each team player. Exploring the different possible network metrics that can be utilised in sports analysis, their possible applications and variances from situation to situation, the respective chapters present an array of illustrative case studies. Identifying the general concepts of social network analysis and network centrality metrics, readers are shown how to generate a methodological protocol for data collection. As such, the book provides a valuable resource for students of the sport sciences, sports engineering, applied computation and the social sciences.

  8. Telemedicine - a scientometric and density equalizing analysis.

    Science.gov (United States)

    Groneberg, David A; Rahimian, Shaghayegh; Bundschuh, Matthias; Schwarzer, Mario; Gerber, Alexander; Kloft, Beatrix

    2015-01-01

    As a result of the various telemedicine projects in the past years a large number of studies were recently published in this field. However, a precise bibliometric analysis of telemedicine publications does not exist so far. The present study was conducted to establish a data base of the existing approaches. Density-equalizing algorithms were used and data was retrieved from the Thomson Reuters database Web of Science. During the period from 1900 to 2006 a number of 3290 filed items were connected to telemedicine, with the first being published in 1964. The studies originate from 101 countries, with the USA, Great Britain and Canada being the most productive suppliers participating in 56.08 % of all published items. Analyzing the average citation per item for countries with more than 10 publications, Ireland ranked first (10.19/item), New Zealand ranked second (9.5/item) followed by Finland (9.04/item). The citation rate can be assumed as an indicator for research quality. The ten most productive journals include three journals with the main focus telemedicine and another five with the main focus "Information/Informatics". In all subject categories examined for published items related to telemedicine, "Health Care Sciences & Services" ranked first by far. More than 36 % of all publications are assigned to this category, followed by "Medical Informatics" with 9.72 % and "Medicine, General & Internal" with 8.84 % of all publications. In summary it can be concluded that the data shows clearly a strong increase in research productivity. Using science citation analysis it can be assumed that there is a large rise in the interest in telemedicine studies.

  9. Resampling method for applying density-dependent habitat selection theory to wildlife surveys.

    Science.gov (United States)

    Tardy, Olivia; Massé, Ariane; Pelletier, Fanie; Fortin, Daniel

    2015-01-01

    Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection over large

  10. Colilert® applied to food analysis

    Directory of Open Access Journals (Sweden)

    Maria José Rodrigues

    2014-06-01

    Full Text Available Colilert® (IDEXX was originally developed for the simultaneous enumeration of coliforms and E. coli in water samples and has been used for the quality control routine of drinking, swimming pools, fresh, coastal and waste waters (Grossi et al., 2013. The Colilert® culture medium contains the indicator nutrient 4-Methylumbelliferyl-β-D-Glucuronide (MUG. MUG acts as a substrate for the E. coli enzyme β-glucuronidase, from which a fluorescent compound is produced. A positive MUG result produces fluorescence when viewed under an ultraviolet lamp. If the test fluorescence is equal to or greater than that of the control, the presence of E. coli has been confirmed (Lopez-Roldan et al., 2013. The present work aimed to apply Colilert® to the enumeration of E. coli in different foods, through the comparison of results against the reference method (ISO 16649-2, 2001 for E. coli food analysis. The study was divided in two stages. During the first stage ten different types of foods were analyzed with Colilert®, these included pastry, raw meat, ready to eat meals, yogurt, raw seabream and salmon, and cooked shrimp. From these it were approved the following: pastry with custard; raw minced pork; soup "caldo-verde"; raw vegetable salad (lettuce and carrots and solid yogurt. The approved foods presented a better insertion in the tray, the colour of the wells was lighter and the UV reading was easier. In the second stage the foods were artificially contaminated with 2 log/g of E. coli (ATCC 25922 and analyzed. Colilert® proved to be an accurate method and the counts were similar to the ones obtained with the reference method. In the present study, the Colilert® method did not reveal neither false-positive or false-negative results, however sometimes the results were difficult to read due to the presence of green fluorescence in some wells. Generally Colilert® was an easy and rapid method, but less objective and more expensive than the reference method.

  11. Moving Forward: Positive Behavior Support and Applied Behavior Analysis

    Science.gov (United States)

    Tincani, Matt

    2007-01-01

    A controversy has emerged about the relationship between positive behavior support and applied behavior analysis. Some behavior analysts suggest that positive behavior support and applied behavior analysis are the same (e.g., Carr & Sidener, 2002). Others argue that positive behavior support is harmful to applied behavior analysis (e.g., Johnston,…

  12. Functional Analysis in Applied Mathematics and Engineering

    DEFF Research Database (Denmark)

    Pedersen, Michael

    1997-01-01

    Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title.......Lecture notes for the course 01245 Functional Analysis. Consists of the first part of amonograph with the same title....

  13. Bouguer correction density determination from fractal analysis using ...

    African Journals Online (AJOL)

    In this work, Bouguer density is determined using the fractal approach. This technique was applied to the gravity data of the Kwello area of the Basement Complex, north-western Nigeria. The density obtained using the fractal approach is 2500 kgm which is lower than the conventional value of 2670 kgm used for average ...

  14. Introduction: Conversation Analysis in Applied Linguistics

    Science.gov (United States)

    Sert, Olcay; Seedhouse, Paul

    2011-01-01

    This short, introductory paper presents an up-to-date account of works within the field of Applied Linguistics which have been influenced by a Conversation Analytic paradigm. The article reviews recent studies in classroom interaction, materials development, proficiency assessment and language teacher education. We believe that the publication of…

  15. Proteomic analysis of high-density lipoprotein

    NARCIS (Netherlands)

    Rezaee, Farhad; Casetta, Bruno; Levels, J. Han M.; Speijer, Dave; Meijers, Joost C. M.

    2006-01-01

    Plasma lipoproteins, such as high-density lipoprotein (HDL), can serve as carriers for a wide range of proteins that are involved in processes such as lipid metabolism, thrombosis, inflammation and atherosclerosis. The identification of HDL-associated proteins is essential with regards to

  16. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  17. Tissue Microarray Analysis Applied to Bone Diagenesis

    OpenAIRE

    Barrios Mello, Rafael; Regis Silva, Maria Regina; Seixas Alves, Maria Teresa; Evison, Martin; Guimarães, Marco Aurélio; Francisco, Rafaella Arrabaça; Dias Astolphi, Rafael; Miazato Iwamura, Edna Sadayo

    2017-01-01

    Taphonomic processes affecting bone post mortem are important in forensic, archaeological and palaeontological investigations. In this study, the application of tissue microarray (TMA) analysis to a sample of femoral bone specimens from 20 exhumed individuals of known period of burial and age at death is described. TMA allows multiplexing of subsamples, permitting standardized comparative analysis of adjacent sections in 3-D and of representative cross-sections of a large number of specimens....

  18. Operation feedback analysis applied to preventive maintenance

    International Nuclear Information System (INIS)

    Bouchet, J.L.; Havart, J.; Jacquot, J.P.; Lannoy, A.; Vasseur, D.

    1992-03-01

    The paper presents the contribution of operation feedback data bases and their analysis for the optimization of a preventive maintenance program. The RCM approach (Reliability Centered Maintenance) is based on a functional breakdown of systems and components. There are four major stages: analysis of equipment criticality, critical failure analysis for each critical component, selection of maintenance tasks, and operation feedback analysis. The in-depth knowledge of the failure and degradation mechanisms is only available through operation feedback, its analysis and experts judgements. The validation of the information collected in the various feedback files, and its statistical processing coupled with reliability calculations, enables : - the definition of the causes of failure and degradation noted, - the estimation of the associated failure rate and the calculation of the evolution of this failure rate as the age of the component increases. The paper presents the approach used and the results obtained for the pilot study of the chemical and volumetric control system CVCS of 900 MW PWR nuclear power plants. About 2 500 sheets, concerning a lot of equipment types (pumps, sensors, valves, ...) have been expertised and the statistical processing provided basic reliability parameters for RCM (rates, modes, causes, subcomponents, ...). (authors). 5 tabs., 6 figs., 7 refs

  19. Applied quantitative analysis in the social sciences

    CERN Document Server

    Petscher, Yaacov; Compton, Donald L

    2013-01-01

    To say that complex data analyses are ubiquitous in the education and social sciences might be an understatement. Funding agencies and peer-review journals alike require that researchers use the most appropriate models and methods for explaining phenomena. Univariate and multivariate data structures often require the application of more rigorous methods than basic correlational or analysis of variance models. Additionally, though a vast set of resources may exist on how to run analysis, difficulties may be encountered when explicit direction is not provided as to how one should run a model

  20. Applied Spectrophotometry: Analysis of a Biochemical Mixture

    Science.gov (United States)

    Trumbo, Toni A.; Schultz, Emeric; Borland, Michael G.; Pugh, Michael Eugene

    2013-01-01

    Spectrophotometric analysis is essential for determining biomolecule concentration of a solution and is employed ubiquitously in biochemistry and molecular biology. The application of the Beer-Lambert-Bouguer Lawis routinely used to determine the concentration of DNA, RNA or protein. There is however a significant difference in determining the…

  1. Density functional and neural network analysis

    DEFF Research Database (Denmark)

    Jalkanen, K. J.; Suhai, S.; Bohr, Henrik

    1997-01-01

    Density functional theory (DFT) calculations have been carried out for hydrated L-alanine, L-alanyl-L-alanine and N-acetyl L-alanine N'-methylamide and examined with respect to the effect of water on the structure, the vibrational frequencies, vibrational absorption (VA) and vibrational circular...... dichroism (VCD) intensities. The large changes due to hydration on the structures, relative stability of conformers, and in the VA and VCD spectra observed experimentally are reproduced by the DFT calculations. Furthermore a neural network was constructed for reproducing the inverse scattering data (infer...

  2. Applied bioinformatics: Genome annotation and transcriptome analysis

    DEFF Research Database (Denmark)

    Gupta, Vikas

    and dhurrin, which have not previously been characterized in blueberries. There are more than 44,500 spider species with distinct habitats and unique characteristics. Spiders are masters of producing silk webs to catch prey and using venom to neutralize. The exploration of the genetics behind these properties...... japonicus (Lotus), Vaccinium corymbosum (blueberry), Stegodyphus mimosarum (spider) and Trifolium occidentale (clover). From a bioinformatics data analysis perspective, my work can be divided into three parts; genome annotation, small RNA, and gene expression analysis. Lotus is a legume of significant...... has just started. We have assembled and annotated the first two spider genomes to facilitate our understanding of spiders at the molecular level. The need for analyzing the large and increasing amount of sequencing data has increased the demand for efficient, user friendly, and broadly applicable...

  3. Applied bioinformatics: Genome annotation and transcriptome analysis

    DEFF Research Database (Denmark)

    Gupta, Vikas

    japonicus (Lotus), Vaccinium corymbosum (blueberry), Stegodyphus mimosarum (spider) and Trifolium occidentale (clover). From a bioinformatics data analysis perspective, my work can be divided into three parts; genome annotation, small RNA, and gene expression analysis. Lotus is a legume of significant...... biology and genetics studies. We present an improved Lotus genome assembly and annotation, a catalog of natural variation based on re-sequencing of 29 accessions, and describe the involvement of small RNAs in the plant-bacteria symbiosis. Blueberries contain anthocyanins, other pigments and various...... polyphenolic compounds, which have been linked to protection against diabetes, cardiovascular disease and age-related cognitive decline. We present the first genome- guided approach in blueberry to identify genes involved in the synthesis of health-protective compounds. Using RNA-Seq data from five stages...

  4. Principal component analysis applied to remote sensing

    Directory of Open Access Journals (Sweden)

    Javier Estornell

    2013-06-01

    Full Text Available The main objective of this article was to show an application of principal component analysis (PCA which is used in two science degrees. Particularly, PCA analysis was used to obtain information of the land cover from satellite images. Three Landsat images were selected from two areas which were located in the municipalities of Gandia and Vallat, both in the Valencia province (Spain. In the first study area, just one Landsat image of the 2005 year was used. In the second study area, two Landsat images were used taken in the 1994 and 2000 years to analyse the most significant changes in the land cover. According to the results, the second principal component of the Gandia area image allowed detecting the presence of vegetation. The same component in the Vallat area allowed detecting a forestry area affected by a forest fire. Consequently in this study we confirmed the feasibility of using PCA in remote sensing to extract land use information.

  5. Thermal analysis applied to irradiated propolis

    Energy Technology Data Exchange (ETDEWEB)

    Matsuda, Andrea Harumi; Machado, Luci Brocardo; Mastro, N.L. del E-mail: nelida@usp.br

    2002-03-01

    Propolis is a resinous hive product, collected by bees. Raw propolis requires a decontamination procedure and irradiation appears as a promising technique for this purpose. The valuable properties of propolis for food and pharmaceutical industries have led to increasing interest in its technological behavior. Thermal analysis is a chemical analysis that gives information about changes on heating of great importance for technological applications. Ground propolis samples were {sup 60}Co gamma irradiated with 0 and 10 kGy. Thermogravimetry curves shown a similar multi-stage decomposition pattern for both irradiated and unirradiated samples up to 600 deg. C. Similarly, through differential scanning calorimetry , a coincidence of melting point of irradiated and unirradiated samples was found. The results suggest that the irradiation process do not interfere on the thermal properties of propolis when irradiated up to 10 kGy.

  6. Microfluidic Electronic Tongue Applied to Soil Analysis

    Directory of Open Access Journals (Sweden)

    Maria L. Braunger

    2017-04-01

    Full Text Available Precision agriculture is crucial for increasing food output without expanding the cultivable area, which requires sensors to be deployed for controlling the level of nutrients in the soil. In this paper, we report on a microfluidic electronic tongue (e-tongue based on impedance measurements which is capable of distinguishing soil samples enriched with plant macronutrients. The e-tongue setup consisted of an array of sensing units made with layer-by-layer films deposited onto gold interdigitated electrodes. Significantly, the sensing units could be reused with adequate reproducibility after a simple washing procedure, thus indicating that there is no cross-contamination in three independent sets of measurements. A high performance was achieved by treating the capacitance data with the multidimensional projection techniques Principal Component Analysis (PCA, Interactive Document Map (IDMAP, and Sammon’s Mapping. While an optimized performance was demonstrated with IDMAP and feature selection, during which data of a limited frequency range were used, the distinction of all soil samples was also possible with the well-established PCA analysis for measurements at a single frequency. The successful use of a simple microfluidic e-tongue for soil analysis paves the way for enhanced tools to support precision agriculture.

  7. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  8. In situ texture analysis under applied load

    International Nuclear Information System (INIS)

    Brokmeier, H.G.

    2005-01-01

    The in-situ measurement of a crystallographic texture is a special type of a non-destructive measurement, which need special equipments. Due to the high photon flux and the excellent brilliance high energetic synchrotron radiations are a fantastic tool particular in fast experimentation. Moreover, a high penetration power allows the investigation of standard tensile sample of the DIN-norm. A loading device with a power up to 20 kN was installed at the hard wiggler beamline BW5 (HASYLAB-DESY) to perform in-situ strain and in-situ texture analysis. Using 100keV X-rays one gets short wavelength so that a 2D image-plate detector offers a wide range of diffraction pattern within the first 10 degree in 2 theta. Thermal neutron is another radiation with a high penetration power, which is the standard method for global texture analysis of bulk samples. As an example rectangular extruded Mg- Az31 was investigated by an in-situ. tensile experiment. Samples with 0 degree, 45 degree and 90 degree to the extrusion direction were cut. In-situ strain studies show the lattice dependent strains perpendicular and parallel to the loading direction. Moreover, in hexagonal Mg-Az31 a strong influence of the initial texture on the tensile behavior can be explained by the combination of texture simulation with in-situ measurements. (author)

  9. Multivariate analysis applied to tomato hybrid production.

    Science.gov (United States)

    Balasch, S; Nuez, F; Palomares, G; Cuartero, J

    1984-11-01

    Twenty characters were measured on 60 tomato varieties cultivated in the open-air and in polyethylene plastic-house. Data were analyzed by means of principal components, factorial discriminant methods, Mahalanobis D(2) distances and principal coordinate techniques. Factorial discriminant and Mahalanobis D(2) distances methods, both of which require collecting data plant by plant, lead to similar conclusions as the principal components method that only requires taking data by plots. Characters that make up the principal components in both environments studied are the same, although the relative importance of each one of them varies within the principal components. By combining information supplied by multivariate analysis with the inheritance mode of characters, crossings among cultivars can be experimented with that will produce heterotic hybrids showing characters within previously established limits.

  10. Energy analysis applied to uranium resource estimation

    International Nuclear Information System (INIS)

    Mortimer, N.D.

    1980-01-01

    It is pointed out that fuel prices and ore costs are interdependent, and that in estimating ore costs (involving the cost of fuels used to mine and process the uranium) it is necessary to take into account the total use of energy by the entire fuel system, through the technique of energy analysis. The subject is discussed, and illustrated with diagrams, under the following heads: estimate of how total workable resources would depend on production costs; sensitivity of nuclear electricity prices to ore costs; variation of net energy requirement with ore grade for a typical PWR reactor design; variation of average fundamental cost of nuclear electricity with ore grade; variation of cumulative uranium resources with current maximum ore costs. (U.K.)

  11. Toward applied behavior analysis of life aloft

    Science.gov (United States)

    Brady, J. V.

    1990-01-01

    This article deals with systems at multiple levels, at least from cell to organization. It also deals with learning, decision making, and other behavior at multiple levels. Technological development of a human behavioral ecosystem appropriate to space environments requires an analytic and synthetic orientation, explicitly experimental in nature, dictated by scientific and pragmatic considerations, and closely approximating procedures of established effectiveness in other areas of natural science. The conceptual basis of such an approach has its roots in environmentalism which has two main features: (1) knowledge comes from experience rather than from innate ideas, divine revelation, or other obscure sources; and (2) action is governed by consequences rather than by instinct, reason, will, beliefs, attitudes or even the currently fashionable cognitions. Without an experimentally derived data base founded upon such a functional analysis of human behavior, the overgenerality of "ecological systems" approaches render them incapable of ensuring the successful establishment of enduring space habitats. Without an experimentally derived function account of individual behavioral variability, a natural science of behavior cannot exist. And without a natural science of behavior, the social sciences will necessarily remain in their current status as disciplines of less than optimal precision or utility. Such a functional analysis of human performance should provide an operational account of behavior change in a manner similar to the way in which Darwin's approach to natural selection accounted for the evolution of phylogenetic lines (i.e., in descriptive, nonteleological terms). Similarly, as Darwin's account has subsequently been shown to be consonant with information obtained at the cellular level, so too should behavior principles ultimately prove to be in accord with an account of ontogenetic adaptation at a biochemical level. It would thus seem obvious that the most

  12. Exercise and Bone Density: Meta-Analysis

    National Research Council Canada - National Science Library

    Kelley, George A; Sharpe-Kelley, Kristi

    2007-01-01

    .... Since no meta-analysis had existed using individual patient data (IPD) to examine the effects of exercise on BMD, our second two-year period of funding was devoted to examining the feasibility...

  13. Digital photoelastic analysis applied to implant dentistry

    Science.gov (United States)

    Ramesh, K.; Hariprasad, M. P.; Bhuvanewari, S.

    2016-12-01

    Development of improved designs of implant systems in dentistry have necessitated the study of stress fields in the implant regions of the mandible/maxilla for better understanding of the biomechanics involved. Photoelasticity has been used for various studies related to dental implants in view of whole field visualization of maximum shear stress in the form of isochromatic contours. The potential of digital photoelasticity has not been fully exploited in the field of implant dentistry. In this paper, the fringe field in the vicinity of the connected implants (All-On-Four® concept) is analyzed using recent advances in digital photoelasticity. Initially, a novel 3-D photoelastic model making procedure, to closely mimic all the anatomical features of the human mandible is proposed. By choosing appropriate orientation of the model with respect to the light path, the essential region of interest were sought to be analysed while keeping the model under live loading conditions. Need for a sophisticated software module to carefully identify the model domain has been brought out. For data extraction, five-step method is used and isochromatics are evaluated by twelve fringe photoelasticity. In addition to the isochromatic fringe field, whole field isoclinic data is also obtained for the first time in implant dentistry, which could throw important information in improving the structural stability of the implant systems. Analysis is carried out for the implant in the molar as well as the incisor region. In addition, the interaction effects of loaded molar implant on the incisor area are also studied.

  14. Contact resistance problems applying ERT on low bulk density forested stony soils. Is there a solution?

    Science.gov (United States)

    Deraedt, Deborah; Touzé, Camille; Robert, Tanguy; Colinet, Gilles; Degré, Aurore; Garré, Sarah

    2015-04-01

    Electrical resistivity tomography (ERT) has often been put forward as a promising tool to quantify soil water and solute fluxes in a non-invasive way. In our experiment, we wanted to determine preferential flow processes along a forested hillslope using a saline tracer with ERT. The experiment was conducted in the Houille watershed, subcatchment of the Meuse located in the North of Belgian Ardennes (50° 1'52.6'N, 4° 53'22.5'E). The climate is continental but the soil under spruce (Picea abies (L.) Karst.) and Douglas fire stand (Pseudotsuga menziesii (Mirb.) Franco) remains quite dry (19% WVC in average) during the whole year. The soil is Cambisol and the parent rock is Devonian schist covered with variable thickness of silty loam soil. The soil density ranges from 1.13 to 1.87 g/cm3 on average. The stone content varies from 20 to 89% and the soil depth fluctuates between 70 and 130 cm. The ERT tests took place on June 1st 2012, April 1st, 2nd and 3rd 2014 and May 12th 2014. We used the Terrameter LS 12 channels (ABEM, Sweden) in 2012 test and the DAS-1 (Multi-Phase Technologies, United States) in 2014. Different electrode configurations and arrays were adopted for different dates (transect and grid arrays and Wenner - Schlumberger, Wenner alpha and dipole-dipole configurations). During all tests, we systematically faced technical problems, mainly related to bad electrode contact. The recorded data show values of contact resistance above 14873 Ω (our target value would be below 3000 Ω). Subsequently, we tried to improve the contact by predrilling the soil and pouring water in the electrode holes. The contact resistance improved to 14040 Ω as minimum. The same procedure with liquid mud was then tested to prevent quick percolation of the water from the electrode location. As a result, the lower contact resistance dropped to 11745 Ω. Finally, we applied about 25 litre of saline solution (CaCl2, 0.75g/L) homogeneously on the electrode grid. The minimum value of

  15. A national and international analysis of changing forest density.

    Directory of Open Access Journals (Sweden)

    Aapo Rautiainen

    Full Text Available Like cities, forests grow by spreading out or by growing denser. Both inventories taken steadily by a single nation and other inventories gathered recently from many nations by the United Nations confirm the asynchronous effects of changing area and of density or volume per hectare. United States forests spread little after 1953, while growing density per hectare increased national volume and thus sequestered carbon. The 2010 United Nations appraisal of global forests during the briefer span of two decades after 1990 reveals a similar pattern: A slowing decline of area with growing volume means growing density in 68 nations encompassing 72% of reported global forest land and 68% of reported global carbon mass. To summarize, the nations were placed in 5 regions named for continents. During 1990-2010 national density grew unevenly, but nevertheless grew in all regions. Growing density was responsible for substantially increasing sequestered carbon in the European and North American regions, despite smaller changes in area. Density nudged upward in the African and South American regions as area loss outstripped the loss of carbon. For the Asian region, density grew in the first decade and fell slightly in the second as forest area expanded. The different courses of area and density disqualify area as a proxy for volume and carbon. Applying forestry methods traditionally used to measure timber volumes still offers a necessary route to measuring carbon stocks. With little expansion of forest area, managing for timber growth and density offered a way to increase carbon stocks.

  16. Analysis of surface degradation of high density polyethylene (HDPE ...

    Indian Academy of Sciences (India)

    Unknown

    Analysis of surface degradation of high density polyethylene (HDPE) insulation ... ammonium chloride as the contaminant, in high density polyethylene ..... liquid in the material. When diffusion is driven by the concentration gradient and if there is no chemical change between liquid and material, this would result in mass.

  17. Exercise and Bone Density: Meta-Analysis

    Science.gov (United States)

    2007-01-01

    ZO,ZZ,Z6,30,31,33,34,37-39 three each in Australia18,3S,36and the United King- dom ,17,Zl,3Ztwo each in Finlandz4,z5and Japan,Z3,40 and one in...were used to calculate four dependent variables related to SSC performance: peak vertical velocity (VI)’ small- est knee angle (eK). and durations of...the eccentric (tE) and concentric (te:>phases. The mean of three jumps was used for statistical analysis using a multivariate analysis of variance (MA

  18. IAEA advisory group meeting on basic and applied problems of nuclear level densities

    International Nuclear Information System (INIS)

    Bhat, M.R.

    1983-06-01

    Separate entries were made in the data base for 17 of the 19 papers included. Two papers were previously included in the data base. Workshop reports are included on (1) nuclear level density theories and nuclear model reaction cross-section calculations and (2) extraction of nuclear level density information from experimental data

  19. Applied research of environmental monitoring using instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Young Sam; Moon, Jong Hwa; Chung, Young Ju

    1997-08-01

    This technical report is written as a guide book for applied research of environmental monitoring using Instrumental Neutron Activation Analysis. The contents are as followings; sampling and sample preparation as a airborne particulate matter, analytical methodologies, data evaluation and interpretation, basic statistical methods of data analysis applied in environmental pollution studies. (author). 23 refs., 7 tabs., 9 figs.

  20. Hands on applied finite element analysis application with ANSYS

    CERN Document Server

    Arslan, Mehmet Ali

    2015-01-01

    Hands on Applied Finite Element Analysis Application with Ansys is truly an extraordinary book that offers practical ways of tackling FEA problems in machine design and analysis. In this book, 35 good selection of example problems have been presented, offering students the opportunity to apply their knowledge to real engineering FEA problem solutions by guiding them with real life hands on experience.

  1. The Negele-Vautherin Density Matrix Expansion Applied to the Gogny Force

    Energy Technology Data Exchange (ETDEWEB)

    Dobaczewski, J. [Warsaw University; Carlsson, B. G. [University of Jyvaskyla; Kortelainen, Erno M [ORNL

    2010-01-01

    We use the Negele-Vautherin density matrix expansion to derive local density approximation for the interaction composed of arbitrary finite-range central, spin-orbit, and tensor components. Terms that are absent in the original Negele-Vautherin approach owing to the angle averaging of the density matrix are fixed by requiring gauge invariance of the energy density. We obtain the Kohn-Sham interaction energies in all spin-isospin channels, including the exchange terms, expressed as functions of the local densities and their derivatives up to second (next to leading) order. We illustrate the method by determining the coupling constants of the Skyrme functional or Skyrme force that correspond to the finite-range Gogny central force. The resulting self-consistent solutions reproduce the Gogny-force binding energies and radii within the precision of 1-2%.

  2. Density Functional Theory Applied to Pt and Pt Alloy Clusters and Adsorbate

    National Research Council Canada - National Science Library

    Smotkin, Eugine

    2001-01-01

    ...). Using this advanced and powerful multiprocessor SGI server, we were able to dramatically speed and further advance our computational Density Functional calculations on CO absorbed on Pt and alloy clusters...

  3. An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia.

    Directory of Open Access Journals (Sweden)

    Darren Kidney

    Full Text Available Some animal species are hard to see but easy to hear. Standard visual methods for estimating population density for such species are often ineffective or inefficient, but methods based on passive acoustics show more promise. We develop spatially explicit capture-recapture (SECR methods for territorial vocalising species, in which humans act as an acoustic detector array. We use SECR and estimated bearing data from a single-occasion acoustic survey of a gibbon population in northeastern Cambodia to estimate the density of calling groups. The properties of the estimator are assessed using a simulation study, in which a variety of survey designs are also investigated. We then present a new form of the SECR likelihood for multi-occasion data which accounts for the stochastic availability of animals. In the context of gibbon surveys this allows model-based estimation of the proportion of groups that produce territorial vocalisations on a given day, thereby enabling the density of groups, instead of the density of calling groups, to be estimated. We illustrate the performance of this new estimator by simulation. We show that it is possible to estimate density reliably from human acoustic detections of visually cryptic species using SECR methods. For gibbon surveys we also show that incorporating observers' estimates of bearings to detected groups substantially improves estimator performance. Using the new form of the SECR likelihood we demonstrate that estimates of availability, in addition to population density and detection function parameters, can be obtained from multi-occasion data, and that the detection function parameters are not confounded with the availability parameter. This acoustic SECR method provides a means of obtaining reliable density estimates for territorial vocalising species. It is also efficient in terms of data requirements since since it only requires routine survey data. We anticipate that the low-tech field requirements will

  4. Sensitivity analysis approaches applied to systems biology models.

    Science.gov (United States)

    Zi, Z

    2011-11-01

    With the rising application of systems biology, sensitivity analysis methods have been widely applied to study the biological systems, including metabolic networks, signalling pathways and genetic circuits. Sensitivity analysis can provide valuable insights about how robust the biological responses are with respect to the changes of biological parameters and which model inputs are the key factors that affect the model outputs. In addition, sensitivity analysis is valuable for guiding experimental analysis, model reduction and parameter estimation. Local and global sensitivity analysis approaches are the two types of sensitivity analysis that are commonly applied in systems biology. Local sensitivity analysis is a classic method that studies the impact of small perturbations on the model outputs. On the other hand, global sensitivity analysis approaches have been applied to understand how the model outputs are affected by large variations of the model input parameters. In this review, the author introduces the basic concepts of sensitivity analysis approaches applied to systems biology models. Moreover, the author discusses the advantages and disadvantages of different sensitivity analysis methods, how to choose a proper sensitivity analysis approach, the available sensitivity analysis tools for systems biology models and the caveats in the interpretation of sensitivity analysis results.

  5. Physiological responses to acid stress by Saccharomyces cerevisiae when applying high initial cell density

    Science.gov (United States)

    2016-01-01

    High initial cell density is used to increase volumetric productivity and shorten production time in lignocellulosic hydrolysate fermentation. Comparison of physiological parameters in high initial cell density cultivation of Saccharomyces cerevisiae in the presence of acetic, formic, levulinic and cinnamic acids demonstrated general and acid-specific responses of cells. All the acids studied impaired growth and inhibited glycolytic flux, and caused oxidative stress and accumulation of trehalose. However, trehalose may play a role other than protecting yeast cells from acid-induced oxidative stress. Unlike the other acids, cinnamic acid did not cause depletion of cellular ATP, but abolished the growth of yeast on ethanol. Compared with low initial cell density, increasing initial cell density reduced the lag phase and improved the bioconversion yield of cinnamic acid during acid adaptation. In addition, yeast cells were able to grow at elevated concentrations of acid, probable due to the increase in phenotypic cell-to-cell heterogeneity in large inoculum size. Furthermore, the specific growth rate and the specific rates of glucose consumption and metabolite production were significantly lower than at low initial cell density, which was a result of the accumulation of a large fraction of cells that persisted in a viable but non-proliferating state. PMID:27620460

  6. Animal Research in the "Journal of Applied Behavior Analysis"

    Science.gov (United States)

    Edwards, Timothy L.; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the "Journal of Applied Behavior Analysis" and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance…

  7. Optical excitation and electron relaxation dynamics at semiconductor surfaces: a combined approach of density functional and density matrix theory applied to the silicon (001) surface

    Energy Technology Data Exchange (ETDEWEB)

    Buecking, N.

    2007-11-05

    In this work a new theoretical formalism is introduced in order to simulate the phononinduced relaxation of a non-equilibrium distribution to equilibrium at a semiconductor surface numerically. The non-equilibrium distribution is effected by an optical excitation. The approach in this thesis is to link two conventional, but approved methods to a new, more global description: while semiconductor surfaces can be investigated accurately by density-functional theory, the dynamical processes in semiconductor heterostructures are successfully described by density matrix theory. In this work, the parameters for density-matrix theory are determined from the results of density-functional calculations. This work is organized in two parts. In Part I, the general fundamentals of the theory are elaborated, covering the fundamentals of canonical quantizations as well as the theory of density-functional and density-matrix theory in 2{sup nd} order Born approximation. While the formalism of density functional theory for structure investigation has been established for a long time and many different codes exist, the requirements for density matrix formalism concerning the geometry and the number of implemented bands exceed the usual possibilities of the existing code in this field. A special attention is therefore attributed to the development of extensions to existing formulations of this theory, where geometrical and fundamental symmetries of the structure and the equations are used. In Part II, the newly developed formalism is applied to a silicon (001)surface in a 2 x 1 reconstruction. As first step, density-functional calculations using the LDA functional are completed, from which the Kohn-Sham-wave functions and eigenvalues are used to calculate interaction matrix elements for the electron-phonon-coupling an the optical excitation. These matrix elements are determined for the optical transitions from valence to conduction bands and for electron-phonon processes inside the

  8. Dimensional Analysis with space discrimination applied to Fickian difussion phenomena

    International Nuclear Information System (INIS)

    Diaz Sanchidrian, C.; Castans, M.

    1989-01-01

    Dimensional Analysis with space discrimination is applied to Fickian difussion phenomena in order to transform its partial differen-tial equations into ordinary ones, and also to obtain in a dimensionl-ess fom the Ficks second law. (Author)

  9. Local density of states of a disordered superconductor applied to cuprates

    Energy Technology Data Exchange (ETDEWEB)

    Kasal, R.B., E-mail: kasal@if.uff.b [Instituto de Fi' sica, Universidade Federal Fluminense, Niteroi, RJ 24210-340 (Brazil); Mello, E.V.L. de [Instituto de Fi' sica, Universidade Federal Fluminense, Niteroi, RJ 24210-340 (Brazil)

    2010-12-15

    We describe the physics of cuprate superconductors by an electronic phase separation (EPS) transition that segregates the holes into high and low density domains. This approach explains the pseudogap and superconducting phases and it also reproduces some recent scanning tunneling microscopy (STM) data.

  10. The Keldysh formalism applied to time-dependent current-density-functional theory

    NARCIS (Netherlands)

    Gidopoulos, NI; Wilson, S

    2003-01-01

    In this work we demonstrate how to derive the Kohn-Sham equations of time-dependent current-density functional theory from a generating action functional defined on a Keldysh time contour. These Kohn-Sham equations contain an exchange-correlation contribution to the vector potential. For this

  11. Earthworm Population Density in Sugarcane Cropping System Applied with Various Quality of Organic Matter

    Directory of Open Access Journals (Sweden)

    Nurhidayati Nurhidayati

    2012-12-01

    Full Text Available Earthworms population in the soil are greatly impacted by agricultural management, yet little is known about how the quality and quantity of organic matter addition interact in sugarcane cropping system to earthworm population. This study describes the effect of various organic matter and application rates on earthworms in sugarcane cropping system. Earthworms were collected in April, July and December from 48 experimental plots under five kinds of organic matter application : (1 cattle manure, (2 filter cake of sugar mill, (3 sugarcane trash, (4 mixture of cattle manure+filter cake, and (5 mixture of cattle manure+sugarcane trash. There were three application rates of the organic matter (5, 10, and 15 ton ha-1. The treatments were arranged in factorial block randomize design with three replications and one treatment as a control (no organic input. Earthworms were collected using monolith sampling methods and hand-sorted from each plot, and measured its density (D (indiv.m-2, biomass (B (g m-2 and B/D ratio (g/indiv.. All the plots receiving organic matter input had higher earthworm density, biomass, and B/D ratio than the control. The highest earthworm population density was found in the plot receiving application of sugarcane trash (78 indiv.m-2 and the mixture of cattle manure+sugarcane trash (84 indiv.m-2. The increase in application rates of organic matter could increase the earthworm density and biomass. Earthworm population density also appeared to be strongly influenced by the quality of organic matter, such as the C-organic, N, C/N ratio, lignin, polyphenols, and cellulose content. Earthworm preferred low quality organic matter. It was caused by the higher energy of low quality organic matter than high quality organic matter. Our findings suggest that the input of low quality organic matter with application rate as 10 ton ha-1 is important for maintaining earthworm population and soil health in sugarcane land.

  12. Unsupervised deep learning applied to breast density segmentation and mammographic risk scoring

    DEFF Research Database (Denmark)

    Kallenberg, Michiel Gijsbertus J.; Petersen, Peter Kersten; Nielsen, Mads

    2016-01-01

    Mammographic risk scoring has commonly been automated by extracting a set of handcrafted features from mammograms, and relating the responses directly or indirectly to breast cancer risk. We present a method that learns a feature hierarchy from unlabeled data. When the learned features are used...... as the input to a simple classifier, two different tasks can be addressed: i) breast density segmentation, and ii) scoring of mammographic texture. The proposed model learns features at multiple scales. To control the models capacity a novel sparsity regularizer is introduced that incorporates both lifetime...... and population sparsity. We evaluated our method on three different clinical datasets. Our state-of-the-art results show that the learned breast density scores have a very strong positive relationship with manual ones, and that the learned texture scores are predictive of breast cancer. The model is easy...

  13. 3D-Laser-Scanning Technique Applied to Bulk Density Measurements of Apollo Lunar Samples

    Science.gov (United States)

    Macke, R. J.; Kent, J. J.; Kiefer, W. S.; Britt, D. T.

    2015-01-01

    In order to better interpret gravimetric data from orbiters such as GRAIL and LRO to understand the subsurface composition and structure of the lunar crust, it is import to have a reliable database of the density and porosity of lunar materials. To this end, we have been surveying these physical properties in both lunar meteorites and Apollo lunar samples. To measure porosity, both grain density and bulk density are required. For bulk density, our group has historically utilized sub-mm bead immersion techniques extensively, though several factors have made this technique problematic for our work with Apollo samples. Samples allocated for measurement are often smaller than optimal for the technique, leading to large error bars. Also, for some samples we were required to use pure alumina beads instead of our usual glass beads. The alumina beads were subject to undesirable static effects, producing unreliable results. Other investigators have tested the use of 3d laser scanners on meteorites for measuring bulk volumes. Early work, though promising, was plagued with difficulties including poor response on dark or reflective surfaces, difficulty reproducing sharp edges, and large processing time for producing shape models. Due to progress in technology, however, laser scanners have improved considerably in recent years. We tested this technique on 27 lunar samples in the Apollo collection using a scanner at NASA Johnson Space Center. We found it to be reliable and more precise than beads, with the added benefit that it involves no direct contact with the sample, enabling the study of particularly friable samples for which bead immersion is not possible

  14. The Significance of Regional Analysis in Applied Geography.

    Science.gov (United States)

    Sommers, Lawrence M.

    Regional analysis is central to applied geographic research, contributing to better planning and policy development for a variety of societal problems facing the United States. The development of energy policy serves as an illustration of the capabilities of this type of analysis. The United States has had little success in formulating a national…

  15. An investigation of magnetic field effects on plume density and temperature profiles of an applied-field MPD thruster

    Science.gov (United States)

    Bullock, S. Ray; Myers, R. M.

    1994-01-01

    Applied-field magnetoplasmadynamic (MPD) thruster performance is below levels required for primary propulsion missions. While MPD thruster performance has been found to increase with the magnitude of the applied-field strength, there is currently little understanding of the impact of applied-field shape on thruster performance. The results of a study in which a single applied-field thruster was operated using three solenoidal magnets with diameters of 12.7, 15.2, and 30.4-cm are presented. Thruster voltage and anode power deposition were measured for each applied field shape over a range of field strengths. Plume electron number density and temperature distributions were measured using a Langmuir probe in an effort to determine the effect of field shape on plume confinement by the diverging magnetic-field for each of the three magnetic field shapes. Results show that the dependence of the measured thruster characteristics on field shape were non-monotonic and that the field shape had a significant effect on the plume density and temperature profiles.

  16. Robust functional statistics applied to Probability Density Function shape screening of sEMG data.

    Science.gov (United States)

    Boudaoud, S; Rix, H; Al Harrach, M; Marin, F

    2014-01-01

    Recent studies pointed out possible shape modifications of the Probability Density Function (PDF) of surface electromyographical (sEMG) data according to several contexts like fatigue and muscle force increase. Following this idea, criteria have been proposed to monitor these shape modifications mainly using High Order Statistics (HOS) parameters like skewness and kurtosis. In experimental conditions, these parameters are confronted with small sample size in the estimation process. This small sample size induces errors in the estimated HOS parameters restraining real-time and precise sEMG PDF shape monitoring. Recently, a functional formalism, the Core Shape Model (CSM), has been used to analyse shape modifications of PDF curves. In this work, taking inspiration from CSM method, robust functional statistics are proposed to emulate both skewness and kurtosis behaviors. These functional statistics combine both kernel density estimation and PDF shape distances to evaluate shape modifications even in presence of small sample size. Then, the proposed statistics are tested, using Monte Carlo simulations, on both normal and Log-normal PDFs that mimic observed sEMG PDF shape behavior during muscle contraction. According to the obtained results, the functional statistics seem to be more robust than HOS parameters to small sample size effect and more accurate in sEMG PDF shape screening applications.

  17. Difficulties in applying pure Kohn-Sham density functional theory electronic structure methods to protein molecules

    Science.gov (United States)

    Rudberg, Elias

    2012-02-01

    Self-consistency-based Kohn-Sham density functional theory (KS-DFT) electronic structure calculations with Gaussian basis sets are reported for a set of 17 protein-like molecules with geometries obtained from the Protein Data Bank. It is found that in many cases such calculations do not converge due to vanishing HOMO-LUMO gaps. A sequence of polyproline I helix molecules is also studied and it is found that self-consistency calculations using pure functionals fail to converge for helices longer than six proline units. Since the computed gap is strongly correlated to the fraction of Hartree-Fock exchange, test calculations using both pure and hybrid density functionals are reported. The tested methods include the pure functionals BLYP, PBE and LDA, as well as Hartree-Fock and the hybrid functionals BHandHLYP, B3LYP and PBE0. The effect of including solvent molecules in the calculations is studied, and it is found that the inclusion of explicit solvent molecules around the protein fragment in many cases gives a larger gap, but that convergence problems due to vanishing gaps still occur in calculations with pure functionals. In order to achieve converged results, some modeling of the charge distribution of solvent water molecules outside the electronic structure calculation is needed. Representing solvent water molecules by a simple point charge distribution is found to give non-vanishing HOMO-LUMO gaps for the tested protein-like systems also for pure functionals.

  18. The interaction between theory and experiment in charge density analysis

    International Nuclear Information System (INIS)

    Coppens, Phillip

    2013-01-01

    The field of x-ray charge density analysis has gradually morphed into an area benefiting from the strong interactions between theoreticians and experimentalists, leading to new concepts on chemical bonding and of intermolecular interactions in condensed phases. Some highlights of the developments culminating in the 2013 Aminoff Award are described in this paper. (comment)

  19. Reactivity descriptors and electron density analysis for ligand ...

    Indian Academy of Sciences (India)

    We discuss evaluation of local descriptors using relaxed as well as frozen approximation and characterize the / acceptance/donor characteristics of the above ligands. The intermolecular reactivity sequence for the same systems is examined by the global and local philicity index. In addition, electron density analysis has ...

  20. Application of texture analysis method for mammogram density classification

    Science.gov (United States)

    Nithya, R.; Santhi, B.

    2017-07-01

    Mammographic density is considered a major risk factor for developing breast cancer. This paper proposes an automated approach to classify breast tissue types in digital mammogram. The main objective of the proposed Computer-Aided Diagnosis (CAD) system is to investigate various feature extraction methods and classifiers to improve the diagnostic accuracy in mammogram density classification. Texture analysis methods are used to extract the features from the mammogram. Texture features are extracted by using histogram, Gray Level Co-Occurrence Matrix (GLCM), Gray Level Run Length Matrix (GLRLM), Gray Level Difference Matrix (GLDM), Local Binary Pattern (LBP), Entropy, Discrete Wavelet Transform (DWT), Wavelet Packet Transform (WPT), Gabor transform and trace transform. These extracted features are selected using Analysis of Variance (ANOVA). The features selected by ANOVA are fed into the classifiers to characterize the mammogram into two-class (fatty/dense) and three-class (fatty/glandular/dense) breast density classification. This work has been carried out by using the mini-Mammographic Image Analysis Society (MIAS) database. Five classifiers are employed namely, Artificial Neural Network (ANN), Linear Discriminant Analysis (LDA), Naive Bayes (NB), K-Nearest Neighbor (KNN), and Support Vector Machine (SVM). Experimental results show that ANN provides better performance than LDA, NB, KNN and SVM classifiers. The proposed methodology has achieved 97.5% accuracy for three-class and 99.37% for two-class density classification.

  1. Analysis of optimum density of forest roads in rural properties

    Directory of Open Access Journals (Sweden)

    Flávio Cipriano de Assis do Carmo

    2013-09-01

    Full Text Available This study analyzed the density of roads in rural properties in the south of the Espírito Santo and compared it with the calculation of the optimal density in forestry companies in steep areas. The work was carried out in six small rural properties based on the costs of roads of forest use, wood extraction and the costs of loss of productive area. The technical analysis included time and movement study and productivity. The economic analysis included operational costs, production costs and returns for different scenarios of productivity (180m.ha-1, 220m.ha-1and 250 m.ha-1. According to the results, all the properties have densities of road well above the optimum, which reflects the lack of criteria in the planning of the forest stands, resulting in a inadequate use of plantation area. Property 1 had the highest density of roads (373.92 m.ha-1 and the property 5 presented the lowest density (111.56 m.ha-1.

  2. Negative reinforcement in applied behavior analysis: an emerging technology.

    OpenAIRE

    Iwata, B A

    1987-01-01

    Although the effects of negative reinforcement on human behavior have been studied for a number of years, a comprehensive body of applied research does not exist at this time. This article describes three aspects of negative reinforcement as it relates to applied behavior analysis: behavior acquired or maintained through negative reinforcement, the treatment of negatively reinforced behavior, and negative reinforcement as therapy. A consideration of research currently being done in these area...

  3. Applied data analysis and modeling for energy engineers and scientists

    CERN Document Server

    Reddy, T Agami

    2011-01-01

    ""Applied Data Analysis and Modeling for Energy Engineers and Scientists"" discusses mathematical models, data analysis, and decision analysis in modeling. The approach taken in this volume focuses on the modeling and analysis of thermal systems in an engineering environment, while also covering a number of other critical areas. Other material covered includes the tools that researchers and engineering professionals will need in order to explore different analysis methods, use critical assessment skills and reach sound engineering conclusions. The book also covers process and system design and

  4. MADNESS applied to density functional theory in chemistry and nuclear physics

    International Nuclear Information System (INIS)

    Fann, G I; Harrison, R J; Beylkin, G; Jia, J; Hartman-Baker, R; Shelton, W A; Sugiki, S

    2007-01-01

    We describe some recent mathematical results in constructing computational methods that lead to the development of fast and accurate multiresolution numerical methods for solving quantum chemistry and nuclear physics problems based on Density Functional Theory (DFT). Using low separation rank representations of functions and operators in conjunction with representations in multiwavelet bases, we developed a multiscale solution method for integral and differential equations and integral transforms. The Poisson equation, the Schrodinger equation, and the projector on the divergence free functions provide important examples with a wide range of applications in computational chemistry, nuclear physics, computational electromagnetic and fluid dynamics. We have implemented this approach along with adaptive representations of operators and functions in the multiwavelet basis and low separation rank (LSR) approximation of operators and functions. These methods have been realized and implemented in a software package called Multiresolution Adaptive Numerical Evaluation for Scientific Simulation (MADNESS)

  5. LAMQS analysis applied to ancient Egyptian bronze coins

    International Nuclear Information System (INIS)

    Torrisi, L.; Caridi, F.; Giuffrida, L.; Torrisi, A.; Mondio, G.; Serafino, T.; Caltabiano, M.; Castrizio, E.D.; Paniz, E.; Salici, A.

    2010-01-01

    Some Egyptian bronze coins, dated VI-VII sec A.D. are analyzed through different physical techniques in order to compare their composition and morphology and to identify their origin and the type of manufacture. The investigations have been performed by using micro-invasive analysis, such as Laser Ablation and Mass Quadrupole Spectrometry (LAMQS), X-ray Fluorescence (XRF), Laser Induced Breakdown Spectroscopy (LIBS), Electronic (SEM) and Optical Microscopy, Surface Profile Analysis (SPA) and density measurements. Results indicate that the coins have a similar bulk composition but significant differences have been evidenced due to different constituents of the patina, bulk alloy composition, isotopic ratios, density and surface morphology. The results are in agreement with the archaeological expectations, indicating that the coins have been produced in two different Egypt sites: Alexandria and Antinoupolis. A group of fake coins produced in Alexandria in the same historical period is also identified.

  6. Finite difference applied to the reconstruction method of the nuclear power density distribution

    International Nuclear Information System (INIS)

    Pessoa, Paulo O.; Silva, Fernando C.; Martinez, Aquilino S.

    2016-01-01

    Highlights: • A method for reconstruction of the power density distribution is presented. • The method uses discretization by finite differences of 2D neutrons diffusion equation. • The discretization is performed homogeneous meshes with dimensions of a fuel cell. • The discretization is combined with flux distributions on the four node surfaces. • The maximum errors in reconstruction occur in the peripheral water region. - Abstract: In this reconstruction method the two-dimensional (2D) neutron diffusion equation is discretized by finite differences, employed to two energy groups (2G) and meshes with fuel-pin cell dimensions. The Nodal Expansion Method (NEM) makes use of surface discontinuity factors of the node and provides for reconstruction method the effective multiplication factor of the problem and the four surface average fluxes in homogeneous nodes with size of a fuel assembly (FA). The reconstruction process combines the discretized 2D diffusion equation by finite differences with fluxes distribution on four surfaces of the nodes. These distributions are obtained for each surfaces from a fourth order one-dimensional (1D) polynomial expansion with five coefficients to be determined. The conditions necessary for coefficients determination are three average fluxes on consecutive surfaces of the three nodes and two fluxes in corners between these three surface fluxes. Corner fluxes of the node are determined using a third order 1D polynomial expansion with four coefficients. This reconstruction method uses heterogeneous nuclear parameters directly providing the heterogeneous neutron flux distribution and the detailed nuclear power density distribution within the FAs. The results obtained with this method has good accuracy and efficiency when compared with reference values.

  7. Conference report: summary of the 2010 Applied Pharmaceutical Analysis Conference.

    Science.gov (United States)

    Unger, Steve E

    2011-01-01

    This year, the Applied Pharmaceutical Analysis meeting changed its venue to the Grand Tremont Hotel in Baltimore, MD, USA. Proximity to Washington presented the opportunity to have four speakers from the US FDA. The purpose of the 4-day conference is to provide a forum in which pharmaceutical and CRO scientists can discuss and develop best practices for scientific challenges in bioanalysis and drug metabolism. This year's theme was 'Bioanalytical and Biotransformation Challenges in Meeting Global Regulatory Expectations & New Technologies for Drug Discovery Challenges'. Applied Pharmaceutical Analysis continued its tradition of highlighting new technologies and its impact on drug discovery, drug metabolism and small molecule-regulated bioanalysis. This year, the meeting included an integrated focus on metabolism in drug discovery and development. Middle and large molecule (biotherapeutics) drug development, immunoassay, immunogenicity and biomarkers were also integrated into the forum. Applied Pharmaceutical Analysis offered an enhanced diversity of topics this year while continuing to share experiences of discovering and developing new medicines.

  8. Covariance analysis for energy density functionals and instabilities

    International Nuclear Information System (INIS)

    Roca-Maza, X; Colò, G; Paar, N

    2015-01-01

    We present the covariance analysis of two successful nuclear energy density functionals (EDFs), (i) a non-relativistic Skyrme functional built from a zero-range effective interaction, and (ii) a relativistic nuclear EDF based on density dependent meson–nucleon couplings. The covariance analysis is a useful tool for understanding the limitations of a model, the correlations between observables and the statistical errors. We show, for our selected test nucleus 208 Pb, that when the constraint on a property A included in the fit is relaxed, correlations with other observables B become larger; on the other hand, when a strong constraint is imposed on A, the correlations with other properties become very small. We also provide a brief review, partly connected with the covariance analysis, of some instabilities displayed by several EDFs currently used in nuclear physics. (paper)

  9. Thermodynamic analysis of Thermophotovoltaic Efficiency and Power Density Tradeoffs

    Energy Technology Data Exchange (ETDEWEB)

    P.F. Baldasara; J.E. Reynolds; G.W. Charache; D.M. DePoy; C.T. Ballinger; T. Donovan; J.M. Borrego

    2000-02-22

    This report presents an assessment of the efficiency and power density limitations of thermophotovoltaic (TPV) energy conversion systems for both ideal (radiative-limited) and practical (defect-limited) systems. Thermodynamics is integrated into the unique process physics of TPV conversion, and used to define the intrinsic tradeoff between power density and efficiency. The results of the analysis reveal that the selection of diode bandgap sets a limit on achievable efficiency well below the traditional Carnot level. In addition it is shown that filter performance dominates diode performance in any practical TPV system and determines the optimum bandgap for a given radiator temperature. It is demonstrated that for a given radiator temperature, lower bandgap diodes enable both higher efficiency and power density when spectral control limitations are included. The goal of this work is to provide a better understanding of the basic system limitations that will enable successful long-term development of TPV energy conversion technology.

  10. Urinary density measurement and analysis methods in neonatal unit care

    Directory of Open Access Journals (Sweden)

    Maria Vera Lúcia Moreira Leitão Cardoso

    2013-09-01

    Full Text Available The objective was to assess urine collection methods through cotton in contact with genitalia and urinary collector to measure urinary density in newborns. This is a quantitative intervention study carried out in a neonatal unit of Fortaleza-CE, Brazil, in 2010. The sample consisted of 61 newborns randomly chosen to compose the study group. Most neonates were full term (31/50.8% males (33/54%. Data on urinary density measurement through the methods of cotton and collector presented statistically significant differences (p<0.05. The analysis of interquartile ranges between subgroups resulted in statistical differences between urinary collector/reagent strip (1005 and cotton/reagent strip (1010, however there was no difference between urinary collector/ refractometer (1008 and cotton/ refractometer. Therefore, further research should be conducted with larger sampling using methods investigated in this study and whenever possible, comparing urine density values to laboratory tests.

  11. Range-separated density-functional theory with random phase approximation applied to noncovalent intermolecular interactions.

    Science.gov (United States)

    Zhu, Wuming; Toulouse, Julien; Savin, Andreas; Angyán, János G

    2010-06-28

    Range-separated methods combining a short-range density functional with long-range random phase approximations (RPAs) with or without exchange response kernel are tested on rare-gas dimers and the S22 benchmark set of weakly interacting complexes of Jurecka et al. [Phys. Chem. Chem. Phys. 8, 1985 (2006)]. The methods are also compared to full-range RPA approaches. Both range separation and inclusion of the Hartree-Fock exchange kernel largely improve the accuracy of intermolecular interaction energies. The best results are obtained with the method called RSH+RPAx, which yields interaction energies for the S22 set with an estimated mean absolute error of about 0.5-0.6 kcal/mol, corresponding to a mean absolute percentage error of about 7%-9% depending on the reference interaction energies used. In particular, the RSH+RPAx method is found to be overall more accurate than the range-separated method based on long-range second-order Moller-Plesset (MP2) perturbation theory (RSH+MP2).

  12. Machine learning applied to proton radiography of high-energy-density plasmas

    Science.gov (United States)

    Chen, Nicholas F. Y.; Kasim, Muhammad Firmansyah; Ceurvorst, Luke; Ratan, Naren; Sadler, James; Levy, Matthew C.; Trines, Raoul; Bingham, Robert; Norreys, Peter

    2017-04-01

    Proton radiography is a technique extensively used to resolve magnetic field structures in high-energy-density plasmas, revealing a whole variety of interesting phenomena such as magnetic reconnection and collisionless shocks found in astrophysical systems. Existing methods of analyzing proton radiographs give mostly qualitative results or specific quantitative parameters, such as magnetic field strength, and recent work showed that the line-integrated transverse magnetic field can be reconstructed in specific regimes where many simplifying assumptions were needed. Using artificial neural networks, we demonstrate for the first time 3D reconstruction of magnetic fields in the nonlinear regime, an improvement over existing methods, which reconstruct only in 2D and in the linear regime. A proof of concept is presented here, with mean reconstruction errors of less than 5% even after introducing noise. We demonstrate that over the long term, this approach is more computationally efficient compared to other techniques. We also highlight the need for proton tomography because (i) certain field structures cannot be reconstructed from a single radiograph and (ii) errors can be further reduced when reconstruction is performed on radiographs generated by proton beams fired in different directions.

  13. Spectral analysis and filter theory in applied geophysics

    CERN Document Server

    Buttkus, Burkhard

    2000-01-01

    This book is intended to be an introduction to the fundamentals and methods of spectral analysis and filter theory and their appli­ cations in geophysics. The principles and theoretical basis of the various methods are described, their efficiency and effectiveness eval­ uated, and instructions provided for their practical application. Be­ sides the conventional methods, newer methods arediscussed, such as the spectral analysis ofrandom processes by fitting models to the ob­ served data, maximum-entropy spectral analysis and maximum-like­ lihood spectral analysis, the Wiener and Kalman filtering methods, homomorphic deconvolution, and adaptive methods for nonstation­ ary processes. Multidimensional spectral analysis and filtering, as well as multichannel filters, are given extensive treatment. The book provides a survey of the state-of-the-art of spectral analysis and fil­ ter theory. The importance and possibilities ofspectral analysis and filter theory in geophysics for data acquisition, processing an...

  14. Animal research in the Journal of Applied Behavior Analysis.

    Science.gov (United States)

    Edwards, Timothy L; Poling, Alan

    2011-01-01

    This review summarizes the 6 studies with nonhuman animal subjects that have appeared in the Journal of Applied Behavior Analysis and offers suggestions for future research in this area. Two of the reviewed articles described translational research in which pigeons were used to illustrate and examine behavioral phenomena of applied significance (say-do correspondence and fluency), 3 described interventions that changed animals' behavior (self-injury by a baboon, feces throwing and spitting by a chimpanzee, and unsafe trailer entry by horses) in ways that benefited the animals and the people in charge of them, and 1 described the use of trained rats that performed a service to humans (land-mine detection). We suggest that each of these general research areas merits further attention and that the Journal of Applied Behavior Analysis is an appropriate outlet for some of these publications.

  15. Electron beam irradiation process applied to primary and secondary recycled high density polyethylene

    Energy Technology Data Exchange (ETDEWEB)

    Cardoso, Jéssica R.; Moura, Eduardo de; Geraldo, Áurea B.C., E-mail: ageraldo@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    Plastic bags, packaging and furniture items are examples of plastic utilities always present in life. However, the end-of-life of plastics impacts the environment because of this ubiquity and also often their high degradation time. Recycling processes are important in this scenario because they offer many solutions to this problem. Basically, four ways are known for plastic recycling: primary recycling, which consists in re-extrusion of clean plastic scraps from a production plant; secondary recycling, that uses end-of-life products that generally are reduced in size by extrusion to obtain a more desirable shape for reprocessing (pellets and powder); tertiary recover which is related to thermo-chemical methods to produce fuels and petrochemical feedstock; and quaternary route, that is related to energy recovery and it is done in appropriate reactors. In this work, high density polyethylene (HDPE) was recovered to simulate empirically the primary and secondary recycling ways using materials which ranged from pristine to 20-fold re-extrused materials. The final 20-fold recycled thermoplastic was irradiated in an electron beam accelerator under a dose rate of 22.4 kGy/s and absorbed doses of 50 kGy and 100 kGy. The characterization of HDPE in distinct levels of recovering was performed by infrared spectroscopy (FTIR) and thermogravimetric degradation. In the HDPE recycling, degradation and crosslinking are consecutive processes; degradation is very noticeable in the 20-fold recycled product. Despite this, the 20-fold recycled product presents crosslinking after irradiation process and the post-irradiation product presents similarities in spectroscopic and thermal degradation characteristics of pristine, irradiated HDPE. These results are discussed. (author)

  16. Electron beam irradiation process applied to primary and secondary recycled high density polyethylene

    International Nuclear Information System (INIS)

    Cardoso, Jéssica R.; Moura, Eduardo de; Geraldo, Áurea B.C.

    2017-01-01

    Plastic bags, packaging and furniture items are examples of plastic utilities always present in life. However, the end-of-life of plastics impacts the environment because of this ubiquity and also often their high degradation time. Recycling processes are important in this scenario because they offer many solutions to this problem. Basically, four ways are known for plastic recycling: primary recycling, which consists in re-extrusion of clean plastic scraps from a production plant; secondary recycling, that uses end-of-life products that generally are reduced in size by extrusion to obtain a more desirable shape for reprocessing (pellets and powder); tertiary recover which is related to thermo-chemical methods to produce fuels and petrochemical feedstock; and quaternary route, that is related to energy recovery and it is done in appropriate reactors. In this work, high density polyethylene (HDPE) was recovered to simulate empirically the primary and secondary recycling ways using materials which ranged from pristine to 20-fold re-extrused materials. The final 20-fold recycled thermoplastic was irradiated in an electron beam accelerator under a dose rate of 22.4 kGy/s and absorbed doses of 50 kGy and 100 kGy. The characterization of HDPE in distinct levels of recovering was performed by infrared spectroscopy (FTIR) and thermogravimetric degradation. In the HDPE recycling, degradation and crosslinking are consecutive processes; degradation is very noticeable in the 20-fold recycled product. Despite this, the 20-fold recycled product presents crosslinking after irradiation process and the post-irradiation product presents similarities in spectroscopic and thermal degradation characteristics of pristine, irradiated HDPE. These results are discussed. (author)

  17. The spread of behavior analysis to the applied fields 1

    OpenAIRE

    Fraley, Lawrence E.

    1981-01-01

    This paper reviews the status of applied behavioral science as it exists in the various behavioral fields and considers the role of the Association for Behavior Analysis in serving those fields. The confounding effects of the traditions of psychology are discussed. Relevant issues are exemplified in the fields of law, communications, psychology, and education, but broader generalization is implied.

  18. Context, Cognition, and Biology in Applied Behavior Analysis.

    Science.gov (United States)

    Morris, Edward K.

    Behavior analysts are having their professional identities challenged by the roles that cognition and biology are said to play in the conduct and outcome of applied behavior analysis and behavior therapy. For cogniphiliacs, cognition and biology are central to their interventions because cognition and biology are said to reflect various processes,…

  19. X-ray fluorescence spectrometry applied to soil analysis

    International Nuclear Information System (INIS)

    Salvador, Vera Lucia Ribeiro; Sato, Ivone Mulako; Scapin Junior, Wilson Santo; Scapin, Marcos Antonio; Imakima, Kengo

    1997-01-01

    This paper studies the X-ray fluorescence spectrometry applied to the soil analysis. A comparative study of the WD-XRFS and ED-XRFS techniques was carried out by using the following soil samples: SL-1, SOIL-7 and marine sediment SD-M-2/TM, from IAEA, and clay, JG-1a from Geological Survey of Japan (GSJ)

  20. Progressive-Ratio Schedules and Applied Behavior Analysis

    Science.gov (United States)

    Poling, Alan

    2010-01-01

    Establishing appropriate relations between the basic and applied areas of behavior analysis has been of long and persistent interest to the author. In this article, the author illustrates that there is a direct relation between how hard an organism will work for access to an object or activity, as indexed by the largest ratio completed under a…

  1. B. F. Skinner's Contributions to Applied Behavior Analysis

    Science.gov (United States)

    Morris, Edward K.; Smith, Nathaniel G.; Altus, Deborah E.

    2005-01-01

    Our paper reviews and analyzes B. F. Skinner's contributions to applied behavior analysis in order to assess his role as the field's originator and founder. We found, first, that his contributions fall into five categorizes: the style and content of his science, his interpretations of typical and atypical human behavior, the implications he drew…

  2. Applied Behavior Analysis: Current Myths in Public Education

    Science.gov (United States)

    Fielding, Cheryl; Lowdermilk, John; Lanier, Lauren L.; Fannin, Abigail G.; Schkade, Jennifer L.; Rose, Chad A.; Simpson, Cynthia G.

    2013-01-01

    The effective use of behavior management strategies and related policies continues to be a debated issue in public education. Despite overwhelming evidence espousing the benefits of the implementation of procedures derived from principles based on the science of applied behavior analysis (ABA), educators often indicate many common misconceptions…

  3. Applied Behavior Analysis Is a Science And, Therefore, Progressive

    Science.gov (United States)

    Leaf, Justin B.; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K.; Smith, Tristram; Weiss, Mary Jane

    2016-01-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful…

  4. Positive Behavior Support and Applied Behavior Analysis: A Familial Alliance

    Science.gov (United States)

    Dunlap, Glen; Carr, Edward G.; Horner, Robert H.; Zarcone, Jennifer R.; Schwartz, Ilene

    2008-01-01

    Positive behavior support (PBS) emerged in the mid-1980s as an approach for understanding and addressing problem behaviors. PBS was derived primarily from applied behavior analysis (ABA). Over time, however, PBS research and practice has incorporated evaluative methods, assessment and intervention procedures, and conceptual perspectives associated…

  5. Measuring the grafting density of nanoparticles in solution by analytical ultracentrifugation and total organic carbon analysis.

    Science.gov (United States)

    Benoit, Denise N; Zhu, Huiguang; Lilierose, Michael H; Verm, Raymond A; Ali, Naushaba; Morrison, Adam N; Fortner, John D; Avendano, Carolina; Colvin, Vicki L

    2012-11-06

    Many of the solution phase properties of nanoparticles, such as their colloidal stability and hydrodynamic diameter, are governed by the number of stabilizing groups bound to the particle surface (i.e., grafting density). Here, we show how two techniques, analytical ultracentrifugation (AUC) and total organic carbon analysis (TOC), can be applied separately to the measurement of this parameter. AUC directly measures the density of nanoparticle-polymer conjugates while TOC provides the total carbon content of its aqueous dispersions. When these techniques are applied to model gold nanoparticles capped with thiolated poly(ethylene glycol), the measured grafting densities across a range of polymer chain lengths, polymer concentrations, and nanoparticle diameters agree to within 20%. Moreover, the measured grafting densities correlate well with the polymer content determined by thermogravimetric analysis of solid conjugate samples. Using these tools, we examine the particle core diameter, polymer chain length, and polymer solution concentration dependence of nanoparticle grafting densities in a gold nanoparticle-poly(ethylene glycol) conjugate system.

  6. Theoretical analysis of the density wave in a new continuum model and numerical simulation

    Science.gov (United States)

    Lai, Ling-Ling; Cheng, Rong-Jun; Li, Zhi-Peng; Ge, Hong-Xia

    2014-05-01

    Considered the effect of traffic anticipation in the real world, a new anticipation driving car following model (AD-CF) was proposed by Zheng et al. Based on AD-CF model, adopted an asymptotic approximation between the headway and density, a new continuum model is presented in this paper. The neutral stability condition is obtained by applying the linear stability theory. Additionally, the Korteweg-de Vries (KdV) equation is derived via nonlinear analysis to describe the propagating behavior of traffic density wave near the neutral stability line. The numerical simulation and the analytical results show that the new continuum model is capable of explaining some particular traffic phenomena.

  7. Effect of Magnetic Flux Density and Applied Current on Temperature, Velocity and Entropy Generation Distributions in MHD Pumps

    Directory of Open Access Journals (Sweden)

    M. Kiyasatfar

    2011-01-01

    Full Text Available In the present study, simulation of steady state, incompressible and fully developed laminar flow has been conducted in a magneto hydrodynamic (MHD pump. The governing equations are solved numerically by finite-difference method. The effect of the magnetic flux density and current on the flow and temperature distributions in a MHD pump is investigated. The obtained results showed that controlling the flow and the temperature is possible through the controlling of the applied current and the magnetic flux. Furthermore, the effects of the magnetic flux density and current on entropy generation in MHD pump are considered. Our presented numerical results are in good agreement with the experimental data showed in literature.

  8. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  9. Applying a random encounter model to estimate lion density from camera traps in Serengeti National Park, Tanzania.

    Science.gov (United States)

    Cusack, Jeremy J; Swanson, Alexandra; Coulson, Tim; Packer, Craig; Carbone, Chris; Dickman, Amy J; Kosmala, Margaret; Lintott, Chris; Rowcliffe, J Marcus

    2015-08-01

    The random encounter model (REM) is a novel method for estimating animal density from camera trap data without the need for individual recognition. It has never been used to estimate the density of large carnivore species, despite these being the focus of most camera trap studies worldwide. In this context, we applied the REM to estimate the density of female lions ( Panthera leo ) from camera traps implemented in Serengeti National Park, Tanzania, comparing estimates to reference values derived from pride census data. More specifically, we attempted to account for bias resulting from non-random camera placement at lion resting sites under isolated trees by comparing estimates derived from night versus day photographs, between dry and wet seasons, and between habitats that differ in their amount of tree cover. Overall, we recorded 169 and 163 independent photographic events of female lions from 7,608 and 12,137 camera trap days carried out in the dry season of 2010 and the wet season of 2011, respectively. Although all REM models considered over-estimated female lion density, models that considered only night-time events resulted in estimates that were much less biased relative to those based on all photographic events. We conclude that restricting REM estimation to periods and habitats in which animal movement is more likely to be random with respect to cameras can help reduce bias in estimates of density for female Serengeti lions. We highlight that accurate REM estimates will nonetheless be dependent on reliable measures of average speed of animal movement and camera detection zone dimensions. © 2015 The Authors. Journal of Wildlife Management published by Wiley Periodicals, Inc. on behalf of The Wildlife Society.

  10. Validation in Principal Components Analysis Applied to EEG Data

    Directory of Open Access Journals (Sweden)

    João Carlos G. D. Costa

    2014-01-01

    Full Text Available The well-known multivariate technique Principal Components Analysis (PCA is usually applied to a sample, and so component scores are subjected to sampling variability. However, few studies address their stability, an important topic when the sample size is small. This work presents three validation procedures applied to PCA, based on confidence regions generated by a variant of a nonparametric bootstrap called the partial bootstrap: (i the assessment of PC scores variability by the spread and overlapping of “confidence regions” plotted around these scores; (ii the use of the confidence regions centroids as a validation set; and (iii the definition of the number of nontrivial axes to be retained for analysis. The methods were applied to EEG data collected during a postural control protocol with twenty-four volunteers. Two axes were retained for analysis, with 91.6% of explained variance. Results showed that the area of the confidence regions provided useful insights on the variability of scores and suggested that some subjects were not distinguishable from others, which was not evident from the principal planes. In addition, potential outliers, initially suggested by an analysis of the first principal plane, could not be confirmed by the confidence regions.

  11. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  12. Boston Society's 11th Annual Applied Pharmaceutical Analysis conference.

    Science.gov (United States)

    Lee, Violet; Liu, Ang; Groeber, Elizabeth; Moghaddam, Mehran; Schiller, James; Tweed, Joseph A; Walker, Gregory S

    2016-02-01

    Boston Society's 11th Annual Applied Pharmaceutical Analysis conference, Hyatt Regency Hotel, Cambridge, MA, USA, 14-16 September 2015 The Boston Society's 11th Annual Applied Pharmaceutical Analysis (APA) conference took place at the Hyatt Regency hotel in Cambridge, MA, on 14-16 September 2015. The 3-day conference affords pharmaceutical professionals, academic researchers and industry regulators the opportunity to collectively participate in meaningful and relevant discussions impacting the areas of pharmaceutical drug development. The APA conference was organized in three workshops encompassing the disciplines of regulated bioanalysis, discovery bioanalysis (encompassing new and emerging technologies) and biotransformation. The conference included a short course titled 'Bioanalytical considerations for the clinical development of antibody-drug conjugates (ADCs)', an engaging poster session, several panel and round table discussions and over 50 diverse talks from leading industry and academic scientists.

  13. Applying DEA sensitivity analysis to efficiency measurement of Vietnamese universities

    Directory of Open Access Journals (Sweden)

    Thi Thanh Huyen Nguyen

    2015-11-01

    Full Text Available The primary purpose of this study is to measure the technical efficiency of 30 doctorate-granting universities, the universities or the higher education institutes with PhD training programs, in Vietnam, applying the sensitivity analysis of data envelopment analysis (DEA. The study uses eight sets of input-output specifications using the replacement as well as aggregation/disaggregation of variables. The measurement results allow us to examine the sensitivity of the efficiency of these universities with the sets of variables. The findings also show the impact of variables on their efficiency and its “sustainability”.

  14. Magnetic Solid Phase Extraction Applied to Food Analysis

    Directory of Open Access Journals (Sweden)

    Israel S. Ibarra

    2015-01-01

    Full Text Available Magnetic solid phase extraction has been used as pretreatment technique for the analysis of several compounds because of its advantages when it is compared with classic methods. This methodology is based on the use of magnetic solids as adsorbents for preconcentration of different analytes from complex matrices. Magnetic solid phase extraction minimizes the use of additional steps such as precipitation, centrifugation, and filtration which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique which were applied in food analysis.

  15. Harmonic and applied analysis from groups to signals

    CERN Document Server

    Mari, Filippo; Grohs, Philipp; Labate, Demetrio

    2015-01-01

    This contributed volume explores the connection between the theoretical aspects of harmonic analysis and the construction of advanced multiscale representations that have emerged in signal and image processing. It highlights some of the most promising mathematical developments in harmonic analysis in the last decade brought about by the interplay among different areas of abstract and applied mathematics. This intertwining of ideas is considered starting from the theory of unitary group representations and leading to the construction of very efficient schemes for the analysis of multidimensional data. After an introductory chapter surveying the scientific significance of classical and more advanced multiscale methods, chapters cover such topics as An overview of Lie theory focused on common applications in signal analysis, including the wavelet representation of the affine group, the Schrödinger representation of the Heisenberg group, and the metaplectic representation of the symplectic group An introduction ...

  16. Equivalent Circulation Density Analysis of Geothermal Well by Coupling Temperature

    Directory of Open Access Journals (Sweden)

    Xiuhua Zheng

    2017-02-01

    Full Text Available The accurate control of the wellbore pressure not only prevents lost circulation/blowout and fracturing formation by managing the density of the drilling fluid, but also improves productivity by mitigating reservoir damage. Calculating the geothermal pressure of a geothermal well by constant parameters would easily bring big errors, as the changes of physical, rheological and thermal properties of drilling fluids with temperature are neglected. This paper researched the wellbore pressure coupling by calculating the temperature distribution with the existing model, fitting the rule of density of the drilling fluid with the temperature and establishing mathematical models to simulate the wellbore pressures, which are expressed as the variation of Equivalent Circulating Density (ECD under different conditions. With this method, the temperature and ECDs in the wellbore of the first medium-deep geothermal well, ZK212 Yangyi Geothermal Field in Tibet, were determined, and the sensitivity analysis was simulated by assumed parameters, i.e., the circulating time, flow rate, geothermal gradient, diameters of the wellbore, rheological models and regimes. The results indicated that the geothermal gradient and flow rate were the most influential parameters on the temperature and ECD distribution, and additives added in the drilling fluid should be added carefully as they change the properties of the drilling fluid and induce the redistribution of temperature. To ensure the safe drilling and velocity of pipes tripping into the hole, the depth and diameter of the wellbore are considered to control the surge pressure.

  17. Analysis scheme of density modulation experiments for particle confinements study

    International Nuclear Information System (INIS)

    Tanaka, K.; Michael, C.; Kawanata, K.; Tokuzawa, T.; Shoji, M.; Toi, K.; Gao, X.; Jie, Y.X.

    2005-01-01

    Density modulation experiments are one of the powerful experimental schemas to study particle confinements. The diffusion coefficients (D) and convection velocity (V), which is impossible to evaluated from particle balance in equilibrium state, can be separately obtained. And the estimated value of D and V are determined independent of absolute value of particle source rate, which is difficult to be obtained experimentally. However sensitivities and interpretation of D and V from modulation experiments should be taken care. In this paper, numerical techniques to solve particle balance equation of modulation components are described. Examples of analysis are shown from the data of LHD. And interpretations of results of modulation experiments are studied. (author)

  18. Analysis of Brick Masonry Wall using Applied Element Method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  19. Analysis of concrete beams using applied element method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  20. Crowd Analysis by Using Optical Flow and Density Based Clustering

    DEFF Research Database (Denmark)

    Santoro, Francesco; Pedro, Sergio; Tan, Zheng-Hua

    2010-01-01

    In this paper, we present a system to detect and track crowds in a video sequence captured by a camera. In a first step, we compute optical flows by means of pyramidal Lucas-Kanade feature tracking. Afterwards, a density based clustering is used to group similar vectors. In the last step, it is a......In this paper, we present a system to detect and track crowds in a video sequence captured by a camera. In a first step, we compute optical flows by means of pyramidal Lucas-Kanade feature tracking. Afterwards, a density based clustering is used to group similar vectors. In the last step......, it is applied a crowd tracker in every frame, allowing us to detect and track the crowds. Our system gives the output as a graphic overlay, i.e it adds arrows and colors to the original frame sequence, in order to identify crowds and their movements. For the evaluation, we check when our system detect certains...

  1. Fast ground filtering for TLS data via Scanline Density Analysis

    Science.gov (United States)

    Che, Erzhuo; Olsen, Michael J.

    2017-07-01

    Terrestrial Laser Scanning (TLS) efficiently collects 3D information based on lidar (light detection and ranging) technology. TLS has been widely used in topographic mapping, engineering surveying, forestry, industrial facilities, cultural heritage, and so on. Ground filtering is a common procedure in lidar data processing, which separates the point cloud data into ground points and non-ground points. Effective ground filtering is helpful for subsequent procedures such as segmentation, classification, and modeling. Numerous ground filtering algorithms have been developed for Airborne Laser Scanning (ALS) data. However, many of these are error prone in application to TLS data because of its different angle of view and highly variable resolution. Further, many ground filtering techniques are limited in application within challenging topography and experience difficulty coping with some objects such as short vegetation, steep slopes, and so forth. Lastly, due to the large size of point cloud data, operations such as data traversing, multiple iterations, and neighbor searching significantly affect the computation efficiency. In order to overcome these challenges, we present an efficient ground filtering method for TLS data via a Scanline Density Analysis, which is very fast because it exploits the grid structure storing TLS data. The process first separates the ground candidates, density features, and unidentified points based on an analysis of point density within each scanline. Second, a region growth using the scan pattern is performed to cluster the ground candidates and further refine the ground points (clusters). In the experiment, the effectiveness, parameter robustness, and efficiency of the proposed method is demonstrated with datasets collected from an urban scene and a natural scene, respectively.

  2. Lidar point density analysis: implications for identifying water bodies

    Science.gov (United States)

    Worstell, Bruce B.; Poppenga, Sandra K.; Evans, Gayla A.; Prince, Sandra

    2014-01-01

    Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.

  3. Dimensional analysis and extended hydrodynamic theory applied to long-rod penetration of ceramics

    Directory of Open Access Journals (Sweden)

    J.D. Clayton

    2016-08-01

    Full Text Available Principles of dimensional analysis are applied in a new interpretation of penetration of ceramic targets subjected to hypervelocity impact. The analysis results in a power series representation – in terms of inverse velocity – of normalized depth of penetration that reduces to the hydrodynamic solution at high impact velocities. Specifically considered are test data from four literature sources involving penetration of confined thick ceramic targets by tungsten long rod projectiles. The ceramics are AD-995 alumina, aluminum nitride, silicon carbide, and boron carbide. Test data can be accurately represented by the linear form of the power series, whereby the same value of a single fitting parameter applies remarkably well for all four ceramics. Comparison of the present model with others in the literature (e.g., Tate's theory demonstrates a target resistance stress that depends on impact velocity, linearly in the limiting case. Comparison of the present analysis with recent research involving penetration of thin ceramic tiles at lower typical impact velocities confirms the importance of target properties related to fracture and shear strength at the Hugoniot Elastic Limit (HEL only in the latter. In contrast, in the former (i.e., hypervelocity and thick target experiments, the current analysis demonstrates dominant dependence of penetration depth only by target mass density. Such comparisons suggest transitions from microstructure-controlled to density-controlled penetration resistance with increasing impact velocity and ceramic target thickness.

  4. Resolvability of regional density structure and the road to direct density inversion - a principal-component approach to resolution analysis

    Science.gov (United States)

    Płonka, Agnieszka; Fichtner, Andreas

    2017-04-01

    Lateral density variations are the source of mass transport in the Earth at all scales, acting as drivers of convective motion. However, the density structure of the Earth remains largely unknown since classic seismic observables and gravity provide only weak constraints with strong trade-offs. Current density models are therefore often based on velocity scaling, making strong assumptions on the origin of structural heterogeneities, which may not necessarily be correct. Our goal is to assess if 3D density structure may be resolvable with emerging full-waveform inversion techniques. We have previously quantified the impact of regional-scale crustal density structure on seismic waveforms with the conclusion that reasonably sized density variations within the crust can leave a strong imprint on both travel times and amplitudes, and, while this can produce significant biases in velocity and Q estimates, the seismic waveform inversion for density may become feasible. In this study we perform principal component analyses of sensitivity kernels for P velocity, S velocity, and density. This is intended to establish the extent to which these kernels are linearly independent, i.e. the extent to which the different parameters may be constrained independently. We apply the method to data from 81 events around the Iberian Penninsula, registered in total by 492 stations. The objective is to find a principal kernel which would maximize the sensitivity to density, potentially allowing for as independent as possible density resolution. We find that surface (mosty Rayleigh) waves have significant sensitivity to density, and that the trade-off with velocity is negligible. We also show the preliminary results of the inversion.

  5. Applied research and development of neutron activation analysis

    International Nuclear Information System (INIS)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Baek, Sung Ryel; Kim, Young Gi; Jung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun; Lim, Jong Myoung

    2003-05-01

    The aims of this project are to establish the quality control system of Neutron Activation Analysis(NAA) due to increase of industrial needs for standard analytical method and to prepare and identify the standard operation procedure of NAA through practical testing for different analytical items. R and D implementations of analytical quality system using neutron irradiation facility and gamma-ray measurement system and automation of NAA facility in HANARO research reactor are as following ; 1) Establishment of NAA quality control system for the maintenance of best measurement capability and the promotion of utilization of HANARO research reactor 2) Improvement of analytical sensitivity for industrial applied technologies and establishment of certified standard procedures 3) Standardization and development of Prompt Gamma-ray Activation Analysis (PGAA) technology

  6. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  7. Improving CMD Areal Density Analysis: Algorithms and Strategies

    Directory of Open Access Journals (Sweden)

    R. E. Wilson

    2014-06-01

    Full Text Available Essential ideas, successes, and difficulties of Areal Density Analysis (ADA for color-magnitude diagrams (CMD’s of resolved stellar populations are examined, with explanation of various algorithms and strategies for optimal performance. A CMDgeneration program computes theoretical datasets with simulated observational error and a solution program inverts the problem by the method of Differential Corrections (DC so as to compute parameter values from observed magnitudes and colors, with standard error estimates and correlation coefficients. ADA promises not only impersonal results, but also significant saving of labor, especially where a given dataset is analyzed with several evolution models. Observational errors and multiple star systems, along with various single star characteristics and phenomena, are modeled directly via the Functional Statistics Algorithm (FSA. Unlike Monte Carlo, FSA is not dependent on a random number generator. Discussions include difficulties and overall requirements, such as need for fast evolutionary computation and realization of goals within machine memory limits. Degradation of results due to influence of pixelization on derivatives, Initial Mass Function (IMF quantization, IMF steepness, low Areal Densities (A, and large variation in A are reduced or eliminated through a variety of schemes that are explained sufficiently for general application. The Levenberg-Marquardt and MMS algorithms for improvement of solution convergence are contained within the DC program. An example of convergence, which typically is very good, is shown in tabular form. A number of theoretical and practical solution issues are discussed, as are prospects for further development.

  8. Improving CMD Areal Density Analysis: Algorithms and Strategies

    Science.gov (United States)

    Wilson, R. E.

    2014-06-01

    Essential ideas, successes, and difficulties of Areal Density Analysis (ADA) for color-magnitude diagrams (CMD¡¯s) of resolved stellar populations are examined, with explanation of various algorithms and strategies for optimal performance. A CMDgeneration program computes theoretical datasets with simulated observational error and a solution program inverts the problem by the method of Differential Corrections (DC) so as to compute parameter values from observed magnitudes and colors, with standard error estimates and correlation coefficients. ADA promises not only impersonal results, but also significant saving of labor, especially where a given dataset is analyzed with several evolution models. Observational errors and multiple star systems, along with various single star characteristics and phenomena, are modeled directly via the Functional Statistics Algorithm (FSA). Unlike Monte Carlo, FSA is not dependent on a random number generator. Discussions include difficulties and overall requirements, such as need for fast evolutionary computation and realization of goals within machine memory limits. Degradation of results due to influence of pixelization on derivatives, Initial Mass Function (IMF) quantization, IMF steepness, low Areal Densities (A ), and large variation in A are reduced or eliminated through a variety of schemes that are explained sufficiently for general application. The Levenberg-Marquardt and MMS algorithms for improvement of solution convergence are contained within the DC program. An example of convergence, which typically is very good, is shown in tabular form. A number of theoretical and practical solution issues are discussed, as are prospects for further development.

  9. Diffusing wave spectroscopy applied to material analysis and process control

    International Nuclear Information System (INIS)

    Lloyd, Christopher James

    1997-01-01

    coefficient. The inherent instability of high density suspensions instigated high speed analysis techniques capable of monitoring suspensions that were undergoing rapid change as well as suggesting novel methods for the evaluation of the state of sample dispersion. (author)

  10. A mechanistic analysis of density dependence in algal population dynamics

    Directory of Open Access Journals (Sweden)

    Adrian eBorlestean

    2015-04-01

    Full Text Available Population density regulation is a fundamental principle in ecology, but the specific process underlying functional expression of density dependence remains to be fully elucidated. One view contends that patterns of density dependence are largely fixed across a species irrespective of environmental conditions, whereas another is that the strength and expression of density dependence are fundamentally variable depending on the nature of exogenous or endogenous constraints acting on the population. We conducted a study investigating the expression of density dependence in Chlamydomonas spp. grown under a gradient from low to high nutrient density. We predicted that the relationship between per capita growth rate (pgr and population density would vary from concave up to concave down as nutrient density became less limiting and populations experienced weaker density regulation. Contrary to prediction, we found that the relationship between pgr and density became increasingly concave-up as nutrient levels increased. We also found that variation in pgr increased, and pgr levels reached higher maxima in nutrient-limited environments. Most likely, these results are attributable to population growth suppression in environments with high intraspecific competition due to limited nutrient resources. Our results suggest that density regulation is strongly variable depending on exogenous and endogenous processes acting on the population, implying that expression of density dependence depends extensively on local conditions. Additional experimental work should reveal the mechanisms influencing how the expression of density dependence varies across populations through space and time.

  11. Applying Authentic Data Analysis in Learning Earth Atmosphere

    Science.gov (United States)

    Johan, H.; Suhandi, A.; Samsudin, A.; Wulan, A. R.

    2017-09-01

    The aim of this research was to develop earth science learning material especially earth atmosphere supported by science research with authentic data analysis to enhance reasoning through. Various earth and space science phenomenon require reasoning. This research used experimental research with one group pre test-post test design. 23 pre-service physics teacher participated in this research. Essay test was conducted to get data about reason ability. Essay test was analyzed quantitatively. Observation sheet was used to capture phenomena during learning process. The results showed that student’s reasoning ability improved from unidentified and no reasoning to evidence based reasoning and inductive/deductive rule-based reasoning. Authentic data was considered using Grid Analysis Display System (GrADS). Visualization from GrADS facilitated students to correlate the concepts and bring out real condition of nature in classroom activity. It also helped student to reason the phenomena related to earth and space science concept. It can be concluded that applying authentic data analysis in learning process can help to enhance students reasoning. This study is expected to help lecture to bring out result of geoscience research in learning process and facilitate student understand concepts.

  12. Utility of the pooling approach as applied to whole genome association scans with high-density Affymetrix microarrays

    Directory of Open Access Journals (Sweden)

    Gray Joanna

    2010-11-01

    Full Text Available Abstract Background We report an attempt to extend the previously successful approach of combining SNP (single nucleotide polymorphism microarrays and DNA pooling (SNP-MaP employing high-density microarrays. Whereas earlier studies employed a range of Affymetrix SNP microarrays comprising from 10 K to 500 K SNPs, this most recent investigation used the 6.0 chip which displays 906,600 SNP probes and 946,000 probes for the interrogation of CNVs (copy number variations. The genotyping assay using the Affymetrix SNP 6.0 array is highly demanding on sample quality due to the small feature size, low redundancy, and lack of mismatch probes. Findings In the first study published so far using this microarray on pooled DNA, we found that pooled cheek swab DNA could not accurately predict real allele frequencies of the samples that comprised the pools. In contrast, the allele frequency estimates using blood DNA pools were reasonable, although inferior compared to those obtained with previously employed Affymetrix microarrays. However, it might be possible to improve performance by developing improved analysis methods. Conclusions Despite the decreasing costs of genome-wide individual genotyping, the pooling approach may have applications in very large-scale case-control association studies. In such cases, our study suggests that high-quality DNA preparations and lower density platforms should be preferred.

  13. Numerical analysis of energy density and particle density in high energy heavy-ion collisions

    International Nuclear Information System (INIS)

    Fu Yuanyong; Lu Zhongdao

    2004-01-01

    Energy density and particle density in high energy heavy-ion collisions are calculated with infinite series expansion method and Gauss-Laguerre formulas in numerical integration separately, and the results of these two methods are compared, the higher terms and linear terms in series expansion are also compared. The results show that Gauss-Laguerre formulas is a good method in calculations of high energy heavy-ion collisions. (author)

  14. Applied Behavior Analysis is a Science and, Therefore, Progressive.

    Science.gov (United States)

    Leaf, Justin B; Leaf, Ronald; McEachin, John; Taubman, Mitchell; Ala'i-Rosales, Shahla; Ross, Robert K; Smith, Tristram; Weiss, Mary Jane

    2016-02-01

    Applied behavior analysis (ABA) is a science and, therefore, involves progressive approaches and outcomes. In this commentary we argue that the spirit and the method of science should be maintained in order to avoid reductionist procedures, stifled innovation, and rote, unresponsive protocols that become increasingly removed from meaningful progress for individuals diagnosed with autism spectrum disorder (ASD). We describe this approach as progressive. In a progressive approach to ABA, the therapist employs a structured yet flexible process, which is contingent upon and responsive to child progress. We will describe progressive ABA, contrast it to reductionist ABA, and provide rationales for both the substance and intent of ABA as a progressive scientific method for improving conditions of social relevance for individuals with ASD.

  15. Applied research and development of neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yong Sam; Moon, Jong Hwa; Kim, Sun Ha; Bak, Sung Ryel; Park, Yong Chul; Kim, Young Ki; Chung, Hwan Sung; Park, Kwang Won; Kang, Sang Hun

    2000-05-01

    This report is written for results of research and development as follows : improvement of neutron irradiation facilities, counting system and development of automation system and capsules for NAA in HANARO ; improvement of analytical procedures and establishment of analytical quality control and assurance system; applied research and development of environment, industry and human health and its standardization. For identification and standardization of analytical method, environmental biological samples and polymer are analyzed and uncertainity of measurement are estimated. Also data intercomparison and proficency test were performed. Using airborne particulate matter chosen as a environmental indicators, trace elemental concentrations of sample collected at urban and rural site are determined and then the calculation of statistics and the factor analysis are carried out for investigation of emission source. International cooperation research project was carried out for utilization of nuclear techniques.

  16. A comparative analysis of the density distributions and the structure ...

    Indian Academy of Sciences (India)

    28.3 MeV for some cluster models and various density distributions of the 9 Li nucleus. First, we have obtained five different density distributions of the 9 Li nucleus to generate real potentials with the help of double-folding model. For these densities, we have calculated the elastic scattering angular distributions. Secondly ...

  17. A strategy to apply quantitative epistasis analysis on developmental traits.

    Science.gov (United States)

    Labocha, Marta K; Yuan, Wang; Aleman-Meza, Boanerges; Zhong, Weiwei

    2017-05-15

    Genetic interactions are keys to understand complex traits and evolution. Epistasis analysis is an effective method to map genetic interactions. Large-scale quantitative epistasis analysis has been well established for single cells. However, there is a substantial lack of such studies in multicellular organisms and their complex phenotypes such as development. Here we present a method to extend quantitative epistasis analysis to developmental traits. In the nematode Caenorhabditis elegans, we applied RNA interference on mutants to inactivate two genes, used an imaging system to quantitatively measure phenotypes, and developed a set of statistical methods to extract genetic interactions from phenotypic measurement. Using two different C. elegans developmental phenotypes, body length and sex ratio, as examples, we showed that this method could accommodate various metazoan phenotypes with performances comparable to those methods in single cell growth studies. Comparing with qualitative observations, this method of quantitative epistasis enabled detection of new interactions involving subtle phenotypes. For example, several sex-ratio genes were found to interact with brc-1 and brd-1, the orthologs of the human breast cancer genes BRCA1 and BARD1, respectively. We confirmed the brc-1 interactions with the following genes in DNA damage response: C34F6.1, him-3 (ortholog of HORMAD1, HORMAD2), sdc-1, and set-2 (ortholog of SETD1A, SETD1B, KMT2C, KMT2D), validating the effectiveness of our method in detecting genetic interactions. We developed a reliable, high-throughput method for quantitative epistasis analysis of developmental phenotypes.

  18. Sensitivity analysis for reactivity and power density investigations in nuclear reactors

    International Nuclear Information System (INIS)

    Naguib, K.; Morcos, H.N.; Sallam, O.H.; Abdelsamei, SH.

    1993-01-01

    Sensitivity analysis theory based on the variational functional approaches was applied to evaluate sensitivities of eigenvalues and power densities due to variation of the absorber concentration in the reactor core. The practical usefulness of this method is illustrated by considering test cases. The result indicates that this method is as accurate as those obtained from direct calculations, yet it provides an economical means in saving computational time since it requires fewer calculations. The SARC-1/2 code have been written in Fortran-77 to solve this problem.3 tab. 1 fig

  19. Exploring charge density analysis in crystals at high pressure: data collection, data analysis and advanced modelling.

    Science.gov (United States)

    Casati, Nicola; Genoni, Alessandro; Meyer, Benjamin; Krawczuk, Anna; Macchi, Piero

    2017-08-01

    The possibility to determine electron-density distribution in crystals has been an enormous breakthrough, stimulated by a favourable combination of equipment for X-ray and neutron diffraction at low temperature, by the development of simplified, though accurate, electron-density models refined from the experimental data and by the progress in charge density analysis often in combination with theoretical work. Many years after the first successful charge density determination and analysis, scientists face new challenges, for example: (i) determination of the finer details of the electron-density distribution in the atomic cores, (ii) simultaneous refinement of electron charge and spin density or (iii) measuring crystals under perturbation. In this context, the possibility of obtaining experimental charge density at high pressure has recently been demonstrated [Casati et al. (2016). Nat. Commun. 7, 10901]. This paper reports on the necessities and pitfalls of this new challenge, focusing on the species syn-1,6:8,13-biscarbonyl[14]annulene. The experimental requirements, the expected data quality and data corrections are discussed in detail, including warnings about possible shortcomings. At the same time, new modelling techniques are proposed, which could enable specific information to be extracted, from the limited and less accurate observations, like the degree of localization of double bonds, which is fundamental to the scientific case under examination.

  20. The Evidence-Based Practice of Applied Behavior Analysis.

    Science.gov (United States)

    Slocum, Timothy A; Detrich, Ronnie; Wilczynski, Susan M; Spencer, Trina D; Lewis, Teri; Wolfe, Katie

    2014-05-01

    Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith (The Behavior Analyst, 36, 7-33, 2013) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith's arguments and extend the discussion of the relevant issues. Although we support many of Smith's (The Behavior Analyst, 36, 7-33, 2013) points, we contend that Smith's definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.

  1. DAMQT: A package for the analysis of electron density in molecules

    Science.gov (United States)

    López, Rafael; Rico, Jaime Fernández; Ramírez, Guillermo; Ema, Ignacio; Zorrilla, David

    2009-09-01

    DAMQT is a package for the analysis of the electron density in molecules and the fast computation of the density, density deformations, electrostatic potential and field, and Hellmann-Feynman forces. The method is based on the partition of the electron density into atomic fragments by means of a least deformation criterion. Each atomic fragment of the density is expanded in regular spherical harmonics times radial factors, which are piecewise represented in terms of analytical functions. This representation is used for the fast evaluation of the electrostatic potential and field generated by the electron density and nuclei, as well as for the computation of the Hellmann-Feynman forces on the nuclei. An analysis of the atomic and molecular deformations of the density can be also carried out, yielding a picture that connects with several concepts of the empirical structural chemistry. Program summaryProgram title: DAMQT1.0 Catalogue identifier: AEDL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDL_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPLv3 No. of lines in distributed program, including test data, etc.: 278 356 No. of bytes in distributed program, including test data, etc.: 31 065 317 Distribution format: tar.gz Programming language: Fortran90 and C++ Computer: Any Operating system: Linux, Windows (Xp, Vista) RAM: 190 Mbytes Classification: 16.1 External routines: Trolltech's Qt (4.3 or higher) ( http://www.qtsoftware.com/products), OpenGL (1.1 or higher) ( http://www.opengl.org/), GLUT 3.7 ( http://www.opengl.org/resources/libraries/glut/). Nature of problem: Analysis of the molecular electron density and density deformations, including fast evaluation of electrostatic potential, electric field and Hellmann-Feynman forces on nuclei. Solution method: The method of Deformed Atoms in Molecules, reported elsewhere [1], is used for partitioning the molecular electron density

  2. Adjoint-Based Sensitivity and Uncertainty Analysis for Density and Composition: A User’s Guide

    International Nuclear Information System (INIS)

    Favorite, Jeffrey A.; Perkó, Zoltán; Kiedrowski, Brian C.; Perfetti, Christopher M.

    2017-01-01

    The evaluation of uncertainties is essential for criticality safety. Our paper deals with material density and composition uncertainties and provides guidance on how traditional first-order sensitivity methods can be used to predict their effects. Unlike problems that deal with traditional cross-section uncertainty analysis, material density and composition-related problems are often characterized by constraints that do not allow arbitrary and independent variations of the input parameters. Their proper handling requires constrained sensitivities that take into account the interdependence of the inputs. This paper discusses how traditional unconstrained isotopic density sensitivities can be calculated using the adjoint sensitivity capabilities of the popular Monte Carlo codes MCNP6 and SCALE 6.2, and we also present the equations to be used when forward and adjoint flux distributions are available. Subsequently, we show how the constrained sensitivities can be computed using the unconstrained (adjoint-based) sensitivities as well as by applying central differences directly. We present three distinct procedures for enforcing the constraint on the input variables, each leading to different constrained sensitivities. As a guide, the sensitivity and uncertainty formulas for several frequently encountered specific cases involving densities and compositions are given. One analytic k ∞ example highlights the relationship between constrained sensitivity formulas and central differences, and a more realistic numerical problem reveals similarities among the computer codes used and differences among the three methods of enforcing the constraint.

  3. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Science.gov (United States)

    von Hippel, Ted; von Hippel, Courtney

    2015-01-01

    We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  4. To apply or not to apply: a survey analysis of grant writing costs and benefits.

    Directory of Open Access Journals (Sweden)

    Ted von Hippel

    Full Text Available We surveyed 113 astronomers and 82 psychologists active in applying for federally funded research on their grant-writing history between January, 2009 and November, 2012. We collected demographic data, effort levels, success rates, and perceived non-financial benefits from writing grant proposals. We find that the average proposal takes 116 PI hours and 55 CI hours to write; although time spent writing was not related to whether the grant was funded. Effort did translate into success, however, as academics who wrote more grants received more funding. Participants indicated modest non-monetary benefits from grant writing, with psychologists reporting a somewhat greater benefit overall than astronomers. These perceptions of non-financial benefits were unrelated to how many grants investigators applied for, the number of grants they received, or the amount of time they devoted to writing their proposals. We also explored the number of years an investigator can afford to apply unsuccessfully for research grants and our analyses suggest that funding rates below approximately 20%, commensurate with current NIH and NSF funding, are likely to drive at least half of the active researchers away from federally funded research. We conclude with recommendations and suggestions for individual investigators and for department heads.

  5. Independent Component Analysis applied to Ground-based observations

    Science.gov (United States)

    Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas

    2018-01-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements

  6. Applying importance-performance analysis to patient safety culture.

    Science.gov (United States)

    Lee, Yii-Ching; Wu, Hsin-Hung; Hsieh, Wan-Lin; Weng, Shao-Jen; Hsieh, Liang-Po; Huang, Chih-Hsuan

    2015-01-01

    The Sexton et al.'s (2006) safety attitudes questionnaire (SAQ) has been widely used to assess staff's attitudes towards patient safety in healthcare organizations. However, to date there have been few studies that discuss the perceptions of patient safety both from hospital staff and upper management. The purpose of this paper is to improve and to develop better strategies regarding patient safety in healthcare organizations. The Chinese version of SAQ based on the Taiwan Joint Commission on Hospital Accreditation is used to evaluate the perceptions of hospital staff. The current study then lies in applying importance-performance analysis technique to identify the major strengths and weaknesses of the safety culture. The results show that teamwork climate, safety climate, job satisfaction, stress recognition and working conditions are major strengths and should be maintained in order to provide a better patient safety culture. On the contrary, perceptions of management and hospital handoffs and transitions are important weaknesses and should be improved immediately. Research limitations/implications - The research is restricted in generalizability. The assessment of hospital staff in patient safety culture is physicians and registered nurses. It would be interesting to further evaluate other staff's (e.g. technicians, pharmacists and others) opinions regarding patient safety culture in the hospital. Few studies have clearly evaluated the perceptions of healthcare organization management regarding patient safety culture. Healthcare managers enable to take more effective actions to improve the level of patient safety by investigating key characteristics (either strengths or weaknesses) that healthcare organizations should focus on.

  7. SUCCESS CONCEPT ANALYSIS APPLIED TO THE INFORMATION TECHNOLOGY PROJECT MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Cassio C. Montenegro Duarte

    2012-05-01

    Full Text Available This study evaluates the concept of success in project management that is applicable to the IT universe, from the classical theory associated with the techniques of project management. Therefore, it applies the theoretical analysis associated to the context of information technology in enterprises as well as the classic literature of traditional project management, focusing on its application in business information technology. From the literature developed in the first part of the study, four propositions were prepared for study which formed the basis for the development of the field research with three large companies that develop projects of Information Technology. The methodology used in the study predicted the development of the multiple case study. Empirical evidence suggests that the concept of success found in the classical literature in project management adjusts to the environment management of IT projects. Showed that it is possible to create the model of standard IT projects in order to replicate it in future derivatives projects, which depends on the learning acquired at the end of a long and continuous process and sponsorship of senior management, which ultimately results in its merger into the company culture.

  8. Applied genre analysis: a multi-perspective model

    Directory of Open Access Journals (Sweden)

    Vijay K Bhatia

    2002-04-01

    Full Text Available Genre analysis can be viewed from two different perspectives: it may be seen as a reflection of the complex realities of the world of institutionalised communication, or it may be seen as a pedagogically effective and convenient tool for the design of language teaching programmes, often situated within simulated contexts of classroom activities. This paper makes an attempt to understand and resolve the tension between these two seemingly contentious perspectives to answer the question: "Is generic description a reflection of reality, or a convenient fiction invented by applied linguists?". The paper also discusses issues related to the nature and use of linguistic description in a genre-based educational enterprise, claiming that instead of using generic descriptions as models for linguistic reproduction of conventional forms to respond to recurring social contexts, as is often the case in many communication based curriculum contexts, they can be used as analytical resource to understand and manipulate complex inter-generic and multicultural realisations of professional discourse, which will enable learners to use generic knowledge to respond to novel social contexts and also to create new forms of discourse to achieve pragmatic success as well as other powerful human agendas.

  9. High-density polyethylene dosimetry by transvinylene FTIR analysis

    DEFF Research Database (Denmark)

    McLaughlin, W.L.; Silverman, J.; Al-Sheikhly, M.

    1999-01-01

    and electrons. The useful dose range of 0.053 cm thick high-density polyethylene film (rho = 0.961 g cm(-3); melt index = 0.8 dg min(-1)), for irradiations by (60)Co gamma radiation and 2.0 and 0.4 MeV electron beams in deaerated atmosphere (Na gas), is about 50-10(3) kGy for FTIR transvinylene......The formation of transvinylene unsaturation, -CH=CH-, due to free-radical or cationic-initiated dehydrogenation by irradiation, is a basic reaction in polyethylene and is useful for dosimetry at high absorbed doses. The radiation-enhanced infrared absorption having a maximum at nu = 965 cm......(-l) (lambda = 10.36 mu m) is stable in air and can be measured by Fourier-transform infrared (FTIR) spectrophotometry. The quantitative analysis is a useful means of product end-point dosimetry for radiation processing with gamma rays and electrons, where polyethylene is a component of the processed product...

  10. Rotavirus - Global research density equalizing mapping and gender analysis.

    Science.gov (United States)

    Köster, Corinna; Klingelhöfer, Doris; Groneberg, David A; Schwarzer, Mario

    2016-01-02

    Rotaviruses are the leading reason for dehydration and severe diarrheal disease and in infants and young children worldwide. An increasing number of related publications cause a crucial challenge to determine the relevant scientific output. Therefore, scientometric analyses are helpful to evaluate quantity as well as quality of the worldwide research activities on Rotavirus. Up to now, no in-depth global scientometric analysis relating to Rotavirus publications has been carried out. This study used scientometric tools and the method of density equalizing mapping to visualize the differences of the worldwide research effort referring to Rotavirus. The aim of the study was to compare scientific output geographically and over time by using an in-depth data analysis and New quality and quantity indices in science (NewQIS) tools. Furthermore, a gender analysis was part of the data interpretation. We retrieved all Rotavirus-related articles, which were published on "Rotavirus" during the time period from 1900 to 2013, from the Web of Science by a defined search term. These items were analyzed regarding quantitative and qualitative aspects, and visualized with the help of bibliometric methods and the technique of density equalizing mapping to show the differences of the worldwide research efforts. This work aimed to extend the current NewQIS platform. The 5906 Rotavirus associated articles were published in 138 countries from 1900 to 2013. The USA authored 2037 articles that equaled 34.5% of all published items followed by Japan with 576 articles and the United Kingdom - as the most productive representative of the European countries - with 495 articles. Furthermore, the USA established the most cooperations with other countries and was found to be in the center of an international collaborative network. We performed a gender analysis of authors per country (threshold was set at a publishing output of more than 100 articles by more than 50 authors whose names could be

  11. Geostatistical analysis of GPS trajectory data: Space-time densities

    NARCIS (Netherlands)

    Hengl, T.; van Loon, E.E.; Shamoun-Baranes, J.; Bouten, W.; Zhang, J.; Goodchild, M.F.

    2008-01-01

    Creation of density maps and estimation of home range is problematic for observations of animal movement at irregular intervals. We propose a technique to estimate space-time densities by separately modeling animal movement paths and velocities, both as continuous fields. First the length of

  12. Bond energy decomposition analysis for subsystem density functional theory

    NARCIS (Netherlands)

    Beyhan, S.M.; Gotz, A.W.; Visscher, L.

    2013-01-01

    We employed an explicit expression for the dispersion (D) energy in conjunction with Kohn-Sham (KS) density functional theory and frozen-density embedding (FDE) to calculate interaction energies between DNA base pairs and a selected set of amino acid pairs in the hydrophobic core of a small protein

  13. Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?

    Science.gov (United States)

    Strathie, Ailsa; Walker, Guy H

    2016-03-01

    A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.

  14. THE TURBULENCE SPECTRUM OF MOLECULAR CLOUDS IN THE GALACTIC RING SURVEY: A DENSITY-DEPENDENT PRINCIPAL COMPONENT ANALYSIS CALIBRATION

    International Nuclear Information System (INIS)

    Roman-Duval, Julia; Jackson, James; Federrath, Christoph; Klessen, Ralf S.; Brunt, Christopher; Heyer, Mark

    2011-01-01

    Turbulence plays a major role in the formation and evolution of molecular clouds. Observationally, turbulent velocities are convolved with the density of an observed region. To correct for this convolution, we investigate the relation between the turbulence spectrum of model clouds, and the statistics of their synthetic observations obtained from principal component analysis (PCA). We apply PCA to spectral maps generated from simulated density and velocity fields, obtained from hydrodynamic simulations of supersonic turbulence, and from fractional Brownian motion (fBm) fields with varying velocity, density spectra, and density dispersion. We examine the dependence of the slope of the PCA pseudo-structure function, α PCA , on intermittency, on the turbulence velocity (β v ) and density (β n ) spectral indexes, and on density dispersion. We find that PCA is insensitive to β n and to the log-density dispersion σ s , provided σ s ≤ 2. For σ s > 2, α PCA increases with σ s due to the intermittent sampling of the velocity field by the density field. The PCA calibration also depends on intermittency. We derive a PCA calibration based on fBm structures with σ s ≤ 2 and apply it to 367 13 CO spectral maps of molecular clouds in the Galactic Ring Survey. The average slope of the PCA structure function, (α PCA ) = 0.62 ± 0.2, is consistent with the hydrodynamic simulations and leads to a turbulence velocity exponent of (β v ) = 2.06 ± 0.6 for a non-intermittent, low density dispersion flow. Accounting for intermittency and density dispersion, the coincidence between the PCA slope of the GRS clouds and the hydrodynamic simulations suggests β v ≅ 1.9, consistent with both Burgers and compressible intermittent turbulence.

  15. Numerical analysis of wet separation of particles by density differences

    Science.gov (United States)

    Markauskas, D.; Kruggel-Emden, H.

    2017-07-01

    Wet particle separation is widely used in mineral processing and plastic recycling to separate mixtures of particulate materials into further usable fractions due to density differences. This work presents efforts aiming to numerically analyze the wet separation of particles with different densities. In the current study the discrete element method (DEM) is used for the solid phase while the smoothed particle hydrodynamics (SPH) is used for modeling of the liquid phase. The two phases are coupled by the use of a volume averaging technique. In the current study, simulations of spherical particle separation were performed. In these simulations, a set of generated particles with two different densities is dropped into a rectangular container filled with liquid. The results of simulations with two different mixtures of particles demonstrated how separation depends on the densities of particles.

  16. New trends in applied harmonic analysis sparse representations, compressed sensing, and multifractal analysis

    CERN Document Server

    Cabrelli, Carlos; Jaffard, Stephane; Molter, Ursula

    2016-01-01

    This volume is a selection of written notes corresponding to courses taught at the CIMPA School: "New Trends in Applied Harmonic Analysis: Sparse Representations, Compressed Sensing and Multifractal Analysis". New interactions between harmonic analysis and signal and image processing have seen striking development in the last 10 years, and several technological deadlocks have been solved through the resolution of deep theoretical problems in harmonic analysis. New Trends in Applied Harmonic Analysis focuses on two particularly active areas that are representative of such advances: multifractal analysis, and sparse representation and compressed sensing. The contributions are written by leaders in these areas, and covers both theoretical aspects and applications. This work should prove useful not only to PhD students and postdocs in mathematics and signal and image processing, but also to researchers working in related topics.

  17. Current status of neutron activation analysis and applied nuclear chemistry

    International Nuclear Information System (INIS)

    Lyon, W.S.

    1990-01-01

    A review of recent scientometric studies of citations and publication data shows the present state of NAA and applied nuclear chemistry as compared to other analytical techniques. (author) 9 refs.; 7 tabs

  18. Analysis of spatio-temporal structures of the thermospheric density

    Science.gov (United States)

    Schmidt, Michael; Bloßfeld, Mathis; Erdogan, Eren; Meraner, Andrea

    2017-04-01

    The Earth's upper atmosphere comprising the thermosphere and the ionosphere exhibits a dynamically coupled non-linear system in terms of chemical and physical processes. The system also interacts with the magnetosphere as well as the lower atmosphere. Several stand-alone or coupled models have been developed to reveal the behaviour of atmospheric target parameters and their interactions such as the neutral and charged particle density of the thermosphere from different perspectives which are, for instance, based on pure physical or (semi) empirical models as well as data assimilative approaches combining available models with new set of observations. The thermospheric neutral density, for instance, plays a crucial role within the equation of motion of Earth orbiting objects at low altitudes since the drag force is one of the largest non-gravitational perturbations and a function of the thermospheric integral density. Besides, the density estimation is of critical consideration for re-entry operations, manoeuvre planning, collision avoidance, precise orbit determination (POD) and satellite lifetime planning. There exist several empirical thermospheric models, which have been used in satellite orbit determination, e.g. the JB2008 or the DTM2013 model. They all include different gas species and provide thermospheric temperature and density as functions of the instantaneous position in altitude, latitude and longitude, as well as the local solar time, solar and geomagnetic storm indices and the harmonics of the year's fraction. In this contribution we study the global spatial and temporal behaviour of the thermospheric density provided by the models JB2008 or the DTM2013. Based on these insights we set up a concept for an empirical model of the thermospheric density. In the future step appropriate model parameters will be estimated from high precise satellite laser ranging observations. This work is related to the DFG project INSIGHT (Interactions of Low

  19. Comparative analysis of human and bovine teeth: radiographic density

    Directory of Open Access Journals (Sweden)

    Jefferson Luis Oshiro Tanaka

    2008-12-01

    Full Text Available Since bovine teeth have been used as substitutes for human teeth in in vitro dental studies, the aim of this study was to compare the radiographic density of bovine teeth with that of human teeth to evaluate their usability for radiographic studies. Thirty bovine and twenty human teeth were cut transversally in 1 millimeter-thick slices. The slices were X-rayed using a digital radiographic system and an intraoral X-ray machine at 65 kVp and 7 mA. The exposure time (0.08 s and the target-sensor distance (40 cm were standardized for all the radiographs. The radiographic densities of the enamel, coronal dentin and radicular dentin of each slice were obtained separately using the "histogram" tool of Adobe Photoshop 7.0 software. The mean radiographic densities of the enamel, coronal dentin and radicular dentin were calculated by the arithmetic mean of the slices of each tooth. One-way ANOVA demonstrated statistically significant differences for the densities of bovine and human enamel (p 0.05. Based on the results, the authors concluded that: a the radiographic density of bovine enamel is significantly higher than that of human enamel; b the radiodensity of bovine coronal dentin is statistically lower than the radiodensity of human coronal dentin; bovine radicular dentin is also less radiodense than human radicular dentin, although this difference was not statistically significant; c bovine teeth should be used with care in radiographic in vitro studies.

  20. A longitudinal analysis of alcohol outlet density and domestic violence.

    Science.gov (United States)

    Livingston, Michael

    2011-05-01

    A small number of studies have identified a positive relationship between alcohol outlet density and domestic violence. These studies have all been based on cross-sectional data and have been limited to the assessment of ecological correlations between outlet density and domestic violence rates. This study provides the first longitudinal examination of this relationship. Cross-sectional time-series using aggregated data from small areas. The relationships between alcohol outlet density and domestic violence were assessed over time using a fixed-effects model. Controls for the spatial autocorrelation of the data were included in the model. The study uses data for 186 postcodes from within the metropolitan area of Melbourne, Australia for the years 1996 to 2005. Alcohol outlet density measures for three different types of outlets (hotel/pub, packaged liquor, on-premise) were derived from liquor licensing records and domestic violence rates were calculated from police-recorded crime data, based on the victim's postcode. Alcohol outlet density was associated significantly with rates of domestic violence, over time. All three licence categories were positively associated with domestic violence rates, with small effects for general (pub) and on-premise licences and a large effect for packaged liquor licences. In Melbourne, the density of liquor licences is positively associated with rates of domestic violence over time. The effects were particularly large for packaged liquor outlets, suggesting a need for licensing policies that pay more attention to o off-premise alcohol availability. © 2011 The Authors, Addiction © 2011 Society for the Study of Addiction.

  1. Empirical modeling and data analysis for engineers and applied scientists

    CERN Document Server

    Pardo, Scott A

    2016-01-01

    This textbook teaches advanced undergraduate and first-year graduate students in Engineering and Applied Sciences to gather and analyze empirical observations (data) in order to aid in making design decisions. While science is about discovery, the primary paradigm of engineering and "applied science" is design. Scientists are in the discovery business and want, in general, to understand the natural world rather than to alter it. In contrast, engineers and applied scientists design products, processes, and solutions to problems. That said, statistics, as a discipline, is mostly oriented toward the discovery paradigm. Young engineers come out of their degree programs having taken courses such as "Statistics for Engineers and Scientists" without any clear idea as to how they can use statistical methods to help them design products or processes. Many seem to think that statistics is only useful for demonstrating that a device or process actually does what it was designed to do. Statistics courses emphasize creati...

  2. Genetic and Dyanmic Analysis of Murine Peak Bone Density

    Science.gov (United States)

    1997-10-01

    years of age (1-3). As bone density decreases with age, the risk of osteoporotic fractures increases, especially when the density falls below the...fracture threshold. This relationship suggests that the risk of osteoporotic fracture can be defined in terms of two characteristics - the peak bone...two inbred strains of mice, C57BL/6J (B6) and C3H/HeJ (C3H), with highly significant differences in peak BMD in vertebrae (12%), tibia (20%), and

  3. RELATIONSHIPS BETWEEN ANATOMICAL FEATURES AND INTRA-RING WOOD DENSITY PROFILES IN Gmelina arborea APPLYING X-RAY DENSITOMETRY

    Directory of Open Access Journals (Sweden)

    Mario Tomazelo-Filho

    2007-12-01

    Full Text Available Four annual tree-rings (2 of juvenile wood and 2 of mature wood were sampled from fast-growth plantations ofGmelina arborea in two climatic conditions (dry and wet tropical in Costa Rica. Each annual tree-ring was divided in equal parts ina radial direction. For each part, X-ray density as well as vessel percentage, length and width fiber, cell wall thickness and lumendiameter were measured. Wood density and profile patterns of cell dimension demonstrated inconsistency between juvenile andmature wood and climatic conditions. The Pearson correlation matrix showed that intra-ring wood density was positively correlatedwith the cell wall thickness and negatively correlated with vessel percentage, fiber length, lumen diameter and width. The forwardstepwise regressions determined that: (i intra-ring wood density variation could be predicted from 76 to 96% for anatomicalvariation; (ii cell wall thickness was the most important anatomical feature to produce intra-ring wood density variation and (iii thevessel percentage, fiber length, lumen diameter and width were the second most statically significant characteristics to intra-ring wooddensity, however, with low participation of the determination coefficient of stepwise regressions.

  4. Measurement uncertainty analysis techniques applied to PV performance measurements

    International Nuclear Information System (INIS)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results

  5. Geometric Error Analysis in Applied Calculus Problem Solving

    Science.gov (United States)

    Usman, Ahmed Ibrahim

    2017-01-01

    The paper investigates geometric errors students made as they tried to use their basic geometric knowledge in the solution of the Applied Calculus Optimization Problem (ACOP). Inaccuracies related to the drawing of geometric diagrams (visualization skills) and those associated with the application of basic differentiation concepts into ACOP…

  6. An applied general equilibrium model for Dutch agribusiness policy analysis

    NARCIS (Netherlands)

    Peerlings, J.

    1993-01-01

    The purpose of this thesis was to develop a basic static applied general equilibrium (AGE) model to analyse the effects of agricultural policy changes on Dutch agribusiness. In particular the effects on inter-industry transactions, factor demand, income, and trade are of

  7. Automated Proposition Density Analysis for Discourse in Aphasia

    Science.gov (United States)

    Fromm, Davida; Greenhouse, Joel; Hou, Kaiyue; Russell, G. Austin; Cai, Xizhen; Forbes, Margaret; Holland, Audrey; MacWhinney, Brian

    2016-01-01

    Purpose: This study evaluates how proposition density can differentiate between persons with aphasia (PWA) and individuals in a control group, as well as among subtypes of aphasia, on the basis of procedural discourse and personal narratives collected from large samples of participants. Method: Participants were 195 PWA and 168 individuals in a…

  8. Analysis of bone mineral density of human bones for strength ...

    Indian Academy of Sciences (India)

    The bone density (BMD) is a medical term normally referring to the amount of mineral matter per square centimetre of bones. Twenty-five patients (18 female and 7 male patients with a mean age of 71.3 years) undergoing both lumbar spine DXA scans and computed tomography imaging were evaluated to determine if HU ...

  9. Chemical bonding and charge density distribution analysis of ...

    Indian Academy of Sciences (India)

    The mid bond electron density values revealed the enhancement of covalent nature between titanium and oxygen ions and predominant ionic nature between barium and oxygen ions. Average grain sizes were estimated for the undoped and doped samples. SEM investigations showed the existence of smaller grains with ...

  10. A comparative analysis of the density distributions and the structure ...

    Indian Academy of Sciences (India)

    2017-02-16

    Feb 16, 2017 ... (OM) calculations, the density distributions and the internal structure parametrization of the 9Li nucleus. In §3, we give theoretical results of the calculations. Section 4 is devoted to our summary and conclusions. 2. Theory. 2.1 The optical model (OM). The optical potential can be written in the following form:.

  11. Chemical bonding and charge density distribution analysis of ...

    Indian Academy of Sciences (India)

    Ceramics; charge density; X-ray diffraction; bonding; microstructure. 1. Introduction. Ferroelectric perovskite materials are .... on Powder Diffraction Standards (JCPDS) database. (PDF# 05-0626). Existence of well-defined and ..... Li W, Xu Z, Chu R, Fu P and Hao J 2010 J. Alloy. Compd. 499 255. 9. Glinchuk M D, Bykov I P, ...

  12. Analysis of bone mineral density of human bones for strength ...

    Indian Academy of Sciences (India)

    indirect indicator of osteoporosis and fracture risk. This medical bone density is not the true physical “density” of the bone, which would be computed as mass per volume. Dual-energy X-ray absorptiometry (DXA, previously DEXA), a means of measuring BMD, is the most widely used and most thoroughly studied bone ...

  13. Neutronic analysis of a high power density hybrid reactor using ...

    Indian Academy of Sciences (India)

    in hexagonal geometry with ten rows of pitch length 1·25 cm in the radial direction. In the investigated blanket, selection of the potential structural materials for a high power density reactor is conducted on the basis of the following considerations. Among the candidate structural materials for fusion reactors, tungsten (W) has ...

  14. A density model based on the Modified Quasichemical Model and applied to the (NaCl + KCl + ZnCl2) liquid

    International Nuclear Information System (INIS)

    Ouzilleau, Philippe; Robelin, Christian; Chartrand, Patrice

    2012-01-01

    Highlights: ► A model for the density of multicomponent inorganic liquids. ► The density model is based on the Modified Quasichemical Model. ► Application to the (NaCl + KCl + ZnCl 2 ) ternary liquid. ► A Kohler–Toop-like asymmetric interpolation method was used. - Abstract: A theoretical model for the density of multicomponent inorganic liquids based on the Modified Quasichemical Model has been presented previously. By introducing in the Gibbs free energy of the liquid phase temperature-dependent molar volume expressions for the pure components and pressure-dependent excess parameters for the binary (and sometimes higher-order) interactions, it is possible to reproduce, and eventually predict, the molar volume and the density of the multicomponent liquid phase using standard interpolation methods. In the present article, this density model is applied to the (NaCl + KCl + ZnCl 2 ) ternary liquid and a Kohler–Toop-like asymmetric interpolation method is used. All available density data for the (NaCl + KCl + ZnCl 2 ) liquid were collected and critically evaluated, and optimized pressure-dependent model parameters have been found. This new volumetric model can be used with Gibbs free energy minimization software, to calculate the molar volume and the density of (NaCl + KCl + ZnCl 2 ) ternary melts.

  15. Analysis of the Nonlinear Density Wave Two-Phase Instability in a Steam Generator of 600MWe Liquid Metal Reactor

    International Nuclear Information System (INIS)

    Choi, Seok Ki; Kim, Seong O

    2011-01-01

    A 600 MWe demonstration reactor being developed at KAERI employs a once-through helically coiled steam generator. The helically coiled steam generator is compact and is efficient for heat transfer, however, it may suffer from the two-phase instability. It is well known that the density wave instability is the main source of instability among various types of instabilities in a helically coiled S/G in a LMR. In the present study a simple method for analysis of the density wave two phase instability in a liquid metal reactor S/G is proposed and the method is applied to the analysis of density wave instability in a S/G of 600MWe liquid metal reactor

  16. How Has Applied Behavior Analysis and Behavior Therapy Changed?: An Historical Analysis of Journals

    Science.gov (United States)

    O'Donohue, William; Fryling, Mitch

    2007-01-01

    Applied behavior analysis and behavior therapy are now nearly a half century old. It is interesting to ask if and how these disciplines have changed over time, particularly regarding some of their key internal controversies (e.g., role of cognitions). We examined the first five years and the 2000-2004 five year period of the "Journal of Applied…

  17. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis? It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment`s final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis? A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  18. Measurement uncertainty analysis techniques applied to PV performance measurements

    Energy Technology Data Exchange (ETDEWEB)

    Wells, C.

    1992-10-01

    The purpose of this presentation is to provide a brief introduction to measurement uncertainty analysis, outline how it is done, and illustrate uncertainty analysis with examples drawn from the PV field, with particular emphasis toward its use in PV performance measurements. The uncertainty information we know and state concerning a PV performance measurement or a module test result determines, to a significant extent, the value and quality of that result. What is measurement uncertainty analysis It is an outgrowth of what has commonly been called error analysis. But uncertainty analysis, a more recent development, gives greater insight into measurement processes and tests, experiments, or calibration results. Uncertainty analysis gives us an estimate of the I interval about a measured value or an experiment's final result within which we believe the true value of that quantity will lie. Why should we take the time to perform an uncertainty analysis A rigorous measurement uncertainty analysis: Increases the credibility and value of research results; allows comparisons of results from different labs; helps improve experiment design and identifies where changes are needed to achieve stated objectives (through use of the pre-test analysis); plays a significant role in validating measurements and experimental results, and in demonstrating (through the post-test analysis) that valid data have been acquired; reduces the risk of making erroneous decisions; demonstrates quality assurance and quality control measures have been accomplished; define Valid Data as data having known and documented paths of: Origin, including theory; measurements; traceability to measurement standards; computations; uncertainty analysis of results.

  19. Applying FEATHERS for Travel Demand Analysis: Model Considerations

    Directory of Open Access Journals (Sweden)

    Qiong Bao

    2018-01-01

    Full Text Available Activity-based models of travel demand have received considerable attention in transportation planning and forecasting over the last few decades. FEATHERS (The Forecasting Evolutionary Activity-Travel of Households and their Environmental Repercussions, developed by the Transportation Research Institute of Hasselt University, Belgium, is a micro-simulation framework developed to facilitate the implementation of activity-based models for transport demand forecasting. In this paper, we focus on several model considerations when applying this framework. First, the way to apply FEATHERS on a more disaggregated geographical level is investigated, with the purpose of obtaining more detailed travel demand information. Next, to reduce the computation time when applying FEATHERS on a more detailed geographical level, an iteration approach is proposed to identify the minimum size of the study area needed. In addition, the effect of stochastic errors inherently included in the FEATHERS framework is investigated, and the concept of confidence intervals is applied to determine the minimum number of model runs needed to minimize this effect. In the application, the FEATHERS framework is used to investigate the potential impact of light rail initiatives on travel demand at a local network in Flanders, Belgium. In doing so, all the aforementioned model considerations are taken into account. The results indicate that by integrating a light rail network into the current public transport network, there would be a relatively positive impact on public transport-related trips, but a relatively negative impact on the non-motorized-mode trips in this area. However, no significant change is found for car-related trips.

  20. Time-dependent reduced density matrix functional theory applied to laser-driven, correlated two-electron dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Brics, Martins; Kapoor, Varun; Bauer, Dieter [Institut fuer Physik, Universitaet Rostock, 18051 Rostock (Germany)

    2013-07-01

    Time-dependent density functional theory (TDDFT) with known and practicable exchange-correlation potentials does not capture highly correlated electron dynamics such as single-photon double ionization, autoionization, or nonsequential ionization. Time-dependent reduced density matrix functional theory (TDRDMFT) may remedy these problems. The key ingredients in TDRDMFT are the natural orbitals (NOs), i.e., the eigenfunctions of the one-body reduced density matrix (1-RDM), and the occupation numbers (OCs), i.e., the respective eigenvalues. The two-body reduced density matrix (2-RDM) is then expanded in NOs, and equations of motion for the NOs can be derived. If the expansion coefficients of the 2-RDM were known exactly, the problem at hand would be solved. In practice, approximations have to be made. We study the prospects of TDRDMFT following a top-down approach. We solve the exact two-electron time-dependent Schroedinger equation for a model Helium atom in intense laser fields in order to study highly correlated phenomena such as the population of autoionizing states or single-photon double ionization. From the exact wave function we calculate the exact NOs, OCs, the exact expansion coefficients of the 2-RDM, and the exact potentials in the equations of motion. In that way we can identify how many NOs and which level of approximations are necessary to capture such phenomena.

  1. Analysis of flame surface density measurements in turbulent premixed combustion

    Energy Technology Data Exchange (ETDEWEB)

    Halter, Fabien [Institut PRISME, Universite d' Orleans, 45072 Orleans (France); Chauveau, Christian; Goekalp, Iskender [Institut de Combustion, Aerothermique, Reactivite et Environnement, Centre National de la Recherche Scientifique, 45071 Orleans (France); Veynante, Denis [Laboratoire E.M2.C, Centre National de la Recherche Scientifique, Ecole Centrale Paris, 92295 Chatenay-Malabry (France)

    2009-03-15

    In premixed turbulent combustion, reaction rates can be estimated from the flame surface density. This parameter, which measures the mean flame surface area available per unit volume, may be obtained from algebraic expressions or by solving a transport equation. In this study, detailed measurements were performed on a Bunsen-type burner fed with methane/air mixtures in order to determine the local flame surface density experimentally. This burner, located in a high-pressure combustion chamber, allows investigation of turbulent premixed flames under various flow, mixture, and pressure conditions. In the present work, equivalence ratio was varied from 0.6 to 0.8 and pressure from 0.1 to 0.9 MPa. Flame front visualizations by Mie scattering laser tomography are used to obtain experimental data on the instantaneous flame front dynamics. The exact equation given by Pope is used to obtain flame surface density maps for different flame conditions. Some assumptions are made in order to access three-dimensional information from our two-dimensional experiments. Two different methodologies are proposed and tested in term of global mass balance (what enters compared to what is burned). The detailed experimental flame surface data provided for the first time in this work should progressively allow improvement of turbulent premixed flame modeling approaches. (author)

  2. Applying Qualitative Hazard Analysis to Support Quantitative Safety Analysis for Proposed Reduced Wake Separation Conops

    Science.gov (United States)

    Shortle, John F.; Allocco, Michael

    2005-01-01

    This paper describes a scenario-driven hazard analysis process to identify, eliminate, and control safety-related risks. Within this process, we develop selective criteria to determine the applicability of applying engineering modeling to hypothesized hazard scenarios. This provides a basis for evaluating and prioritizing the scenarios as candidates for further quantitative analysis. We have applied this methodology to proposed concepts of operations for reduced wake separation for closely spaced parallel runways. For arrivals, the process identified 43 core hazard scenarios. Of these, we classified 12 as appropriate for further quantitative modeling, 24 that should be mitigated through controls, recommendations, and / or procedures (that is, scenarios not appropriate for quantitative modeling), and 7 that have the lowest priority for further analysis.

  3. Applied risk analysis to the future Brazilian electricity generation matrix

    Energy Technology Data Exchange (ETDEWEB)

    Maues, Jair; Fernandez, Eloi; Correa, Antonio

    2010-09-15

    This study compares energy conversion systems for the generation of electrical power, with an emphasis on the Brazilian energy matrix. The financial model applied in this comparison is based on the Portfolio Theory, developed by Harry Markowitz. The risk-return ratio related to the electrical generation mix predicted in the National Energy Plan - 2030, published in 2006 by the Brazilian Energy Research Office, is evaluated. The increase of non-traditional renewable energy in this expected electrical generating mix, specifically, residues of sugar cane plantations and wind energy, reduce not only the risk but also the average cost of the kilowatt-hour generated.

  4. Signed directed social network analysis applied to group conflict

    DEFF Research Database (Denmark)

    Zheng, Quan; Skillicorn, David; Walther, Olivier

    2015-01-01

    Real-world social networks contain relationships of multiple different types, but this richness is often ignored in graph-theoretic modelling. We show how two recently developed spectral embedding techniques, for directed graphs (relationships are asymmetric) and for signed graphs (relationships...... are both positive and negative), can be combined. This combination is particularly appropriate for intelligence, terrorism, and law enforcement applications. We illustrate by applying the novel embedding technique to datasets describing conflict in North-West Africa, and show how unusual interactions can...

  5. Applied network security monitoring collection, detection, and analysis

    CERN Document Server

    Sanders, Chris

    2013-01-01

    Applied Network Security Monitoring is the essential guide to becoming an NSM analyst from the ground up. This book takes a fundamental approach to NSM, complete with dozens of real-world examples that teach you the key concepts of NSM. Network security monitoring is based on the principle that prevention eventually fails. In the current threat landscape, no matter how much you try, motivated attackers will eventually find their way into your network. At that point, it is your ability to detect and respond to that intrusion that can be the difference between a small incident and a major di

  6. Applying real options analysis to assess cleaner energy development strategies

    International Nuclear Information System (INIS)

    Cheng, Ching-Tsung; Lo, Shang-Lien; Lin, Tyrone T.

    2011-01-01

    The energy industry, accounts for the largest portion of CO 2 emissions, is facing the issue of compliance with the national clean energy policy. The methodology for evaluating the energy mix policy is crucial because of the characteristics of lead time embedded with the power generation facilities investment and the uncertainty of future electricity demand. In this paper, a modified binomial model based on sequential compound options, which may account for the lead time and uncertainty as a whole is established, and a numerical example on evaluating the optional strategies and the strategic value of the cleaner energy policy is also presented. It is found that the optimal decision at some nodes in the binomial tree is path dependent, which is different from the standard sequential compound option model with lead time or time lag concept. The proposed modified binomial sequential compound real options model can be generalized and extensively applied to solve the general decision problems that deal with the long lead time of many government policies as well as capital intensive investments. - Highlights: → Introducing a flexible strategic management approach for government policy making. → Developing a modified binomial real options model based on sequential compound options. → Proposing an innovative model for managing the long term policy with lead time. → Applying to evaluate the options of various scenarios of cleaner energy strategies.

  7. Linear and nonlinear analysis of density wave instability phenomena

    International Nuclear Information System (INIS)

    Ambrosini, Walter

    1999-01-01

    In this paper the mechanism of density-wave oscillations in a boiling channel with uniform and constant heat flux is analysed by linear and nonlinear analytical tools. A model developed on the basis of a semi-implicit numerical discretization of governing partial differential equations is used to provide information on the transient distribution of relevant variables along the channel during instabilities. Furthermore, a lumped parameter model and a distributed parameter model developed in previous activities are also adopted for independent confirmation of the observed trends. The obtained results are finally put in relation with the picture of the phenomenon proposed in classical descriptions. (author)

  8. Using genomic DNA-based probe-selection to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species

    Directory of Open Access Journals (Sweden)

    Townsend Henrik J

    2005-11-01

    Full Text Available Abstract High-density oligonucleotide (oligo arrays are a powerful tool for transcript profiling. Arrays based on GeneChip® technology are amongst the most widely used, although GeneChip® arrays are currently available for only a small number of plant and animal species. Thus, we have developed a method to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species and tested the method by analysing the transcriptome of Brassica oleracea L., a species for which no GeneChip® array is available, using a GeneChip® array designed for Arabidopsis thaliana (L. Heynh. Genomic DNA from B. oleracea was labelled and hybridised to the ATH1-121501 GeneChip® array. Arabidopsis thaliana probe-pairs that hybridised to the B. oleracea genomic DNA on the basis of the perfect-match (PM probe signal were then selected for subsequent B. oleracea transcriptome analysis using a .cel file parser script to generate probe mask files. The transcriptional response of B. oleracea to a mineral nutrient (phosphorus; P stress was quantified using probe mask files generated for a wide range of gDNA hybridisation intensity thresholds. An example probe mask file generated with a gDNA hybridisation intensity threshold of 400 removed > 68 % of the available PM probes from the analysis but retained >96 % of available A. thaliana probe-sets. Ninety-nine of these genes were then identified as significantly regulated under P stress in B. oleracea, including the homologues of P stress responsive genes in A. thaliana. Increasing the gDNA hybridisation intensity thresholds up to 500 for probe-selection increased the sensitivity of the GeneChip® array to detect regulation of gene expression in B. oleracea under P stress by up to 13-fold. Our open-source software to create probe mask files is freely available http://affymetrix.arabidopsis.info/xspecies/ and may be used to facilitate transcriptomic analyses of a wide range of plant and animal

  9. Condition Monitoring of a Process Filter Applying Wireless Vibration Analysis

    Directory of Open Access Journals (Sweden)

    Pekka KOSKELA

    2011-05-01

    Full Text Available This paper presents a novel wireless vibration-based method for monitoring the degree of feed filter clogging. In process industry, these filters are applied to prevent impurities entering the process. During operation, the filters gradually become clogged, decreasing the feed flow and, in the worst case, preventing it. The cleaning of the filter should therefore be carried out predictively in order to avoid equipment damage and unnecessary process downtime. The degree of clogging is estimated by first calculating the time domain indices from low frequency accelerometer samples and then taking the median of the processed values. Nine different statistical quantities are compared based on the estimation accuracy and criteria for operating in resource-constrained environments with particular focus on energy efficiency. The initial results show that the method is able to detect the degree of clogging, and the approach may be applicable to filter clogging monitoring.

  10. Performance analysis of numeric solutions applied to biokinetics of radionuclides

    International Nuclear Information System (INIS)

    Mingatos, Danielle dos Santos; Bevilacqua, Joyce da Silva

    2013-01-01

    Biokinetics models for radionuclides applied to dosimetry problems are constantly reviewed by ICRP. The radionuclide trajectory could be represented by compartmental models, assuming constant transfer rates between compartments. A better understanding of physiological or biochemical phenomena, improve the comprehension of radionuclide behavior in the human body and, in general, more complex compartmental models are proposed, increasing the difficulty of obtaining the analytical solution for the system of first order differential equations. Even with constant transfer rates numerical solutions must be carefully implemented because of almost singular characteristic of the matrix of coefficients. In this work we compare numerical methods with different strategies for ICRP-78 models for Thorium-228 and Uranium-234. The impact of uncertainty in the parameters of the equations is also estimated for local and global truncation errors. (author)

  11. [The method of analysis of distribution of erythrocytes by density: practical guidelines].

    Science.gov (United States)

    Shukrina, E S; Nesterenko, V M; Tsvetaeva, N V; Nikulina, O F; Ataullakhanov, F I

    2014-07-01

    The article describes the phthalate method of analysis of distribution of erythrocytes by density and demonstrates its possibility. The distribution of erythrocytes by density is implemented using centrifugation of blood in micro-hematocrit capillaries in presence of compounds of dimethyl- and dibuthylphthalates of known density. The acquisition of such clinically reliable parameters of distribution of erythrocytes by density as mean density of erythrocytes, width of distribution of erythrocytes by density, light and heavy fraction of erythrocytes and maximum of curve of distribution of erythrocytes by density is described. The causes of deviation of distribution of erythrocytes by density from standard values under various pathological conditions are considered. The syndrome of dehydration of erythrocytes is described in details. The simple and accessible method of acquisition of distribution of erythrocytes by density is described. It is demonstrated that analysis of distribution of erythrocytes by density makes it possible to determine character of changes occurring with erythrocytes. The monitoring of parameters of distribution of erythrocytes by density allows evaluating dynamics of pathological process and effectiveness of therapy.

  12. Sensitivity Analysis Applied in Design of Low Energy Office Building

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik

    2008-01-01

    satisfies the design requirements and objectives. In the design of sustainable Buildings it is beneficial to identify the most important design parameters in order to develop more efficiently alternative design solutions or reach optimized design solutions. A sensitivity analysis makes it possible...

  13. Applying an Activity System to Online Collaborative Group Work Analysis

    Science.gov (United States)

    Choi, Hyungshin; Kang, Myunghee

    2010-01-01

    This study determines whether an activity system provides a systematic framework to analyse collaborative group work. Using an activity system as a unit of analysis, the research examined learner behaviours, conflicting factors and facilitating factors while students engaged in collaborative work via asynchronous computer-mediated communication.…

  14. Applying Skinner's Analysis of Verbal Behavior to Persons with Dementia

    Science.gov (United States)

    Dixon, Mark; Baker, Jonathan C.; Sadowski, Katherine Ann

    2011-01-01

    Skinner's 1957 analysis of verbal behavior has demonstrated a fair amount of utility to teach language to children with autism and other various disorders. However, the learning of language can be forgotten, as is the case for many elderly suffering from dementia or other degenerative diseases. It appears possible that Skinner's operants may…

  15. Risk Analysis Applied in Oil Exploration and Production | Mbanugo ...

    African Journals Online (AJOL)

    The analysis in this work is based on the actual field data obtained from Devon Exploration and Production Inc. The Net Present Value (NPV) and the Expected Monetary Value (EMV) were computed using Excel and Visual Basic to determine the viability of these projects. Although the use of risk management techniques ...

  16. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSISAPPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    Full Text Available Carvedilol is a nonselective beta blocker/alpha-1 blocker, which is used for treatment of essential hypertension, chronic stable angina, unstable angina and ischemic left ventricular dysfunction. The aim of this study was to describe carvedilol population pharmacokinetic (PK analysis as well as the validation of analytical procedure, which is an important step regarding this approach. In contemporary clinical practice, population PK analysis is often more important than standard PK approach in setting a mathematical model that describes the PK parameters. Also, it includes the variables that have particular importance in the drugs pharmacokinetics such as sex, body mass, dosage, pharmaceutical form, pathophysiological state, disease associated with the organism or the presence of a specific polymorphism in the isoenzyme important for biotransformation of the drug. One of the most frequently used approach in population PK analysis is the Nonlinear Modeling of Mixed Effects - NONMEM modeling. Analytical methods used in the data collection period is of great importance for the implementation of a population PK analysis of carvedilol in order to obtain reliable data that can be useful in clinical practice. High performance liquid chromatography (HPLC analysis of carvedilol is used to confirm the identity of a drug and provide quantitative results and also to monitor the efficacy of the therapy. Analytical procedures used in other studies could not be fully implemented in our research as it was necessary to perform certain modification and validation of the method with the aim of using the obtained results for the purpose of a population pharmacokinetic analysis. Validation process is a logical terminal phase of analytical procedure development that provides applicability of the procedure itself. The goal of validation is to ensure consistency of the method and accuracy of results or to confirm the selection of analytical method for a given sample

  17. The Private Lives of Minerals: Social Network Analysis Applied to Mineralogy and Petrology

    Science.gov (United States)

    Hazen, R. M.; Morrison, S. M.; Fox, P. A.; Golden, J. J.; Downs, R. T.; Eleish, A.; Prabhu, A.; Li, C.; Liu, C.

    2016-12-01

    Comprehensive databases of mineral species (rruff.info/ima) and their geographic localities and co-existing mineral assemblages (mindat.org) reveal patterns of mineral association and distribution that mimic social networks, as commonly applied to such varied topics as social media interactions, the spread of disease, terrorism networks, and research collaborations. Applying social network analysis (SNA) to common assemblages of rock-forming igneous and regional metamorphic mineral species, we find patterns of cohesion, segregation, density, and cliques that are similar to those of human social networks. These patterns highlight classic trends in lithologic evolution and are illustrated with sociograms, in which mineral species are the "nodes" and co-existing species form "links." Filters based on chemistry, age, structural group, and other parameters highlight visually both familiar and new aspects of mineralogy and petrology. We quantify sociograms with SNA metrics, including connectivity (based on the frequency of co-occurrence of mineral pairs), homophily (the extent to which co-existing mineral species share compositional and other characteristics), network closure (based on the degree of network interconnectivity), and segmentation (as revealed by isolated "cliques" of mineral species). Exploitation of large and growing mineral data resources with SNA offers promising avenues for discovering previously hidden trends in mineral diversity-distribution systematics, as well as providing new pedagogical approaches to teaching mineralogy and petrology.

  18. The industrial computerized tomography applied to the rock analysis

    International Nuclear Information System (INIS)

    Tetzner, Guaraciaba de Campos

    2008-01-01

    This work is a study of the possibilities of the technical applications of Computerized Tomography (CT) by using a device developed in the Radiation Technology Center (CTR), Institute for Energy and Nuclear Research (IPEN-CNEN/SP). The equipment consists of a gamma radiation source ( 60 Co), a scintillation detector of sodium iodide doped with thallium (NaI (Tl)), a mechanical system to move the object (rotation and translation) and a computer system. This operating system has been designed and developed by the CTR-IPEN-CNEN/SP team using national resources and technology. The first validation test of the equipment was carried out using a cylindrical sample of polypropylene (phantom) with two cylindrical cavities (holes) of 5 x 25 cm (diameter and length). In these tests, the holes were filled with materials of different density (air, oil and metal), whose attenuation coefficients are well known. The goal of this first test was to assess the response quality of the equipment. The present report is a study comparing computerized tomography equipment CTR-IPEN-CNEN/SP which uses a source of gamma radiation ( 60 Co) and other equipment provided by the Department of Geosciences in the University of Texas (CTUT), which uses an X-ray source (450 kV and 3.2 mA). As a result, the images obtained and the comprehensive study of the usefulness of the equipment developed here strengthened the proposition that the development of industrial computerized tomography is an important step toward consolidating the national technology. (author)

  19. POWER SPECTRUM DENSITY (PSD ANALYSIS OF AUTOMOTIVE PEDAL-PAD

    Directory of Open Access Journals (Sweden)

    AHMED RITHAUDDEEN YUSOFF

    2016-04-01

    Full Text Available Vibration at the pedal-pad may contribute to discomfort of foot plantar fascia during driving. This is due to transmission of vibration to the mount, chassis, pedal, and then to the foot plantar fascia. This experimental study is conducted to determine the estimation of peak value using the power spectral density of the vertical vibration input at the foot. The power spectral density value is calculated based on the frequency range between 13 Hz to 18 Hz. This experiment was conducted using 12 subjects testing on three size of pedal-pads; small, medium and large. The result shows that peak value occurs at resonance frequency of 15 Hz. The PSD values at that resonance frequency are 0.251 (m/s2 2/Hz for small pedal-pad, followed by the medium pedal-pad is at 0.387 (m/s2 2/Hz and lastly for the large pedal-pad is at 0.483 (m/s22/Hz. The resultsindicate that during driving, the foot vibration when interact with the large pedal-pad contributed higher stimulus compared with the small and medium pedal-pad. The pedal-pad size plays an important role in the pedal element designs in terms of vibration-transfer from pedal-pads on the feet, particularly to provide comfort to the driver while driving.

  20. Low-density lipoprotein apheresis: an evidence-based analysis.

    Science.gov (United States)

    2007-01-01

    To assess the effectiveness and safety of low-density lipoprotein (LDL) apheresis performed with the heparin-induced extracorporeal LDL precipitation (HELP) system for the treatment of patients with refractory homozygous (HMZ) and heterozygous (HTZ) familial hypercholesterolemia (FH). BACKGROUND ON FAMILIAL HYPERCHOLESTEROLEMIA: Familial hypercholesterolemia is a genetic autosomal dominant disorder that is caused by several mutations in the LDL-receptor gene. The reduced number or absence of functional LDL receptors results in impaired hepatic clearance of circulating low-density lipoprotein cholesterol (LDL-C) particles, which results in extremely high levels of LDL-C in the bloodstream. Familial hypercholesterolemia is characterized by excess LDL-C deposits in tendons and arterial walls, early onset of atherosclerotic disease, and premature cardiac death. Familial hypercholesterolemia occurs in both HTZ and HMZ forms. Heterozygous FH is one of the most common monogenic metabolic disorders in the general population, occurring in approximately 1 in 500 individuals. Nevertheless, HTZ FH is largely undiagnosed and an accurate diagnosis occurs in only about 15% of affected patients in Canada. Thus, it is estimated that there are approximately 3,800 diagnosed and 21,680 undiagnosed cases of HTZ FH in Ontario. In HTZ FH patients, half of the LDL receptors do not work properly or are absent, resulting in plasma LDL-C levels 2- to 3-fold higher than normal (range 7-15mmol/L or 300-500mg/dL). Most HTZ FH patients are not diagnosed until middle age when either they or one of their siblings present with symptomatic coronary artery disease (CAD). Without lipid-lowering treatment, 50% of males die before the age of 50 and 25% of females die before the age of 60, from myocardial infarction or sudden death. In contrast to the HTZ form, HMZ FH is rare (occurring in 1 case per million persons) and more severe, with a 6- to 8-fold elevation in plasma LDL-C levels (range 15-25mmol

  1. On the relation between applied behavior analysis and positive behavioral support.

    Science.gov (United States)

    Carr, James E; Sidener, Tina M

    2002-01-01

    Anderson and Freeman (2000) recently defined positive behavioral support (PBS) as a systematic approach to the delivery of clinical and educational services that is rooted in behavior analysis. However, the recent literature contains varied definitions of PBS as well as discrepant notions regarding the relation between applied behavior analysis and PBS. After summarizing common definitional characteristics of PBS from the literature, we conclude that PBS is comprised almost exclusively of techniques and values originating in applied behavior analysis. We then discuss the relations between applied behavior analysis and PBS that have been proposed in the literature. Finally, we discuss possible implications of considering PBS a field separate from applied behavior analysis.

  2. Applied Hierarchical Cluster Analysis with Average Linkage Algoritm

    Directory of Open Access Journals (Sweden)

    Cindy Cahyaning Astuti

    2017-11-01

    Full Text Available This research was conducted in Sidoarjo District where source of data used from secondary data contained in the book "Kabupaten Sidoarjo Dalam Angka 2016" .In this research the authors chose 12 variables that can represent sub-district characteristics in Sidoarjo. The variable that represents the characteristics of the sub-district consists of four sectors namely geography, education, agriculture and industry. To determine the equitable geographical conditions, education, agriculture and industry each district, it would require an analysis to classify sub-districts based on the sub-district characteristics. Hierarchical cluster analysis is the analytical techniques used to classify or categorize the object of each case into a relatively homogeneous group expressed as a cluster. The results are expected to provide information about dominant sub-district characteristics and non-dominant sub-district characteristics in four sectors based on the results of the cluster is formed.

  3. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    Andriola, L.; Ceccacci, R.

    2001-01-01

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report [it

  4. Applying Financial Portfolio Analysis to Government Program Portfolios

    Science.gov (United States)

    2007-06-01

    containing programs lacking EVM data. Analysis of larger EVM supported portfolios requires significantly more computation (exponential growth ) and...The rule had managers fill their portfolios with technology stocks during a period of rapid technology growth . While this strategy generated...himself points out, “The Rational Man, like the unicorn , does not exist” (Markowitz, 1959). The various investor assumptions presented above break down

  5. Pair distribution function analysis applied to decahedral gold nanoparticles

    International Nuclear Information System (INIS)

    Nakotte, H; Silkwood, C; Kiefer, B; Karpov, D; Fohtung, E; Page, K; Wang, H-W; Olds, D; Manna, S; Fullerton, E E

    2017-01-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  6. Pair distribution function analysis applied to decahedral gold nanoparticles

    Science.gov (United States)

    Nakotte, H.; Silkwood, C.; Page, K.; Wang, H.-W.; Olds, D.; Kiefer, B.; Manna, S.; Karpov, D.; Fohtung, E.; Fullerton, E. E.

    2017-11-01

    The five-fold symmetry of face-centered cubic (fcc) derived nanoparticles is inconsistent with the translational symmetry of a Bravais lattice and generally explained by multiple twinning of a tetrahedral subunit about a (joint) symmetry axis, with or without structural modification to the fcc motif. Unlike in bulk materials, five-fold twinning in cubic nanoparticles is common and strongly affects their structural, chemical, and electronic properties. To test and verify theoretical approaches, it is therefore pertinent that the local structural features of such materials can be fully characterized. The small size of nanoparticles severely limits the application of traditional analysis techniques, such as Bragg diffraction. A complete description of the atomic arrangement in nanoparticles therefore requires a departure from the concept of translational symmetry, and prevents fully evaluating all the structural features experimentally. We describe how recent advances in instrumentation, together with the increasing power of computing, are shaping the development of alternative analysis methods of scattering data for nanostructures. We present the application of Debye scattering and pair distribution function (PDF) analysis towards modeling of the total scattering data for the example of decahedral gold nanoparticles. PDF measurements provide a statistical description of the pair correlations of atoms within a material, allowing one to evaluate the probability of finding two atoms within a given distance. We explored the sensitivity of existing synchrotron x-ray PDF instruments for distinguishing four different simple models for our gold nanoparticles: a multiply twinned fcc decahedron with either a single gap or multiple distributed gaps, a relaxed body-centered orthorhombic (bco) decahedron, and a hybrid decahedron. The data simulations of the models were then compared with experimental data from synchrotron x-ray total scattering. We present our experimentally

  7. Ion beam analysis techniques applied to large scale pollution studies

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, D.D.; Bailey, G.; Martin, J.; Garton, D.; Noorman, H.; Stelcer, E.; Johnson, P. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1993-12-31

    Ion Beam Analysis (IBA) techniques are ideally suited to analyse the thousands of filter papers a year that may originate from a large scale aerosol sampling network. They are fast multi-elemental and, for the most part, non-destructive so other analytical methods such as neutron activation and ion chromatography can be performed afterwards. ANSTO in collaboration with the NSW EPA, Pacific Power and the Universities of NSW and Macquarie has established a large area fine aerosol sampling network covering nearly 80,000 square kilometres of NSW with 25 fine particle samplers. This network known as ASP was funded by the Energy Research and Development Corporation (ERDC) and commenced sampling on 1 July 1991. The cyclone sampler at each site has a 2.5 {mu}m particle diameter cut off and runs for 24 hours every Sunday and Wednesday using one Gillman 25mm diameter stretched Teflon filter for each day. These filters are ideal targets for ion beam analysis work. Currently ANSTO receives 300 filters per month from this network for analysis using its accelerator based ion beam techniques on the 3 MV Van de Graaff accelerator. One week a month of accelerator time is dedicated to this analysis. Four simultaneous accelerator based IBA techniques are used at ANSTO, to analyse for the following 24 elements: H, C, N, O, F, Na, Al, Si, P, S, Cl, K, Ca, Ti, V, Cr, Mn, Fe, Cu, Ni, Co, Zn, Br and Pb. The IBA techniques were proved invaluable in identifying sources of fine particles and their spatial and seasonal variations accross the large area sampled by the ASP network. 3 figs.

  8. Bacterial meningitis: a density-equalizing mapping analysis of the global research architecture.

    Science.gov (United States)

    Pleger, Niklas; Kloft, Beatrix; Quarcoo, David; Zitnik, Simona; Mache, Stefanie; Klingelhoefer, Doris; Groneberg, David A

    2014-09-30

    Bacterial meningitis is caused by a variety of pathogens and displays an important public health threat all over the world. Despite the necessity to develop customized public health-related research projects, a thorough study of global meningitis research is not present, so far. Therefore, the aim of this study was a combined density-equalizing and scientometric study. To evaluate the scientific efforts of bibliometric methods, density-equalizing algorithms and large-scale data analysis of the Web of Science were applied in the period between 1900 and 2007. From this, 7998 publications on bacterial meningitis have been found. With a number of 2698, most publications have been written by U.S. authors, followed by the UK (912), Germany (749) and France (620). This dominance can also be shown in the international cooperation. The specific citation analyses reveal that the nation with the highest average citation rate (citations per publications) was Norway (26.36), followed by Finland (24.16) and the U.S. (24.06). This study illustrates the architecture of global research on bacterial meningitis and points to the need for customized research programs with a focus on local public health issues in countries with a low development index, but high incidences, to target this global public health problem.

  9. Spectrographic Image Analysis Applied to Gabor and Wavelet Transforms

    Science.gov (United States)

    Lees, J. M.

    2009-12-01

    I present a new package for analysis of two-dimensional fields commonly produced by spectrum analysis of harmonic tremor and other seismic signals recorded during volcanic unrest. Exploding volcanoes often exhibit considerable variability in seismic signature - events can range considerably in amplitude, bandwidth and source time function. Harmonic behavior is common on numerous volcanoes and examples show the presence of frequency gliding interspersed with impulsive and emergent explosive signals. The intermittent behavior poses a serious challenge for signal processing and automated analysis for extracting important volcano seismo-acoustic parameters. Here we introduce image processing methods to estimate ridge continuity on spectrograms and to form the basis for pattern recognition on varying quasi-harmonic tremor. Standard routines such as pixel opening and closing, threshold estimation and edge detection will be illustrated. For example, distinguishing harmonic tremor from chugging may be an important element of a hazard reduction program at volcano observatories. Patterns revealed in spectrograms and wavelet transforms have been used in the past to extract physical parameters associated with ongoing eruptions. In this paper an additional level of abstraction is used for pattern recognition. Application of these automated methods for extracting the structure underlying seismic behavior may provide critical time sensitive information used for mitigation.

  10. A study of influence of material properties on magnetic flux density induced in magneto rheological damper through finite element analysis

    Directory of Open Access Journals (Sweden)

    Gurubasavaraju T. M.

    2018-01-01

    Full Text Available Magnetorheological fluids are smart materials, which are responsive to the external stimulus and changes their rheological properties. The damper performance (damping force is dependent on the magnetic flux density induced at the annular gap. Magnetic flux density developed at fluid flow gap of MR damper due to external applied current is also dependent on materials properties of components of MR damper (such as piston head, outer cylinder and piston rod. The present paper discus about the influence of different materials selected for components of the MR damper on magnetic effect using magnetostatic analysis. Different materials such as magnetic and low carbon steels are considered for piston head of the MR damper and magnetic flux density induced at fluid flow gap (filled with MR fluid is computed for different DC current applied to the electromagnetic coil. Developed magnetic flux is used for calculating the damper force using analytical method for each case. The low carbon steel has higher magnetic permeability hence maximum magnetic flux could pass through the piston head, which leads to higher value of magnetic effect induction at the annular gap. From the analysis results it is observed that the magnetic steel and low carbon steel piston head provided maximum magnetic flux density. Eventually the higher damping force can be observed for same case.

  11. Framework for applying probabilistic safety analysis in nuclear regulation

    International Nuclear Information System (INIS)

    Dimitrijevic, V.B.

    1997-01-01

    The traditional regulatory framework has served well to assure the protection of public health and safety. It has been recognized, however, that in a few circumstances, this deterministic framework has lead to an extensive expenditure on matters hat have little to do with the safe and reliable operation of the plant. Developments of plant-specific PSA have offered a new and powerful analytical tool in the evaluation of the safety of the plant. Using PSA insights as an aid to decision making in the regulatory process is now known as 'risk-based' or 'risk-informed' regulation. Numerous activities in the U.S. nuclear industry are focusing on applying this new approach to modify regulatory requirements. In addition, other approaches to regulations are in the developmental phase and are being evaluated. One is based on the performance monitoring and results and it is known as performance-based regulation. The other, called the blended approach, combines traditional deterministic principles with PSA insights and performance results. (author)

  12. Structural analysis of fuel rod applied to pressurized water reactors

    Energy Technology Data Exchange (ETDEWEB)

    Faria, Danilo P.; Pinheiro, Andre Ricardo M.; Lotto, André A., E-mail: danilo.pinheiro@marinha.mil.br [Centro Tecnológico da Marinha em São Paulo (CTMSP), São Paulo, SP (Brazil)

    2017-07-01

    The design of fuel assemblies applied to Pressurized Water Reactors (PWR) has several requirements and acceptance criteria that must be attended for licensing. In the case of PWR fuel rods, an important mechanical structural requirement is to keep the radial stability when submitted to the coolant external pressure. In the framework of the Accident Tolerant Fuel (ATF) program new materials have been studied to replace zirconium based alloys as cladding, including iron-based alloys. In this sense, efforts have been made to evaluate the behavior of these materials under PWR conditions. The present work aims to evaluate the collapse cold pressure of a stainless steel thin-walled tube similar to that used as cladding material of fuel rods by means of the comparison of numeric data, and experimental results. As a result of the simulations, it was observed that the collapse pressure has a value intermediate value between those found by regulatory requirements and analytical calculations. The experiment was carried out for the validation of the computational model using test specimens of thin-walled tubes considering empty tube. The test specimens were sealed at both ends by means of welding. They were subjected to a high pressure device until the collapse of the tubes. Preliminary results obtained from experiments with the empty test specimens indicate that the computational model can be validated for stainless steel cladding, considering the difference between collapse pressure indicated in the regulatory document and the actual limit pressure concerning to radial instability of tubes with the studied characteristics. (author)

  13. Stability analysis of a new lattice hydrodynamic model by considering lattice's self-anticipative density effect

    Science.gov (United States)

    Zhang, Geng; Sun, Di-Hua; Liu, Hui; Chen, Dong

    2017-11-01

    In this paper, a new lattice hydrodynamic model with consideration of the density difference of a lattice's current density and its anticipative density is proposed. The influence of lattice's self-anticipative density on traffic stability is revealed through linear stability theory and it shows that lattice's self-anticipative density can improve the stability of traffic flow. To describe the phase transition of traffic flow, the mKdV equation near the critical point is derived by using nonlinear analysis method. The propagating behavior of density wave in the unstable region can be described by the kink-antikink soliton of the mKdV equation. Numerical simulation validates the analytical results, which shows that traffic jam can be suppressed efficiently by considering lattice's self-anticipative density in the modified lattice hydrodynamic model.

  14. Applying model analysis to a resource-based analysis of the Force and Motion Conceptual Evaluation

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2014-07-01

    Full Text Available Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information regarding the results of investigations using these question clusters than normalized gain graphs. We provide examples from two different institutions to show how the use of model analysis with our redefined clusters can provide previously hidden insight into the effectiveness of instruction.

  15. Density of mixed alkali borate glasses: A structural analysis

    International Nuclear Information System (INIS)

    Doweidar, H.; El-Damrawi, G.M.; Moustafa, Y.M.; Ramadan, R.M.

    2005-01-01

    Density of mixed alkali borate glasses has been correlated with the glass structure. It is assumed that in such glasses each alkali oxide associates with a proportional quantity of B 2 O 3 . The number of BO 3 and BO 4 units related to each type of alkali oxide depends on the total concentration of alkali oxide. It is concluded that in mixed alkali borate glasses the volumes of structural units related to an alkali ion are the same as in the corresponding binary alkali borate glass. This reveals that each type of alkali oxide forms its own borate matrix and behaves as if not affected with the presence of the other alkali oxide. Similar conclusions are valid for borate glasses with three types of alkali oxide

  16. Periodic Density Functional Theory Solver using Multiresolution Analysis with MADNESS

    Science.gov (United States)

    Harrison, Robert; Thornton, William

    2011-03-01

    We describe the first implementation of the all-electron Kohn-Sham density functional periodic solver (DFT) using multi-wavelets and fast integral equations using MADNESS (multiresolution adaptive numerical environment for scientific simulation; http://code.google.com/p/m-a-d-n-e-s-s). The multiresolution nature of a multi-wavelet basis allows for fast computation with guaranteed precision. By reformulating the Kohn-Sham eigenvalue equation into the Lippmann-Schwinger equation, we can avoid using the derivative operator which allows better control of overall precision for the all-electron problem. Other highlights include the development of periodic integral operators with low-rank separation, an adaptable model potential for nuclear potential, and an implementation for Hartree Fock exchange. This work was supported by NSF project OCI-0904972 and made use of resources at the Center for Computational Sciences at Oak Ridge National Laboratory under contract DE-AC05-00OR22725.

  17. Density Functional Theory using Multiresolution Analysis with MADNESS

    Science.gov (United States)

    Thornton, Scott; Harrison, Robert

    2012-02-01

    We describe the first implementation of the all-electron Kohn-Sham density functional periodic solver (DFT) using multi-wavelets and fast integral equations using MADNESS (multiresolution adaptive numerical environment for scientific simulation; http://code.google.com/p/m-a-d-n-e-s-s). The multiresolution nature of a multi-wavelet basis allows for fast computation with guaranteed precision. By reformulating the Kohn-Sham eigenvalue equation into the Lippmann-Schwinger equation, we can avoid using the derivative operator which allows better control of overall precision for the all-electron problem. Other highlights include the development of periodic integral operators with low-rank separation, an adaptable model potential for the nuclear potential, and an implementation for Hartree-Fock exchange.

  18. The video densitometric analysis of the radiographic density and contrast

    International Nuclear Information System (INIS)

    Yoo, Young Sun; Lee, Sang Rae

    1992-01-01

    Generally the patient's absorb dose and readability of radiograms are affected by the exposure time and kVp of which are related with the radiographic density and contrast. The investigator carried studies to know the adequate level of exposure time and kVp to obtain the better readability of radiograms. In these studies dried human mandible with each other by video densitometry among various combination sets of the exposure time, such as, 5, 6, 8, 12, 15, 19, 24, 30, 38, 48 and 60, and varing level of kVp, such as 60, 65, 70, 80 and 90 respectively. The obtained results were as follows: 1. As exposure time and kVp were increased, radiographic density of radiograms was increased. 2. The subject contrast was increased where aluminum step wedge was thin and reduced in the reversed condition. As the thin aluminum step wedge, subject contrast was increased at the condition of lower kilovoltage than that of higher kilovoltage. 3. In the case of non-contrast was increased in the lower kilovoltage with the longer exposure time and the higher kiovoltage with the shorter exposure time. 4. At the condition of short exposure time, bitter readability of each reading item was obtained with the increment of the kilovoltage but at the opposite condition increasing exposure time worsened readability of radiograms.Since X-ray machine in the current dental clinics is fixed between the range of 60-70 kVp and 10 mA, good radiograms can be obtained by varied exposure time. But according to the conclusion of these studies, better radiograms can be obtained by using filtered high kVp and then the absorb dose to patient and exposure time can be reduced.

  19. A photoemission moments model using density functional and transfer matrix methods applied to coating layers on surfaces: Theory

    Science.gov (United States)

    Jensen, Kevin L.; Finkenstadt, Daniel; Shabaev, Andrew; Lambrakos, Samuel G.; Moody, Nathan A.; Petillo, John J.; Yamaguchi, Hisato; Liu, Fangze

    2018-01-01

    Recent experimental measurements of a bulk material covered with a small number of graphene layers reported by Yamaguchi et al. [NPJ 2D Mater. Appl. 1, 12 (2017)] (on bialkali) and Liu et al. [Appl. Phys. Lett. 110, 041607 (2017)] (on copper) and the needs of emission models in beam optics codes have lead to substantial changes in a Moments model of photoemission. The changes account for (i) a barrier profile and density of states factor based on density functional theory (DFT) evaluations, (ii) a Drude-Lorentz model of the optical constants and laser penetration depth, and (iii) a transmission probability evaluated by an Airy Transfer Matrix Approach. Importantly, the DFT results lead to a surface barrier profile of a shape similar to both resonant barriers and reflectionless wells: the associated quantum mechanical transmission probabilities are shown to be comparable to those recently required to enable the Moments (and Three Step) model to match experimental data but for reasons very different than the assumption by conventional wisdom that a barrier is responsible. The substantial modifications of the Moments model components, motivated by computational materials methods, are developed. The results prepare the Moments model for use in treating heterostructures and discrete energy level systems (e.g., quantum dots) proposed for decoupling the opposing metrics of performance that undermine the performance of advanced light sources like the x-ray Free Electron Laser. The consequences of the modified components on quantum yield, emittance, and emission models needed by beam optics codes are discussed.

  20. Power density investigation on the press-pack IGBT 3L-HB-VSCs applied to large wind turbine

    DEFF Research Database (Denmark)

    Senturk, Osman Selcuk; Munk-Nielsen, Stig; Teodorescu, Remus

    2011-01-01

    With three different DC-side and AC-side connections, the three-level H-bridge voltage source converters (3L-HB-VSCs) are alternatives to 3L neutral-point-clamped VSCs (3L-NPC-VSCs) for interfacing large wind turbines with electricity grids. In order to assess their feasibility for large wind...... capabilities, DC capacitor sizes, converter cabinet volumes of the three 3LHB- VSCs utilizing press-pack IGBTs are investigated in order to quantify and compare the power densities of the 3L-HB-VSCs employed as grid-side converters. Also, the suitable transformer types for the 3L-HB-VSCs are determined...... and comparatively studied in terms of volume and weight in order to estimate the size effects of the 3L-HB-VSC topology on the whole wind turbine connection system. Finally, based on the power density and transformer-size investigations, the feasibility of each 3LHB- VSC is discussed....

  1. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    International Nuclear Information System (INIS)

    Angland, P.; Haberberger, D.; Ivancic, S. T.; Froula, D. H.

    2017-01-01

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of the χ2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.

  2. Statistical analysis applied to safety culture self-assessment

    International Nuclear Information System (INIS)

    Macedo Soares, P.P.

    2002-01-01

    Interviews and opinion surveys are instruments used to assess the safety culture in an organization as part of the Safety Culture Enhancement Programme. Specific statistical tools are used to analyse the survey results. This paper presents an example of an opinion survey with the corresponding application of the statistical analysis and the conclusions obtained. Survey validation, Frequency statistics, Kolmogorov-Smirnov non-parametric test, Student (T-test) and ANOVA means comparison tests and LSD post-hoc multiple comparison test, are discussed. (author)

  3. Applying temporal network analysis to the venture capital market

    Science.gov (United States)

    Zhang, Xin; Feng, Ling; Zhu, Rongqian; Stanley, H. Eugene

    2015-10-01

    Using complex network theory to study the investment relationships of venture capital firms has produced a number of significant results. However, previous studies have often neglected the temporal properties of those relationships, which in real-world scenarios play a pivotal role. Here we examine the time-evolving dynamics of venture capital investment in China by constructing temporal networks to represent (i) investment relationships between venture capital firms and portfolio companies and (ii) the syndication ties between venture capital investors. The evolution of the networks exhibits rich variations in centrality, connectivity and local topology. We demonstrate that a temporal network approach provides a dynamic and comprehensive analysis of real-world networks.

  4. Methods of economic analysis applied to fusion research. Final report

    International Nuclear Information System (INIS)

    1983-01-01

    In this and previous efforts ECON has provided economic assessment of a fusion research program. This phase of study focused on two tasks, the first concerned with the economics of fusion in an economy that relies heavily upon synthetic fuels, and the second concerned with the overall economic effects of pursuing soft energy technologies instead of hard technologies. This report is organized in two parts, the first entitled An Economic Analysis of Coproduction of Fusion-Electric Energy and Other Products, and the second entitled Arguments Associated with the Choice of Potential Energy Futures

  5. Differential network analysis applied to preoperative breast cancer chemotherapy response.

    Directory of Open Access Journals (Sweden)

    Gregor Warsow

    Full Text Available In silico approaches are increasingly considered to improve breast cancer treatment. One of these treatments, neoadjuvant TFAC chemotherapy, is used in cases where application of preoperative systemic therapy is indicated. Estimating response to treatment allows or improves clinical decision-making and this, in turn, may be based on a good understanding of the underlying molecular mechanisms. Ever increasing amounts of high throughput data become available for integration into functional networks. In this study, we applied our software tool ExprEssence to identify specific mechanisms relevant for TFAC therapy response, from a gene/protein interaction network. We contrasted the resulting active subnetwork to the subnetworks of two other such methods, OptDis and KeyPathwayMiner. We could show that the ExprEssence subnetwork is more related to the mechanistic functional principles of TFAC therapy than the subnetworks of the other two methods despite the simplicity of ExprEssence. We were able to validate our method by recovering known mechanisms and as an application example of our method, we identified a mechanism that may further explain the synergism between paclitaxel and doxorubicin in TFAC treatment: Paclitaxel may attenuate MELK gene expression, resulting in lower levels of its target MYBL2, already associated with doxorubicin synergism in hepatocellular carcinoma cell lines. We tested our hypothesis in three breast cancer cell lines, confirming it in part. In particular, the predicted effect on MYBL2 could be validated, and a synergistic effect of paclitaxel and doxorubicin could be demonstrated in the breast cancer cell lines SKBR3 and MCF-7.

  6. [Failure mode effect analysis applied to preparation of intravenous cytostatics].

    Science.gov (United States)

    Santos-Rubio, M D; Marín-Gil, R; Muñoz-de la Corte, R; Velázquez-López, M D; Gil-Navarro, M V; Bautista-Paloma, F J

    2016-01-01

    To proactively identify risks in the preparation of intravenous cytostatic drugs, and to prioritise and establish measures to improve safety procedures. Failure Mode Effect Analysis methodology was used. A multidisciplinary team identified potential failure modes of the procedure through a brainstorming session. The impact associated with each failure mode was assessed with the Risk Priority Number (RPN), which involves three variables: occurrence, severity, and detectability. Improvement measures were established for all identified failure modes, with those with RPN>100 considered critical. The final RPN (theoretical) that would result from the proposed measures was also calculated and the process was redesigned. A total of 34 failure modes were identified. The initial accumulated RPN was 3022 (range: 3-252), and after recommended actions the final RPN was 1292 (range: 3-189). RPN scores >100 were obtained in 13 failure modes; only the dispensing sub-process was free of critical points (RPN>100). A final reduction of RPN>50% was achieved in 9 failure modes. This prospective risk analysis methodology allows the weaknesses of the procedure to be prioritised, optimize use of resources, and a substantial improvement in the safety of the preparation of cytostatic drugs through the introduction of double checking and intermediate product labelling. Copyright © 2015 SECA. Published by Elsevier Espana. All rights reserved.

  7. SPI Trend Analysis of New Zealand Applying the ITA Technique

    Directory of Open Access Journals (Sweden)

    Tommaso Caloiero

    2018-03-01

    Full Text Available A natural temporary imbalance of water availability, consisting of persistent lower-than-average or higher-than-average precipitation, can cause extreme dry and wet conditions that adversely impact agricultural yields, water resources, infrastructure, and human systems. In this study, dry and wet periods in New Zealand were expressed using the Standardized Precipitation Index (SPI. First, both the short term (3 and 6 months and the long term (12 and 24 months SPI were estimated, and then, possible trends in the SPI values were detected by means of a new graphical technique, the Innovative Trend Analysis (ITA, which allows the trend identification of the low, medium, and high values of a series. Results show that, in every area currently subject to drought, an increase in this phenomenon can be expected. Specifically, the results of this paper highlight that agricultural regions on the eastern side of the South Island, as well as the north-eastern regions of the North Island, are the most consistently vulnerable areas. In fact, in these regions, the trend analysis mainly showed a general reduction in all the values of the SPI: that is, a tendency toward heavier droughts and weaker wet periods.

  8. Principles of Micellar Electrokinetic Capillary Chromatography Applied in Pharmaceutical Analysis

    Directory of Open Access Journals (Sweden)

    Árpád Gyéresi

    2013-02-01

    Full Text Available Since its introduction capillary electrophoresis has shown great potential in areas where electrophoretic techniques have rarely been used before, including here the analysis of pharmaceutical substances. The large majority of pharmaceutical substances are neutral from electrophoretic point of view, consequently separations by the classic capillary zone electrophoresis; where separation is based on the differences between the own electrophoretic mobilities of the analytes; are hard to achieve. Micellar electrokinetic capillary chromatography, a hybrid method that combines chromatographic and electrophoretic separation principles, extends the applicability of capillary electrophoretic methods to neutral analytes. In micellar electrokinetic capillary chromatography, surfactants are added to the buffer solution in concentration above their critical micellar concentrations, consequently micelles are formed; micelles that undergo electrophoretic migration like any other charged particle. The separation is based on the differential partitioning of an analyte between the two-phase system: the mobile aqueous phase and micellar pseudostationary phase. The present paper aims to summarize the basic aspects regarding separation principles and practical applications of micellar electrokinetic capillary chromatography, with particular attention to those relevant in pharmaceutical analysis.

  9. Dynamical Systems Analysis Applied to Working Memory Data

    Directory of Open Access Journals (Sweden)

    Fidan eGasimova

    2014-07-01

    Full Text Available In the present paper we investigate weekly fluctuations in the working memory capacity (WMC assessed over a period of two years. We use dynamical system analysis, specifically a second order linear differential equation, to model weekly variability in WMC in a sample of 112 9th graders. In our longitudinal data we use a B-spline imputation method to deal with missing data. The results show a significant negative frequency parameter in the data, indicating a cyclical pattern in weekly memory updating performance across time. We use a multilevel modeling approach to capture individual differences in model parameters and find that a higher initial performance level and a slower improvement at the MU task is associated with a slower frequency of oscillation. Additionally, we conduct a simulation study examining the analysis procedure’s performance using different numbers of B-spline knots and values of time delay embedding dimensions. Results show that the number of knots in the B-spline imputation influence accuracy more than the number of embedding dimensions.

  10. Neutron activation analysis applied to nutritional and foodstuff studies

    Energy Technology Data Exchange (ETDEWEB)

    Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de, E-mail: vmaihara@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); Avegliano, Roseane P., E-mail: pagliaro@usp.b [Universidade de Sao Paulo (USP), SP (Brazil). Coordenadoria de Assistencia Social. Div. de Alimentacao

    2009-07-01

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  11. Downside Risk analysis applied to the Hedge Funds universe

    Science.gov (United States)

    Perelló, Josep

    2007-09-01

    Hedge Funds are considered as one of the portfolio management sectors which shows a fastest growing for the past decade. An optimal Hedge Fund management requires an appropriate risk metrics. The classic CAPM theory and its Ratio Sharpe fail to capture some crucial aspects due to the strong non-Gaussian character of Hedge Funds statistics. A possible way out to this problem while keeping the CAPM simplicity is the so-called Downside Risk analysis. One important benefit lies in distinguishing between good and bad returns, that is: returns greater or lower than investor's goal. We revisit most popular Downside Risk indicators and provide new analytical results on them. We compute these measures by taking the Credit Suisse/Tremont Investable Hedge Fund Index Data and with the Gaussian case as a benchmark. In this way, an unusual transversal lecture of the existing Downside Risk measures is provided.

  12. Neutron activation analysis applied to nutritional and foodstuff studies

    International Nuclear Information System (INIS)

    Maihara, Vera A.; Santos, Paola S.; Moura, Patricia L.C.; Castro, Lilian P. de; Avegliano, Roseane P.

    2009-01-01

    Neutron Activation Analysis, NAA, has been successfully used on a regularly basis in several areas of nutrition and foodstuffs. NAA has become an important and useful research tool due to the methodology's advantages. These include high accuracy, small quantities of samples and no chemical treatment. This technique allows the determination of important elements directly related to human health. NAA also provides data concerning essential and toxic concentrations in foodstuffs and specific diets. In this paper some studies in the area of nutrition which have been carried out at the Neutron Activation Laboratory of IPEN/CNEN-SP will be presented: a Brazilian total diet study: nutritional element dietary intakes of Sao Paulo state population; a study of trace element in maternal milk and the determination of essential trace elements in some edible mushrooms. (author)

  13. Applying importance-performance analysis to evaluate banking service quality

    Directory of Open Access Journals (Sweden)

    André Luís Policani Freitas

    2012-11-01

    Full Text Available In an increasingly competitive market, the identification of the most important aspects and the measurement of service quality as perceived by the customers are important actions taken by organizations which seek the competitive advantage. In particular, this scenario is typical of Brazilian banking sector. In this context, this article presents an exploratory case study in which the Importance-Performance Analysis (IPA was used to identify the strong and the weak points related to services provided by a bank. In order to check the reliability of the questionnaire, Cronbach's alpha and correlation analyses were used. The results are presented and some actions have been defined in order to improve the quality of services.

  14. Quantitative analysis of caffeine applied to pharmaceutical industry

    Science.gov (United States)

    Baucells, M.; Ferrer, N.; Gómez, P.; Lacort, G.; Roura, M.

    1993-03-01

    The direct determination of some compounds like caffeine in pharmaceutical samples without sample pretreatment and without the separation of these compounds from the matrix (acetyl salicylic acid, paracetamol,…) is very worthwhile. It enables analysis to be performed quickly and without the problems associated with sample manipulation. The samples were diluted directly in KBr powder. We used both diffuse reflectance (DRIFT) and transmission techniques in order to measure the intensity of the peaks of the caffeine in the pharmaceutical matrix. Limits of detection, determination, relative standard deviation and recovery using caffeine in the same matrix as in the pharmaceutical product are related. Two methods for the quantification of caffeine were used: calibration line and standard addition techniques.

  15. Current density and polarization curves for radial flow field patterns applied to PEMFCs (Proton Exchange Membrane Fuel Cells)

    International Nuclear Information System (INIS)

    Cano-Andrade, S.; Hernandez-Guerrero, A.; Spakovsky, M.R. von; Damian-Ascencio, C.E.; Rubio-Arana, J.C.

    2010-01-01

    A numerical solution of the current density and velocity fields of a 3-D PEM radial configuration fuel cell is presented. The energy, momentum and electrochemical equations are solved using a computational fluid dynamics (CFD) code based on a finite volume scheme. There are three cases of principal interest for this radial model: four channels, eight channels and twelve channels placed in a symmetrical path over the flow field plate. The figures for the current-voltage curves for the three models proposed are presented, and the main factors that affect the behavior of each of the curves are discussed. Velocity contours are presented for the three different models, showing how the fuel cell behavior is affected by the velocity variations in the radial configuration. All these results are presented for the case of high relative humidity. The favorable results obtained for this unconventional geometry seems to indicate that this geometry could replace the conventional commercial geometries currently in use.

  16. Power Spectral Density Specification and Analysis of Large Optical Surfaces

    Science.gov (United States)

    Sidick, Erkin

    2009-01-01

    The 2-dimensional Power Spectral Density (PSD) can be used to characterize the mid- and the high-spatial frequency components of the surface height errors of an optical surface. We found it necessary to have a complete, easy-to-use approach for specifying and evaluating the PSD characteristics of large optical surfaces, an approach that allows one to specify the surface quality of a large optical surface based on simulated results using a PSD function and to evaluate the measured surface profile data of the same optic in comparison with those predicted by the simulations during the specification-derivation process. This paper provides a complete mathematical description of PSD error, and proposes a new approach in which a 2-dimentional (2D) PSD is converted into a 1-dimentional (1D) one by azimuthally averaging the 2D-PSD. The 1D-PSD calculated this way has the same unit and the same profile as the original PSD function, thus allows one to compare the two with each other directly.

  17. Applied and computational harmonic analysis on graphs and networks

    Science.gov (United States)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  18. Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation

    Directory of Open Access Journals (Sweden)

    Marlen Promann

    2015-03-01

    Full Text Available Libraries are implementing discovery layers to offer better user experiences. While usability tests have been helpful in evaluating the success or failure of implementing discovery layers in the library context, the focus has remained on its relative interface benefits over the traditional federated search. The informal site- and context specific usability tests have offered little to test the rigor of the discovery layers against the user goals, motivations and workflow they have been designed to support. This study proposes hierarchical task analysis (HTA as an important complementary evaluation method to usability testing of discovery layers. Relevant literature is reviewed for the discovery layers and the HTA method. As no previous application of HTA to the evaluation of discovery layers was found, this paper presents the application of HTA as an expert based and workflow centered (e.g. retrieving a relevant book or a journal article method to evaluating discovery layers. Purdue University’s Primo by Ex Libris was used to map eleven use cases as HTA charts. Nielsen’s Goal Composition theory was used as an analytical framework to evaluate the goal carts from two perspectives: a users’ physical interactions (i.e. clicks, and b user’s cognitive steps (i.e. decision points for what to do next. A brief comparison of HTA and usability test findings is offered as a way of conclusion.

  19. Perturbation Method of Analysis Applied to Substitution Measurements of Buckling

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Rolf

    1966-11-15

    Calculations with two-group perturbation theory on substitution experiments with homogenized regions show that a condensation of the results into a one-group formula is possible, provided that a transition region is introduced in a proper way. In heterogeneous cores the transition region comes in as a consequence of a new cell concept. By making use of progressive substitutions the properties of the transition region can be regarded as fitting parameters in the evaluation procedure. The thickness of the region is approximately equal to the sum of 1/(1/{tau} + 1/L{sup 2}){sup 1/2} for the test and reference regions. Consequently a region where L{sup 2} >> {tau}, e.g. D{sub 2}O, contributes with {radical}{tau} to the thickness. In cores where {tau} >> L{sup 2} , e.g. H{sub 2}O assemblies, the thickness of the transition region is determined by L. Experiments on rod lattices in D{sub 2}O and on test regions of D{sub 2}O alone (where B{sup 2} = - 1/L{sup 2} ) are analysed. The lattice measurements, where the pitches differed by a factor of {radical}2, gave excellent results, whereas the determination of the diffusion length in D{sub 2}O by this method was not quite successful. Even regions containing only one test element can be used in a meaningful way in the analysis.

  20. Soft tissue cephalometric analysis applied to Himachali ethnic population

    Directory of Open Access Journals (Sweden)

    Isha Aggarwal

    2016-01-01

    Full Text Available Introduction: The modern society considers facial attractiveness as an important physical attribute. The great variance in soft tissue drape of the human face complicates accurate assessment of the soft tissue profile, and it is a known fact that facial features of different ethnic groups differ significantly. This study was undertaken to establish norms for Himachali ethnic population. Materials and Methods: The sample comprised lateral cephalograms taken in natural head position of 100 normal individuals (50 males, 50 females. The cephalograms were analyzed by Arnett soft tissue cephalometric analysis for orthodontic diagnosis and treatment planning. Student's t-test was used to compare the means of the two groups. Results: Statistically significant differences were found between Himachali males and females in certain key parameters. Males have thicker soft tissue structures and a more acute nasolabial angle than females. Males have longer faces and females have greater interlabial gap and maxillary incisor exposure. Males have more deep-set facial structures than females. Conclusions: Statistically significant differences were found between Himachali males and females in certain key parameters. Differences were also noted between other ethnic groups and Himachali faces.

  1. Applying revised gap analysis model in measuring hotel service quality.

    Science.gov (United States)

    Lee, Yu-Cheng; Wang, Yu-Che; Chien, Chih-Hung; Wu, Chia-Huei; Lu, Shu-Chiung; Tsai, Sang-Bing; Dong, Weiwei

    2016-01-01

    With the number of tourists coming to Taiwan growing by 10-20 % since 2010, the number has increased due to an increasing number of foreign tourists, particularly after deregulation allowed admitting tourist groups, followed later on by foreign individual tourists, from mainland China. The purpose of this study is to propose a revised gap model to evaluate and improve service quality in Taiwanese hotel industry. Thus, service quality could be clearly measured through gap analysis, which was more effective for offering direction in developing and improving service quality. The HOLSERV instrument was used to identify and analyze service gaps from the perceptions of internal and external customers. The sample for this study included three main categories of respondents: tourists, employees, and managers. The results show that five gaps influenced tourists' evaluations of service quality. In particular, the study revealed that Gap 1 (management perceptions vs. customer expectations) and Gap 9 (service provider perceptions of management perceptions vs. service delivery) were more critical than the others in affecting perceived service quality, making service delivery the main area of improvement. This study contributes toward an evaluation of the service quality of the Taiwanese hotel industry from the perspectives of customers, service providers, and managers, which is considerably valuable for hotel managers. It was the aim of this study to explore all of these together in order to better understand the possible gaps in the hotel industry in Taiwan.

  2. Relativistic analysis of nuclear ground state densities at 135 to 200 ...

    Indian Academy of Sciences (India)

    Abstract. A relativistic analysis of p + 40Ca elastic scattering with different nuclear ground state target densities at 135 to 200 MeV is presented in this paper. It is found that the IGO densities are more consistent in reproducing the data over the energy range considered here. The reproduction of spin-rotation-function data ...

  3. Differentiating defects in red oak lumber by discriminant analysis using color, shape, and density

    Science.gov (United States)

    B. H. Bond; D. Earl Kline; Philip A. Araman

    2002-01-01

    Defect color, shape, and density measures aid in the differentiation of knots, bark pockets, stain/mineral streak, and clearwood in red oak, (Quercus rubra). Various color, shape, and density measures were extracted for defects present in color and X-ray images captured using a color line scan camera and an X-ray line scan detector. Analysis of variance was used to...

  4. Functional analysis of the cross-section form and X-ray density of human ulnae

    International Nuclear Information System (INIS)

    Hilgen, B.

    1981-01-01

    On 20 ulnae the form of the cross sections and distribution of the X-ray density were investigated in five different cross-section heights. The analysis of the cross-section forms was carried through using plane contraction figures, the X-ray density was established by means of the equidensity line method. (orig.) [de

  5. Statistical analysis and Kalman filtering applied to nuclear materials accountancy

    International Nuclear Information System (INIS)

    Annibal, P.S.

    1990-08-01

    Much theoretical research has been carried out on the development of statistical methods for nuclear material accountancy. In practice, physical, financial and time constraints mean that the techniques must be adapted to give an optimal performance in plant conditions. This thesis aims to bridge the gap between theory and practice, to show the benefits to be gained from a knowledge of the facility operation. Four different aspects are considered; firstly, the use of redundant measurements to reduce the error on the estimate of the mass of heavy metal in an 'accountancy tank' is investigated. Secondly, an analysis of the calibration data for the same tank is presented, establishing bounds for the error and suggesting a means of reducing them. Thirdly, a plant-specific method of producing an optimal statistic from the input, output and inventory data, to help decide between 'material loss' and 'no loss' hypotheses, is developed and compared with existing general techniques. Finally, an application of the Kalman Filter to materials accountancy is developed, to demonstrate the advantages of state-estimation techniques. The results of the analyses and comparisons illustrate the importance of taking into account a complete and accurate knowledge of the plant operation, measurement system, and calibration methods, to derive meaningful results from statistical tests on materials accountancy data, and to give a better understanding of critical random and systematic error sources. The analyses were carried out on the head-end of the Fast Reactor Reprocessing Plant, where fuel from the prototype fast reactor is cut up and dissolved. However, the techniques described are general in their application. (author)

  6. Improving the flash flood frequency analysis applying dendrogeomorphological evidences

    Science.gov (United States)

    Ruiz-Villanueva, V.; Ballesteros, J. A.; Bodoque, J. M.; Stoffel, M.; Bollschweiler, M.; Díez-Herrero, A.

    2009-09-01

    Flash floods are one of the natural hazards that cause major damages worldwide. Especially in Mediterranean areas they provoke high economic losses every year. In mountain areas with high stream gradients, floods events are characterized by extremely high flow and debris transport rates. Flash flood analysis in mountain areas presents specific scientific challenges. On one hand, there is a lack of information on precipitation and discharge due to a lack of spatially well distributed gauge stations with long records. On the other hand, gauge stations may not record correctly during extreme events when they are damaged or the discharge exceeds the recordable level. In this case, no systematic data allows improvement of the understanding of the spatial and temporal occurrence of the process. Since historic documentation is normally scarce or even completely missing in mountain areas, tree-ring analysis can provide an alternative approach. Flash floods may influence trees in different ways: (1) tilting of the stem through the unilateral pressure of the flowing mass or individual boulders; (2) root exposure through erosion of the banks; (3) injuries and scars caused by boulders and wood transported in the flow; (4) decapitation of the stem and resulting candelabra growth through the severe impact of boulders; (5) stem burial through deposition of material. The trees react to these disturbances with specific growth changes such as abrupt change of the yearly increment and anatomical changes like reaction wood or callus tissue. In this study, we sampled 90 cross sections and 265 increment cores of trees heavily affected by past flash floods in order to date past events and to reconstruct recurrence intervals in two torrent channels located in the Spanish Central System. The first study site is located along the Pelayo River, a torrent in natural conditions. Based on the external disturbances of trees and their geomorphological position, 114 Pinus pinaster (Ait

  7. Spatial analysis of NDVI readings with difference sampling density

    Science.gov (United States)

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  8. Potential density and tree survival: an analysis based on South ...

    African Journals Online (AJOL)

    Finally, we present a tree survival analysis, based on the Weibull distribution function, for the Nelshoogte replicated CCT study, which has been observed for almost 40 years after planting and provides information about tree survival in response to planting espacements ranging from 494 to 2 965 trees per hectare.

  9. Applied Drama and the Higher Education Learning Spaces: A Reflective Analysis

    Science.gov (United States)

    Moyo, Cletus

    2015-01-01

    This paper explores Applied Drama as a teaching approach in Higher Education learning spaces. The exploration takes a reflective analysis approach by first examining the impact that Applied Drama has had on my career as a Lecturer/Educator/Teacher working in Higher Education environments. My engagement with Applied Drama practice and theory is…

  10. Exploratory Factor Analysis as a Construct Validation Tool: (Mis)applications in Applied Linguistics Research

    Science.gov (United States)

    Karami, Hossein

    2015-01-01

    Factor analysis has been frequently exploited in applied research to provide evidence about the underlying factors in various measurement instruments. A close inspection of a large number of studies published in leading applied linguistic journals shows that there is a misconception among applied linguists as to the relative merits of exploratory…

  11. Probability Density Components Analysis: A New Approach to Treatment and Classification of SAR Images

    Directory of Open Access Journals (Sweden)

    Osmar Abílio de Carvalho Júnior

    2014-04-01

    Full Text Available Speckle noise (salt and pepper is inherent to synthetic aperture radar (SAR, which causes a usual noise-like granular aspect and complicates the image classification. In SAR image analysis, the spatial information might be a particular benefit for denoising and mapping classes characterized by a statistical distribution of the pixel intensities from a complex and heterogeneous spectral response. This paper proposes the Probability Density Components Analysis (PDCA, a new alternative that combines filtering and frequency histogram to improve the classification procedure for the single-channel synthetic aperture radar (SAR images. This method was tested on L-band SAR data from the Advanced Land Observation System (ALOS Phased-Array Synthetic-Aperture Radar (PALSAR sensor. The study area is localized in the Brazilian Amazon rainforest, northern Rondônia State (municipality of Candeias do Jamari, containing forest and land use patterns. The proposed algorithm uses a moving window over the image, estimating the probability density curve in different image components. Therefore, a single input image generates an output with multi-components. Initially the multi-components should be treated by noise-reduction methods, such as maximum noise fraction (MNF or noise-adjusted principal components (NAPCs. Both methods enable reducing noise as well as the ordering of multi-component data in terms of the image quality. In this paper, the NAPC applied to multi-components provided large reductions in the noise levels, and the color composites considering the first NAPC enhance the classification of different surface features. In the spectral classification, the Spectral Correlation Mapper and Minimum Distance were used. The results obtained presented as similar to the visual interpretation of optical images from TM-Landsat and Google Maps.

  12. Evaluation of optical and electronic properties of silicon nano-agglomerates embedded in SRO: applying density functional theory

    Science.gov (United States)

    Espinosa-Torres, Néstor D.; la Luz, David Hernández-de; Flores-Gracia, José Francisco J.; Luna-López, José A.; Martínez-Juárez, Javier; Vázquez-Valerdi, Diana E.

    2014-09-01

    In systems in atomic scale and nanoscale such as clusters or agglomerates constituted by particles from a few to less than 100 atoms, quantum confinement effects are very important. Their optical and electronic properties are often dependent on the size of the systems and the way in which the atoms in these clusters are bonded. Generally, these nanostructures display optical and electronic properties significantly different to those found in corresponding bulk materials. Silicon agglomerates embedded in silicon rich oxide (SRO) films have optical properties, which have been reported to be directly dependent on silicon nanocrystal size. Furthermore, the room temperature photoluminescence (PL) of SRO has repeatedly generated a huge interest due to its possible applications in optoelectronic devices. However, a plausible emission mechanism has not been widely accepted in the scientific community. In this work, we present a short review about the experimental results on silicon nanoclusters in SRO considering different techniques of growth. We focus mainly on their size, Raman spectra, and photoluminescence spectra. With this as background, we employed the density functional theory with a functional B3LYP and a basis set 6-31G* to calculate the optical and electronic properties of clusters of silicon (constituted by 15 to 20 silicon atoms). With the theoretical calculation of the structural and optical properties of silicon clusters, it is possible to evaluate the contribution of silicon agglomerates in the luminescent emission mechanism, experimentally found in thin SRO films.

  13. Evaluation of optical and electronic properties of silicon nano-agglomerates embedded in SRO: applying density functional theory.

    Science.gov (United States)

    Espinosa-Torres, Néstor D; la Luz, David Hernández-de; Flores-Gracia, José Francisco J; Luna-López, José A; Martínez-Juárez, Javier; Vázquez-Valerdi, Diana E

    2014-01-01

    In systems in atomic scale and nanoscale such as clusters or agglomerates constituted by particles from a few to less than 100 atoms, quantum confinement effects are very important. Their optical and electronic properties are often dependent on the size of the systems and the way in which the atoms in these clusters are bonded. Generally, these nanostructures display optical and electronic properties significantly different to those found in corresponding bulk materials. Silicon agglomerates embedded in silicon rich oxide (SRO) films have optical properties, which have been reported to be directly dependent on silicon nanocrystal size. Furthermore, the room temperature photoluminescence (PL) of SRO has repeatedly generated a huge interest due to its possible applications in optoelectronic devices. However, a plausible emission mechanism has not been widely accepted in the scientific community. In this work, we present a short review about the experimental results on silicon nanoclusters in SRO considering different techniques of growth. We focus mainly on their size, Raman spectra, and photoluminescence spectra. With this as background, we employed the density functional theory with a functional B3LYP and a basis set 6-31G* to calculate the optical and electronic properties of clusters of silicon (constituted by 15 to 20 silicon atoms). With the theoretical calculation of the structural and optical properties of silicon clusters, it is possible to evaluate the contribution of silicon agglomerates in the luminescent emission mechanism, experimentally found in thin SRO films.

  14. Stereoscopy of dust density waves under microgravity: Velocity distributions and phase-resolved single-particle analysis

    Energy Technology Data Exchange (ETDEWEB)

    Himpel, Michael, E-mail: himpel@physik.uni-greifswald.de; Killer, Carsten; Melzer, André [Institute of Physics, Ernst-Moritz-Arndt-University, 17489 Greifswald (Germany); Bockwoldt, Tim; Piel, Alexander [IEAP, Christian-Albrechts-Universität Kiel, D-24098 Kiel (Germany); Ole Menzel, Kristoffer [ABB Switzerland Ltd, Corporate Research Center, 5405 Dättwil (Switzerland)

    2014-03-15

    Experiments on dust-density waves have been performed in dusty plasmas under the microgravity conditions of parabolic flights. Three-dimensional measurements of a dust density wave on a single particle level are presented. The dust particles have been tracked for many oscillation periods. A Hilbert analysis is applied to obtain trajectory parameters such as oscillation amplitude and three-dimensional velocity amplitude. While the transverse motion is found to be thermal, the velocity distribution in wave propagation direction can be explained by harmonic oscillations with added Gaussian (thermal) noise. Additionally, it is shown that the wave properties can be reconstructed by means of a pseudo-stroboscopic approach. Finally, the energy dissipation mechanism from the kinetic oscillation energy to thermal motion is discussed and presented using phase-resolved analysis.

  15. Sensitivity analysis of optimized nuclear energy density functional

    International Nuclear Information System (INIS)

    Mondal, C.; Agrawal, B.K.; De, J.N.; Samaddar, S.K.

    2016-01-01

    Being the exact nature of nuclear force unknown, parameters of nuclear models have been optimized by fitting different kind of data e.g. properties of finite nuclei as well as neutron stars. As the information about exact correspondence between parameters of the model and the fitted data are missing, it has led to a plethora of nuclear models. These informations can be extracted by studying correlations between different parameters and fitted data in all sorts of combinations within the framework of covariance analysis. This not only minimizes the number of the parameters of the model, but also helps to restrict unnecessary addition of redundant data

  16. Beyond Time Out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    Science.gov (United States)

    Boutot, E. Amanda; Hume, Kara

    2010-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  17. Beyond Time out and Table Time: Today's Applied Behavior Analysis for Students with Autism

    Science.gov (United States)

    Boutot, E. Amanda; Hume, Kara

    2012-01-01

    Recent mandates related to the implementation of evidence-based practices for individuals with autism spectrum disorder (ASD) require that autism professionals both understand and are able to implement practices based on the science of applied behavior analysis (ABA). The use of the term "applied behavior analysis" and its related concepts…

  18. Polycystic ovary syndrome: analysis of the global research architecture using density equalizing mapping.

    Science.gov (United States)

    Brüggmann, Dörthe; Berges, Lea; Klingelhöfer, Doris; Bauer, Jan; Bendels, Michael; Louwen, Frank; Jaque, Jenny; Groneberg, David A

    2017-06-01

    Polycystic ovary syndrome (PCOS) is the most common cause of female infertility worldwide. Although the related research output is constantly growing, no detailed global map of the scientific architecture has so far been created encompassing quantitative, qualitative, socioeconomic and gender aspects. We used the NewQIS platform to assess all PCOS-related publications indexed between 1900 and 2014 in the Web of Science, and applied density equalizing mapping projections, scientometric techniques and economic benchmarking procedures. A total of 6261 PCOS-specific publications and 703 international research collaborations were found. The USA was identified as the most active country in total and collaborative research activity. In the socioeconomic analysis, the USA was also ranked first (25.49 PCOS-related publications per gross domestic product [GDP]/capita), followed by the UK, Italy and Greece. When research activity was related to population size, Scandinavian countries and Greece were leading the field. For many highly productive countries, gender analysis revealed a high ratio of female scientists working on PCOS with the exception of Japan. In this study, we have created the first picture of global PCOS research, which largely differs from other gynaecologic conditions and indicates that most related research and collaborations originate from high-income countries. Copyright © 2017 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.

  19. Density functional theory-broken symmetry (DFT-BS) methodology applied to electronic and magnetic properties of bioinorganic prosthetic groups.

    Science.gov (United States)

    Mouesca, Jean-Marie

    2014-01-01

    The goal of this "how to" chapter is to present in a way as simple and practical as possible some of the concepts, key issues, and practices behind the so-called broken symmetry (BS) state which is widely used within the density functional theory (DFT) (for a very nice but thoughtful introduction to DFT (without equations!), read Perdew et al. (J Chem Theory Comput 5:902-908, 2009)) community to compute energetic as well as spectroscopic properties pertaining to (poly-)radicals, bioinorganic clusters (especially those containing transition metal ions), etc. Such properties encompass exchange coupling constants J (molecular magnetism) but also (among other things) g-tensors and hyperfine coupling tensors A (from electron paramagnetic resonance), isomer shifts δ and quadrupolar tensors ΔE Q (from Mössbauer), etc.Hopefully, this chapter will appeal to those DFT practitioners who would like to understand the basics behind the BS state and help them "demystify" some of the issues involved with them. More technical issues will only be alluded to, and appropriate references will be given for those interested to go beyond this mere introduction. This chapter is however not a review of the field. Consequently, it will be primarily based on my own experience. The goal here (in the spirit of a "how to" chapter) is to accompany the readers' thoughts in a progressive way along increasingly complex issues rather than encumbering the same thoughts with too complicate mathematical details (the few derivations which are given will therefore be explicit). Moreover, I will emphasize in this chapter the interplay between the computation of BS states on the one hand, and the derivation of phenomenological models on the other hand, whose parameters can be supplied from appropriate BS states. Finally, this chapter is dedicated to Louis Noodleman (Scripps Research Institute, CA, USA), pioneer (Noodleman, J Chem Phys 74:5737-5743, 1981; Noodleman, Chem Phys 109:131-143, 1986) and

  20. Sensitivity analysis of crustal correction and its error propagation to upper mantle residual gravity and density anomalies

    DEFF Research Database (Denmark)

    Herceg, Matija; Artemieva, Irina; Thybo, Hans

    2013-01-01

    We investigate the effect of the crustal structure heterogeneity and uncertainty in its determination on stripped gravity field. The analysis is based on interpretation of residual upper mantle gravity anomalies which are calculated by subtracting (stripping) the gravitational effect of the crust...... a relatively small range of expected density variations in the lithospheric mantle, knowledge on the uncertainties associated with incomplete knowledge of density structure of the crust is of utmost importance for further progress in such studies......) uncertainties in the velocity-density conversion and (ii) uncertainties in knowledge of the crustal structure (thickness and average Vp velocities of individual crustal layers, including the sedimentary cover). In this study, we address both sources of possible uncertainties by applying different conversions...... from velocity to density and by introducing variations into the crustal structure which corresponds to the uncertainty of its resolution by high-quality and low-quality seismic models. We examine the propagation of these uncertainties into determinations of lithospheric mantle density. The residual...

  1. Applying Full Spectrum Analysis to a Raman Spectroscopic Assessment of Fracture Toughness of Human Cortical Bone.

    Science.gov (United States)

    Makowski, Alexander J; Granke, Mathilde; Ayala, Oscar D; Uppuganti, Sasidhar; Mahadevan-Jansen, Anita; Nyman, Jeffry S

    2017-10-01

    A decline in the inherent quality of bone tissue is a † Equal contributors contributor to the age-related increase in fracture risk. Although this is well-known, the important biochemical factors of bone quality have yet to be identified using Raman spectroscopy (RS), a nondestructive, inelastic light-scattering technique. To identify potential RS predictors of fracture risk, we applied principal component analysis (PCA) to 558 Raman spectra (370-1720 cm -1 ) of human cortical bone acquired from 62 female and male donors (nine spectra each) spanning adulthood (age range = 21-101 years). Spectra were analyzed prior to R-curve, nonlinear fracture mechanics that delineate crack initiation (K init ) from crack growth toughness (K grow ). The traditional ν 1 phosphate peak per amide I peak (mineral-to-matrix ratio) weakly correlated with K init (r = 0.341, p = 0.0067) and overall crack growth toughness (J-int: r = 0.331, p = 0.0086). Sub-peak ratios of the amide I band that are related to the secondary structure of type 1 collagen did not correlate with the fracture toughness properties. In the full spectrum analysis, one principal component (PC5) correlated with all of the mechanical properties (K init : r = - 0.467, K grow : r = - 0.375, and J-int: r = - 0.428; p toughness, namely age and/or volumetric bone mineral density (vBMD), were included in general linear models as covariates, several PCs helped explain 45.0% (PC5) to 48.5% (PC7), 31.4% (PC6), and 25.8% (PC7) of the variance in K init , K grow , and J-int, respectively. Deriving spectral features from full spectrum analysis may improve the ability of RS, a clinically viable technology, to assess fracture risk.

  2. Analysis of the photophysical properties of zearalenone using density functional theory

    Science.gov (United States)

    The intrinsic photophysical properties of the resorcylic acid moiety of zearalenone offer a convenient label free method to determine zearalenone levels in contaminated agricultural products. Density functional theory and steady-state fluorescence methods were applied to investigate the role of stru...

  3. Unsupervised neural spike sorting for high-density microelectrode arrays with convolutive independent component analysis.

    Science.gov (United States)

    Leibig, Christian; Wachtler, Thomas; Zeck, Günther

    2016-09-15

    Unsupervised identification of action potentials in multi-channel extracellular recordings, in particular from high-density microelectrode arrays with thousands of sensors, is an unresolved problem. While independent component analysis (ICA) achieves rapid unsupervised sorting, it ignores the convolutive structure of extracellular data, thus limiting the unmixing to a subset of neurons. Here we present a spike sorting algorithm based on convolutive ICA (cICA) to retrieve a larger number of accurately sorted neurons than with instantaneous ICA while accounting for signal overlaps. Spike sorting was applied to datasets with varying signal-to-noise ratios (SNR: 3-12) and 27% spike overlaps, sampled at either 11.5 or 23kHz on 4365 electrodes. We demonstrate how the instantaneity assumption in ICA-based algorithms has to be relaxed in order to improve the spike sorting performance for high-density microelectrode array recordings. Reformulating the convolutive mixture as an instantaneous mixture by modeling several delayed samples jointly is necessary to increase signal-to-noise ratio. Our results emphasize that different cICA algorithms are not equivalent. Spike sorting performance was assessed with ground-truth data generated from experimentally derived templates. The presented spike sorter was able to extract ≈90% of the true spike trains with an error rate below 2%. It was superior to two alternative (c)ICA methods (≈80% accurately sorted neurons) and comparable to a supervised sorting. Our new algorithm represents a fast solution to overcome the current bottleneck in spike sorting of large datasets generated by simultaneous recording with thousands of electrodes. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Introduction to applied statistical signal analysis guide to biomedical and electrical engineering applications

    CERN Document Server

    Shiavi, Richard

    2007-01-01

    Introduction to Applied Statistical Signal Analysis is designed for the experienced individual with a basic background in mathematics, science, and computer. With this predisposed knowledge, the reader will coast through the practical introduction and move on to signal analysis techniques, commonly used in a broad range of engineering areas such as biomedical engineering, communications, geophysics, and speech.Introduction to Applied Statistical Signal Analysis intertwines theory and implementation with practical examples and exercises. Topics presented in detail include: mathematical

  5. Inclusive Elementary Classroom Teacher Knowledge of and Attitudes toward Applied Behavior Analysis and Autism Spectrum Disorder and Their Use of Applied Behavior Analysis

    Science.gov (United States)

    McCormick, Jennifer A.

    2011-01-01

    The purpose of this study was to examine inclusive elementary teacher knowledge and attitude toward Autism Spectrum Disorder (ASD) and applied behavior analysis (ABA) and their use of ABA. Furthermore, this study examined if knowledge and attitude predicted use of ABA. A survey was developed and administered through a web-based program. Of the…

  6. Sensitivity analysis of crustal correction for calculation of lithospheric mantle density from gravity data

    DEFF Research Database (Denmark)

    Herceg, Matija; Artemieva, Irina; Thybo, Hans

    2016-01-01

    of the crust from the observed satellite gravity field data (GOCE Direct release 3). Thus calculated residual mantle gravity anomalies are caused mainly by a heterogeneous density distribution in the upper mantle. Given a relatively small range of expected compositional density variations in the lithospheric......We investigate how uncertainties in seismic and density structure of the crust propagate to uncertainties in mantle density structure. The analysis is based on interpretation of residual upper-mantle gravity anomalies which are calculated by subtracting (stripping) the gravitational effect...... mantle, knowledge on uncertainties associated with incomplete information on crustal structure is of utmost importance for progress in gravity modelling. Uncertainties in the residual upper-mantle gravity anomalies result chiefly from uncertainties in (i) seismic VP velocity-density conversion...

  7. A joint probability density function of wind speed and direction for wind energy analysis

    International Nuclear Information System (INIS)

    Carta, Jose A.; Ramirez, Penelope; Bueno, Celia

    2008-01-01

    A very flexible joint probability density function of wind speed and direction is presented in this paper for use in wind energy analysis. A method that enables angular-linear distributions to be obtained with specified marginal distributions has been used for this purpose. For the marginal distribution of wind speed we use a singly truncated from below Normal-Weibull mixture distribution. The marginal distribution of wind direction comprises a finite mixture of von Mises distributions. The proposed model is applied in this paper to wind direction and wind speed hourly data recorded at several weather stations located in the Canary Islands (Spain). The suitability of the distributions is judged from the coefficient of determination R 2 . The conclusions reached are that the joint distribution proposed in this paper: (a) can represent unimodal, bimodal and bitangential wind speed frequency distributions, (b) takes into account the frequency of null winds, (c) represents the wind direction regimes in zones with several modes or prevailing wind directions, (d) takes into account the correlation between wind speeds and its directions. It can therefore be used in several tasks involved in the evaluation process of the wind resources available at a potential site. We also conclude that, in the case of the Canary Islands, the proposed model provides better fits in all the cases analysed than those obtained with the models used in the specialised literature on wind energy

  8. Comparative analysis of methods of cartographic representation of population density in lithuania

    OpenAIRE

    Šturaitė, Aurelija

    2016-01-01

    Residents are one of conditions for state existance, therefore analysis of demographic characteristics is iportant not only in theoretical, but also practical aspect. Population density is one of the most important demographic characteristics which contains and also reflects economic, social, political and sometimes cultural meaning of region. Mapping of population density has dual objectives: on the one hand, representing quantity and distribution of residents in the analyzed area, on the ot...

  9. Bulk density estimation using a 3-dimensional image acquisition and analysis system

    Directory of Open Access Journals (Sweden)

    Heyduk Adam

    2016-01-01

    Full Text Available The paper presents a concept of dynamic bulk density estimation of a particulate matter stream using a 3-d image analysis system and a conveyor belt scale. A method of image acquisition should be adjusted to the type of scale. The paper presents some laboratory results of static bulk density measurements using the MS Kinect time-of-flight camera and OpenCV/Matlab software. Measurements were made for several different size classes.

  10. Independent principal component analysis for simulation of soil water content and bulk density in a Canadian Watershed

    Directory of Open Access Journals (Sweden)

    Alaba Boluwade

    2016-09-01

    Full Text Available Accurate characterization of soil properties such as soil water content (SWC and bulk density (BD is vital for hydrologic processes and thus, it is importance to estimate θ (water content and ρ (soil bulk density among other soil surface parameters involved in water retention and infiltration, runoff generation and water erosion, etc. The spatial estimation of these soil properties are important in guiding agricultural management decisions. These soil properties vary both in space and time and are correlated. Therefore, it is important to find an efficient and robust technique to simulate spatially correlated variables. Methods such as principal component analysis (PCA and independent component analysis (ICA can be used for the joint simulations of spatially correlated variables, but they are not without their flaws. This study applied a variant of PCA called independent principal component analysis (IPCA that combines the strengths of both PCA and ICA for spatial simulation of SWC and BD using the soil data set from an 11 km2 Castor watershed in southern Quebec, Canada. Diagnostic checks using the histograms and cumulative distribution function (cdf both raw and back transformed simulations show good agreement. Therefore, the results from this study has potential in characterization of water content variability and bulk density variation for precision agriculture.

  11. Non-regularized inversion method from light scattering applied to ferrofluid magnetization curves for magnetic size distribution analysis

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2014-01-01

    A numerical inversion method known from the analysis of light scattering by colloidal dispersions is now applied to magnetization curves of ferrofluids. The distribution of magnetic particle sizes or dipole moments is determined without assuming that the distribution is unimodal or of a particular shape. The inversion method enforces positive number densities via a non-negative least squares procedure. It is tested successfully on experimental and simulated data for ferrofluid samples with known multimodal size distributions. The created computer program MINORIM is made available on the web. - Highlights: • A method from light scattering is applied to analyze ferrofluid magnetization curves. • A magnetic size distribution is obtained without prior assumption of its shape. • The method is tested successfully on ferrofluids with a known size distribution. • The practical limits of the method are explored with simulated data including noise. • This method is implemented in the program MINORIM, freely available online

  12. Applying transactional analysis and personality assessment to improve patient counseling and communication skills.

    Science.gov (United States)

    Lawrence, Lesa

    2007-08-15

    To teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling to improve communication. A lecture series for a required pharmacy communications class was developed to teach pharmacy students how to apply transactional analysis and personality assessment to patient counseling. Students were asked to apply these techniques and to report their experiences. A personality self-assessment was also conducted. After attending the lecture series, students were able to apply the techniques and demonstrated an understanding of the psychological factors that may affect patient communication, an appreciation for the diversity created by different personality types, the ability to engage patients based on adult-to-adult interaction cues, and the ability to adapt the interactive patient counseling model to different personality traits. Students gained a greater awareness of transactional analysis and personality assessment by applying these concepts. This understanding will help students communicate more effectively with patients.

  13. Tapped density optimisation for four agricultural wastes - Part II: Performance analysis and Taguchi-Pareto

    Directory of Open Access Journals (Sweden)

    Ajibade Oluwaseyi Ayodele

    2016-01-01

    Full Text Available In this attempt, which is a second part of discussions on tapped density optimisation for four agricultural wastes (particles of coconut, periwinkle, palm kernel and egg shells, performance analysis for comparative basis is made. This paper pioneers a study direction in which optimisation of process variables are pursued using Taguchi method integrated with the Pareto 80-20 rule. Negative percentage improvements resulted when the optimal tapped density was compared with the average tapped density. However, the performance analysis between optimal tapped density and the peak tapped density values yielded positive percentage improvements for the four filler particles. The performance analysis results validate the effectiveness of using the Taguchi method in improving the tapped density properties of the filler particles. The application of the Pareto 80-20 rule to the table of parameters and levels produced revised tables of parameters and levels which helped to identify the factor-levels position of each parameter that is economical to optimality. The Pareto 80-20 rule also produced revised S/N response tables which were used to know the relevant S/N ratios that are relevant to optimality.

  14. A Statistical Analysis for Estimating Fish Number Density with the Use of a Multibeam Echosounder

    Science.gov (United States)

    Schroth-Miller, Madeline L.

    Fish number density can be estimated from the normalized second moment of acoustic backscatter intensity [Denbigh et al., J. Acoust. Soc. Am. 90, 457-469 (1991)]. This method assumes that the distribution of fish scattering amplitudes is known and that the fish are randomly distributed following a Poisson volume distribution within regions of constant density. It is most useful at low fish densities, relative to the resolution of the acoustic device being used, since the estimators quickly become noisy as the number of fish per resolution cell increases. New models that include noise contributions are considered. The methods were applied to an acoustic assessment of juvenile Atlantic Bluefin Tuna, Thunnus thynnus. The data were collected using a 400 kHz multibeam echo sounder during the summer months of 2009 in Cape Cod, MA. Due to the high resolution of the multibeam system used, the large size (approx. 1.5 m) of the tuna, and the spacing of the fish in the school, we expect there to be low fish densities relative to the resolution of the multibeam system. Results of the fish number density based on the normalized second moment of acoustic intensity are compared to fish packing density estimated using aerial imagery that was collected simultaneously.

  15. Model Proposition for the Fiscal Policies Analysis Applied in Economic Field

    Directory of Open Access Journals (Sweden)

    Larisa Preda

    2007-05-01

    Full Text Available This paper presents a study about fiscal policy applied in economic development. Correlations between macroeconomics and fiscal indicators signify the first steep in our analysis. Next step is a new model proposal for the fiscal and budgetary choices. This model is applied on the date of the Romanian case.

  16. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  17. Advantages and Drawbacks of Applying Periodic Time-Variant Modal Analysis to Spur Gear Dynamics

    DEFF Research Database (Denmark)

    Pedersen, Rune; Santos, Ilmar; Hede, Ivan Arthur

    2010-01-01

    to ensure sufficient accuracy of the results. The method of time-variant modal analysis is applied, and the changes in the fundamental and the parametric resonance frequencies as a function of the rotational speed of the gears, are found. By obtaining the stationary and parametric parts of the time...... of applying the methodology to wind turbine gearboxes are addressed and elucidated....

  18. Applied behavior analysis: understanding and changing behavior in the community-a representative review.

    Science.gov (United States)

    Luyben, Paul D

    2009-01-01

    Applied behavior analysis, a psychological discipline, has been characterized as the science of behavior change (Chance, 2006). Research in applied behavior analysis has been published for approximately 40 years since the initial publication of the Journal of Applied Behavior Analysis in 1968. The field now encompasses a wide range of human behavior. Although much of the published research centers on problem behaviors that occur in schools and among people with disabilities, a substantial body of knowledge has emerged in community settings. This article provides a review of the behavioral community research published in the Journal of Applied Behavior Analysis as representative of this work, including research in the areas of home and family, health, safety, community involvement and the environment, recreation and sports, crime and delinquency, and organizations. In the interest of space, research in schools and with people with disabilities has been excluded from this review.

  19. Applying ABC Analysis to the Navy’s Inventory Management System

    Science.gov (United States)

    2014-09-01

    TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE APPLYING ABC ANALYSIS TO THE NAVY’S INVENTORY MANAGEMENT SYSTEM 5. FUNDING NUMBERS 6...Parts, Navy Enterprise Resource Planning, ABC Inventory Classification. 15. NUMBER OF PAGES 101 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...for public release; distribution is unlimited APPLYING ABC ANALYSIS TO THE NAVY’S INVENTORY MANAGEMENT SYSTEM Benjamin May Lieutenant

  20. Electron density analysis of 1-butyl-3-methylimidazolium chloride ionic liquid.

    Science.gov (United States)

    del Olmo, Lourdes; Morera-Boado, Cercis; López, Rafael; García de la Vega, José M

    2014-06-01

    An analysis of the electron density of different conformers of the 1-butyl-3-methylimidazolium chloride (bmimCl) ionic liquid by using DFT through the BVP86 density functional has been obtained within the framework of Bader's atom in molecules (AIM), localized orbital locator (LOL), natural bond orbital (NBO), and deformed atoms in molecules (DAM). We also present an analysis of the reduced density gradients that deliver the non-covalent interaction regions and allow to understand the nature of intermolecular interactions. The most polar conformer can be characterized as ionic by AIM, LOL, and DAM methods while the most stable and the least polar shows shared-type interactions. The NBO method allows to comprehend what causes the stabilization of the most stable conformer based on analysis of the second-order perturbative energy and the charge transferred among the natural orbitals involved in the interaction.

  1. Breast Density and Risk of Breast Cancer in Asian Women: A Meta-analysis of Observational Studies.

    Science.gov (United States)

    Bae, Jong-Myon; Kim, Eun Hee

    2016-11-01

    The established theory that breast density is an independent predictor of breast cancer risk is based on studies targeting white women in the West. More Asian women than Western women have dense breasts, but the incidence of breast cancer is lower among Asian women. This meta-analysis investigated the association between breast density in mammography and breast cancer risk in Asian women. PubMed and Scopus were searched, and the final date of publication was set as December 31, 2015. The effect size in each article was calculated using the interval-collapse method. Summary effect sizes (sESs) and 95% confidence intervals (CIs) were calculated by conducting a meta-analysis applying a random effect model. To investigate the dose-response relationship, random effect dose-response meta-regression (RE-DRMR) was conducted. Six analytical epidemiology studies in total were selected, including one cohort study and five case-control studies. A total of 17 datasets were constructed by type of breast density index and menopausal status. In analyzing the subgroups of premenopausal vs. postmenopausal women, the percent density (PD) index was confirmed to be associated with a significantly elevated risk for breast cancer (sES, 2.21; 95% CI, 1.52 to 3.21; I 2 =50.0%). The RE-DRMR results showed that the risk of breast cancer increased 1.73 times for each 25% increase in PD in postmenopausal women (95% CI, 1.20 to 2.47). In Asian women, breast cancer risk increased with breast density measured using the PD index, regardless of menopausal status. We propose the further development of a breast cancer risk prediction model based on the application of PD in Asian women.

  2. Analysis of average density difference effect in a new two-lane lattice model

    Science.gov (United States)

    Zhang, Geng; Sun, Di-Hua; Zhao, Min; Liu, Wei-Ning; Cheng, Sen-Lin

    2015-11-01

    A new lattice model is proposed by taking the average density difference effect into account for two-lane traffic system according to Transportation Cyber-physical Systems. The influence of average density difference effect on the stability of traffic flow is investigated through linear stability theory and nonlinear reductive perturbation method. The linear analysis results reveal that the unstable region would be reduced by considering the average density difference effect. The nonlinear kink-antikink soliton solution derived from the mKdV equation is analyzed to describe the properties of traffic jamming transition near the critical point. Numerical simulations confirm the analytical results showing that traffic jam can be suppressed efficiently by considering the average density difference effect for two-lane traffic system.

  3. Genetic analysis of male reproductive success in relation to density in the zebrafish, Danio rerio

    Directory of Open Access Journals (Sweden)

    Jordan William C

    2006-04-01

    Full Text Available Abstract Background We used behavioural and genetic data to investigate the effects of density on male reproductive success in the zebrafish, Danio rerio. Based on previous measurements of aggression and courtship behaviour by territorial males, we predicted that they would sire more offspring than non-territorial males. Results Microsatellite analysis of paternity showed that at low densities territorial males had higher reproductive success than non-territorial males. However, at high density territorial males were no more successful than non-territorials and the sex difference in the opportunity for sexual selection, based on the parameter Imates, was low. Conclusion Male zebrafish exhibit two distinct mating tactics; territoriality and active pursuit of females. Male reproductive success is density dependent and the opportunity for sexual selection appears to be weak in this species.

  4. Error Analysis of a Fractional Time-Stepping Technique for Incompressible Flows with Variable Density

    KAUST Repository

    Guermond, J.-L.

    2011-01-01

    In this paper we analyze the convergence properties of a new fractional time-stepping technique for the solution of the variable density incompressible Navier-Stokes equations. The main feature of this method is that, contrary to other existing algorithms, the pressure is determined by just solving one Poisson equation per time step. First-order error estimates are proved, and stability of a formally second-order variant of the method is established. © 2011 Society for Industrial and Applied Mathematics.

  5. Development of the female voxel phantom, NAOMI, and its application to calculations of induced current densities and electric fields from applied low frequency magnetic and electric fields

    International Nuclear Information System (INIS)

    Dimbylow, Peter

    2005-01-01

    This paper outlines the development of a 2 mm resolution voxel model, NAOMI (aNAtOMIcal model), designed to be representative of the average adult female. The primary medical imaging data were derived from a high-resolution MRI scan of a 1.65 m tall, 23 year old female subject with a mass of 58 kg. The model was rescaled to a height of 1.63 m and a mass of 60 kg, the dimensions of the International Commission on Radiological Protection reference adult female. There are 41 tissue types in the model. The application of NAOMI to the calculations of induced current densities and electric fields from applied low frequency magnetic and electric fields is described. Comparisons are made with values from the male voxel model, NORMAN. The calculations were extended from 50 Hz up to 10 MHz. External field reference levels are compared with the ICNIRP guidelines

  6. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    International Nuclear Information System (INIS)

    Reimund, Kevin K.

    2015-01-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π(1+√w -1 ), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at "maximum power density operating pressure" requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  7. Thermodynamic analysis of energy density in pressure retarded osmosis: The impact of solution volumes and costs

    Energy Technology Data Exchange (ETDEWEB)

    Reimund, Kevin K. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; McCutcheon, Jeffrey R. [Univ. of Connecticut, Storrs, CT (United States). Dept. of Chemical and Biomolecular Engineering; Wilson, Aaron D. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-08-01

    A general method was developed for estimating the volumetric energy efficiency of pressure retarded osmosis via pressure-volume analysis of a membrane process. The resulting model requires only the osmotic pressure, π, and mass fraction, w, of water in the concentrated and dilute feed solutions to estimate the maximum achievable specific energy density, uu, as a function of operating pressure. The model is independent of any membrane or module properties. This method utilizes equilibrium analysis to specify the volumetric mixing fraction of concentrated and dilute solution as a function of operating pressure, and provides results for the total volumetric energy density of similar order to more complex models for the mixing of seawater and riverwater. Within the framework of this analysis, the total volumetric energy density is maximized, for an idealized case, when the operating pressure is π/(1+√w⁻¹), which is lower than the maximum power density operating pressure, Δπ/2, derived elsewhere, and is a function of the solute osmotic pressure at a given mass fraction. It was also found that a minimum 1.45 kmol of ideal solute is required to produce 1 kWh of energy while a system operating at “maximum power density operating pressure” requires at least 2.9 kmol. Utilizing this methodology, it is possible to examine the effects of volumetric solution cost, operation of a module at various pressure, and operation of a constant pressure module with various feed.

  8. Critical Analysis of a Website: A Critique based on Critical Applied Linguistics and Critical Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Rina Agustina

    2013-05-01

    Full Text Available E-learning was easily found through browsing internet, which was mostly free of charge and provided various learning materials. Spellingcity.com was one of e-learning websites for teaching and learning English to learn spelling, vocabulary and writing, which offered various games and activities for young learners, 6 until 8 year old learners in particular. Having considered those constraints, this paper aimed to analyse the website from two different views: (1 critical applied linguistics  (CAL aspects and (2 critical  discourse analysis (CDA. After analysing the website using CAL and CDA, it was found that the website was adequate for beginner, in which it provided fun learning through games as well as challenged learners’ to test their vocabulary. Despite of these strengths, there were several issues required further thinking in terms of learners’ broad knowledge, such as, some of learning materials focused on states in America. It was quite difficult for EFL learners if they did not have adequate general knowledge. Thus, the findings implied that the website could be used as a supporting learning material, which accompanied textbooks and vocabulary exercise books.

  9. A review of the technology and process on integrated circuits failure analysis applied in communications products

    Science.gov (United States)

    Ming, Zhimao; Ling, Xiaodong; Bai, Xiaoshu; Zong, Bo

    2016-02-01

    The failure analysis of integrated circuits plays a very important role in the improvement of the reliability in communications products. This paper intends to mainly introduce the failure analysis technology and process of integrated circuits applied in the communication products. There are many technologies for failure analysis, include optical microscopic analysis, infrared microscopic analysis, acoustic microscopy analysis, liquid crystal hot spot detection technology, optical microscopic analysis technology, micro analysis technology, electrical measurement, microprobe technology, chemical etching technology and ion etching technology. The integrated circuit failure analysis depends on the accurate confirmation and analysis of chip failure mode, the search of the root failure cause, the summary of failure mechanism and the implement of the improvement measures. Through the failure analysis, the reliability of integrated circuit and rate of good products can improve.

  10. [Particle Size and Number Density Online Analysis for Particle Suspension with Polarization-Differentiation Elastic Light Scattering Spectroscopy].

    Science.gov (United States)

    Chen, Wei-kang; Fang, Hui

    2016-03-01

    The basic principle of polarization-differentiation elastic light scattering spectroscopy based techniques is that under the linear polarized light incidence, the singlely scattered light from the superficial biological tissue and diffusively scattered light from the deep tissue can be separated according to the difference of polarization characteristics. The novel point of the paper is to apply this method to the detection of particle suspension and, to realize the simultaneous measurement of its particle size and number density in its natural status. We design and build a coaxial cage optical system, and measure the backscatter signal at a specified angle from a polystyrene microsphere suspension. By controlling the polarization direction of incident light with a linear polarizer and adjusting the polarization direction of collected light with another linear polarizer, we obtain the parallel polarized elastic light scattering spectrum and cross polarized elastic light scattering spectrum. The difference between the two is the differential polarized elastic light scattering spectrum which include only the single scattering information of the particles. We thus compare this spectrum to the Mie scattering calculation and extract the particle size. We then also analyze the cross polarized elastic light scattering spectrum by applying the particle size already extracted. The analysis is based on the approximate expressions taking account of light diffusing, from which we are able to obtain the number density of the particle suspension. We compare our experimental outcomes with the manufacturer-provided values and further analyze the influence of the particle diameter standard deviation on the number density extraction, by which we finally verify the experimental method. The potential applications of the method include the on-line particle quality monitoring for particle manufacture as well as the fat and protein density detection of milk products.

  11. Quantitative determination of alveolar bone density using digital image analysis of microradiographs

    International Nuclear Information System (INIS)

    Jaeger, A.; Radlanski, R.J.; Taufall, D.; Klein, C.; Steinhoefel, N.; Doeler, W.

    1990-01-01

    Horizontal 100 μm ground sections of 20 alveolar bone specimens from adult human mandibles obtained from autopsies were prepared for microradiography. Quantitative analysis of bone density of the alveolar cortex was performed using a semiautomatic digital image analysis system (KONTRON). The results demonstrated that bone density was higher in the lingual than in the labial alveolar cortex (p < 0.05). In addition, the coronal portion of alveolar cortical bone was significantly more porous than the medial and apical ones (p < 0.05). These variances were primarily due to increased canal size rather than to an increased number of canals. No significant age dependent changes in bone density could be determined. (author)

  12. Applied Behavior Analysis: Its Impact on the Treatment of Mentally Retarded Emotionally Disturbed People.

    Science.gov (United States)

    Matson, Johnny L.; Coe, David A.

    1992-01-01

    This article reviews applications of the applied behavior analysis ideas of B. F. Skinner and others to persons with both mental retardation and emotional disturbance. The review examines implications of behavior analysis for operant conditioning and radical behaviorism, schedules of reinforcement, and emotion and mental illness. (DB)

  13. Improving Skill Development: An Exploratory Study Comparing a Philosophical and an Applied Ethical Analysis Technique

    Science.gov (United States)

    Al-Saggaf, Yeslam; Burmeister, Oliver K.

    2012-01-01

    This exploratory study compares and contrasts two types of critical thinking techniques; one is a philosophical and the other an applied ethical analysis technique. The two techniques analyse an ethically challenging situation involving ICT that a recent media article raised to demonstrate their ability to develop the ethical analysis skills of…

  14. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  15. Sociosexuality Education for Persons with Autism Spectrum Disorders Using Principles of Applied Behavior Analysis

    Science.gov (United States)

    Wolfe, Pamela S.; Condo, Bethany; Hardaway, Emily

    2009-01-01

    Applied behavior analysis (ABA) has emerged as one of the most effective empirically based strategies for instructing individuals with autism spectrum disorders (ASD). Four ABA-based strategies that have been found effective are video modeling, visual strategies, social script fading, and task analysis. Individuals with ASD often struggle with…

  16. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in ...

  17. Ethnic density effects for adult mental health: systematic review and meta-analysis of international studies.

    Science.gov (United States)

    Bécares, Laia; Dewey, Michael E; Das-Munshi, Jayati

    2017-12-14

    Despite increased ethnic diversity in more economically developed countries it is unclear whether residential concentration of ethnic minority people (ethnic density) is detrimental or protective for mental health. This is the first systematic review and meta-analysis covering the international literature, assessing ethnic density associations with mental health outcomes. We systematically searched Medline, PsychINFO, Sociological Abstracts, Web of Science from inception to 31 March 2016. We obtained additional data from study authors. We conducted random-effects meta-analysis taking into account clustering of estimates within datasets. Meta-regression assessed heterogeneity in studies due to ethnicity, country, generation, and area-level deprivation. Our main exposure was ethnic density, defined as the residential concentration of own racial/ethnic minority group. Outcomes included depression, anxiety and the common mental disorders (CMD), suicide, suicidality, psychotic experiences, and psychosis. We included 41 studies in the review, with meta-analysis of 12 studies. In the meta-analyses, we found a large reduction in relative odds of psychotic experiences [odds ratio (OR) 0.82 (95% confidence interval (CI) 0.76-0.89)] and suicidal ideation [OR 0.88 (95% CI 0.79-0.98)] for each 10 percentage-point increase in own ethnic density. For CMD, depression, and anxiety, associations were indicative of protective effects of own ethnic density; however, results were not statistically significant. Findings from narrative review were consistent with those of the meta-analysis. The findings support consistent protective ethnic density associations across countries and racial/ethnic minority populations as well as mental health outcomes. This may suggest the importance of the social environment in patterning detrimental mental health outcomes in marginalized and excluded population groups.

  18. Density of States FFA analysis of SU(3 lattice gauge theory at a finite density of color sources

    Directory of Open Access Journals (Sweden)

    Mario Giuliani

    2017-10-01

    Full Text Available We present a Density of States calculation with the Functional Fit Approach (DoS FFA in SU(3 lattice gauge theory with a finite density of static color sources. The DoS FFA uses a parameterized density of states and determines the parameters of the density by fitting data from restricted Monte Carlo simulations with an analytically known function. We discuss the implementation of DoS FFA and the results for a qualitative picture of the phase diagram in a model which is a further step towards implementing DoS FFA in full QCD. We determine the curvature κ in the μ–T phase diagram and find a value close to the results published for full QCD.

  19. Genetic determinant of trabecular bone score (TBS) and bone mineral density: A bivariate analysis.

    Science.gov (United States)

    Ho-Pham, Lan T; Hans, Didier; Doan, Minh C; Mai, Linh D; Nguyen, Tuan V

    2016-11-01

    This study sought to estimate the extent of genetic influence on the variation in trabecular bone score (TBS). We found that genetic factors accounted for ~45% of variance in TBS, and that the co-variation between TBS and bone density is partially determined by genetic factors. Trabecular bone score has emerged as an important predictor of fragility fracture, but factors underlying the individual differences in TBS have not been explored. In this study, we sought to determine the genetic contribution to the variation of TBS in the general population. The study included 556 women and 189 men from 265 families. The individuals aged 53years (SD 11). We measured lumbar spine bone mineral density (BMD; Hologic Horizon) and then derived the TBS from the same Hologic scan where BMD was derived. A biometric model was applied to the data to partition the variance of TBS into two components: one due to additive genetic factors, and one due to environmental factors. The index of heritability was estimated as the ratio of genetic variance to total variance of a trait. Bivariate genetic analysis was conducted to estimate the genetic correlation between TBS and BMD measurements. TBS was strongly correlated with lumbar spine BMD (r=0.73; P<0.001). On average TBS in men was higher than women, after adjusting age and height which are significantly associated with both TBS and lumbar spine BMD. The age and height adjusted index of heritability of TBS was 0.46 (95% CI, 0.39-0.54), which was not much different from that of LSBMD (0.44; 95% CI, 0.31-0.55). Moreover, the genetic correlation between TBS and LSBMD was 0.35 (95% CI, 0.21-0.46), between TBS and femoral neck BMD was 0.21 (95% CI, 0.10-0.33). Approximately 45% of the variance in TBS is under genetic influence, and this effect magnitude is similar to that of lumbar spine BMD. This finding provides a scientific justification for the search for specific genetic variants that may be associated with TBS and fracture risk

  20. Alternative definitions of the frozen energy in energy decomposition analysis of density functional theory calculations.

    Science.gov (United States)

    Horn, Paul R; Head-Gordon, Martin

    2016-02-28

    In energy decomposition analysis (EDA) of intermolecular interactions calculated via density functional theory, the initial supersystem wavefunction defines the so-called "frozen energy" including contributions such as permanent electrostatics, steric repulsions, and dispersion. This work explores the consequences of the choices that must be made to define the frozen energy. The critical choice is whether the energy should be minimized subject to the constraint of fixed density. Numerical results for Ne2, (H2O)2, BH3-NH3, and ethane dissociation show that there can be a large energy lowering associated with constant density orbital relaxation. By far the most important contribution is constant density inter-fragment relaxation, corresponding to charge transfer (CT). This is unwanted in an EDA that attempts to separate CT effects, but it may be useful in other contexts such as force field development. An algorithm is presented for minimizing single determinant energies at constant density both with and without CT by employing a penalty function that approximately enforces the density constraint.

  1. Analysis of the weld strength of the High Density Polyethylene (HDPE)

    African Journals Online (AJOL)

    An analysis was carried out to determine the strength of welded joints in High Density Polyethylene (HDPE) dam liners. Samples were collected of welded joints and subjected to tensile tests and creep test. It was observed that the welded joints from field welded samples were much weaker and had a very low straining ...

  2. Evaluation of bitterness in white wine applying descriptive analysis, time-intensity analysis, and temporal dominance of sensations analysis.

    Science.gov (United States)

    Sokolowsky, Martina; Fischer, Ulrich

    2012-06-30

    Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine. Copyright © 2011 Elsevier B.V. All rights reserved.

  3. Supplementary graphical analysis for the multi-density expansion of associating fluids

    International Nuclear Information System (INIS)

    Chang, Jaeeon

    2014-01-01

    We present a detailed analysis of Wertheim's multi-density formulation for the thermodynamic properties of associating fluids with a single attraction site. Graphical expressions are explicitly illustrated for the partition function, multi-densities and direct correlation functions, and they are compared with those from the classical singledensity formulation of simple fluids. The steric incompatibility among three monomers greatly simplifies cluster integrals of associating fluids, which allows dimerizing association only. Graphical expressions for the pressure and the Helmholtz energy are derived by using functional derivatives, which provide a theoretical base for TPT and SAFT equations of state

  4. Lagrangian analysis of two-phase hydrodynamic and nuclear-coupled density-wave oscillations

    International Nuclear Information System (INIS)

    Lahey, R.T. Jr.; Yadigaroglu, G.

    1974-01-01

    The mathematical technique known as the ''method of characteristics'' has been used to construct an exact, analytical solution to predict the onset of density-wave oscillations in diabatic two-phase systems, such as Boiling Water Nuclear Reactors (BWR's). Specifically, heater wall dynamics, boiling boundary dynamics and nuclear kinetics have been accounted for in this analysis. Emphasis is placed on giving the reader a clear physical understanding of the phenomena of two-phase density-wave oscillations. Explanations are presented in terms of block diagram logic, and phasor representations of the various pressure drop perturbations are given. (U.S.)

  5. Transport analysis of high radiation and high density plasmas in the ASDEX Upgrade tokamak

    Directory of Open Access Journals (Sweden)

    Casali L.

    2014-01-01

    Full Text Available Future fusion reactors, foreseen in the “European road map” such as DEMO, will operate under more demanding conditions compared to present devices. They will require high divertor and core radiation by impurity seeding to reduce heat loads on divertor target plates. In addition, DEMO will have to work at high core densities to reach adequate fusion performance. The performance of fusion reactors depends on three essential parameters: temperature, density and energy confinement time. The latter characterizes the loss rate due to both radiation and transport processes. The DEMO foreseen scenarios described above were not investigated so far, but are now addressed at the ASDEX Upgrade tokamak. In this work we present the transport analysis of such scenarios. Plasma with high radiation by impurity seeding: transport analysis taking into account the radiation distribution shows no change in transport during impurity seeding. The observed confinement improvement is an effect of higher pedestal temperatures which extend to the core via stiffness. A non coronal radiation model was developed and compared to the bolometric measurements in order to provide a reliable radiation profile for transport calculations. High density plasmas with pellets: the analysis of kinetic profiles reveals a transient phase at the start of the pellet fuelling due to a slower density build up compared to the temperature decrease. The low particle diffusion can explain the confinement behaviour.

  6. Towards factor analysis exploration applied to positron emission tomography functional imaging for breast cancer characterization

    International Nuclear Information System (INIS)

    Rekik, W.; Ketata, I.; Sellami, L.; Ben slima, M.; Ben Hamida, A.; Chtourou, K.; Ruan, S.

    2011-01-01

    This paper aims to explore the factor analysis when applied to a dynamic sequence of medical images obtained using nuclear imaging modality, Positron Emission Tomography (PET). This latter modality allows obtaining information on physiological phenomena, through the examination of radiotracer evolution during time. Factor analysis of dynamic medical images sequence (FADMIS) estimates the underlying fundamental spatial distributions by factor images and the associated so-called fundamental functions (describing the signal variations) by factors. This method is based on an orthogonal analysis followed by an oblique analysis. The results of the FADMIS are physiological curves showing the evolution during time of radiotracer within homogeneous tissues distributions. This functional analysis of dynamic nuclear medical images is considered to be very efficient for cancer diagnostics. In fact, it could be applied for cancer characterization, vascularization as well as possible evaluation of response to therapy.

  7. International publication trends in the Journal of Applied Behavior Analysis: 2000-2014.

    Science.gov (United States)

    Martin, Neil T; Nosik, Melissa R; Carr, James E

    2016-06-01

    Dymond, Clarke, Dunlap, and Steiner's (2000) analysis of international publication trends in the Journal of Applied Behavior Analysis (JABA) from 1970 to 1999 revealed low numbers of publications from outside North America, leading the authors to express concern about the lack of international involvement in applied behavior analysis. They suggested that a future review would be necessary to evaluate any changes in international authorship in the journal. As a follow-up, we analyzed non-U.S. publication trends in the most recent 15 years of JABA and found similar results. We discuss potential reasons for the relative paucity of international authors and suggest potential strategies for increasing non-U.S. contributions to the advancement of behavior analysis. © 2015 Society for the Experimental Analysis of Behavior.

  8. Optimization and analysis of 3D nanostructures for power-density enhancement in ultra-thin photovoltaics under oblique illumination.

    Science.gov (United States)

    Shen, Bing; Wang, Peng; Menon, Rajesh

    2014-03-10

    Nanostructures have the potential to significantly increase the output power-density of ultra-thin photovoltaic devices by scattering incident sunlight into resonant guided modes. We applied a modified version of the direct-binary-search algorithm to design such nanostructures in order to maximize the output power-density under oblique-illumination conditions. We show that with appropriate design of nanostructured cladding layers, it is possible for a 10nm-thick organic absorber to produce an average peak power-density of 4 mW/cm² with incident polar angle ranging from -90° to 90° and incident azimuthal angle ranging from -23.5° to 23.5°. Using careful modal and spectral analysis, we further show that an optimal trade-off of absorption at λ~510 nm among various angles of incidence is essential to excellent performance under oblique illumination. Finally, we show that the optimized device with no sun tracking can produce on an average 7.23 times more energy per year than that produced by a comparable unpatterned device with an optimal anti-reflection coating.

  9. Analysis of the IMAGE RPI electron density data and CHAMP plasmasphere electron density reconstructions with focus on plasmasphere modelling

    Science.gov (United States)

    Gerzen, T.; Feltens, J.; Jakowski, N.; Galkin, I.; Reinisch, B.; Zandbergen, R.

    2016-09-01

    The electron density of the topside ionosphere and the plasmasphere contributes essentially to the overall Total Electron Content (TEC) budget affecting Global Navigation Satellite Systems (GNSS) signals. The plasmasphere can cause half or even more of the GNSS range error budget due to ionospheric propagation errors. This paper presents a comparative study of different plasmasphere and topside ionosphere data aiming at establishing an appropriate database for plasmasphere modelling. We analyze electron density profiles along the geomagnetic field lines derived from the Imager for Magnetopause-to-Aurora Global Exploration (IMAGE) satellite/Radio Plasma Imager (RPI) records of remote plasma sounding with radio waves. We compare these RPI profiles with 2D reconstructions of the topside ionosphere and plasmasphere electron density derived from GNSS based TEC measurements onboard the Challenging Minisatellite Payload (CHAMP) satellite. Most of the coincidences between IMAGE profiles and CHAMP reconstructions are detected in the region with L-shell between 2 and 5. In general the CHAMP reconstructed electron densities are below the IMAGE profile densities, with median of the CHAMP minus IMAGE residuals around -588 cm-3. Additionally, a comparison is made with electron densities derived from passive radio wave RPI measurements onboard the IMAGE satellite. Over the available 2001-2005 period of IMAGE measurements, the considered combined data from the active and passive RPI operations cover the region within a latitude range of ±60°N, all longitudes, and an L-shell ranging from 1.2 to 15. In the coincidence regions (mainly 2 ⩽ L ⩽ 4), we check the agreement between available active and passive RPI data. The comparison shows that the measurements are well correlated, with a median residual of ∼52 cm-3. The RMS and STD values of the relative residuals are around 22% and 21% respectively. In summary, the results encourage the application of IMAGE RPI data for

  10. Visualization and analysis of pulsed ion beam energy density profile with infrared imaging

    Science.gov (United States)

    Isakova, Y. I.; Pushkarev, A. I.

    2018-03-01

    Infrared imaging technique was used as a surface temperature-mapping tool to characterize the energy density distribution of intense pulsed ion beams on a thin metal target. The technique enables the measuring of the total ion beam energy and the energy density distribution along the cross section and allows one to optimize the operation of an ion diode and control target irradiation mode. The diagnostics was tested on the TEMP-4M accelerator at TPU, Tomsk, Russia and on the TEMP-6 accelerator at DUT, Dalian, China. The diagnostics was applied in studies of the dynamics of the target cooling in vacuum after irradiation and in the experiments with target ablation. Errors caused by the target ablation and target cooling during measurements have been analyzed. For Fluke Ti10 and Fluke Ti400 infrared cameras, the technique can achieve surface energy density sensitivity of 0.05 J/cm2 and spatial resolution of 1-2 mm. The thermal imaging diagnostics does not require expensive consumed materials. The measurement time does not exceed 0.1 s; therefore, this diagnostics can be used for the prompt evaluation of the energy density distribution of a pulsed ion beam and during automation of the irradiation process.

  11. Analysis of the Effect of Electron Density Perturbations Generated by Gravity Waves on HF Communication Links

    Science.gov (United States)

    Fagre, M.; Elias, A. G.; Chum, J.; Cabrera, M. A.

    2017-12-01

    In the present work, ray tracing of high frequency (HF) signals in ionospheric disturbed conditions is analyzed, particularly in the presence of electron density perturbations generated by gravity waves (GWs). The three-dimensional numerical ray tracing code by Jones and Stephenson, based on Hamilton's equations, which is commonly used to study radio propagation through the ionosphere, is used. An electron density perturbation model is implemented to this code based upon the consideration of atmospheric GWs generated at a height of 150 km in the thermosphere and propagating up into the ionosphere. The motion of the neutral gas at these altitudes induces disturbances in the background plasma which affects HF signals propagation. To obtain a realistic model of GWs in order to analyze the propagation and dispersion characteristics, a GW ray tracing method with kinematic viscosity and thermal diffusivity was applied. The IRI-2012, HWM14 and NRLMSISE-00 models were incorporated to assess electron density, wind velocities, neutral temperature and total mass density needed for the ray tracing codes. Preliminary results of gravity wave effects on ground range and reflection height are presented for low-mid latitude ionosphere.

  12. Reconstruction and analysis of temperature and density spatial profiles inertial confinement fusion implosion cores

    International Nuclear Information System (INIS)

    Mancini, R. C.

    2007-01-01

    We discuss several methods for the extraction of temperature and density spatial profiles in inertial confinement fusion implosion cores based on the analysis of the x-ray emission from spectroscopic tracers added to the deuterium fuel. The ideas rely on (1) detailed spectral models that take into account collisional-radiative atomic kinetics, Stark broadened line shapes, and radiation transport calculations, (2) the availability of narrow-band, gated pinhole and slit x-ray images, and space-resolved line spectra of the core, and (3) several data analysis and reconstruction methods that include a multi-objective search and optimization technique based on a novel application of Pareto genetic algorithms to plasma spectroscopy. The spectroscopic analysis yields the spatial profiles of temperature and density in the core at the collapse of the implosion, and also the extent of shell material mixing into the core. Results are illustrated with data recorded in implosion experiments driven by the OMEGA and Z facilities

  13. Generation of Native Chromatin Immunoprecipitation Sequencing Libraries for Nucleosome Density Analysis.

    Science.gov (United States)

    Lorzadeh, Alireza; Lopez Gutierrez, Rodrigo; Jackson, Linda; Moksa, Michelle; Hirst, Martin

    2017-12-12

    We present a modified native chromatin immunoprecipitation sequencing (ChIP-seq) experimental protocol compatible with a Gaussian mixture distribution based analysis methodology (nucleosome density ChIP-seq; ndChIP-seq) that enables the generation of combined measurements of micrococcal nuclease (MNase) accessibility with histone modification genome-wide. Nucleosome position and local density, and the posttranslational modification of their histone subunits, act in concert to regulate local transcription states. Combinatorial measurements of nucleosome accessibility with histone modification generated by ndChIP-seq allows for the simultaneous interrogation of these features. The ndChIP-seq methodology is applicable to small numbers of primary cells inaccessible to cross-linking based ChIP-seq protocols. Taken together, ndChIP-seq enables the measurement of histone modification in combination with local nucleosome density to obtain new insights into shared mechanisms that regulate RNA transcription within rare primary cell populations.

  14. Thermodynamic, energy efficiency, and power density analysis of reverse electrodialysis power generation with natural salinity gradients.

    Science.gov (United States)

    Yip, Ngai Yin; Vermaas, David A; Nijmeijer, Kitty; Elimelech, Menachem

    2014-05-06

    Reverse electrodialysis (RED) can harness the Gibbs free energy of mixing when fresh river water flows into the sea for sustainable power generation. In this study, we carry out a thermodynamic and energy efficiency analysis of RED power generation, and assess the membrane power density. First, we present a reversible thermodynamic model for RED and verify that the theoretical maximum extractable work in a reversible RED process is identical to the Gibbs free energy of mixing. Work extraction in an irreversible process with maximized power density using a constant-resistance load is then examined to assess the energy conversion efficiency and power density. With equal volumes of seawater and river water, energy conversion efficiency of ∼ 33-44% can be obtained in RED, while the rest is lost through dissipation in the internal resistance of the ion-exchange membrane stack. We show that imperfections in the selectivity of typical ion exchange membranes (namely, co-ion transport, osmosis, and electro-osmosis) can detrimentally lower efficiency by up to 26%, with co-ion leakage being the dominant effect. Further inspection of the power density profile during RED revealed inherent ineffectiveness toward the end of the process. By judicious early discontinuation of the controlled mixing process, the overall power density performance can be considerably enhanced by up to 7-fold, without significant compromise to the energy efficiency. Additionally, membrane resistance was found to be an important factor in determining the power densities attainable. Lastly, the performance of an RED stack was examined for different membrane conductivities and intermembrane distances simulating high performance membranes and stack design. By thoughtful selection of the operating parameters, an efficiency of ∼ 37% and an overall gross power density of 3.5 W/m(2) represent the maximum performance that can potentially be achieved in a seawater-river water RED system with low

  15. Analysis of Tensegrity Structures with Redundancies, by Implementing a Comprehensive Equilibrium Equations Method with Force Densities

    Directory of Open Access Journals (Sweden)

    Miltiades Elliotis

    2016-01-01

    Full Text Available A general approach is presented to analyze tensegrity structures by examining their equilibrium. It belongs to the class of equilibrium equations methods with force densities. The redundancies are treated by employing Castigliano’s second theorem, which gives the additional required equations. The partial derivatives, which appear in the additional equations, are numerically replaced by statically acceptable internal forces which are applied on the structure. For both statically determinate and indeterminate tensegrity structures, the properties of the resulting linear system of equations give an indication about structural stability. This method requires a relatively small number of computations, it is direct (there is no iteration procedure and calculation of auxiliary parameters and is characterized by its simplicity. It is tested on both 2D and 3D tensegrity structures. Results obtained with the method compare favorably with those obtained by the Dynamic Relaxation Method or the Adaptive Force Density Method.

  16. Maternal depression research: socioeconomic analysis and density-equalizing mapping of the global research architecture.

    Science.gov (United States)

    Brüggmann, Dörthe; Wagner, Christina; Klingelhöfer, Doris; Schöffel, Norman; Bendels, Michael; Louwen, Frank; Jaque, Jenny; Groneberg, David A

    2017-02-01

    Maternal depression can be accounted for one of the most common complications during pregnancy and the postpartum period affecting women all over the world. So far, no detailed map of the worldwide maternal depression research architecture has been constructed, which encompasses aspects of research activity, quality, and also socioeconomic features. Using the NewQIS platform, density-equalizing mapping projections, scientometric techniques, and economic benchmarking procedures were applied to evaluate global maternal depression research for the period between 1900 and 2012. In total, 7330 related publications and 3335 international collaborations were identified. The USA was the most active country concerning collaborations and total research activity. In the socioeconomic analysis of research activity in high-income countries, Australia was ranked first with an average of 412.05 maternal depression-related publications per 1000 billion US$ GDP (Q 1 ), followed by the UK (Q 1  = 373.51) and Canada (Q 1  = 306.32). The group of upper-middle-income countries was led by South Africa (Q 1  = 145.67), followed by Turkey (Q 1  = 91.8). China authored 11.95 maternal depression-related publications per 1000 billion US$ GDP. The USA had the highest activity of maternal depression research per GDP in billion US$ per capita (Q 2  = 60.86). When research activity was related to population size (Q 3  = publications per Mio. inhabitants), Australia (Q 3  = 26.44) was leading the field, followed by Norway (Q 3  = 18.48). Gender analysis revealed a relatively high degree of female scientists involved in this field of research with pronounced differences between single subject areas. In summary, we here present the first picture of the global scientific development in maternal depression research over a period of more than 100 years. The research landscape is clearly dominated by North American and Western European countries, with only minor

  17. Spectral analysis, vibrational assignments, NBO analysis, NMR, UV-Vis, hyperpolarizability analysis of 2-aminofluorene by density functional theory.

    Science.gov (United States)

    Jone Pradeepa, S; Sundaraganesan, N

    2014-05-05

    In this present investigation, the collective experimental and theoretical study on molecular structure, vibrational analysis and NBO analysis has been reported for 2-aminofluorene. FT-IR spectrum was recorded in the range 4000-400 cm(-1). FT-Raman spectrum was recorded in the range 4000-50 cm(-1). The molecular geometry, vibrational spectra, and natural bond orbital analysis (NBO) were calculated for 2-aminofluorene using Density Functional Theory (DFT) based on B3LYP/6-31G(d,p) model chemistry. (13)C and (1)H NMR chemical shifts of 2-aminofluorene were calculated using GIAO method. The computed vibrational and NMR spectra were compared with the experimental results. The total energy distribution (TED) was derived to deepen the understanding of different modes of vibrations contributed by respective wavenumber. The experimental UV-Vis spectra was recorded in the region of 400-200 nm and correlated with simulated spectra by suitably solvated B3LYP/6-31G(d,p) model. The HOMO-LUMO energies were measured with time dependent DFT approach. The nonlinearity of the title compound was confirmed by hyperpolarizabilty examination. Using theoretical calculation Molecular Electrostatic Potential (MEP) was investigated. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. The Notions of foreignization and domestication applied to film translation : analysis of subtitles in cartoon "Ratatouille"

    OpenAIRE

    Judickaitė, Ligita

    2009-01-01

    This paper shows how Venuti's theory on foreignization and domestication can be applied to film translation and presents the analysis of culture-specific items' translation in the Lithuanian subtitles of cartoon Ratatouille. The translation analysis considers 135 culture-specific items that can be divided into two groups, which are the names of occupations of the people who work in the kitchen and the names of food items, dishes and drinks. The cartoon also contains other culture-specific wor...

  19. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    Science.gov (United States)

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  20. Stress wave analysis: applied to rotating machines; Stress wave analysis: aplicado a maquinas rotativas

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Paulo Garcia de [Invensys Brasil Ltda., Sao Paulo, SP (Brazil)

    2009-11-01

    Stress wave analysis is the technology of data analysis (stress profile - ultrasound spectrum) collected by high-frequency acoustic sensors. Monitoring and analysis of rotating equipment, is a crucial element in predictive maintenance and condition based maintenance projects and, in a broader context, of performance management and optimization of assets. This article discusses the application of stress wave analysis to rotating machines in the context of assets optimization and CBM. (author)

  1. A case study in the misrepresentation of applied behavior analysis in autism: the gernsbacher lectures.

    Science.gov (United States)

    Morris, Edward K

    2009-01-01

    I know that most men, including those at ease with problems of the greatest complexity, can seldom accept the simplest and most obvious truth if it be such as would oblige them to admit the falsity of conclusions which they have proudly taught to others, and which they have woven, thread by thread, into the fabrics of their life. (Tolstoy, 1894)This article presents a case study in the misrepresentation of applied behavior analysis for autism based on Morton Ann Gernsbacher's presentation of a lecture titled "The Science of Autism: Beyond the Myths and Misconceptions." Her misrepresentations involve the characterization of applied behavior analysis, descriptions of practice guidelines, reviews of the treatment literature, presentations of the clinical trials research, and conclusions about those trials (e.g., children's improvements are due to development, not applied behavior analysis). The article also reviews applied behavior analysis' professional endorsements and research support, and addresses issues in professional conduct. It ends by noting the deleterious effects that misrepresenting any research on autism (e.g., biological, developmental, behavioral) have on our understanding and treating it in a transdisciplinary context.

  2. UK Parents' Beliefs about Applied Behaviour Analysis as an Approach to Autism Education

    Science.gov (United States)

    Denne, Louise D.; Hastings, Richard P.; Hughes, J. Carl

    2017-01-01

    Research into factors underlying the dissemination of evidence-based practice is limited within the field of Applied Behaviour Analysis (ABA). This is pertinent, particularly in the UK where national policies and guidelines do not reflect the emerging ABA evidence base, or policies and practices elsewhere. Theories of evidence-based practice in…

  3. Using Applied Behaviour Analysis as Standard Practice in a UK Special Needs School

    Science.gov (United States)

    Foran, Denise; Hoerger, Marguerite; Philpott, Hannah; Jones, Elin Walker; Hughes, J. Carl; Morgan, Jonathan

    2015-01-01

    This article describes how applied behaviour analysis can be implemented effectively and affordably in a maintained special needs school in the UK. Behaviour analysts collaborate with classroom teachers to provide early intensive behaviour education for young children with autism spectrum disorders (ASD), and function based behavioural…

  4. Evolution of Applied Behavior Analysis in the Treatment of Individuals With Autism

    Science.gov (United States)

    Wolery, Mark; Barton, Erin E.; Hine, Jeffrey F.

    2005-01-01

    Two issues of each volume of the Journal of Applied Behavior Analysis were reviewed to identify research reports focusing on individuals with autism. The identified articles were analyzed to describe the ages of individuals with autism, the settings in which the research occurred, the nature of the behaviors targeted for intervention, and the…

  5. Lovaas Model of Applied Behavior Analysis. What Works Clearinghouse Intervention Report

    Science.gov (United States)

    What Works Clearinghouse, 2010

    2010-01-01

    The "Lovaas Model of Applied Behavior Analysis" is a type of behavioral therapy that initially focuses on discrete trials: brief periods of one-on-one instruction, during which a teacher cues a behavior, prompts the appropriate response, and provides reinforcement to the child. Children in the program receive an average of 35 to 40 hours…

  6. Applied Behavior Analysis Programs for Autism: Sibling Psychosocial Adjustment during and Following Intervention Use

    Science.gov (United States)

    Cebula, Katie R.

    2012-01-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither…

  7. Applied Behavior Analysis in Autism Spectrum Disorders: Recent Developments, Strengths, and Pitfalls

    Science.gov (United States)

    Matson, Johnny L.; Turygin, Nicole C.; Beighley, Jennifer; Rieske, Robert; Tureck, Kimberly; Matson, Michael L.

    2012-01-01

    Autism has become one of the most heavily researched topics in the field of mental health and education. While genetics has been the most studied of all topics, applied behavior analysis (ABA) has also received a great deal of attention, and has arguably yielded the most promising results of any research area to date. The current paper provides a…

  8. A Self-Administered Parent Training Program Based upon the Principles of Applied Behavior Analysis

    Science.gov (United States)

    Maguire, Heather M.

    2012-01-01

    Parents often respond to challenging behavior exhibited by their children in such a way that unintentionally strengthens it. Applied behavior analysis (ABA) is a research-based science that has been proven effective in remediating challenging behavior in children. Although many parents could benefit from using strategies from the field of ABA with…

  9. A National UK Census of Applied Behavior Analysis School Provision for Children with Autism

    Science.gov (United States)

    Griffith, G. M.; Fletcher, R.; Hastings, R. P.

    2012-01-01

    Over more than a decade, specialist Applied Behavior Analysis (ABA) schools or classes for children with autism have developed in the UK and Ireland. However, very little is known internationally about how ABA is defined in practice in school settings, the characteristics of children supported in ABA school settings, and the staffing structures…

  10. Structure analysis of interstellar clouds - II. Applying the Delta-variance method to interstellar turbulence

    NARCIS (Netherlands)

    Ossenkopf, V.; Krips, M.; Stutzki, J.

    Context. The Delta-variance analysis is an efficient tool for measuring the structural scaling behaviour of interstellar turbulence in astronomical maps. It has been applied both to simulations of interstellar turbulence and to observed molecular cloud maps. In Paper I we proposed essential

  11. SOLUTION TO THE PROBLEMS OF APPLIED MECHANICS USING THE SIMILARITY THEORY AND DIMENSIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Dzhinchvelashvili Guram Avtandilovich

    2016-06-01

    Full Text Available In the present work the author announces a book of Professor G.S. Vardanyan in which the bases of similarity theory and dimensional analysis are systematically set forward. It discusses the possibilities of these methods and their application in solving not only fundamental problems of solid mechanics, but also applied tasks, in particular, complicated problems of engineering seismology.

  12. CROP DENSITY AND IRRIGATION WITH SALINE WATER

    OpenAIRE

    Feinerman, Eli

    1983-01-01

    The economic implications of plant density for irrigation water use under saline conditions are investigated, utilizing the involved physical and biological relationships. The analysis considers a single crop and is applied to cotton data. The results suggest that treating plant density as an endogenous control variable has substantial impact on profits and the optimal quantities and qualities of the applied irrigation water.

  13. Energy decomposition analysis based on a block-localized wavefunction and multistate density functional theory

    OpenAIRE

    Mo, Yirong; Bao, Peng; Gao, Jiali

    2011-01-01

    An interaction energy decomposition analysis method based on the block-localized wavefunction (BLW-ED) approach is described. The first main feature of the BLW-ED method is that it combines concepts of valence bond and molecular orbital theories such that the intermediate and physically intuitive electron-localized states are variationally optimized by self-consistent field calculations. Furthermore, the block-localization scheme can be used both in wave function theory and in density functio...

  14. Principal component analysis of the CT density histogram to generate parametric response maps of COPD

    Science.gov (United States)

    Zha, N.; Capaldi, D. P. I.; Pike, D.; McCormack, D. G.; Cunningham, I. A.; Parraga, G.

    2015-03-01

    Pulmonary x-ray computed tomography (CT) may be used to characterize emphysema and airways disease in patients with chronic obstructive pulmonary disease (COPD). One analysis approach - parametric response mapping (PMR) utilizes registered inspiratory and expiratory CT image volumes and CT-density-histogram thresholds, but there is no consensus regarding the threshold values used, or their clinical meaning. Principal-component-analysis (PCA) of the CT density histogram can be exploited to quantify emphysema using data-driven CT-density-histogram thresholds. Thus, the objective of this proof-of-concept demonstration was to develop a PRM approach using PCA-derived thresholds in COPD patients and ex-smokers without airflow limitation. Methods: Fifteen COPD ex-smokers and 5 normal ex-smokers were evaluated. Thoracic CT images were also acquired at full inspiration and full expiration and these images were non-rigidly co-registered. PCA was performed for the CT density histograms, from which the components with the highest eigenvalues greater than one were summed. Since the values of the principal component curve correlate directly with the variability in the sample, the maximum and minimum points on the curve were used as threshold values for the PCA-adjusted PRM technique. Results: A significant correlation was determined between conventional and PCA-adjusted PRM with 3He MRI apparent diffusion coefficient (p<0.001), with CT RA950 (p<0.0001), as well as with 3He MRI ventilation defect percent, a measurement of both small airways disease (p=0.049 and p=0.06, respectively) and emphysema (p=0.02). Conclusions: PRM generated using PCA thresholds of the CT density histogram showed significant correlations with CT and 3He MRI measurements of emphysema, but not airways disease.

  15. MRSA: A Density-Equalizing Mapping Analysis of the Global Research Architecture

    Directory of Open Access Journals (Sweden)

    Johann P. Addicks

    2014-09-01

    Full Text Available Methicillin-resistant Staphylococcus aureus (MRSA has evolved as an alarming public health thread due to its global spread as hospital and community pathogen. Despite this role, a scientometric analysis has not been performed yet. Therefore, the NewQIS platform was used to conduct a combined density-equalizing mapping and scientometric study. As database, the Web of Science was used, and all entries between 1961 and 2007 were analyzed. In total, 7671 entries were identified. Density equalizing mapping demonstrated a distortion of the world map for the benefit of the USA as leading country with a total output of 2374 publications, followed by the UK (1030 and Japan (862. Citation rate analysis revealed Portugal as leading country with a rate of 35.47 citations per article, followed by New Zealand and Denmark. Country cooperation network analyses showed 743 collaborations with US-UK being most frequent. Network citation analyses indicated the publications that arose from the cooperation of USA and France as well as USA and Japan as the most cited (75.36 and 74.55 citations per collaboration article, respectively. The present study provides the first combined density-equalizing mapping and scientometric analysis of MRSA research. It illustrates the global MRSA research architecture. It can be assumed that this highly relevant topic for public health will achieve even greater dimensions in the future.

  16. MRSA: a density-equalizing mapping analysis of the global research architecture.

    Science.gov (United States)

    Addicks, Johann P; Uibel, Stefanie; Jensen, Anna-Maria; Bundschuh, Matthias; Klingelhoefer, Doris; Groneberg, David A

    2014-09-30

    Methicillin-resistant Staphylococcus aureus (MRSA) has evolved as an alarming public health thread due to its global spread as hospital and community pathogen. Despite this role, a scientometric analysis has not been performed yet. Therefore, the NewQIS platform was used to conduct a combined density-equalizing mapping and scientometric study. As database, the Web of Science was used, and all entries between 1961 and 2007 were analyzed. In total, 7671 entries were identified. Density equalizing mapping demonstrated a distortion of the world map for the benefit of the USA as leading country with a total output of 2374 publications, followed by the UK (1030) and Japan (862). Citation rate analysis revealed Portugal as leading country with a rate of 35.47 citations per article, followed by New Zealand and Denmark. Country cooperation network analyses showed 743 collaborations with US-UK being most frequent. Network citation analyses indicated the publications that arose from the cooperation of USA and France as well as USA and Japan as the most cited (75.36 and 74.55 citations per collaboration article, respectively). The present study provides the first combined density-equalizing mapping and scientometric analysis of MRSA research. It illustrates the global MRSA research architecture. It can be assumed that this highly relevant topic for public health will achieve even greater dimensions in the future.

  17. An ecological analysis of food outlet density and prevalence of type II diabetes in South Carolina counties.

    Science.gov (United States)

    AlHasan, Dana M; Eberth, Jan Marie

    2016-01-05

    Studies suggest that the built environment with high numbers of fast food restaurants and convenience stores and low numbers of super stores and grocery stores are related to obesity, type II diabetes mellitus, and other chronic diseases. Since few studies assess these relationships at the county level, we aim to examine fast food restaurant density, convenience store density, super store density, and grocery store density and prevalence of type II diabetes among counties in South Carolina. Pearson's correlation between four types of food outlet densities- fast food restaurants, convenience stores, super stores, and grocery stores- and prevalence of type II diabetes were computed. The relationship between each of these food outlet densities were mapped with prevalence of type II diabetes, and OLS regression analysis was completed adjusting for county-level rates of obesity, physical inactivity, density of recreation facilities, unemployment, households with no car and limited access to stores, education, and race. We showed a significant, negative relationship between fast food restaurant density and prevalence of type II diabetes, and a significant, positive relationship between convenience store density and prevalence of type II diabetes. In adjusted analysis, the food outlet densities (of any type) was not associated with prevalence of type II diabetes. This ecological analysis showed no associations between fast food restaurants, convenience stores, super stores, or grocery stores densities and the prevalence of type II diabetes. Consideration of environmental, social, and cultural determinants, as well as individual behaviors is needed in future research.

  18. The Pressure and Magnetic Flux Density Analysis of Helical-Type DC Electromagnetic Pump

    International Nuclear Information System (INIS)

    Lee, Geun Hyeong; Kim, Hee Reyoung

    2016-01-01

    The developed pressure was made by only electromagnetic force eliminating probability of impurities contact, therefore the high reactivity materials such as alkali were best match to electromagnetic pump. The heavy ion accelerator facility by Rare Isotope Science Project (RISP) in Korea is trying to construct accelerator using liquid lithium for high efficiency of acceleration by decreasing charge state. The helical-type DC electromagnetic pump was employed to make a charge stripper that decrease charge state of heavy ion. The specification of electromagnetic pump was developed pressure of 15 bar with flowrate of 6 cc/s in the condition of 200℃. The pressure of DC electromagnetic pump was analyzed in the aspects of current and number of duct turns. The developed pressure was almost proportional to input current because relatively low flowrate made negligible of the electromotive force and hydraulic pressure drop. The pressure and magnetic flux density of helical-type DC electromagnetic pump were analyzed. The pressure was proportion to input current and number of duct turns, and magnetic flux density was higher when ferromagnet was applied at electromagnetic pump. It seems that number of duct turns could be increase and ferromagnet could be applied in order to increase pressure of DC electromagnetic pump with constant input current

  19. Improved parameterization of interatomic potentials for rare gas dimers with density-based energy decomposition analysis.

    Science.gov (United States)

    Zhou, Nengjie; Lu, Zhenyu; Wu, Qin; Zhang, Yingkai

    2014-06-07

    We examine interatomic interactions for rare gas dimers using the density-based energy decomposition analysis (DEDA) in conjunction with computational results from CCSD(T) at the complete basis set (CBS) limit. The unique DEDA capability of separating frozen density interactions from density relaxation contributions is employed to yield clean interaction components, and the results are found to be consistent with the typical physical picture that density relaxations play a very minimal role in rare gas interactions. Equipped with each interaction component as reference, we develop a new three-term molecular mechanical force field to describe rare gas dimers: a smeared charge multipole model for electrostatics with charge penetration effects, a B3LYP-D3 dispersion term for asymptotically correct long-range attractions that is screened at short-range, and a Born-Mayer exponential function for the repulsion. The resulted force field not only reproduces rare gas interaction energies calculated at the CCSD(T)/CBS level, but also yields each interaction component (electrostatic or van der Waals) which agrees very well with its corresponding reference value.

  20. Analysis of Mid-Latitude Plasma Density Irregularities in the Presence of Finite Larmor Radius Effects

    Science.gov (United States)

    Sotnikov, V. I.; Kim, T. C.; Mishin, E. V.; Kil, H.; Kwak, Y. S.; Paraschiv, I.

    2017-12-01

    Ionospheric irregularities cause scintillations of electromagnetic signals that can severely affect navigation and transionospheric communication, in particular during space storms. At mid-latitudes the source of F-region Field Aligned Irregularities (FAI) is yet to be determined. They can be created in enhanced subauroral flow channels (SAI/SUBS), where strong gradients of electric field, density and plasma temperature are present. Another important source of FAI is connected with Medium-scale travelling ionospheric disturbances (MSTIDs). Related shear flows and plasma density troughs point to interchange and Kelvin-Helmholtz type instabilities as a possible source of plasma irregularities. A model of nonlinear development of these instabilities based on the two-fluid hydrodynamic description with inclusion of finite Larmor radius effects will be presented. This approach allows to resolve density irregularities on the meter scale. A numerical code in C language to solve the derived nonlinear equations for analysis of interchange and flow velocity shear instabilities in the ionosphere was developed. This code will be used to analyze competition between interchange and Kelvin-Helmholtz instabilities in the mid-latitude region. The high-resolution simulations with continuous density and velocity profiles will be driven by the ambient conditions corresponding to the in situ data obtained during the 2016 Daejeon (Korea) and MU (Japan) radar campaign and data collected simultaneously by the Swarm satellites passed over Korea and Japan. PA approved #: 88ABW-2017-3641

  1. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  2. Quantitative Assessment of Mammary Gland Density in Rodents Using Digital Image Analysis

    Directory of Open Access Journals (Sweden)

    Thompson Henry J

    2011-06-01

    Full Text Available Abstract Background Rodent models have been used extensively to study mammary gland development and for studies of toxicology and carcinogenesis. Mammary gland gross morphology can visualized via the excision of intact mammary gland chains following fixation and staining with carmine using a tissue preparation referred to as a whole mount. Methods are described for the automated collection of digital images from an entire mammary gland whole mount and for the interrogation of digital data using a "masking" technique available with Image-Pro® plus image analysis software (Mediacybernetics. Silver Spring, MD. Results Parallel to mammographic analysis in humans, measurements of rodent mammary gland density were derived from area-based or volume-based algorithms and included: total circumscribed mammary fat pad mass, mammary epithelial mass, and epithelium-free fat pad mass. These values permitted estimation of absolute mass of mammary epithelium as well as breast density. The biological plausibility of these measurements was evaluated in mammary whole mounts from rats and mice. During mammary gland development, absolute epithelial mass increased linearly without significant changes in mammographic density. Treatment of rodents with tamoxifen, 9-cis-retinoic acid, or ovariectomy, and occurrence of diet induced obesity decreased both absolute epithelial mass and mammographic density. The area and volumetric methods gave similar results. Conclusions Digital image analysis can be used for screening agents for potential impact on reproductive toxicity or carcinogenesis as well as for mechanistic studies, particularly for cumulative effects on mammary epithelial mass as well as translational studies of mechanisms that explain the relationship between epithelial mass and cancer risk.

  3. An Analysis of Methods Section of Research Reports in Applied Linguistics

    Directory of Open Access Journals (Sweden)

    Patrícia Marcuzzo

    2011-10-01

    Full Text Available This work aims at identifying analytical categories and research procedures adopted in the analysis of research article in Applied Linguistics/EAP in order to propose a systematization of the research procedures in Genre Analysis. For that purpose, 12 research reports and interviews with four authors were analyzed. The analysis showed that the studies are concentrated on the investigation of the macrostructure or on the microstructure of research articles in different fields. Studies about the microstructure report exclusively the analysis of grammatical elements, and studies about the macrostructure investigate the language with the purpose of identifying patterns of organization in written discourse. If the objective of these studies is in fact to develop a genre analysis in order to contribute to reading and writing teaching in EAP, these studies should include an ethnographic perspective that analyzes the genre based on its context.

  4. [Statistical analysis of articles in "Chinese journal of applied physiology" from 1999 to 2008].

    Science.gov (United States)

    Du, Fei; Fang, Tao; Ge, Xue-ming; Jin, Peng; Zhang, Xiao-hong; Sun, Jin-li

    2010-05-01

    To evaluate the academic level and influence of "Chinese Journal of Applied Physiology" through statistical analysis for the fund sponsored articles published in the recent ten years. The articles of "Chinese Journal of Applied Physiology" from 1999 to 2008 were investigated. The number and the percentage of the fund sponsored articles, the fund organization and the author region were quantitatively analyzed by using the literature metrology method. The number of the fund sponsored articles increased unceasingly. The ratio of the fund from local government significantly enhanced in the latter five years. Most of the articles were from institutes located at Beijing, Zhejiang and Tianjin. "Chinese Journal of Applied Physiology" has a fine academic level and social influence.

  5. A study in the founding of applied behavior analysis through its publications.

    Science.gov (United States)

    Morris, Edward K; Altus, Deborah E; Smith, Nathaniel G

    2013-01-01

    This article reports a study of the founding of applied behavior analysis through its publications. Our methods included hand searches of sources (e.g., journals, reference lists), search terms (i.e., early, applied, behavioral, research, literature), inclusion criteria (e.g., the field's applied dimension), and (d) challenges to their face and content validity. Our results were 36 articles published between 1959 and 1967 that we organized into 4 groups: 12 in 3 programs of research and 24 others. Our discussion addresses (a) limitations in our method (e.g., the completeness of our search), (b) challenges to the validity of our methods and results (e.g., convergent validity), and (c) priority claims about the field's founding. We conclude that the claims are irresolvable because identification of the founding publications depends significantly on methods and because the field's founding was an evolutionary process. We close with suggestions for future research.

  6. Response and reliability analysis of nonlinear uncertain dynamical structures by the probability density evolution method

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Peng, Yongbo; Sichani, Mahdi Teimouri

    2016-01-01

    The paper deals with the response and reliability analysis of hysteretic or geometric nonlinear uncertain dynamical systems of arbitrary dimensionality driven by stochastic processes. The approach is based on the probability density evolution method proposed by Li and Chen (Stochastic dynamics...... of structures, 1st edn. Wiley, London, 2009; Probab Eng Mech 20(1):33–44, 2005), which circumvents the dimensional curse of traditional methods for the determination of non-stationary probability densities based on Markov process assumptions and the numerical solution of the related Fokker–Planck and Kolmogorov......–Feller equations. The main obstacle of the method is that a multi-dimensional convolution integral needs to be carried out over the sample space of a set of basic random variables, for which reason the number of these need to be relatively low. In order to handle this problem an approach is suggested, which...

  7. From Metal Cluster to Metal Nanowire: A Topological Analysis of Electron Density and Band Structure Calculation

    Directory of Open Access Journals (Sweden)

    Yu Wang

    2002-01-01

    Full Text Available Abstract:We investigate a theoretical model of molecular metalwire constructed from linear polynuclear metal complexes. In particular we study the linear Crn metal complex and Cr molecular metalwire. The electron density distributions of the model nanowire and the linear Crn metal complexes, with n = 3, 5, and 7, are calculated by employing CRYSTAL98 package with topological analysis. The preliminary results indicate that the bonding types between any two neighboring Cr are all the same, namely the polarized open-shell interaction. The pattern of electron density distribution in metal complexes resembles that of the model Cr nanowire as the number of metal ions increases. The conductivity of the model Cr nanowire is also tested by performing the band structure calculation.

  8. Applying behavior analysis to school violence and discipline problems: Schoolwide positive behavior support

    Science.gov (United States)

    Anderson, Cynthia M.; Kincaid, Donald

    2005-01-01

    School discipline is a growing concern in the United States. Educators frequently are faced with discipline problems ranging from infrequent but extreme problems (e.g., shootings) to less severe problems that occur at high frequency (e.g., bullying, insubordination, tardiness, and fighting). Unfortunately, teachers report feeling ill prepared to deal effectively with discipline problems in schools. Further, research suggests that many commonly used strategies, such as suspension, expulsion, and other reactive strategies, are not effective for ameliorating discipline problems and may, in fact, make the situation worse. The principles and technology of behavior analysis have been demonstrated to be extremely effective for decreasing problem behavior and increasing social skills exhibited by school children. Recently, these principles and techniques have been applied at the level of the entire school, in a movement termed schoolwide positive behavior support. In this paper we review the tenets of schoolwide positive behavior support, demonstrating the relation between this technology and applied behavior analysis. PMID:22478439

  9. Finite mixture model applied in the analysis of a turbulent bistable flow on two parallel circular cylinders

    Energy Technology Data Exchange (ETDEWEB)

    Paula, A.V. de, E-mail: vagtinski@mecanica.ufrgs.br [PROMEC – Programa de Pós Graduação em Engenharia Mecânica, UFRGS – Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil); Möller, S.V., E-mail: svmoller@ufrgs.br [PROMEC – Programa de Pós Graduação em Engenharia Mecânica, UFRGS – Universidade Federal do Rio Grande do Sul, Porto Alegre, RS (Brazil)

    2013-11-15

    This paper presents a study of the bistable phenomenon which occurs in the turbulent flow impinging on circular cylinders placed side-by-side. Time series of axial and transversal velocity obtained with the constant temperature hot wire anemometry technique in an aerodynamic channel are used as input data in a finite mixture model, to classify the observed data according to a family of probability density functions. Wavelet transforms are applied to analyze the unsteady turbulent signals. Results of flow visualization show that the flow is predominantly two-dimensional. A double-well energy model is suggested to describe the behavior of the bistable phenomenon in this case. -- Highlights: ► Bistable flow on two parallel cylinders is studied with hot wire anemometry as a first step for the application on the analysis to tube bank flow. ► The method of maximum likelihood estimation is applied to hot wire experimental series to classify the data according to PDF functions in a mixture model approach. ► Results show no evident correlation between the changes of flow modes with time. ► An energy model suggests the presence of more than two flow modes.

  10. Apollo remote analysis system applied to surface and underwater in-situ elemental analysis

    International Nuclear Information System (INIS)

    Evans, L.G.; Bielefeld, M.J.; Eller, E.L.; Schmadebeck, R.L.; Trombka, J.I.; Mustafa, M.G.; Senftle, F.E.; Heath, R.L.; Stehling, K.; Vadus, J.

    1976-01-01

    The surveying of the elemental composition of bulk samples over extended areas in near real-time would be an invaluable tool for surface and underwater environmental analysis. However, few techniques provide such a capability. Based on the experience from the orbital gamma-ray spectrometer experiments on Apollo 15 and 16 in which elemental composition of large portions of the moon were determined, an analysis system has been developed for terrestrial applications, which can fulfill these requirements. A portable, compact pulsed neutron generator and NaI(Tl) detector system coupled to associated electronics under mini-computer control can provide the timing and spectral characteristics necessary to determine elemental composition for many applications. Field trials of the system for underwater elemental analysis are planned during the next year

  11. Laboratory Performance of Five Selected Soil Moisture Sensors Applying Factory and Own Calibration Equations for Two Soil Media of Different Bulk Density and Salinity Levels

    Science.gov (United States)

    Matula, Svatopluk; Báťková, Kamila; Legese, Wossenu Lemma

    2016-01-01

    Non-destructive soil water content determination is a fundamental component for many agricultural and environmental applications. The accuracy and costs of the sensors define the measurement scheme and the ability to fit the natural heterogeneous conditions. The aim of this study was to evaluate five commercially available and relatively cheap sensors usually grouped with impedance and FDR sensors. ThetaProbe ML2x (impedance) and ECH2O EC-10, ECH2O EC-20, ECH2O EC-5, and ECH2O TE (all FDR) were tested on silica sand and loess of defined characteristics under controlled laboratory conditions. The calibrations were carried out in nine consecutive soil water contents from dry to saturated conditions (pure water and saline water). The gravimetric method was used as a reference method for the statistical evaluation (ANOVA with significance level 0.05). Generally, the results showed that our own calibrations led to more accurate soil moisture estimates. Variance component analysis arranged the factors contributing to the total variation as follows: calibration (contributed 42%), sensor type (contributed 29%), material (contributed 18%), and dry bulk density (contributed 11%). All the tested sensors performed very well within the whole range of water content, especially the sensors ECH2O EC-5 and ECH2O TE, which also performed surprisingly well in saline conditions. PMID:27854263

  12. Applying circular economy innovation theory in business process modeling and analysis

    Science.gov (United States)

    Popa, V.; Popa, L.

    2017-08-01

    The overall aim of this paper is to develop a new conceptual framework for business process modeling and analysis using circular economy innovative theory as a source for business knowledge management. The last part of the paper presents an author’s proposed basic structure for a new business models applying circular economy innovation theories. For people working on new innovative business models in the field of the circular economy this paper provides new ideas for clustering their concepts.

  13. The 2D Spectral Intrinsic Decomposition Method Applied to Image Analysis

    Directory of Open Access Journals (Sweden)

    Samba Sidibe

    2017-01-01

    Full Text Available We propose a new method for autoadaptive image decomposition and recomposition based on the two-dimensional version of the Spectral Intrinsic Decomposition (SID. We introduce a faster diffusivity function for the computation of the mean envelope operator which provides the components of the SID algorithm for any signal. The 2D version of SID algorithm is implemented and applied to some very known images test. We extracted relevant components and obtained promising results in images analysis applications.

  14. Applying the Goal-Question-Indicator-Metric (GQIM) Method to Perform Military Situational Analysis

    Science.gov (United States)

    2016-05-11

    MAXIMUM 200 WORDS ) When developing situational awareness in support of military operations, the U.S. armed forces use a mnemonic, or memory aide, to...REV-03.18.2016.0 Applying the Goal- Question -Indicator- Metric (GQIM) Method to Perform Military Situational Analysis Douglas Gray May 2016...Acknowledgments The subject matter covered in this technical note evolved from an excellent question from Capt. Tomomi Ogasawara, Japan Ground Self

  15. Assimilation of tourism satellite accounts and applied general equilibrium models to inform tourism policy analysis

    OpenAIRE

    Rossouw, Riaan; Saayman, Melville

    2011-01-01

    Historically, tourism policy analysis in South Africa has posed challenges to accurate measurement. The primary reason for this is that tourism is not designated as an 'industry' in standard economic accounts. This paper therefore demonstrates the relevance and need for applied general equilibrium (AGE) models to be completed and extended through an integration with tourism satellite accounts (TSAs) as a tool for policy makers (especially tourism policy makers) in South Africa. The paper sets...

  16. Applied behavior analysis as intervention for autism: definition, features and philosophical concepts

    Directory of Open Access Journals (Sweden)

    Síglia Pimentel Höher Camargo

    2013-11-01

    Full Text Available Autism spectrum disorder (ASD is a lifelong pervasive developmental disorder with no known causes and cure. However, educational and behavioral interventions with a foundation in applied behavior analysis (ABA have been shown to improve a variety of skill areas such as communication, social, academic, and adaptive behaviors of individuals with ASD. The goal of this work is to present the definition, features and philosophical concepts that underlie ABA and make this science an effective intervention method for people with autism.

  17. The x-rays fluorescence applied to the analysis of alloys

    International Nuclear Information System (INIS)

    Gutierrez, D.A.

    1997-01-01

    This work is based on the utilization of the Fluorescence of X Rays. This technique of non destructive trial, has the purpose to establish a routine method, for the control of the conformation of industrial samples used. It makes an analysis with a combination of the algorithms of Rasberry-Heinrich and Claisse-Thinh. Besides, the numerical implementation of non usual techniques in this type of analysis. Such as the Linear Programming applied to the solution of super determined systems, of equations and the utilization of methods of relaxation to facilitate the convergence to the solutions. (author) [es

  18. Classical linear-control analysis applied to business-cycle dynamics and stability

    Science.gov (United States)

    Wingrove, R. C.

    1983-01-01

    Linear control analysis is applied as an aid in understanding the fluctuations of business cycles in the past, and to examine monetary policies that might improve stabilization. The analysis shows how different policies change the frequency and damping of the economic system dynamics, and how they modify the amplitude of the fluctuations that are caused by random disturbances. Examples are used to show how policy feedbacks and policy lags can be incorporated, and how different monetary strategies for stabilization can be analytically compared. Representative numerical results are used to illustrate the main points.

  19. The Applied Behavior Analysis Research Paradigm and Single-Subject Designs in Adapted Physical Activity Research.

    Science.gov (United States)

    Haegele, Justin A; Hodge, Samuel Russell

    2015-10-01

    There are basic philosophical and paradigmatic assumptions that guide scholarly research endeavors, including the methods used and the types of questions asked. Through this article, kinesiology faculty and students with interests in adapted physical activity are encouraged to understand the basic assumptions of applied behavior analysis (ABA) methodology for conducting, analyzing, and presenting research of high quality in this paradigm. The purposes of this viewpoint paper are to present information fundamental to understanding the assumptions undergirding research methodology in ABA, describe key aspects of single-subject research designs, and discuss common research designs and data-analysis strategies used in single-subject studies.

  20. On the use of the term 'frequency' in applied behavior analysis.

    Science.gov (United States)

    Carr, James E; Nosik, Melissa R; Luke, Molli M

    2018-04-01

    There exists a terminological problem in applied behavior analysis: the term frequency has been used as a synonym for both rate (the number of responses per time) and count (the number of responses). To guide decisions about the use and meaning of frequency, we surveyed the usage of frequency in contemporary behavior-analytic journals and textbooks and found that the predominant usage of frequency was as count, not rate. Thus, we encourage behavior analysts to use frequency as a synonym for count. © 2018 Society for the Experimental Analysis of Behavior.

  1. A parameter estimation and identifiability analysis methodology applied to a street canyon air pollution model

    DEFF Research Database (Denmark)

    Ottosen, T. B.; Ketzel, Matthias; Skov, H.

    2016-01-01

    Mathematical models are increasingly used in environmental science thus increasing the importance of uncertainty and sensitivity analyses. In the present study, an iterative parameter estimation and identifiability analysis methodology is applied to an atmospheric model – the Operational Street...... of the identifiability analysis, showed that some model parameters were significantly more sensitive than others. The application of the determined optimal parameter values was shown to successfully equilibrate the model biases among the individual streets and species. It was as well shown that the frequentist approach...

  2. Fatigue Analysis of Tubesheet/Shell Juncture Applying the Mitigation Factor for Over-conservatism

    International Nuclear Information System (INIS)

    Kang, Deog Ji; Kim, Kyu Hyoung; Lee, Jae Gon

    2009-01-01

    If the environmental fatigue requirements are applied to the primary components of a nuclear power plant, to which the present ASME Code fatigue curves are applied, some locations with high level CUF (Cumulative Usage Factor) are anticipated not to meet the code criteria. The application of environmental fatigue damage is still particularly controversial for plants with 60-year design lives. Therefore, it is need to develop a detailed fatigue analysis procedure to identify the conservatisms in the procedure and to lower the cumulative usage factor. Several factors are being considered to mitigate the conservatism such as three-dimensional finite element modeling. In the present analysis, actual pressure transient data instead of conservative maximum and minimum pressure data was applied as one of mitigation factors. Unlike in the general method, individual transient events were considered instead of the grouped transient events. The tubesheet/shell juncture in the steam generator assembly is the one of the weak locations and was, therefore, selected as a target to evaluate the mitigation factor in the present analysis

  3. Continuous Wavelet and Hilbert-Huang Transforms Applied for Analysis of Active and Reactive Power Consumption

    Directory of Open Access Journals (Sweden)

    Avdakovic Samir

    2014-08-01

    Full Text Available Analysis of power consumption presents a very important issue for power distribution system operators. Some power system processes such as planning, demand forecasting, development, etc.., require a complete understanding of behaviour of power consumption for observed area, which requires appropriate techniques for analysis of available data. In this paper, two different time-frequency techniques are applied for analysis of hourly values of active and reactive power consumption from one real power distribution transformer substation in urban part of Sarajevo city. Using the continuous wavelet transform (CWT with wavelet power spectrum and global wavelet spectrum some properties of analysed time series are determined. Then, empirical mode decomposition (EMD and Hilbert-Huang Transform (HHT are applied for the analyses of the same time series and the results showed that both applied approaches can provide very useful information about the behaviour of power consumption for observed time interval and different period (frequency bands. Also it can be noticed that the results obtained by global wavelet spectrum and marginal Hilbert spectrum are very similar, thus confirming that both approaches could be used for identification of main properties of active and reactive power consumption time series.

  4. PRO-ELICERE: A Hazard Analysis Automation Process Applied to Space Systems

    Directory of Open Access Journals (Sweden)

    Tharcius Augusto Pivetta

    2016-07-01

    Full Text Available In the last decades, critical systems have increasingly been developed using computers and software even in space area, where the project approach is usually very conservative. In the projects of rockets, satellites and its facilities, like ground support systems, simulators, among other critical operations for the space mission, it must be applied a hazard analysis. The ELICERE process was created to perform a hazard analysis mainly over computer critical systems, in order to define or evaluate its safety and dependability requirements, strongly based on Hazards and Operability Study and Failure Mode and Effect Analysis techniques. It aims to improve the project design or understand the potential hazards of existing systems improving their functions related to functional or non-functional requirements. Then, the main goal of the ELICERE process is to ensure the safety and dependability goals of a space mission. The process, at the beginning, was created to operate manually in a gradual way. Nowadays, a software tool called PRO-ELICERE was developed, in such a way to facilitate the analysis process and store the results for reuse in another system analysis. To understand how ELICERE works and its tool, a small example of space study case was applied, based on a hypothetical rocket of the Cruzeiro do Sul family, developed by the Instituto de Aeronáutica e Espaço in Brazil.

  5. The evolution of applied harmonic analysis models of the real world

    CERN Document Server

    Prestini, Elena

    2016-01-01

    A sweeping exploration of the development and far-reaching applications of harmonic analysis such as signal processing, digital music, optics, radio astronomy, crystallography, medical imaging, spectroscopy, and more. Featuring a wealth of illustrations, examples, and material not found in other harmonic analysis books, this unique monograph skillfully blends together historical narrative with scientific exposition to create a comprehensive yet accessible work. While only an understanding of calculus is required to appreciate it, there are more technical sections that will charm even specialists in harmonic analysis. From undergraduates to professional scientists, engineers, and mathematicians, there is something for everyone here. The second edition of The Evolution of Applied Harmonic Analysis contains a new chapter on atmospheric physics and climate change, making it more relevant for today’s audience. Praise for the first edition: "…can be thoroughly recommended to any reader who is curious about the ...

  6. Analysis of LSD in human body fluids and hair samples applying ImmunElute columns.

    Science.gov (United States)

    Röhrich, J; Zörntlein, S; Becker, J

    2000-01-10

    Immunoaffinity extraction units (LSD ImmunElute) are commercially available for the analysis of lysergic acid diethylamide (LSD) in urine. The ImmunElute resin contains immobilized monoclonal antibodies to LSD. We applied the ImmunElute procedure to serum and also to human hair samples. For hair analysis the samples were first extracted with methanol under sonication. The extracts were then purified using the ImmunElute resin. LSD analysis was carried out with HPLC and fluorescence detection. The immunoaffinity extraction provides highly purified extracts for chromatographic analysis. The limit of detection (signal-to-noise ratio = 3) has been determined to be hair samples from drug abusers (n = 11). One of these samples tested positive with an amount of 110 pg LSD in 112 mg extracted hair corresponding to a concentration of 1 pg/mg.

  7. The Dynamic Mechanical Analysis of Highly Filled Rice Husk Biochar/High-Density Polyethylene Composites

    Directory of Open Access Journals (Sweden)

    Qingfa Zhang

    2017-11-01

    Full Text Available In this study, rice husk biochar/high-density polyethylene (HDPE composites were prepared via melt mixing followed by extrusion. Effects of biochar content and testing temperature on the dynamic mechanical analysis (DMA of the composites were studied. Morphological analysis of the rice husk biochar and composites were evaluated by scanning electron microscopy (SEM. The results showed that biochar had a positive effect on dynamic viscoelasticity, creep resistance and stress relaxation properties of the composites, but the creep resistance and stress relaxation of the composites decreased with the increase of temperature. SEM analysis showed that HDPE components were embedded in the holes of the rice husk biochar, and it is believed that strong interaction was achieved.

  8. A menu-driven software package of Bayesian nonparametric (and parametric) mixed models for regression analysis and density estimation.

    Science.gov (United States)

    Karabatsos, George

    2017-02-01

    Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected

  9. Principal component analysis based unsupervised feature extraction applied to budding yeast temporally periodic gene expression.

    Science.gov (United States)

    Taguchi, Y-H

    2016-01-01

    The recently proposed principal component analysis (PCA) based unsupervised feature extraction (FE) has successfully been applied to various bioinformatics problems ranging from biomarker identification to the screening of disease causing genes using gene expression/epigenetic profiles. However, the conditions required for its successful use and the mechanisms involved in how it outperforms other supervised methods is unknown, because PCA based unsupervised FE has only been applied to challenging (i.e. not well known) problems. In this study, PCA based unsupervised FE was applied to an extensively studied organism, i.e., budding yeast. When applied to two gene expression profiles expected to be temporally periodic, yeast metabolic cycle (YMC) and yeast cell division cycle (YCDC), PCA based unsupervised FE outperformed simple but powerful conventional methods, with sinusoidal fitting with regards to several aspects: (i) feasible biological term enrichment without assuming periodicity for YMC; (ii) identification of periodic profiles whose period was half as long as the cell division cycle for YMC; and (iii) the identification of no more than 37 genes associated with the enrichment of biological terms related to cell division cycle for the integrated analysis of seven YCDC profiles, for which sinusoidal fittings failed. The explantation for differences between methods used and the necessary conditions required were determined by comparing PCA based unsupervised FE with fittings to various periodic (artificial, thus pre-defined) profiles. Furthermore, four popular unsupervised clustering algorithms applied to YMC were not as successful as PCA based unsupervised FE. PCA based unsupervised FE is a useful and effective unsupervised method to investigate YMC and YCDC. This study identified why the unsupervised method without pre-judged criteria outperformed supervised methods requiring human defined criteria.

  10. Neutron activation analysis as applied to instrumental analysis of trace elements from seawater

    International Nuclear Information System (INIS)

    Boniforti, R.; Moauro, A.; Madaro, M.

    1983-01-01

    Particulate matter collected from the coastal area delimited by the mouth of the river Volturno and the Sabaudia lake has been analyzed by instrumental neutron activation analysis for its content of twenty-two trace elements. The results for surface water and bottom water are reported separately, thus evidencing the effect of sampling depth on the concentration of many elements. The necessity of accurately 'cleaning' the filters before use is stressed

  11. New approach to detect and classify stroke in skull CT images via analysis of brain tissue densities.

    Science.gov (United States)

    Rebouças Filho, Pedro P; Sarmento, Róger Moura; Holanda, Gabriel Bandeira; de Alencar Lima, Daniel

    2017-09-01

    Cerebral vascular accident (CVA), also known as stroke, is an important health problem worldwide and it affects 16 million people worldwide every year. About 30% of those that have a stroke die and 40% remain with serious physical limitations. However, recovery in the damaged region is possible if treatment is performed immediately. In the case of a stroke, Computed Tomography (CT) is the most appropriate technique to confirm the occurrence and to investigate its extent and severity. Stroke is an emergency problem for which early identification and measures are difficult; however, computer-aided diagnoses (CAD) can play an important role in obtaining information imperceptible to the human eye. Thus, this work proposes a new method for extracting features based on radiological density patterns of the brain, called Analysis of Brain Tissue Density (ABTD). The proposed method is a specific approach applied to CT images to identify and classify the occurrence of stroke diseases. The evaluation of the results of the ABTD extractor proposed in this paper were compared with extractors already established in the literature, such as features from Gray-Level Co-Occurrence Matrix (GLCM), Local binary patterns (LBP), Central Moments (CM), Statistical Moments (SM), Hu's Moment (HM) and Zernike's Moments (ZM). Using a database of 420 CT images of the skull, each extractor was applied with the classifiers such as MLP, SVM, kNN, OPF and Bayesian to classify if a CT image represented a healthy brain or one with an ischemic or hemorrhagic stroke. ABTD had the shortest extraction time and the highest average accuracy (99.30%) when combined with OPF using the Euclidean distance. Also, the average accuracy values for all classifiers were higher than 95%. The relevance of the results demonstrated that the ABTD method is a useful algorithm to extract features that can potentially be integrated with CAD systems to assist in stroke diagnosis. Copyright © 2017 Elsevier B.V. All rights

  12. A statistical analysis of the elastic distortion and dislocation density fields in deformed crystals

    KAUST Repository

    Mohamed, Mamdouh S.

    2015-05-18

    The statistical properties of the elastic distortion fields of dislocations in deforming crystals are investigated using the method of discrete dislocation dynamics to simulate dislocation structures and dislocation density evolution under tensile loading. Probability distribution functions (PDF) and pair correlation functions (PCF) of the simulated internal elastic strains and lattice rotations are generated for tensile strain levels up to 0.85%. The PDFs of simulated lattice rotation are compared with sub-micrometer resolution three-dimensional X-ray microscopy measurements of rotation magnitudes and deformation length scales in 1.0% and 2.3% compression strained Cu single crystals to explore the linkage between experiment and the theoretical analysis. The statistical properties of the deformation simulations are analyzed through determinations of the Nye and Kröner dislocation density tensors. The significance of the magnitudes and the length scales of the elastic strain and the rotation parts of dislocation density tensors are demonstrated, and their relevance to understanding the fundamental aspects of deformation is discussed.

  13. Density functional theory analysis of hexagonal close-packed elemental metal photocathodes

    Directory of Open Access Journals (Sweden)

    Tuo Li

    2015-07-01

    Full Text Available A density function theory based analysis of photoemission from hexagonal close packed (hcp metals is presented and the calculated values of the rms transverse momentum (Δp_{T} are in good agreement with the available experimental data on Be [Phys. Rev. Lett. 111, 237401 (2013] and Mg [Proceedings of LINAC 2002, Gyeongju, Korea (2002]. The lattice constants and work functions of the hcp metals are also examined and are consistent with the available experimental values. In addition, emission from (0001-oriented Be is examined due to the presence of a strong surface state.

  14. Breast cancer research output, 1945-2008: a bibliometric and density-equalizing analysis

    LENUS (Irish Health Repository)

    Glynn, Ronan W

    2010-12-22

    Abstract Introduction Breast cancer is the most common form of cancer among women, with an estimated 194,280 new cases diagnosed in the United States in 2009 alone. The primary aim of this work was to provide an in-depth evaluation of research yield in breast cancer from 1945 to 2008, using large-scale data analysis, the employment of bibliometric indicators of production and quality, and density-equalizing mapping. Methods Data were retrieved from the Web of Science (WOS) Science Citation Expanded database; this was searched using the Boolean operator, \\'OR\\

  15. Covariance analysis of finite temperature density functional theory: symmetric nuclear matter

    International Nuclear Information System (INIS)

    Rios, A; Maza, X Roca

    2015-01-01

    We study symmetric nuclear matter at finite temperature, with particular emphasis on the liquid-gas phase transition. We use a standard covariance analysis to propagate statistical uncertainties from the density functional to the thermodynamic properties. We use four functionals with known covariance matrices to obtain as wide a set of results as possible. Our findings suggest that thermodynamical properties are very well constrained by fitting data at zero temperature. The propagated statistical errors in the liquid-gas phase transition parameters are relatively small. (paper)

  16. Harmonic analysis of the ionospheric electron densities retrieved from FORMOSAT-3/COSMIC radio occultation measurements

    Science.gov (United States)

    Masoumi, S.; Safari, A.; Sharifi, M.; Sam Khaniani, A.

    2011-12-01

    In order to investigate regular variations of the ionosphere, the least-squares harmonic estimation is applied to the time series of ionospheric electron densities in the region of Iran derived from about five years of Global Positioning System Radio Occultation (GPS RO) observations by FORMOSAT-3/COSMIC satellites. Although the obtained results are slightly different from the expected ones due to the low horizontal resolution of RO measurements, high vertical resolution of the observations enables us to detect not only the Total Electron Content (TEC) variations, but also periodic patterns of electron densities in different altitudes of the ionosphere. Dominant diurnal and annual signals, together with their Fourier series decompositions, and also periods close to 27 days are obtained, which is consistent with the previous analyses on TEC. In the equatorial anomaly band, the annual component is weaker than its Fourier decomposition periods. In particular, the semiannual period dominates the annual component, which is probably due to the effect of geomagnetic field. By the investigation of the frequencies at different local times, the semiannual signal is more significant than the annual one in the daytime, while the annual frequency is dominant at night. By the detection of the phases of the components, it is revealed that the annual signal has its maximum in summer at high altitudes, and in winter at lower altitudes. This suggests the effect of neutral compositions in the lower atmosphere. Further, the semiannual component peaks around equinox during the day, while its maximum mostly occurs in solstice at night. Since RO measurements can be used to derive TEC along the signal path between a GPS satellite and a receiver, study on the potentiality of using these observations for the prediction of electron densities and its application to the ionospheric correction of the single frequency receivers is suggested.

  17. Analysis of Observation Data of Earth-Rockfill Dam Based on Cloud Probability Distribution Density Algorithm

    Directory of Open Access Journals (Sweden)

    Han Liwei

    2014-07-01

    Full Text Available Monitoring data on an earth-rockfill dam constitutes a form of spatial data. Such data include much uncertainty owing to the limitation of measurement information, material parameters, load, geometry size, initial conditions, boundary conditions and the calculation model. So the cloud probability density of the monitoring data must be addressed. In this paper, the cloud theory model was used to address the uncertainty transition between the qualitative concept and the quantitative description. Then an improved algorithm of cloud probability distribution density based on a backward cloud generator was proposed. This was used to effectively convert certain parcels of accurate data into concepts which can be described by proper qualitative linguistic values. Such qualitative description was addressed as cloud numerical characteristics-- {Ex, En, He}, which could represent the characteristics of all cloud drops. The algorithm was then applied to analyze the observation data of a piezometric tube in an earth-rockfill dam. And experiment results proved that the proposed algorithm was feasible, through which, we could reveal the changing regularity of piezometric tube’s water level. And the damage of the seepage in the body was able to be found out.

  18. How to: Using Mode Analysis to Quantify, Analyze, and Interpret the Mechanisms of High-Density Collective Motion

    Directory of Open Access Journals (Sweden)

    Arianna Bottinelli

    2017-12-01

    Full Text Available While methods from statistical mechanics were some of the earliest analytical tools used to understand collective motion, the field has substantially expanded in scope beyond phase transitions and fluctuating order parameters. In part, this expansion is driven by the increasing variety of systems being studied, which in turn, has increased the need for innovative approaches to quantify, analyze, and interpret a growing zoology of collective behaviors. For example, concepts from material science become particularly relevant when considering the collective motion that emerges at high densities. Here, we describe methods originally developed to study inert jammed granular materials that have been borrowed and adapted to study dense aggregates of active particles. This analysis is particularly useful because it projects difficult-to-analyze patterns of collective motion onto an easier-to-interpret set of eigenmodes. Carefully viewed in the context of non-equilibrium systems, mode analysis identifies hidden long-range motions and localized particle rearrangements based solely on the knowledge of particle trajectories. In this work, we take a “how to” approach and outline essential steps, diagnostics, and know-how used to apply this analysis to study densely-packed active systems.

  19. Density-viscosity product of small-volume ionic liquid samples using quartz crystal impedance analysis.

    Science.gov (United States)

    McHale, Glen; Hardacre, Chris; Ge, Rile; Doy, Nicola; Allen, Ray W K; MacInnes, Jordan M; Bown, Mark R; Newton, Michael I

    2008-08-01

    Quartz crystal impedance analysis has been developed as a technique to assess whether room-temperature ionic liquids are Newtonian fluids and as a small-volume method for determining the values of their viscosity-density product, rho eta. Changes in the impedance spectrum of a 5-MHz fundamental frequency quartz crystal induced by a water-miscible room-temperature ionic liquid, 1-butyl-3-methylimiclazolium trifluoromethylsulfonate ([C4mim][OTf]), were measured. From coupled frequency shift and bandwidth changes as the concentration was varied from 0 to 100% ionic liquid, it was determined that this liquid provided a Newtonian response. A second water-immiscible ionic liquid, 1-butyl-3-methylimidazolium bis(trifluoromethanesulfonyl)imide [C4mim][NTf2], with concentration varied using methanol, was tested and also found to provide a Newtonian response. In both cases, the values of the square root of the viscosity-density product deduced from the small-volume quartz crystal technique were consistent with those measured using a viscometer and density meter. The third harmonic of the crystal was found to provide the closest agreement between the two measurement methods; the pure ionic liquids had the largest difference of approximately 10%. In addition, 18 pure ionic liquids were tested, and for 11 of these, good-quality frequency shift and bandwidth data were obtained; these 12 all had a Newtonian response. The frequency shift of the third harmonic was found to vary linearly with square root of viscosity-density product of the pure ionic liquids up to a value of square root(rho eta) approximately 18 kg m(-2) s(-1/2), but with a slope 10% smaller than that predicted by the Kanazawa and Gordon equation. It is envisaged that the quartz crystal technique could be used in a high-throughput microfluidic system for characterizing ionic liquids.

  20. Dispersion interactions in density-functional theory: An adiabatic-connection analysis

    Science.gov (United States)

    Strømsheim, Marie D.; Kumar, Naveen; Coriani, Sonia; Sagvolden, Espen; Teale, Andrew M.; Helgaker, Trygve

    2011-11-01

    We present an analysis of the dispersion interaction energy and forces in density-functional theory from the point of view of the adiabatic connection between the Kohn-Sham non-interacting and fully interacting systems. Accurate coupled-cluster singles-doubles-perturbative-triples [CCSD(T)] densities are computed for the helium dimer and used to construct the exchange-correlation potential of Kohn-Sham theory, showing agreement with earlier results presented for the Hartree-Fock-Kohn-Sham method [M. Allen and D. J. Tozer, J. Chem. Phys. 117, 11113 (2002), 10.1063/1.1522715]. The accuracy of the methodology utilized to determine these solutions is checked by calculation of the Hellmann-Feynman forces based on the Kohn-Sham densities, which are compared with analytic CCSD(T) forces. To ensure that this comparison is valid in a finite atomic-orbital basis set, we employ floating Gaussian basis functions throughout and all results are counterpoise corrected. The subtle charge-rearrangement effects associated with the dispersion interaction are highlighted as the origin of a large part of the dispersion force. To recover the exchange-correlation components of the interaction energy, adiabatic connections are constructed for the supermolecular system and for its constituent atoms; subtraction of the resulting adiabatic-connection curves followed by integration over the interaction strength recovers the exchange-correlation contribution relevant to the density-functional description of the dispersion interaction. The results emphasize the long-ranged, dynamically correlated nature of the dispersion interaction between closed-shell species. An alternative adiabatic-connection path is also explored, where the electronic interactions are introduced in a manner that emphasizes the range of the electronic interactions, highlighting their purely long-ranged nature, consistent with the success of range-separated hybrid approaches in this context.

  1. Statistical Techniques Applied to Aerial Radiometric Surveys (STAARS): cluster analysis. National Uranium Resource Evaluation

    International Nuclear Information System (INIS)

    Pirkle, F.L.; Stablein, N.K.; Howell, J.A.; Wecksung, G.W.; Duran, B.S.

    1982-11-01

    One objective of the aerial radiometric surveys flown as part of the US Department of Energy's National Uranium Resource Evaluation (NURE) program was to ascertain the regional distribution of near-surface radioelement abundances. Some method for identifying groups of observations with similar radioelement values was therefore required. It is shown in this report that cluster analysis can identify such groups even when no a priori knowledge of the geology of an area exists. A method of convergent k-means cluster analysis coupled with a hierarchical cluster analysis is used to classify 6991 observations (three radiometric variables at each observation location) from the Precambrian rocks of the Copper Mountain, Wyoming, area. Another method, one that combines a principal components analysis with a convergent k-means analysis, is applied to the same data. These two methods are compared with a convergent k-means analysis that utilizes available geologic knowledge. All three methods identify four clusters. Three of the clusters represent background values for the Precambrian rocks of the area, and one represents outliers (anomalously high 214 Bi). A segmentation of the data corresponding to geologic reality as discovered by other methods has been achieved based solely on analysis of aerial radiometric data. The techniques employed are composites of classical clustering methods designed to handle the special problems presented by large data sets. 20 figures, 7 tables

  2. An evaluation of an operating BWR piping system damping during earthquake by applying auto regressive analysis

    International Nuclear Information System (INIS)

    Kitada, Yoshio; Ichiki, Tadaharu; Makiguchi, Morio; Komori, Akio.

    1986-01-01

    The observation of the equipment and piping system installed in an operating nuclear power plant in earthquakes is very umportant for evaluating and confirming the adequacy and the safety margin expected in the design stage. By analyzing observed earthquake records, it can be expected to get the valuable data concerning the behavior of those in earthquakes, and extract the information about the aseismatic design parameters for those systems. From these viewpoints, an earthquake observation system was installed in a reactor building in an operating plant. Up to now, the records of three earthquakes were obtained with this system. In this paper, an example of the analysis of earthquake records is shown, and the main purpose of the analysis was the evaluation of the vibration mode, natural frequency and damping factor of this piping system. Prior to the earthquake record analysis, the eigenvalue analysis for this piping system was performed. Auto-regressive analysis was applied to the observed acceleration time history which was obtained with a piping system installed in an operating BWR. The results of earthquake record analysis agreed well with the results of eigenvalue analysis. (Kako, I.)

  3. Is whole body bone mineral density measured by the dual energy X-ray absorptiometry applied to evaluate risk of osteoporosis among Japanese adult females?

    International Nuclear Information System (INIS)

    Sakai, Yumiko; Koike, George; Numata, Makoto; Taneda, Kiyoshi; Jingu, Sumie

    2010-01-01

    The purpose of this study is to measure whole body fat accurately, the dual energy X-ray absorptiometry (DXA) is widely utilized. Simultaneously, bone mineral density (BMD) of the whole body can also be measured. BMD is one of important information to diagnose osteoporosis. However, it is not established to use whole body BMD for this diagnosis. It is recommended that lumbar and/or hip BMD should be used for diagnosing osteoporosis by the guideline for prevention and treatment of osteoporosis. Although it is possible to measure whole body BMD and lumbar and/or hip BMD separately at the same visit, it is inevitable to expose patients to more X-ray. Therefore, an aim of this study is to elucidate the relationship between whole body BMD and lumbar BMD to find the cut off point of whole body BMD for screening of osteoporosis. Two hundred and thirty six Japanese adult females were ascertained to this study. Whole body BMD and lumbar BMD of each subject were measured with the use of Delphi W (Hologic, USA). One hundred and sixty five subjects were judged as possible osteoporosis (less than 80% of young adult mean (YAM) of lumbar BMD and/or definite fracture of lumbar vertebras). The cut off point of whole body BMD for screening possible osteoporosis was estimated by receiver operated characteristic (ROC) analysis. The cut off point of whole body BMD was 84% of YAM, equivalent to 80% of YAM of lumbar BMD, with the following sensitivity and specificity (0.84 and 0.79, respectively), indicating that whole body BMD could be used for screening osteoporosis. (author)

  4. An evaluation of an operating BWR piping system damping during earthquake by applying auto regressive analysis

    International Nuclear Information System (INIS)

    Kitada, Y.; Makiguchi, M.; Komori, A.; Ichiki, T.

    1985-01-01

    The records of three earthquakes which had induced significant earthquake response to the piping system were obtained with the earthquake observation system. In the present paper, first, the eigenvalue analysis results for the natural piping system based on the piping support (boundary) conditions are described and second, the frequency and the damping factor evaluation results for each vibrational mode are described. In the present study, the Auto Regressive (AR) analysis method is used in the evaluation of natural frequencies and damping factors. The AR analysis applied here has a capability of direct evaluation of natural frequencies and damping factors from earthquake records observed on a piping system without any information on the input motions to the system. (orig./HP)

  5. Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.

    Science.gov (United States)

    Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin

    2017-08-16

    The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.

  6. Flexible Multibody Dynamics Finite Element Formulation Applied to Structural Progressive Collapse Analysis

    Directory of Open Access Journals (Sweden)

    Rodolfo André Kuche Sanches

    Full Text Available Abstract This paper presents a two-dimensional frame finite element methodology to deal with flexible multi-body dynamic systems and applies it to building progressive collapse analysis. The proposed methodology employs a frame element with Timoshenko kinematics and the dynamic governing equation is solved based on the stationary potential energy theorem written regarding nodal positions and generalized vectors components instead of displacements and rotations. The bodies are discretized by lose finite elements, which are assembled by Lagrange multipliers in order to make possible dynamical detachment. Due to the absence of rotation, the time integration is carried by classical Newmark algorithm, which reveals to be stable to the position based formulation. The accuracy of the proposed formulation is verified by simple examples and its capabilities regarding progressive collapse analysis is demonstrated in a more complete building analysis.

  7. Applied Behavior Analysis, Autism, and Occupational Therapy: A Search for Understanding.

    Science.gov (United States)

    Welch, Christie D; Polatajko, H J

    2016-01-01

    Occupational therapists strive to be mindful, competent practitioners and continuously look for ways to improve practice. Applied behavior analysis (ABA) has strong evidence of effectiveness in helping people with autism achieve goals, yet it does not seem to be implemented in occupational therapy practice. To better understand whether ABA could be an evidence-based option to expand occupational therapy practice, the authors conducted an iterative, multiphase investigation of relevant literature. Findings suggest that occupational therapists apply developmental and sensory approaches to autism treatment. The occupational therapy literature does not reflect any use of ABA despite its strong evidence base. Occupational therapists may currently avoid using ABA principles because of a perception that ABA is not client centered. ABA principles and occupational therapy are compatible, and the two could work synergistically. Copyright © 2016 by the American Occupational Therapy Association, Inc.

  8. Stability analysis of multi-infeed HVDC system applying VSC-HVDC

    DEFF Research Database (Denmark)

    Liu, Yan; Chen, Zhe

    2010-01-01

    This paper presents a general model of dual infeed HVDC system applying VSC-HVDC, which can be used as an element in large multi infeed HVDC system. The model may have different structure under different grid faults because of the action of breakers. Hence power flow of the system based on this m......This paper presents a general model of dual infeed HVDC system applying VSC-HVDC, which can be used as an element in large multi infeed HVDC system. The model may have different structure under different grid faults because of the action of breakers. Hence power flow of the system based....../EMTDC to verify the theoretical analysis. Simulation results indicate that this dual infeed HVDC system can realize higher stability than single infeed HVDC system. And different control strategies on a VSC-HVDC link may result in different influence on AC voltage and active power oscillation during transient...

  9. Global analysis of quadrupole shape invariants based on covariant energy density functionals

    Science.gov (United States)

    Quan, S.; Chen, Q.; Li, Z. P.; Nikšić, T.; Vretenar, D.

    2017-05-01

    Background: The coexistence of different geometric shapes at low energies presents a universal structure phenomenon that occurs over the entire chart of nuclides. Studies of the shape coexistence are important for understanding the microscopic origin of collectivity and modifications of shell structure in exotic nuclei far from stability. Purpose: The aim of this work is to provide a systematic analysis of characteristic signatures of coexisting nuclear shapes in different mass regions, using a global self-consistent theoretical method based on universal energy density functionals and the quadrupole collective model. Method: The low-energy excitation spectrum and quadrupole shape invariants of the two lowest 0+ states of even-even nuclei are obtained as solutions of a five-dimensional collective Hamiltonian (5DCH) model, with parameters determined by constrained self-consistent mean-field calculations based on the relativistic energy density functional PC-PK1, and a finite-range pairing interaction. Results: The theoretical excitation energies of the states, 21+,41+,02+,22+,23+, as well as the B (E 2 ;01+→21+) values, are in very good agreement with the corresponding experimental values for 621 even-even nuclei. Quadrupole shape invariants have been implemented to investigate shape coexistence, and the distribution of possible shape-coexisting nuclei is consistent with results obtained in recent theoretical studies and available data. Conclusions: The present analysis has shown that, when based on a universal and consistent microscopic framework of nuclear density functionals, shape invariants provide distinct indicators and reliable predictions for the occurrence of low-energy coexisting shapes. This method is particularly useful for studies of shape coexistence in regions far from stability where few data are available.

  10. Energy decomposition analysis based on a block-localized wavefunction and multistate density functional theory.

    Science.gov (United States)

    Mo, Yirong; Bao, Peng; Gao, Jiali

    2011-04-21

    An interaction energy decomposition analysis method based on the block-localized wavefunction (BLW-ED) approach is described. The first main feature of the BLW-ED method is that it combines concepts of valence bond and molecular orbital theories such that the intermediate and physically intuitive electron-localized states are variationally optimized by self-consistent field calculations. Furthermore, the block-localization scheme can be used both in wave function theory and in density functional theory, providing a useful tool to gain insights on intermolecular interactions that would otherwise be difficult to obtain using the delocalized Kohn-Sham DFT. These features allow broad applications of the BLW method to energy decomposition (BLW-ED) analysis for intermolecular interactions. In this perspective, we outline theoretical aspects of the BLW-ED method, and illustrate its applications in hydrogen-bonding and π-cation intermolecular interactions as well as metal-carbonyl complexes. Future prospects on the development of a multistate density functional theory (MSDFT) are presented, making use of block-localized electronic states as the basis configurations.

  11. Comparison of Bone Mineral Density between Urban and Rural Areas: Systematic Review and Meta-Analysis.

    Directory of Open Access Journals (Sweden)

    Mika Matsuzaki

    Full Text Available Studies from high income countries (HIC have generally shown higher osteoporotic fracture rates in urban areas than rural areas. Low bone mineral density (BMD increases susceptibility to fractures. This review aimed to assess whether urbanicity is consistently associated with lower BMD globally.Ovid MEDLINE, EMBASE, and Global Health (-April 2013 were searched for articles investigating differences in bone mineral content (BMC or BMD between urban and rural areas. Ratio of means (RoM of BMD were used to estimate effect sizes in meta-analysis, with an exception for one study that only presented BMC data.Fifteen articles from eleven distinct populations were included in the review; seven populations from four high income countries and four from three low and middle income countries (LMIC. Meta-analysis showed conflicting evidence for urban-rural difference in BMD; studies from high income countries generally showed higher BMD in rural areas while the results were more mixed in studies from low and middle income countries (HIC RoM = 0.05; 95% CI: 0.03 to 0.06; LMIC RoM = -0.04: 95% CI: -0.1 to 0.01.Urban-rural differences of bone mineral density may be context-specific. BMD may be higher in urban areas in some lower income countries. More studies with robust designs and analytical techniques are needed to understand mechanisms underlying the effects of urbanization on bone mass accrual and loss.

  12. Energy decomposition analysis based on a block-localized wavefunction and multistate density functional theory

    Science.gov (United States)

    Bao, Peng

    2013-01-01

    An interaction energy decomposition analysis method based on the block-localized wavefunction (BLW-ED) approach is described. The first main feature of the BLW-ED method is that it combines concepts of valence bond and molecular orbital theories such that the intermediate and physically intuitive electron-localized states are variationally optimized by self-consistent field calculations. Furthermore, the block-localization scheme can be used both in wave function theory and in density functional theory, providing a useful tool to gain insights on intermolecular interactions that would otherwise be difficult to obtain using the delocalized Kohn–Sham DFT. These features allow broad applications of the BLW method to energy decomposition (BLW-ED) analysis for intermolecular interactions. In this perspective, we outline theoretical aspects of the BLW-ED method, and illustrate its applications in hydrogen-bonding and π–cation intermolecular interactions as well as metal–carbonyl complexes. Future prospects on the development of a multistate density functional theory (MSDFT) are presented, making use of block-localized electronic states as the basis configurations. PMID:21369567

  13. Dispersion-based short-time Fourier transform applied to dispersive wave analysis

    Science.gov (United States)

    Hong, Jin-Chul; Sun, Kyung Ho; Kim, Yoon Young

    2005-05-01

    Although time-frequency analysis is effective for characterizing dispersive wave signals, the time-frequency tilings of most conventional analysis methods do not take into account dispersion phenomena. An adaptive time-frequency analysis method is introduced whose time-frequency tiling is determined with respect to the wave dispersion characteristics. In the dispersion-based time-frequency tiling, each time-frequency atom is adaptively rotated in the time-frequency plane, depending on the local wave dispersion. Although this idea can be useful in various problems, its application to the analysis of dispersive wave signals has not been made. In this work, the adaptive time-frequency method was applied to the analysis of dispersive elastic waves measured in waveguide experiments and a theoretical investigation on its time-frequency resolution was presented. The time-frequency resolution of the proposed transform was then compared with that of the standard short-time Fourier transform to show its effectiveness in dealing with dispersive wave signals. In addition, to facilitate the adaptive time-frequency analysis of experimentally measured signals whose dispersion relations are not known, an iterative scheme for determining the relationships was developed. The validity of the present approach in dealing with dispersive waves was verified experimentally. .

  14. Applied problems of the identification and regression analysis in stochastic structures with multimodal properties

    Directory of Open Access Journals (Sweden)

    Kulikov Vladimir

    2016-01-01

    Full Text Available We have been elaborating an approach founded on the identification of multimodal laws of the complex structure distribution in medicine, biology, chemistry of ultrapure materials and membrane technology as well as in technical applications. The method is based on the formulation and solution of inverse problems in mathematical physics for the respective probability density functions. The verification of the used algorithmic tools is carried out on model limited-scope samples. For stochastic structures and systems under study the method is supplemented with an original option of a regression analysis taking into account the identified stochastic laws displaying numerical parameters into the binary space. The proposed approach has been tested on clinical material in practical medicine.

  15. IAEA-ASSET's root cause analysis method applied to sodium leakage incident at Monju

    International Nuclear Information System (INIS)

    Watanabe, Norio; Hirano, Masashi

    1997-08-01

    The present study applied the ASSET (Analysis and Screening of Safety Events Team) methodology (This method identifies occurrences such as component failures and operator errors, identifies their respective direct/root causes and determines corrective actions.) to the analysis of the sodium leakage incident at Monju, based on the published reports by mainly the Science and Technology Agency, aiming at systematic identification of direct/root causes and corrective actions, and discussed the effectiveness and problems of the ASSET methodology. The results revealed the following seven occurrences and showed the direct/root causes and contributing factors for the individual occurrences: failure of thermometer well tube, delayed reactor manual trip, inadequate continuous monitoring of leakage, misjudgment of leak rate, non-required operator action (turbine trip), retarded emergency sodium drainage, and retarded securing of ventilation system. Most of the occurrences stemmed from deficiencies in emergency operating procedures (EOPs), which were mainly caused by defects in the EOP preparation process and operator training programs. The corrective actions already proposed in the published reports were reviewed, identifying issues to be further studied. Possible corrective actions were discussed for these issues. The present study also demonstrated the effectiveness of the ASSET methodology and pointed out some problems, for example, in delineating causal relations among occurrences, for applying it to the detail and systematic analysis of event direct/root causes and determination of concrete measures. (J.P.N.)

  16. ELUCID—Exploring the Local Universe with the reConstructed Initial Density Field. II. Reconstruction Diagnostics, Applied to Numerical Halo Catalogs

    Energy Technology Data Exchange (ETDEWEB)

    Tweed, Dylan; Yang, Xiaohu; Li, Shijie; Jing, Y. P. [Center for Astronomy and Astrophysics, Shanghai Jiao Tong University, Shanghai 200240 (China); Wang, Huiyuan [Key Laboratory for Research in Galaxies and Cosmology, Department of Astronomy, University of Science and Technology of China, Hefei, Anhui 230026 (China); Cui, Weiguang [Departamento de Física Teórica, Módulo 15, Facultad de Ciencias, Universidad Autónoma de Madrid, E-28049 Madrid (Spain); Zhang, Youcai [Shanghai Astronomical Observatory, Nandan Road 80, Shanghai 200030 (China); Mo, H. J., E-mail: dtweed@sjtu.edu.cn [Department of Astronomy, University of Massachusetts, Amherst MA, 01003-9305 (United States)

    2017-05-20

    The ELUCID project aims to build a series of realistic cosmological simulations that reproduce the spatial and mass distributions of the galaxies as observed in the Sloan Digital Sky Survey. This requires powerful reconstruction techniques to create constrained initial conditions (ICs). We test the reconstruction method by applying it to several N -body simulations. We use two medium-resolution simulations, which each produced three additional constrained N -body simulations. We compare the resulting friend-of-friend catalogs by using the particle indexes as tracers, and quantify the quality of the reconstruction by varying the main smoothing parameter. The cross-identification method we use proves to be efficient, and the results suggest that the most massive reconstructed halos are effectively traced from the same Lagrangian regions in the ICs. A preliminary time-dependence analysis indicates that high-mass-end halos converge only at a redshift close to the reconstruction redshift. This suggests that, for earlier snapshots, only collections of progenitors may be effectively cross-identified.

  17. PARAFOVEAL CAPILLARY DENSITY AFTER PLAQUE RADIOTHERAPY FOR CHOROIDAL MELANOMA: Analysis of Eyes Without Radiation Maculopathy.

    Science.gov (United States)

    Say, Emil Anthony T; Samara, Wasim A; Khoo, Chloe T L; Magrath, George N; Sharma, Priya; Ferenczy, Sandor; Shields, Carol L

    2016-09-01

    .0-0.1) (Snellen equivalent 20/20) in the fellow eye (P = 0.0252). Optical coherence tomography angiography allows qualitative and quantitative analysis of parafoveal capillary density. After plaque radiotherapy for choroidal melanoma, in eyes with normal macular features on ophthalmoscopy and optical coherence tomography, there is a statistically significant decrease in parafoveal capillary density and logMAR visual acuity in irradiated eyes compared with fellow eyes. These subclinical ischemic findings represent the commencement of radiation maculopathy.

  18. Applying HAZOP analysis in assessing remote handling compatibility of ITER port plugs

    International Nuclear Information System (INIS)

    Duisings, L.P.M.; Til, S. van; Magielsen, A.J.; Ronden, D.M.S.; Elzendoorn, B.S.Q.; Heemskerk, C.J.M.

    2013-01-01

    Highlights: ► We applied HAZOP analysis to assess the criticality of remote handling maintenance activities on port plugs in the ITER Hot Cell facility. ► We identified several weak points in the general upper port plug maintenance concept. ► We made clear recommendations on redesign in port plug design, operational sequence and Hot Cell equipment. ► The use of a HAZOP approach for the ECH UL port can also be applied to ITER port plugs in general. -- Abstract: This paper describes the application of a Hazard and Operability Analysis (HAZOP) methodology in assessing the criticality of remote handling maintenance activities on port plugs in the ITER Hot Cell facility. As part of the ECHUL consortium, the remote handling team at the DIFFER Institute is developing maintenance tools and procedures for critical components of the ECH Upper launcher (UL). Based on NRG's experience with nuclear risk analysis and Hot Cell procedures, early versions of these tool concepts and maintenance procedures were subjected to a HAZOP analysis. The analysis identified several weak points in the general upper port plug maintenance concept and led to clear recommendations on redesigns in port plug design, the operational sequence and ITER Hot Cell equipment. The paper describes the HAZOP methodology and illustrates its application with specific procedures: the Steering Mirror Assembly (SMA) replacement and the exchange of the Mid Shield Optics (MSO) in the ECH UPL. A selection of recommended changes to the launcher design associated with the accessibility, maintainability and manageability of replaceable components are presented

  19. A risk analysis approach applied to field surveillance in utility meters in legal metrology

    Science.gov (United States)

    Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.

    2018-03-01

    Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.

  20. Applied behavior analysis treatment of autism: the state of the art.

    Science.gov (United States)

    Foxx, Richard M

    2008-10-01

    The treatment of individuals with autism is associated with fad, controversial, unsupported, disproven, and unvalidated treatments. Eclecticism is not the best approach for treating and educating children and adolescents who have autism. Applied behavior analysis (ABA) uses methods derived from scientifically established principles of behavior and incorporates all of the factors identified by the US National Research Council as characteristic of effective interventions in educational and treatment programs for children who have autism. ABA is a primary method of treating aberrant behavior in individuals who have autism. The only interventions that have been shown to produce comprehensive, lasting results in autism have been based on the principles of ABA.

  1. Time-of-arrival analysis applied to ELF/VLF wave generation experiments at HAARP

    Science.gov (United States)

    Moore, R. C.; Fujimaru, S.

    2012-12-01

    Time-of-arrival (TOA) analysis is applied to observations performed during ELF/VLF wave generation experiments at the High-frequency Active Auroral Research Program (HAARP) HF transmitter in Gakona, Alaska. In 2012, a variety of ELF/VLF wave generation techniques were employed to identify the dominant source altitude for each case. Observations were performed for beat-wave modulation, AM modulation, STF modulation, ICD modulation, and cubic frequency modulation, among others. For each of these cases, we identify the dominant ELF/VLF source altitude and compare the experimental results with theoretical HF heating predictions.

  2. Bone mineral density is decreased in fibromyalgia syndrome: a systematic review and meta-analysis.

    Science.gov (United States)

    Upala, Sikarin; Yong, Wai Chung; Sanguankeo, Anawin

    2017-04-01

    Previous studies have shown that fibromyalgia syndrome (FMS) is associated with low level of physical activity and exercise, which may lead to an increased risk of osteoporosis. However, studies of bone mineral density (BMD) in fibromyalgia have shown conflicting results. Thus, we conducted a systematic review and meta-analysis to better characterize the association between FMS and BMD. A comprehensive search of the databases MEDLINE and EMBASE was performed from inception through May 2016. The inclusion criterion was the observational studies' assessment of the association between fibromyalgia and bone mineral density in adult subjects. Fibromyalgia was diagnosed in accordance with the American College of Rheumatology criteria for the diagnosis of fibromyalgia syndrome. BMD was measured at the lumbar spine and femoral neck by dual-energy X-ray absorptiometry. Pooled mean difference (MD) of BMD at each site and 95% confidence interval (CI) were calculated using a random-effect, generic inverse variance method. The between-study heterogeneity of effect size was quantified using the Q statistic and I 2 . Data were extracted from four observational studies involving 680 subjects. At lumbar spine (L2-L4), BMD is significantly decreased in patients with FMS compared with controls with pooled MD of -0.02 (95% CI -0.03 to -0.01, P value = 0.003, I 2  = 0%) (Fig. 1). At femoral neck, BMD is not significantly decreased in patients with FMS compared with controls with pooled MD of 0.01 (95% CI -0.02 to 0.01, P value = 0.23, I 2  = 0%) (Fig. 2). In this meta-analysis, we observe that BMD at lumbar spine is decreased in FMS compared with normal individuals. Patients with FMS should be assessed for risk of osteoporosis. Fig. 1 Forest plot of bone mineral density at the lumbar spine, for patients with and without fibromyalgia syndrome. CI-confidence interval Fig. 2 Forest plot of bone mineral density at the femoral neck, for patients with and without fibromyalgia

  3. Mammographic density and ageing: A collaborative pooled analysis of cross-sectional data from 22 countries worldwide.

    Science.gov (United States)

    Burton, Anya; Maskarinec, Gertraud; Perez-Gomez, Beatriz; Vachon, Celine; Miao, Hui; Lajous, Martín; López-Ridaura, Ruy; Rice, Megan; Pereira, Ana; Garmendia, Maria Luisa; Tamimi, Rulla M; Bertrand, Kimberly; Kwong, Ava; Ursin, Giske; Lee, Eunjung; Qureshi, Samera A; Ma, Huiyan; Vinnicombe, Sarah; Moss, Sue; Allen, Steve; Ndumia, Rose; Vinayak, Sudhir; Teo, Soo-Hwang; Mariapun, Shivaani; Fadzli, Farhana; Peplonska, Beata; Bukowska, Agnieszka; Nagata, Chisato; Stone, Jennifer; Hopper, John; Giles, Graham; Ozmen, Vahit; Aribal, Mustafa Erkin; Schüz, Joachim; Van Gils, Carla H; Wanders, Johanna O P; Sirous, Reza; Sirous, Mehri; Hipwell, John; Kim, Jisun; Lee, Jong Won; Dickens, Caroline; Hartman, Mikael; Chia, Kee-Seng; Scott, Christopher; Chiarelli, Anna M; Linton, Linda; Pollan, Marina; Flugelman, Anath Arzee; Salem, Dorria; Kamal, Rasha; Boyd, Norman; Dos-Santos-Silva, Isabel; McCormack, Valerie

    2017-06-01

    Mammographic density (MD) is one of the strongest breast cancer risk factors. Its age-related characteristics have been studied in women in western countries, but whether these associations apply to women worldwide is not known. We examined cross-sectional differences in MD by age and menopausal status in over 11,000 breast-cancer-free women aged 35-85 years, from 40 ethnicity- and location-specific population groups across 22 countries in the International Consortium on Mammographic Density (ICMD). MD was read centrally using a quantitative method (Cumulus) and its square-root metrics were analysed using meta-analysis of group-level estimates and linear regression models of pooled data, adjusted for body mass index, reproductive factors, mammogram view, image type, and reader. In all, 4,534 women were premenopausal, and 6,481 postmenopausal, at the time of mammography. A large age-adjusted difference in percent MD (PD) between post- and premenopausal women was apparent (-0.46 cm [95% CI: -0.53, -0.39]) and appeared greater in women with lower breast cancer risk profiles; variation across population groups due to heterogeneity (I2) was 16.5%. Among premenopausal women, the √PD difference per 10-year increase in age was -0.24 cm (95% CI: -0.34, -0.14; I2 = 30%), reflecting a compositional change (lower dense area and higher non-dense area, with no difference in breast area). In postmenopausal women, the corresponding difference in √PD (-0.38 cm [95% CI: -0.44, -0.33]; I2 = 30%) was additionally driven by increasing breast area. The study is limited by different mammography systems and its cross-sectional rather than longitudinal nature. Declines in MD with increasing age are present premenopausally, continue postmenopausally, and are most pronounced over the menopausal transition. These effects were highly consistent across diverse groups of women worldwide, suggesting that they result from an intrinsic biological, likely hormonal, mechanism common to women. If

  4. Mammographic density and ageing: A collaborative pooled analysis of cross-sectional data from 22 countries worldwide.

    Directory of Open Access Journals (Sweden)

    Anya Burton

    2017-06-01

    Full Text Available Mammographic density (MD is one of the strongest breast cancer risk factors. Its age-related characteristics have been studied in women in western countries, but whether these associations apply to women worldwide is not known.We examined cross-sectional differences in MD by age and menopausal status in over 11,000 breast-cancer-free women aged 35-85 years, from 40 ethnicity- and location-specific population groups across 22 countries in the International Consortium on Mammographic Density (ICMD. MD was read centrally using a quantitative method (Cumulus and its square-root metrics were analysed using meta-analysis of group-level estimates and linear regression models of pooled data, adjusted for body mass index, reproductive factors, mammogram view, image type, and reader. In all, 4,534 women were premenopausal, and 6,481 postmenopausal, at the time of mammography. A large age-adjusted difference in percent MD (PD between post- and premenopausal women was apparent (-0.46 cm [95% CI: -0.53, -0.39] and appeared greater in women with lower breast cancer risk profiles; variation across population groups due to heterogeneity (I2 was 16.5%. Among premenopausal women, the √PD difference per 10-year increase in age was -0.24 cm (95% CI: -0.34, -0.14; I2 = 30%, reflecting a compositional change (lower dense area and higher non-dense area, with no difference in breast area. In postmenopausal women, the corresponding difference in √PD (-0.38 cm [95% CI: -0.44, -0.33]; I2 = 30% was additionally driven by increasing breast area. The study is limited by different mammography systems and its cross-sectional rather than longitudinal nature.Declines in MD with increasing age are present premenopausally, continue postmenopausally, and are most pronounced over the menopausal transition. These effects were highly consistent across diverse groups of women worldwide, suggesting that they result from an intrinsic biological, likely hormonal, mechanism common to

  5. Preliminary analysis of MHD-Brayton cycle applied to fusion reactors (CFAR)

    Energy Technology Data Exchange (ETDEWEB)

    Ishikawa, M. [Kyoto Univ. (Japan). Dept. of Electrical Engineering; Inui, Y. [Kyoto Univ. (Japan). Dept. of Electrical Engineering; Umoto, J. [Kyoto Univ. (Japan). Dept. of Electrical Engineering; Yoshikawa, K. [Institute of Atomic Energy, Kyoto University, Gokasho, Uji, Kyoto 611 (Japan)

    1995-03-01

    High performance non-equilibrium magnetohydrodynamic (MHD) disk generators applied to fusion reactors are designed and simplified cycle analyses are carried out which show the following. (1) Disk-type MHD generators of high performance can be designed which result in an enthalpy extraction ratio of 50%-57%. The maximum value of magnetic flux density ranges from 5.4 to 7.9T depending on the maximum temperature of the MHD working gas. (2) Two MHD-Brayton systems are proposed: (a) a simple MHD system and (b) an MHD-gas turbine combined system. The cycle efficiency of the first system ranges from 39.6% to 63.6%, while the second system yields 54.0%-67.8%. The efficiency depends strongly on the maximum temperature of the MHD working gas and on the pressure recovery ratio of the diffuser. (3) A concept of blanket design is briefly described. A detailed study of the overall fusion reactor, including neutronics calculation of the blanket, is required as future work. (orig.).

  6. System Analysis Applied to Autonomy: Application to Human-Rated Lunar/Mars Landers

    Science.gov (United States)

    Young, Larry A.

    2006-01-01

    System analysis is an essential technical discipline for the modern design of spacecraft and their associated missions. Specifically, system analysis is a powerful aid in identifying and prioritizing the required technologies needed for mission and/or vehicle development efforts. Maturation of intelligent systems technologies, and their incorporation into spacecraft systems, are dictating the development of new analysis tools, and incorporation of such tools into existing system analysis methodologies, in order to fully capture the trade-offs of autonomy on vehicle and mission success. A "system analysis of autonomy" methodology will be outlined and applied to a set of notional human-rated lunar/Mars lander missions toward answering these questions: 1. what is the optimum level of vehicle autonomy and intelligence required? and 2. what are the specific attributes of an autonomous system implementation essential for a given surface lander mission/application in order to maximize mission success? Future human-rated lunar/Mars landers, though nominally under the control of their crew, will, nonetheless, be highly automated systems. These automated systems will range from mission/flight control functions, to vehicle health monitoring and prognostication, to life-support and other "housekeeping" functions. The optimum degree of autonomy afforded to these spacecraft systems/functions has profound implications from an exploration system architecture standpoint.

  7. The self-consistent charge density functional tight binding method applied to liquid water and the hydrated excess proton: benchmark simulations.

    Science.gov (United States)

    Maupin, C Mark; Aradi, Bálint; Voth, Gregory A

    2010-05-27

    The self-consistent charge density functional tight binding (SCC-DFTB) method is a relatively new approximate electronic structure method that is increasingly used to study biologically relevant systems in aqueous environments. There have been several gas phase cluster calculations that indicate, in some instances, an ability to predict geometries, energies, and vibrational frequencies in reasonable agreement with high level ab initio calculations. However, to date, there has been little validation of the method for bulk water properties, and no validation for the properties of the hydrated excess proton in water. Presented here is a detailed SCC-DFTB analysis of the latter two systems. This work focuses on the ability of the original SCC-DFTB method, and a modified version that includes a hydrogen bonding damping function (HBD-SCC-DFTB), to describe the structural, energetic, and dynamical nature of these aqueous systems. The SCC-DFTB and HBD-SCC-DFTB results are compared to experimental data and Car-Parrinello molecular dynamics (CPMD) simulations using the HCTH/120 gradient-corrected exchange-correlation energy functional. All simulations for these systems contained 128 water molecules, plus one additional proton in the case of the excess proton system, and were carried out in a periodic simulation box with Ewald long-range electrostatics. The liquid water structure for the original SCC-DFTB is shown to poorly reproduce bulk water properties, while the HBD-SCC-DFTB somewhat more closely represents bulk water due to an improved ability to describe hydrogen bonding energies. Both SCC-DFTB methods are found to underestimate the water dimer interaction energy, resulting in a low heat of vaporization and a significantly elevated water oxygen diffusion coefficient as compared to experiment. The addition of an excess hydrated proton to the bulk water resulted in the Zundel cation (H(5)O(2)(+)) stabilized species being the stable form of the charge defect, which

  8. Graphical Analysis of PET Data Applied to Reversible and Irreversible Tracers

    Energy Technology Data Exchange (ETDEWEB)

    Logan, Jean

    1999-11-18

    Graphical analysis refers to the transformation of multiple time measurements of plasma and tissue uptake data into a linear plot, the slope of which is related to the number of available tracer binding sites. This type of analysis allows easy comparisons among experiments. No particular model structure is assumed, however it is assumed that the tracer is given by bolus injection and that both tissue uptake and the plasma concentration of unchanged tracer are monitored following tracer injection. The requirement of plasma measurements can be eliminated in some cases when a reference region is available. There are two categories of graphical methods which apply to two general types of ligands--those which bind reversibly during the scanning procedure and those which are irreversible or trapped during the time of the scanning procedure.

  9. Escalation research: providing new frontiers for applying behavior analysis to organizational behavior.

    Science.gov (United States)

    Goltz, S M

    2000-01-01

    Decision fiascoes such as escalation of commitment, the tendency of decision makers to "throw good money after bad," can have serious consequences for organizations and are therefore of great interest in applied research. This paper discusses the use of behavior analysis in organizational behavior research on escalation. Among the most significant aspects of behavior-analytic research on escalation is that it has indicated that both the patterns of outcomes that decision makers have experienced for past decisions and the patterns of responses that they make are critical for understanding escalation. This research has also stimulated the refinement of methods by researchers to better assess decision making and the role reinforcement plays in it. Finally, behavior-analytic escalation research has not only indicated the utility of reinforcement principles for predicting more complex human behavior but has also suggested some additional areas for future exploration of decision making using behavior analysis.

  10. Selection of Forklift Unit for Warehouse Operation by Applying Multi-Criteria Analysis

    Directory of Open Access Journals (Sweden)

    Predrag Atanasković

    2013-07-01

    Full Text Available This paper presents research related to the choice of the criteria that can be used to perform an optimal selection of the forklift unit for warehouse operation. The analysis has been done with the aim of exploring the requirements and defining relevant criteria that are important when investment decision is made for forklift procurement, and based on the conducted research by applying multi-criteria analysis, to determine the appropriate parameters and their relative weights that form the input data and database for selection of the optimal handling unit. This paper presents an example of choosing the optimal forklift based on the selected criteria for the purpose of making the relevant investment decision.

  11. An Appraisal of Social Network Theory and Analysis as Applied to Public Health: Challenges and Opportunities.

    Science.gov (United States)

    Valente, Thomas W; Pitts, Stephanie R

    2017-03-20

    The use of social network theory and analysis methods as applied to public health has expanded greatly in the past decade, yielding a significant academic literature that spans almost every conceivable health issue. This review identifies several important theoretical challenges that confront the field but also provides opportunities for new research. These challenges include (a) measuring network influences, (b) identifying appropriate influence mechanisms, (c) the impact of social media and computerized communications, (d) the role of networks in evaluating public health interventions, and (e) ethics. Next steps for the field are outlined and the need for funding is emphasized. Recently developed network analysis techniques, technological innovations in communication, and changes in theoretical perspectives to include a focus on social and environmental behavioral influences have created opportunities for new theory and ever broader application of social networks to public health topics.

  12. Feasibility evaluation of two solar cooling systems applied to a cuban hotel. Comparative analysis

    International Nuclear Information System (INIS)

    Díaz Torres, Yamile; Valdivia Nodal, Yarelis; Monteagudo Yanes, José Pedro; Miranda Torres, Yudit

    2016-01-01

    The article presents an analysis of technical and economic feasibility of using two configurations of solar cooling in a Cuban hotel. HVAC hybrid schemes are: a cooler of ice water vapor compression (chiller) interconnected in parallel with a smaller capacity chiller, first with a solar-powered absorption cooling system (SACS), and then with a photovoltaic cooling system(PSC). Both were simulated taking into account the weather conditions in the region, thermodynamic calculation methodologies and principles that govern these technologies. The results show that the use of these alternatives contributes to reducing energy consumption and the environmental impact of heating, ventilation and air conditioning systems (HVAC). Economic analysis highlights that PCS is more favorable than the SACS taking into account the cooling cost generation (CCG) but energy assessment indicates that SACS has higher thermal performance for the case study to which it is applied. (author)

  13. Analysis of the concept of nursing educational technology applied to the patient

    Directory of Open Access Journals (Sweden)

    Aline Cruz Esmeraldo Áfio

    2014-04-01

    Full Text Available It is aimed at analyzing the concept of educational technology, produced by nursing, applied to the patient. Rodgers´ Evolutionary Method of Concept Analysis was used, identifying background, attributes and consequential damages. 13 articles were selected for analysis in which the background was identified: knowledge deficiency, shortage of nursing professionals' time, to optimize nursing work, the need to achieve the goals of the patients. Attributes: tool, strategy, innovative approach, pedagogical approach, mediator of knowledge, creative way to encourage the acquisition of skills, health production instrument. Consequences: to improve the quality of life, encouraging healthy behavior, empowerment, reflection and link. It emphasizes the importance of educational technologies for the care in nursing, to boost health education activities.

  14. [Analysis of the articles published in Chinese Journal of Applied Physiology between 2000 and 2006].

    Science.gov (United States)

    Wang, Wei-Qiu; Pang, Xiang-Juan

    2008-08-01

    To analyze the characteristics of publications of Chinese Journal of Applied Physiology between 2000 and 2006, evaluate the academic level and the popularity of the issues, and supplying an evidence for the journal reform. With CNKI and manpower search, by use of literature metrology, a comprehensive analysis of the publications of Chinese Journal of Applied Physiology between 2000 and 2006 was made. The number of articles published form 2000 to 2006 in Chinese Journal of Applied Physiology was 968, The average number of each issue is 34.57, the average page of each article is 3.11, in the columns, the article about original articles was top of rank (66.22% of the total). In the quotation, the quotation increase year by year (100% in 2004-2006), the number of English quotation is very more (76.52% in average). In the time lag, the longest is 510 days, the shortest if 60 days, the average is 196.51 days. In the fund support, the level is increase by fund support, the article number by fund support is increase too, It is 97 in 2005. In the authors' professional positions and academic degrees, the authors' level is more and more higher. In the authors column, Beijing's author is the top of rank, has 162 persons (16.74% of the total). The Chinese Journal of Applied Physiology has published high quality articles. It is the one of the most important information resource for the physiological research and the most important medical journal.

  15. DATE analysis: A general theory of biological change applied to microarray data.

    Science.gov (United States)

    Rasnick, David

    2009-01-01

    In contrast to conventional data mining, which searches for specific subsets of genes (extensive variables) to correlate with specific phenotypes, DATE analysis correlates intensive state variables calculated from the same datasets. At the heart of DATE analysis are two biological equations of state not dependent on genetic pathways. This result distinguishes DATE analysis from other bioinformatics approaches. The dimensionless state variable F quantifies the relative overall cellular activity of test cells compared to well-chosen reference cells. The variable pi(i) is the fold-change in the expression of the ith gene of test cells relative to reference. It is the fraction phi of the genome undergoing differential expression-not the magnitude pi-that controls biological change. The state variable phi is equivalent to the control strength of metabolic control analysis. For tractability, DATE analysis assumes a linear system of enzyme-connected networks and exploits the small average contribution of each cellular component. This approach was validated by reproducible values of the state variables F, RNA index, and phi calculated from random subsets of transcript microarray data. Using published microarray data, F, RNA index, and phi were correlated with: (1) the blood-feeding cycle of the malaria parasite, (2) embryonic development of the fruit fly, (3) temperature adaptation of Killifish, (4) exponential growth of cultured S. pneumoniae, and (5) human cancers. DATE analysis was applied to aCGH data from the great apes. A good example of the power of DATE analysis is its application to genomically unstable cancers, which have been refractory to data mining strategies. 2009 American Institute of Chemical Engineers Biotechnol.

  16. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  17. TRICARE Applied Behavior Analysis (ABA) Benefit: Comparison with Medicaid and Commercial Benefits.

    Science.gov (United States)

    Maglione, Margaret; Kadiyala, Srikanth; Kress, Amii; Hastings, Jaime L; O'Hanlon, Claire E

    2017-01-01

    This study compared the Applied Behavior Analysis (ABA) benefit provided by TRICARE as an early intervention for autism spectrum disorder with similar benefits in Medicaid and commercial health insurance plans. The sponsor, the Office of the Under Secretary of Defense for Personnel and Readiness, was particularly interested in how a proposed TRICARE reimbursement rate decrease from $125 per hour to $68 per hour for ABA services performed by a Board Certified Behavior Analyst compared with reimbursement rates (defined as third-party payment to the service provider) in Medicaid and commercial health insurance plans. Information on ABA coverage in state Medicaid programs was collected from Medicaid state waiver databases; subsequently, Medicaid provider reimbursement data were collected from state Medicaid fee schedules. Applied Behavior Analysis provider reimbursement in the commercial health insurance system was estimated using Truven Health MarketScan® data. A weighted mean U.S. reimbursement rate was calculated for several services using cross-state information on the number of children diagnosed with autism spectrum disorder. Locations of potential provider shortages were also identified. Medicaid and commercial insurance reimbursement rates varied considerably across the United States. This project concluded that the proposed $68-per-hour reimbursement rate for services provided by a board certified analyst was more than 25 percent below the U.S. mean.

  18. Topological data analysis (TDA) applied to reveal pedogenetic principles of European topsoil system.

    Science.gov (United States)

    Savic, Aleksandar; Toth, Gergely; Duponchel, Ludovic

    2017-05-15

    Recent developments in applied mathematics are bringing new tools that are capable to synthesize knowledge in various disciplines, and help in finding hidden relationships between variables. One such technique is topological data analysis (TDA), a fusion of classical exploration techniques such as principal component analysis (PCA), and a topological point of view applied to clustering of results. Various phenomena have already received new interpretations thanks to TDA, from the proper choice of sport teams to cancer treatments. For the first time, this technique has been applied in soil science, to show the interaction between physical and chemical soil attributes and main soil-forming factors, such as climate and land use. The topsoil data set of the Land Use/Land Cover Area Frame survey (LUCAS) was used as a comprehensive database that consists of approximately 20,000 samples, each described by 12 physical and chemical parameters. After the application of TDA, results obtained were cross-checked against known grouping parameters including five types of land cover, nine types of climate and the organic carbon content of soil. Some of the grouping characteristics observed using standard approaches were confirmed by TDA (e.g., organic carbon content) but novel subtle relationships (e.g., magnitude of anthropogenic effect in soil formation), were discovered as well. The importance of this finding is that TDA is a unique mathematical technique capable of extracting complex relations hidden in soil science data sets, giving the opportunity to see the influence of physicochemical, biotic and abiotic factors on topsoil formation through fresh eyes. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Validation of AMPX-KENO code for criticality analysis under various moderator density condition

    International Nuclear Information System (INIS)

    Ahn, Joon Gi; Hwang, Hae Ryang; Kim, Hyeong Heon; Lee, Seong Hee

    1992-01-01

    Nuclear criticality safety analysis shall be performed for the storage and handling facilities of the fissionable materials and the calculational method used to determine the effective multiplication factor also shall be validated by comparison with proper experimental data. The benchmark calculations were performed for the criticality analysis of new fuel storage facility using AMPX-KENO computer code system. The reference of the benchmark calculations are the critical experiments performed by the Nuclear Safety Department of the French Atomic Energy Commission to study the problems raised by the accidental sprinkling of a mist into a fuel storage. The bias and statistical uncertainties of the calculational method that will be applied in the criticality analysis of new fuel storage facility were also evaluated

  20. Densidade amostral aplicada ao monitoramento georreferenciado de lagartas desfolhadoras na cultura da soja Sample density applied to the georeferenced monitoring of defoliating caterpillars in soybean crop

    Directory of Open Access Journals (Sweden)

    Cinei Teresinha Riffel

    2012-12-01

    Full Text Available O conhecimento da distribuição espaço-temporal de insetos-praga na cultura da soja, por meio do uso de ferramentas da agricultura de precisão, tem sido apontado como uma importante estratégia no manejo integrado de pragas (MIP. Nesse sentido, o objetivo do trabalho foi avaliar a influência da densidade amostral no monitoramento de lagartas desfolhadoras na cultura da soja. O experimento foi conduzido em área experimental de 48,0ha, localizada no município de Júlio de Castilhos - RS, no ano agrícola 2008/2009. O monitoramento georreferenciado foi realizado seguindo três malhas amostrais regulares, 50x50m, 71x71m e 100x100m e também seguindo o método tradicional de amostragem. Durante todo o ciclo da cultura, e para cada malha amostral, foram realizadas cinco avaliações da infestação de lagartas, duas no estádio vegetativo e três no reprodutivo, com a utilização do pano-de-batida. Para analisar a distribuição espaço-temporal das lagartas na área, os dados foram submetidos à análise estatística descritiva e à geoestatística, utilizando semivariogramas e krigagem para elaboração dos mapas temáticos. Os resultados obtidos indicam que as grades amostrais avaliadas permitem caracterizar a distribuição espacial das lagartas e modelar a variabilidade espacial de lagartas na cultura da soja. A amostragem e monitoramento georreferenciado e posterior elaboração de mapas temáticos constituem-se numa alternativa potencial para agregarem-se às estratégias de MIP.The knowledge of the spatial and temporal distribution of pest insects on soybean crop through the use of precision agriculture tools, have been appointed as an important strategy in the integrated pest management (IPM. In this sense, the objective of this research was to evaluate the influence of sample density in the monitoring of defoliating caterpillars in soybean crop. The experiment was conducted in the experimental area of 47.98ha, located in Júlio de

  1. MHD-IPS analysis of relationship among solar wind density, temperature, and flow speed

    Science.gov (United States)

    Hayashi, Keiji; Tokumaru, Munetoshi; Fujiki, Ken'ichi

    2016-08-01

    The solar wind properties near the Sun are a decisive factor of properties in the rest of heliosphere. As such, determining realistic plasma density and temperature near the Sun is very important in models for solar wind, specifically magnetohydrodynamics (MHD) models. We had developed a tomographic analysis to reconstruct three-dimensional solar wind structures that satisfy line-of-sight-integrated solar wind speed derived from the interplanetary scintillation (IPS) observation data and nonlinear MHD equations simultaneously. In this study, we report a new type of our IPS-MHD tomography that seeks three-dimensional MHD solution of solar wind, matching additionally near-Earth and/or Ulysses in situ measurement data for each Carrington rotation period. In this new method, parameterized relation functions of plasma density and temperature at 50 Rs are optimized through an iterative forward model minimizing discrepancy with the in situ measurements. Satisfying three constraints, the derived 50 Rs maps of plasma quantities provide realistic observation-based information on the state of solar wind near the Sun that cannot be well determined otherwise. The optimized plasma quantities exhibit long-term variations over the solar cycles 21 to 24. The differences in plasma quantities derived from the optimized and original IPS-MHD tomography exhibit correlations with the source-surface magnetic field strength, which can in future give new quantitative constrains and requirements to models of coronal heating and acceleration.

  2. Scientometric analysis and combined density-equalizing mapping of environmental tobacco smoke (ETS research.

    Directory of Open Access Journals (Sweden)

    Karin Vitzthum

    Full Text Available BACKGROUND: Passive exposure to environmental tobacco smoke (ETS is estimated to exert a major burden of disease. Currently, numerous countries have taken legal actions to protect the population against ETS. Numerous studies have been conducted in this field. Therefore, scientometric methods should be used to analyze the accumulated data since there is no such approach available so far. METHODS AND RESULTS: A combination of scientometric methods and novel visualizing procedures were used, including density-equalizing mapping and radar charting techniques. 6,580 ETS-related studies published between 1900 and 2008 were identified in the ISI database. Using different scientometric approaches, a continuous increase of both quantitative and qualitative parameters was found. The combination with density-equalizing calculations demonstrated a leading position of the United States (2,959 items published in terms of quantitative research activities. Charting techniques demonstrated that there are numerous bi- and multilateral networks between different countries and institutions in this field. Again, a leading position of American institutions was found. CONCLUSIONS: This is the first comprehensive scientometric analysis of data on global scientific activities in the field of environmental tobacco smoke research. The present findings can be used as a benchmark for funding allocation processes.

  3. Propositional idea density in older men's written language: findings from the HIMS study using computerised analysis.

    Science.gov (United States)

    Spencer, Elizabeth; Ferguson, Alison; Craig, Hugh; Colyvas, Kim; Hankey, Graeme J; Flicker, Leon

    2015-02-01

    Decline in linguistic function has been associated with decline in cognitive function in previous research. This research investigated the informativeness of written language samples of Australian men from the Health in Men's Study (HIMS) aged from 76 to 93 years using the Computerised Propositional Idea Density Rater (CPIDR 5.1). In total, 60,255 words in 1147 comments were analysed using a linear-mixed model for statistical analysis. Results indicated no relationship with education level (p = 0.79). Participants for whom English was not their first learnt language showed Propositional Idea Density (PD) scores slightly lower (0.018 per 1 word). Mean PD per 1 word for those for whom English was their first language for comments below 60 words was 0.494 and above 60 words 0.526. Text length was found to have an effect (p = <0.0001). The mean PD was higher than previously reported for men and lower than previously reported for a similar cohort for Australian women.

  4. Direct Visualization of Orbital Flipping in Volborthite by Charge Density Analysis Using Detwinned Data

    Science.gov (United States)

    Sugawara, Kento; Sugimoto, Kunihisa; Fujii, Tatsuya; Higuchi, Takafumi; Katayama, Naoyuki; Okamoto, Yoshihiko; Sawa, Hiroshi

    2018-02-01

    The distribution of d-orbital valence electrons in volborthite [Cu3V2O7(OH)2 • 2H2O] was investigated by charge density analysis of the multipole model refinement. Diffraction data were obtained by synchrotron radiation single-crystal X-ray diffraction experiments. Data reduction by detwinning of the multiple structural domains was performed using our developed software. In this study, using high-quality data, we demonstrated that the water molecules in volborthite can be located by the hydrogen bonding in cavities that consist of Kagome lattice layers of CuO4(OH)2 and pillars of V2O7. Final multipole refinements before and after the structural phase transition directly visualized the deformation electron density of the valence electrons. We successfully directly visualized the orbital flipping of the d-orbital dx2-y2, which is the highest level of 3d orbitals occupied by d9 electrons in volborthite. The developed techniques and software can be employed for investigations of structural properties of systems with multiple structural domains.

  5. Electron-density critical points analysis and catastrophe theory to forecast structure instability in periodic solids.

    Science.gov (United States)

    Merli, Marcello; Pavese, Alessandro

    2018-03-01

    The critical points analysis of electron density, i.e. ρ(x), from ab initio calculations is used in combination with the catastrophe theory to show a correlation between ρ(x) topology and the appearance of instability that may lead to transformations of crystal structures, as a function of pressure/temperature. In particular, this study focuses on the evolution of coalescing non-degenerate critical points, i.e. such that ∇ρ(x c ) = 0 and λ 1 , λ 2 , λ 3 ≠ 0 [λ being the eigenvalues of the Hessian of ρ(x) at x c ], towards degenerate critical points, i.e. ∇ρ(x c ) = 0 and at least one λ equal to zero. The catastrophe theory formalism provides a mathematical tool to model ρ(x) in the neighbourhood of x c and allows one to rationalize the occurrence of instability in terms of electron-density topology and Gibbs energy. The phase/state transitions that TiO 2 (rutile structure), MgO (periclase structure) and Al 2 O 3 (corundum structure) undergo because of pressure and/or temperature are here discussed. An agreement of 3-5% is observed between the theoretical model and experimental pressure/temperature of transformation.

  6. Clinical usefulness of the clock drawing test applying rasch analysis in predicting of cognitive impairment.

    Science.gov (United States)

    Yoo, Doo Han; Lee, Jae Shin

    2016-07-01

    [Purpose] This study examined the clinical usefulness of the clock drawing test applying Rasch analysis for predicting the level of cognitive impairment. [Subjects and Methods] A total of 187 stroke patients with cognitive impairment were enrolled in this study. The 187 patients were evaluated by the clock drawing test developed through Rasch analysis along with the mini-mental state examination of cognitive evaluation tool. An analysis of the variance was performed to examine the significance of the mini-mental state examination and the clock drawing test according to the general characteristics of the subjects. Receiver operating characteristic analysis was performed to determine the cutoff point for cognitive impairment and to calculate the sensitivity and specificity values. [Results] The results of comparison of the clock drawing test with the mini-mental state showed significant differences in according to gender, age, education, and affected side. A total CDT of 10.5, which was selected as the cutoff point to identify cognitive impairement, showed a sensitivity, specificity, Youden index, positive predictive, and negative predicive values of 86.4%, 91.5%, 0.8, 95%, and 88.2%. [Conclusion] The clock drawing test is believed to be useful in assessments and interventions based on its excellent ability to identify cognitive disorders.

  7. Performance analysis of damaged buildings applying scenario of related non-linear analyses and damage coefficient

    Directory of Open Access Journals (Sweden)

    Ćosić Mladen

    2015-01-01

    Full Text Available The paper deals with methodology developed and presented for analyzing the damage on structures exposed to accidental and seismic actions. The procedure is based on non-linear numerical analysis, taking into account the principles of Performance-Based Seismic Design (PBSD. The stiffness matrix of the effects of vertical action is used as the initial stiffness matrix in non-linear analysis which simulates the collapse of individual ground-floor columns, forming thereby a number of possible scenarios. By the end of the analysis that simulates the collapse of individual columns, the stiffness matrix is used as the initial stiffness matrix for Non-linear Static Pushover Analysis (NSPA of bi-directional seismic action (X and Y directions. Target displacement analyses were conducted using the Capacity Spectrum Method (CSM. The structure's conditions/state was assessed based on the calculated global and inter-storey drifts and the damage coefficient developed. The damage level to the building was established using an integrated approach based on global and inter-storey drifts, so that, depending on the level of displacements for which the drifts are identified, a more reliable answer can be obtained. Applying the damage coefficient, a prompt, reliable and accurate indication can be obtained on the damage level to the entire structure in the capacitive domain, from elastic and non-linear to collapse state.

  8. Cloud Computing and Internet of Things Concepts Applied on Buildings Data Analysis

    Directory of Open Access Journals (Sweden)

    Hebean Florin-Adrian

    2017-12-01

    Full Text Available Used and developed initially for the IT industry, the Cloud computing and Internet of Things concepts are found at this moment in a lot of sectors of activity, building industry being one of them. These are defined like a global computing, monitoring and analyze network, which is composed of hardware and software resources, with the feature of allocating and dynamically relocating the shared resources, in accordance with user requirements. Data analysis and process optimization techniques based on these new concepts are used increasingly more in the buildings industry area, especially for an optimal operations of the buildings installations and also for increasing occupants comfort. The multitude of building data taken from HVAC sensor, from automation and control systems and from the other systems connected to the network are optimally managed by these new analysis techniques. Through analysis techniques can be identified and manage the issues the arise in operation of building installations like critical alarms, nonfunctional equipment, issues regarding the occupants comfort, for example the upper and lower temperature deviation to the set point and other issues related to equipment maintenance. In this study, a new approach regarding building control is presented and also a generalized methodology for applying data analysis to building services data is described. This methodology is then demonstrated using two case studies.

  9. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  10. Vibrational spectroscopy and density functional theory analysis of 3-O-caffeoylquinic acid

    Science.gov (United States)

    Mishra, Soni; Tandon, Poonam; Eravuchira, Pinkie J.; El-Abassy, Rasha M.; Materny, Arnulf

    2013-03-01

    Density functional theory (DFT) calculations are being performed to investigate the geometric, vibrational, and electronic properties of the chlorogenic acid isomer 3-CQA (1R,3R,4S,5R)-3-{[(2E)-3-(3,4-dihydroxyphenyl)prop-2-enoyl]oxy}-1,4,5-trihydroxycyclohexanecarboxylic acid), a major phenolic compound in coffee. DFT calculations with the 6-311G(d,p) basis set produce very good results. The electrostatic potential mapped onto an isodensity surface has been obtained. A natural bond orbital analysis (NBO) has been performed in order to study intramolecular bonding, interactions among bonds, and delocalization of unpaired electrons. HOMO-LUMO studies give insights into the interaction of the molecule with other species. The calculated HOMO and LUMO energies indicate that a charge transfer occurs within the molecule.

  11. eRDF Analyser: An interactive GUI for electron reduced density function analysis

    Directory of Open Access Journals (Sweden)

    Janaki Shanmugam

    2017-01-01

    Full Text Available eRDF Analyser is an interactive MATLAB GUI for reduced density function (RDF or pair distribution function (PDF analysis of amorphous and polycrystalline materials to study their local structure. It is developed as an integrated tool with an easy-to-use interface that offers a streamlined approach to extract RDF from electron diffraction data without the need for external routines. The software incorporates recent developments in scattering factor parameterisation and an automated fitting routine for the atomic scattering curve. It also features an automated optimisation routine for determination of the position of the centre of diffraction patterns recorded using both central and off-centre locations of the incident beam. It is available in both open source code (MATLAB m-file and executable form.

  12. eRDF Analyser: An interactive GUI for electron reduced density function analysis

    Science.gov (United States)

    Shanmugam, Janaki; Borisenko, Konstantin B.; Chou, Yu-Jen; Kirkland, Angus I.

    eRDF Analyser is an interactive MATLAB GUI for reduced density function (RDF) or pair distribution function (PDF) analysis of amorphous and polycrystalline materials to study their local structure. It is developed as an integrated tool with an easy-to-use interface that offers a streamlined approach to extract RDF from electron diffraction data without the need for external routines. The software incorporates recent developments in scattering factor parameterisation and an automated fitting routine for the atomic scattering curve. It also features an automated optimisation routine for determination of the position of the centre of diffraction patterns recorded using both central and off-centre locations of the incident beam. It is available in both open source code (MATLAB m-file) and executable form.

  13. Energy decomposition analysis of single bonds within Kohn-Sham density functional theory.

    Science.gov (United States)

    Levine, Daniel S; Head-Gordon, Martin

    2017-11-28

    An energy decomposition analysis (EDA) for single chemical bonds is presented within the framework of Kohn-Sham density functional theory based on spin projection equations that are exact within wave function theory. Chemical bond energies can then be understood in terms of stabilization caused by spin-coupling augmented by dispersion, polarization, and charge transfer in competition with destabilizing Pauli repulsions. The EDA reveals distinguishing features of chemical bonds ranging across nonpolar, polar, ionic, and charge-shift bonds. The effect of electron correlation is assessed by comparison with Hartree-Fock results. Substituent effects are illustrated by comparing the C-C bond in ethane against that in bis(diamantane), and dispersion stabilization in the latter is quantified. Finally, three metal-metal bonds in experimentally characterized compounds are examined: a [Formula: see text]-[Formula: see text] dimer, the [Formula: see text]-[Formula: see text] bond in dizincocene, and the Mn-Mn bond in dimanganese decacarbonyl.

  14. Assessing climate model software quality: a defect density analysis of three models

    Directory of Open Access Journals (Sweden)

    J. Pipitone

    2012-08-01

    Full Text Available A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated. Thus, in order to trust a climate model, one must trust that the software it is built from is built correctly. Our study explores the nature of software quality in the context of climate modelling. We performed an analysis of defect reports and defect fixes in several versions of leading global climate models by collecting defect data from bug tracking systems and version control repository comments. We found that the climate models all have very low defect densities compared to well-known, similarly sized open-source projects. We discuss the implications of our findings for the assessment of climate model software trustworthiness.

  15. Global SAXS Data Analysis for Multilamellar Vesicles: Evolution of the Scattering Density Profile (SDP) Model

    Energy Technology Data Exchange (ETDEWEB)

    Heftberger, Peter [University of Graz, Institute of Molecular Biosciences, Austria; Kollmitzer, Benjamin [University of Graz, Institute of Molecular Biosciences, Austria; Heberle, Frederick A [ORNL; Pan, Jianjun [ORNL; Rappolt, Michael [University of Leeds, UK; Amenitsch, Heinz [Graz University of Technology; Kucerka, Norbert [Atomic Energy of Canada Limited (AECL), Canadian Neutron Beam Centre (CNBC) and Comenius University,; Katsaras, John [ORNL; Pabst, georg [University of Graz, Institute of Molecular Biosciences, Austria

    2014-01-01

    The highly successful scattering density profile (SDP) model, used to jointly analyze small-angle X-ray and neutron scattering data from unilamellar vesicles, has been adapted for use with data from fully hydrated, liquid crystalline multilamellar vesicles (MLVs). Using a genetic algorithm, this new method is capable of providing high-resolution structural information, as well as determining bilayer elastic bending fluctuations from standalone X-ray data. Structural parameters such as bilayer thickness and area per lipid were determined for a series of saturated and unsaturated lipids, as well as binary mixtures with cholesterol. The results are in good agreement with previously reported SDP data, which used both neutron and X-ray data. The inclusion of deuterated and non-deuterated MLV neutron data in the analysis improved the lipid backbone information but did not improve, within experimental error, the structural data regarding bilayer thickness and area per lipid.

  16. Common reduced spaces of representation applied to multispectral texture analysis in cosmetology

    Science.gov (United States)

    Corvo, Joris; Angulo, Jesus; Breugnot, Josselin; Borbes, Sylvie; Closs, Brigitte

    2016-03-01

    Principal Component Analysis (PCA) is a technique of multivariate data analysis widely used in various fields like biology, ecology or economy to reduce data dimensionality while retaining most important information. It is becoming a standard practice in multispectral/hyperspectral imaging since those multivariate data generally suffer from a high redundancy level. Nevertheless, by definition, PCA is meant to be applied to a single multispectral/hyperspectral image at a time. When several images have to be treated, running a PCA on each image would generate specific reduced spaces, which is not suitable for comparison between results. Thus, we focus on two PCA based algorithms that could define common reduced spaces of representation. The first method arises from literature and is computed with the barycenter covariance matrix. On the contrary, we designed the second algorithm with the idea of correcting standard PCA using permutations and inversions of eigenvectors. These dimensionality reduction methods are used within the context of a cosmetological study of a foundation make-up. Available data are in-vivo multispectral images of skin acquired on different volunteers in time series. The main purpose of this study is to characterize the make-up degradation especially in terms of texture analysis. Results have to be validate by statistical prediction of time since applying the product. PCA algorithms produce eigenimages that separately enhance skin components (pores, radiance, vessels...). From these eigenimages, we extract morphological texture descriptors and intent a time prediction. Accuracy of common reduced spaces outperform classical PCA one. In this paper, we detail how PCA is extended to the multiple groups case and explain what are the advantages of common reduced spaces when it comes to study several multispectral images.

  17. IMPORTANCE OF APPLYING DATA ENVELOPMENT ANALYSIS IN CASE OF HIGHER EDUCATIONAL INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Labas Istvan

    2015-07-01

    Full Text Available Today, the saying predominates better and better according to which a strong target rationalism has to characterize the higher educational institutions due to the scarce resources and the limitlessness of user demands. Now in Hungary, the funding of higher educational system goes through a total transformation thus the leadership has continuously to reckon with the changes of environment and, in tune with these ones, has to modify the goals existing already. Nowadays, it becomes more and more important to measure the effectiveness of the organizations – organizational units pursuing the same or similar activities relative to each other. Benchmarking helps this procedure. Benchmarking is none other than such a tool of analysis and planning which allows comparing the organizations with the best of the competitors. Applying the method with regard to the higher educational institutions is really nothing but a procedure which focuses on comparing processes and results of the institutions’ different functional areas in order to bring to light the opportunities for the rationality as well as the quality and performance improvement. Those benefits could be managed and used as breakthrough possibilities which have been already developed/applied by other organizations and are given by the way leading to a more effective management.The main goal of my monograph is to show a kind of application of Data Envelopment Analysis (DEA method in the higher education. DEA itself is a performance measuring methodology which is a part of benchmarking and uses the linear programming as a method. By means of its application, the effectiveness of different decision-making units can be compared numerically. In our forcefully varying environment, the managerial decision making can be largely supported in each case by such information that is numerically able to identify which organizational units and activities are effective or less effective. Its advantage is that

  18. Type 2 Diabetes Research Yield, 1951-2012: Bibliometrics Analysis and Density-Equalizing Mapping.

    Directory of Open Access Journals (Sweden)

    Fiona Geaney

    Full Text Available The objective of this paper is to provide a detailed evaluation of type 2 diabetes mellitus research output from 1951-2012, using large-scale data analysis, bibliometric indicators and density-equalizing mapping. Data were retrieved from the Science Citation Index Expanded database, one of the seven curated databases within Web of Science. Using Boolean operators "OR", "AND" and "NOT", a search strategy was developed to estimate the total number of published items. Only studies with an English abstract were eligible. Type 1 diabetes and gestational diabetes items were excluded. Specific software developed for the database analysed the data. Information including titles, authors' affiliations and publication years were extracted from all files and exported to excel. Density-equalizing mapping was conducted as described by Groenberg-Kloft et al, 2008. A total of 24,783 items were published and cited 476,002 times. The greatest number of outputs were published in 2010 (n=2,139. The United States contributed 28.8% to the overall output, followed by the United Kingdom (8.2% and Japan (7.7%. Bilateral cooperation was most common between the United States and United Kingdom (n=237. Harvard University produced 2% of all publications, followed by the University of California (1.1%. The leading journals were Diabetes, Diabetologia and Diabetes Care and they contributed 9.3%, 7.3% and 4.0% of the research yield, respectively. In conclusion, the volume of research is rising in parallel with the increasing global burden of disease due to type 2 diabetes mellitus. Bibliometrics analysis provides useful information to scientists and funding agencies involved in the development and implementation of research strategies to address global health issues.

  19. Precision of periprosthetic bone mineral density measurements using Hologic Windows versus DOS-based analysis software.

    Science.gov (United States)

    Shetty, Nitin R; Hamer, Andrew J; Stockley, Ian; Eastell, Richard; Wilkinson, J Mark

    2006-01-01

    Dual energy X-ray absorptiometry (DXA) is a precise tool for measuring bone mineral density (BMD) around total joint prostheses. The Hologic "metal-removal hip" analysis package (Hologic Inc., Waltham, MA) is a Microsoft DOS-based analysis platform that has undergone a change in the operating platform to a Microsoft Windows-based system that has also incorporated changes to DXA image manipulation on-screen. We evaluated the impact of these changes on instrument precision by analysis of sequential DXA scans taken on the same day using the Hologic QDR-4500A fan beam densitometer (Hologic Inc.) in 29 subjects after total hip arthroplasty. The coefficient of variation percentage (CV%) for the net pelvic region was 3.04 for Windows versus 2.36 for DOS (p>0.05). The CV% for the net femoral region was 1.75 for Windows versus 1.51 for DOS (p>0.05). Absolute BMD values for the net pelvic and net femoral regions were similar (Bland-Altman, Windows minus DOS; pelvic region mean=-1.0%; femoral region mean=1.3%; p>0.05 for both comparisons). Our results suggest that scans analyzed using each platform may be used interchangeably without the need for a calibration correction.

  20. Bone Density and Clinical Periodontal Attachment in Postmenopausal Women: A Systematic Review and Meta-Analysis.

    Science.gov (United States)

    Penoni, D C; Fidalgo, T K S; Torres, S R; Varela, V M; Masterson, D; Leão, A T T; Maia, L C

    2017-03-01

    Osteoporosis is a systemic skeletal disease characterized by low bone mineral density (BMD) and has been considered a risk factor for periodontal disease. The aim of this systematic review and meta-analysis was to verify the scientific evidence for the association of periodontal attachment loss with low BMD in postmenopausal women. A systematic search of the literature was performed in databases until August 2016, in accordance with Preferred Reporting Items for Systematic Review and Meta-Analysis (PRISMA) guidelines. Eligibility criteria included studies that compared clinical attachment loss (CAL) between postmenopausal women with low and normal BMD. Studies using similar methodology, with lower and higher risk of bias, were pooled into 3 different meta-analyses to compare CAL among women with normal BMD, osteoporosis, and osteopenia. In the first meta-analysis, mean CAL was compared among groups. In the other 2 meta-analyses, the mean percentages of sites with CAL ≥4 mm and ≥6 mm were respectively compared among groups. From 792 unique citations, 26 articles were selected for the qualitative synthesis. Eleven of the studies were appraised as presenting low risk of bias, and the association between low BMD and CAL was observed in 10 of these studies. Thirteen cross-sectional articles were included in the meta-analysis for osteoporosis and 9 in the osteopenia analysis. Women with low BMD presented greater mean CAL than those with normal BMD (osteoporosis = 0.34 mm [95% confidence interval (CI), 0.20-0.49], P osteoporosis = 3.04 [95% CI, 1.23-4.85], P = 0.001; osteopenia = 1.74 [95% CI, 0.36-3.12], P = 0.01) and CAL ≥6 mm (osteoporosis = 5.07 [95% CI, 2.74-7.40], P osteoporosis or osteopenia may exhibit greater CAL compared with women with normal BMD.

  1. High-density lipoprotein cholesterol increase and non-cardiovascular mortality: a meta-analysis.

    Science.gov (United States)

    Burillo, Elena; Andres, Eva Maria; Mateo-Gallego, Rocio; Fiddyment, Sarah; Jarauta, Estibaliz; Cenarro, Ana; Civeira, Fernando

    2010-09-01

    Many observational prospective studies have confirmed the inverse relationship between high-density lipoprotein (HDL) cholesterol and coronary heart disease. However, the potential benefit of the pharmacological increase in HDL cholesterol has not been clearly demonstrated. Moreover, in some interventions an increase in total mortality has been reported. The objective of this meta-analysis was to determine the relationship between HDL cholesterol increase and non-cardiovascular mortality in randomised trials. Authors searched Medline up to December 2008. Four reviewers identified randomised trials in which, through different types of interventions, HDL cholesterol increase in the treatment group was >4% compared to control group, both groups reported separately non-cardiovascular mortality and the duration of the study was, at least, one year. Data of HDL cholesterol concentrations and deaths were collected as they appeared in the original studies. If necessary, reviewers calculated data by using trial information. Meta-regression analysis included 44 articles corresponding to 107 773 participants. Analysis showed an association between HDL cholesterol increase and non-cardiovascular mortality (p=0.023), however, the correlation disappeared when we excluded the ILLUMINATE (Investigation of Lipid Level Management to Understand its Impact in Atherosclerosis Events) trial from the analysis (p=0.972). Meta-regression analysis results suggest that increases in HDL cholesterol up to 40% are not associated with higher non-cardiovascular death. The increase in adverse events observed in some trials where HDL cholesterol was raised in large amounts could be related with the drug mechanisms more than the HDL cholesterol increase itself.

  2. Prognostic significance of tumor-associated macrophages density in gastric cancer: a systemic review and meta-analysis.

    Science.gov (United States)

    Liu, Jiu Y; Yang, Xiao J; Geng, Xia F; Huang, Chao Q; Yu, Yang; Li, Yan

    2016-10-01

    Tumor-associated macrophages (TAM) play a dual role in the development of gastric cancer (GC). This study aims to analyze the prognostic value of TAM density in GC patients. We conducted a meta-analysis of 11 studies (N.=1043) to investigate the correlation between TAM density and the overall survival (OS) or disease free survival (DFS) of GC patients. Pooled hazard ratios (HRs) and 95% confidence intervals (CIs) were calculated by the STATA statistical software. The HR of OS of GC patients with high-density TAM is 1.56 (95% CI: 0.90~2.22, Panalysis by ethnicity also revealed no significance effect between TAM density and a worse OS among both Asians and Caucasians (Asians: HR=1.47, 95% CI: 0.76~2.18, Pempirical evidence that TAM density is not an independent predictor for the survival of GC patients.

  3. Analysis of the series resistance and interface state densities in metal semiconductor structures

    Energy Technology Data Exchange (ETDEWEB)

    Gueler, G [Department of Physics, Faculty of Education, University of Adiyaman, 02100 Adiyaman (Turkey); Guellue, Oe [Department of Physics, Faculty of Sciences and Arts, Atatuerk University, 25240 Erzurum (Turkey); Karatas, S [Department of Physics, Faculty of Sciences and Arts, University of Kahramanmaras Suetcue Imam, 46100 Kahramanmaras (Turkey); Bakkaloglu, Oe F, E-mail: skaratas@ksu.edu.t [Department of Engineering Physics, Faculty of Engineering Physics, University of Gaziantep, 27310 Gaziantep (Turkey)

    2009-03-01

    The electrical properties of Co/n-Si metal-semiconductor (MS) Schottky structure investigated at room temperature using current-voltage (I-V) characteristics. The characteristic parameters of the structure such as barrier height, ideality factor and series resistance have been determined from the I-V measurements. The values of barrier height obtained from Norde's function were compared with those from Cheung functions, and it was seen that there was a good agreement between barrier heights from both methods. The series resistance values calculated with Cheung's two methods were compared and seen that there was an agreement with each other. However, the values of series resistance obtained from Cheung functions and Norde's functions are not agreeing with each other. Because, Cheung functions are only applied to the non-linear region (high voltage region) of the forward bias I-V characteristics. Furthermore, the energy distribution of interface state density was determined from the forward bias I-V characteristics by taking into account the bias dependence of the effective barrier height. The results show that the presence of thin interfacial layer between the metal and semiconductor.

  4. UNCERT: geostatistics, uncertainty analysis and visualization software applied to groundwater flow and contaminant transport modeling

    International Nuclear Information System (INIS)

    Wingle, W.L.; Poeter, E.P.; McKenna, S.A.

    1999-01-01

    UNCERT is a 2D and 3D geostatistics, uncertainty analysis and visualization software package applied to ground water flow and contaminant transport modeling. It is a collection of modules that provides tools for linear regression, univariate statistics, semivariogram analysis, inverse-distance gridding, trend-surface analysis, simple and ordinary kriging and discrete conditional indicator simulation. Graphical user interfaces for MODFLOW and MT3D, ground water flow and contaminant transport models, are provided for streamlined data input and result analysis. Visualization tools are included for displaying data input and output. These include, but are not limited to, 2D and 3D scatter plots, histograms, box and whisker plots, 2D contour maps, surface renderings of 2D gridded data and 3D views of gridded data. By design, UNCERT's graphical user interface and visualization tools facilitate model design and analysis. There are few built in restrictions on data set sizes and each module (with two exceptions) can be run in either graphical or batch mode. UNCERT is in the public domain and is available from the World Wide Web with complete on-line and printable (PDF) documentation. UNCERT is written in ANSI-C with a small amount of FORTRAN77, for UNIX workstations running X-Windows and Motif (or Lesstif). This article discusses the features of each module and demonstrates how they can be used individually and in combination. The tools are applicable to a wide range of fields and are currently used by researchers in the ground water, mining, mathematics, chemistry and geophysics, to name a few disciplines. (Copyright (c) 1999 Elsevier Science B.V., Amsterdam. All rights reserved.)

  5. A systematic review and meta-analysis of the association between eating disorders and bone density.

    Science.gov (United States)

    Robinson, L; Aldridge, V; Clark, E M; Misra, M; Micali, N

    2016-06-01

    This meta-analysis investigates the effect of an eating disorder on bone mineral density in two eating disorder subtypes. Following conflicting findings in previous literature, this study finds that not only anorexia nervosa, but also bulimia nervosa has a detrimental effect on BMD. Key predictors of this relationship are discussed. This systematic review and meta-analysis investigates bone mineral density (BMD) in individuals with anorexia nervosa (AN) and bulimia nervosa (BN) in comparison to healthy controls (HCs). AN has been associated with low BMD and a risk of fractures and mixed results have been obtained for the relationship between BN and BMD. Deciphering the effect these two ED subtypes on BMD will determine the effect of low body weight (a characteristic of AN) versus the effects of periods of restrictive eating and malnutrition which are common to both AN and BN. We conducted a systematic search through the electronic databases MedLine, EMBASE and PsychInfo and the Cochrane Library to investigate and quantify this relationship. We screened 544 articles and included 27 studies in a random-effect meta-analysis and calculated the standardised mean difference (SMD) in BMD between women with a current diagnosis of AN (n = 785) vs HCs (n = 979) and a current diagnosis of BN (n = 187) vs HCs (n = 350). The outcome measures investigated were spinal, hip, femoral neck and whole body BMD measured by DXA or DPA scanning. A meta-regression investigated the effect of factors including age, duration since diagnosis, duration of amenorrhea and BMI on BMD. The mean BMI of participants was 16.65 kg/m(2) (AN), 21.16 kg/m(2) (BN) and 22.06 kg/m(2) (HC). Spine BMD was lowest in AN subjects (SMD, -3.681; 95 % CI, -4.738, -2.625; p < 0.0001), but also lower in BN subjects compared with HCs (SMD, -0.472; 95 % CI, -0.688, -0.255; p < 0.0001). Hip, whole body and femoral neck BMD were reduced to a statistically significant level in AN but not BN

  6. LOGICAL CONDITIONS ANALYSIS METHOD FOR DIAGNOSTIC TEST RESULTS DECODING APPLIED TO COMPETENCE ELEMENTS PROFICIENCY

    Directory of Open Access Journals (Sweden)

    V. I. Freyman

    2015-11-01

    Full Text Available Subject of Research.Representation features of education results for competence-based educational programs are analyzed. Solution importance of decoding and proficiency estimation for elements and components of discipline parts of competences is shown. The purpose and objectives of research are formulated. Methods. The paper deals with methods of mathematical logic, Boolean algebra, and parametrical analysis of complex diagnostic test results, that controls proficiency of some discipline competence elements. Results. The method of logical conditions analysis is created. It will give the possibility to formulate logical conditions for proficiency determination of each discipline competence element, controlled by complex diagnostic test. Normalized test result is divided into noncrossing zones; a logical condition about controlled elements proficiency is formulated for each of them. Summarized characteristics for test result zones are imposed. An example of logical conditions forming for diagnostic test with preset features is provided. Practical Relevance. The proposed method of logical conditions analysis is applied in the decoding algorithm of proficiency test diagnosis for discipline competence elements. It will give the possibility to automate the search procedure for elements with insufficient proficiency, and is also usable for estimation of education results of a discipline or a component of competence-based educational program.

  7. Multilayers quantitative X-ray fluorescence analysis applied to easel paintings.

    Science.gov (United States)

    de Viguerie, Laurence; Sole, V Armando; Walter, Philippe

    2009-12-01

    X-ray fluorescence spectrometry (XRF) allows a rapid and simple determination of the elemental composition of a material. As a non-destructive tool, it has been extensively used for analysis in art and archaeology since the early 1970s. Whereas it is commonly used for qualitative analysis, recent efforts have been made to develop quantitative treatment even with portable systems. However, the interpretation of the results obtained with this technique can turn out to be problematic in the case of layered structures such as easel paintings. The use of differential X-ray attenuation enables modelling of the various layers: indeed, the absorption of X-rays through different layers will result in modification of intensity ratio between the different characteristic lines. This work focuses on the possibility to use XRF with the fundamental parameters method to reconstruct the composition and thickness of the layers. This method was tested on several multilayers standards and gives a maximum error of 15% for thicknesses and errors of 10% for concentrations. On a painting test sample that was rather inhomogeneous, the XRF analysis provides an average value. This method was applied in situ to estimate the thickness of the layers a painting from Marco d'Oggiono, pupil of Leonardo da Vinci.

  8. Experimental and Numerical Study of the Micromix Combustion Principle Applied for Hydrogen and Hydrogen-Rich Syngas as Fuel with Increased Energy Density for Industrial Gas Turbine Applications

    OpenAIRE

    Funke, Harald H.-W.; Dickhoff, Jens; Keinz, Jan; Anis, Haj Ayed; Parente, Alessandro; Hendrick, Patrick

    2014-01-01

    The Dry Low NOx (DLN) Micromix combustion principle with increased energy density is adapted for the industrial gas turbine APU GTCP 36-300 using hydrogen and hydrogen-rich syngas with a composition of 90 %-Vol. hydrogen (H2) and 10 %-Vol. carbon-monoxide (CO). Experimental and numerical studies of several combustor geometries for hydrogen and syngas show the successful advance of the DLN Micromix combustion from pure hydrogen to hydrogen-rich syngas. The impact of the different fuel properti...

  9. Practical considerations for sensitivity analysis after multiple imputation applied to epidemiological studies with incomplete data

    Science.gov (United States)

    2012-01-01

    Background Multiple Imputation as usually implemented assumes that data are Missing At Random (MAR), meaning that the underlying missing data mechanism, given the observed data, is independent of the unobserved data. To explore the sensitivity of the inferences to departures from the MAR assumption, we applied the method proposed by Carpenter et al. (2007). This approach aims to approximate inferences under a Missing Not At random (MNAR) mechanism by reweighting estimates obtained after multiple imputation where the weights depend on the assumed degree of departure from the MAR assumption. Methods The method is illustrated with epidemiological data from a surveillance system of hepatitis C virus (HCV) infection in France during the 2001–2007 period. The subpopulation studied included 4343 HCV infected patients who reported drug use. Risk factors for severe liver disease were assessed. After performing complete-case and multiple imputation analyses, we applied the sensitivity analysis to 3 risk factors of severe liver disease: past excessive alcohol consumption, HIV co-infection and infection with HCV genotype 3. Results In these data, the association between severe liver disease and HIV was underestimated, if given the observed data the chance of observing HIV status is high when this is positive. Inference for two other risk factors were robust to plausible local departures from the MAR assumption. Conclusions We have demonstrated the practical utility of, and advocate, a pragmatic widely applicable approach to exploring plausible departures from the MAR assumption post multiple imputation. We have developed guidelines for applying this approach to epidemiological studies. PMID:22681630

  10. Multivariate analysis in the frequency mastery applied to the Laguna Verde Central

    International Nuclear Information System (INIS)

    Castillo D, R.; Ortiz V, J.; Calleros M, G.

    2006-01-01

    The noise analysis is an auxiliary tool in the detection of abnormal operation conditions of equipment, instruments or systems that affect to the dynamic behavior of the reactor. The spectral density of normalized power has usually been used (NPSD, by its initials in English), to watch over the behavior of some components of the reactor, for example, the jet pumps, the recirculation pumps, valves of flow control in the recirculation knots, etc. The behavior change is determined by individual analysis of the NPSD of the signals of the components in study. An alternative analysis that can allow to obtain major information on the component under surveillance is the multivariate autoregressive analysis (MAR, by its initials in English), which allows to know the relationship that exists among diverse signals of the reactor systems, in the time domain. In the space of the frequency, the relative contribution of power (RPC for their initials in English) it quantifies the influence of the variables of the systems on a variable of interest. The RPC allows, therefore that for a peak shown in the NPSD of a variable, it can be determine the influence from other variables to that frequency of interest. This facilitates, in principle, the pursuit of the important physical parameters during an event, and to study their interrelation. In this work, by way of example of the application of the RPC, two events happened in the Laguna Verde Central are analyzed: the rods blockade alarms by high scale in the monitors of average power, in which it was presents a power peak of 12% of width peak to peak, and the power oscillations event. The main obtained result of the analysis of the control rods blockade alarm event was that it was detected that the power peak observed in the signals of the average power monitors was caused by the movement of the valve of flow control of recirculation of the knot B. In the other oscillation event the results its show the mechanism of the oscillation of

  11. Laws' masks descriptors applied to bone texture analysis: an innovative and discriminant tool in osteoporosis

    International Nuclear Information System (INIS)

    Rachidi, M.; Marchadier, A.; Gadois, C.; Lespessailles, E.; Chappard, C.; Benhamou, C.L.

    2008-01-01

    The objective of this study was to explore Laws' masks analysis to describe structural variations of trabecular bone due to osteoporosis on high-resolution digital radiographs and to check its dependence on the spatial resolution. Laws' masks are well established as one of the best methods for texture analysis in image processing and are used in various applications, but not in bone tissue characterisation. This method is based on masks that aim to filter the images. From each mask, five classical statistical parameters can be calculated. The study was performed on 182 healthy postmenopausal women with no fractures and 114 age-matched women with fractures [26 hip fractures (HFs), 29 vertebrae fractures (VFs), 29 wrist fractures (WFs) and 30 other fractures (OFs)]. For all subjects radiographs were obtained of the calcaneus with a new high-resolution X-ray device with direct digitisation (BMA, D3A, France). The lumbar spine, femoral neck, and total hip bone mineral density (BMD) were assessed by dual-energy X-ray absorptiometry. In terms of reproducibility, the best results were obtained with the TR E5E5 mask, especially for three parameters: ''mean'', ''standard deviation'' and ''entropy'' with, respectively, in vivo mid-term root mean square average coefficient of variation (RMSCV)%=1.79, 4.24 and 2.05. The ''mean'' and ''entropy'' parameters had a better reproducibility but ''standard deviation'' showed a better discriminant power. Thus, for univariate analysis, the difference between subjects with fractures and controls was significant (P -3 ) and significant for each fracture group independently (P -4 for HF, P=0.025 for VF and P -3 for OF). After multivariate analysis with adjustment for age and total hip BMD, the difference concerning the ''standard deviation'' parameter remained statistically significant between the control group and the HF and VF groups (P -5 , and P=0.04, respectively). No significant correlation between these Laws' masks parameters and

  12. Applying a social network analysis (SNA) approach to understanding radiologists' performance in reading mammograms

    Science.gov (United States)

    Tavakoli Taba, Seyedamir; Hossain, Liaquat; Heard, Robert; Brennan, Patrick; Lee, Warwick; Lewis, Sarah

    2017-03-01

    Rationale and objectives: Observer performance has been widely studied through examining the characteristics of individuals. Applying a systems perspective, while understanding of the system's output, requires a study of the interactions between observers. This research explains a mixed methods approach to applying a social network analysis (SNA), together with a more traditional approach of examining personal/ individual characteristics in understanding observer performance in mammography. Materials and Methods: Using social networks theories and measures in order to understand observer performance, we designed a social networks survey instrument for collecting personal and network data about observers involved in mammography performance studies. We present the results of a study by our group where 31 Australian breast radiologists originally reviewed 60 mammographic cases (comprising of 20 abnormal and 40 normal cases) and then completed an online questionnaire about their social networks and personal characteristics. A jackknife free response operating characteristic (JAFROC) method was used to measure performance of radiologists. JAFROC was tested against various personal and network measures to verify the theoretical model. Results: The results from this study suggest a strong association between social networks and observer performance for Australian radiologists. Network factors accounted for 48% of variance in observer performance, in comparison to 15.5% for the personal characteristics for this study group. Conclusion: This study suggest a strong new direction for research into improving observer performance. Future studies in observer performance should consider social networks' influence as part of their research paradigm, with equal or greater vigour than traditional constructs of personal characteristics.

  13. Essays on environmental policy analysis: Computable general equilibrium approaches applied to Sweden

    International Nuclear Information System (INIS)

    Hill, M.

    2001-01-01

    This thesis consists of three essays within the field of applied environmental economics, with the common basic aim of analyzing effects of Swedish environmental policy. Starting out from Swedish environmental goals, the thesis assesses a range of policy-related questions. The objective is to quantify policy outcomes by constructing and applying numerical models especially designed for environmental policy analysis. Static and dynamic multi-sectoral computable general equilibrium models are developed in order to analyze the following issues. The costs and benefits of a domestic carbon dioxide (CO 2 ) tax reform. Special attention is given to how these costs and benefits depend on the structure of the tax system and, furthermore, how they depend on policy-induced changes in 'secondary' pollutants. The effects of allowing for emission permit trading through time when the domestic long-term domestic environmental goal is specified in CO 2 stock terms. The effects on long-term projected economic growth and welfare that are due to damages from emission flow and accumulation of 'local' pollutants (nitrogen oxides and sulfur dioxide), as well as the outcome of environmental policy when costs and benefits are considered in an integrated environmental-economic framework

  14. Applying Different Independent Component Analysis Algorithms and Support Vector Regression for IT Chain Store Sales Forecasting

    Science.gov (United States)

    Dai, Wensheng

    2014-01-01

    Sales forecasting is one of the most important issues in managing information technology (IT) chain store sales since an IT chain store has many branches. Integrating feature extraction method and prediction tool, such as support vector regression (SVR), is a useful method for constructing an effective sales forecasting scheme. Independent component analysis (ICA) is a novel feature extraction technique and has been widely applied to deal with various forecasting problems. But, up to now, only the basic ICA method (i.e., temporal ICA model) was applied to sale forecasting problem. In this paper, we utilize three different ICA methods including spatial ICA (sICA), temporal ICA (tICA), and spatiotemporal ICA (stICA) to extract features from the sales data and compare their performance in sales forecasting of IT chain store. Experimental results from a real sales data show that the sales forecasting scheme by integrating stICA and SVR outperforms the comparison models in terms of forecasting error. The stICA is a promising tool for extracting effective features from branch sales data and the extracted features can improve the prediction performance of SVR for sales forecasting. PMID:25165740

  15. The Process of Laying Concrete and Analysis of Operations Applying the Lean Approach

    Directory of Open Access Journals (Sweden)

    Vidmantas Gramauskas

    2012-11-01

    Full Text Available The paper considers Lean philosophy ‘Just in Time’, a value stream map and total quality management principles applying them to the construction. In order to follow these principles, a case study was performed, thus observing and recording the process of laying concrete in three houses where a lower ground floor was casted employing fiber concrete. The collected data were required to fragment the process of laying concrete into smaller operations and examine each operation independently. The examination of operations was introduced in certain units of measurement – time, the number of workers, cubic meters of concrete used, space area, etc. This information helped with distinguishing useful operations from useless actions bringing no value to the product. The previously mentioned methods can be applied to useless operations to reduce their duration or even eliminate them. The main problem is the process of laying concrete splitting it into smaller operations, operation analysis and adaptation of Lean principles. The implementation of Lean system can reduce waste and increase the value of the final product.

  16. Color changes in wood during heating: kinetic analysis by applying a time-temperature superposition method

    Science.gov (United States)

    Matsuo, Miyuki; Yokoyama, Misao; Umemura, Kenji; Gril, Joseph; Yano, Ken'ichiro; Kawai, Shuichi

    2010-04-01

    This paper deals with the kinetics of the color properties of hinoki ( Chamaecyparis obtusa Endl.) wood. Specimens cut from the wood were heated at 90-180°C as accelerated aging treatment. The specimens completely dried and heated in the presence of oxygen allowed us to evaluate the effects of thermal oxidation on wood color change. Color properties measured by a spectrophotometer showed similar behavior irrespective of the treatment temperature with each time scale. Kinetic analysis using the time-temperature superposition principle, which uses the whole data set, was successfully applied to the color changes. The calculated values of the apparent activation energy in terms of L *, a *, b *, and Δ E^{*}_{ab} were 117, 95, 114, and 113 kJ/mol, respectively, which are similar to the values of the literature obtained for other properties such as the physical and mechanical properties of wood.

  17. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    Directory of Open Access Journals (Sweden)

    Shrirang Ambaji KULKARNI

    2017-04-01

    Full Text Available Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique for routing in dynamic network. To analyze the correctness of Q-Routing algorithms mathematically, we provide a proof and also implement a SPIN based verification model. We also perform simulation based analysis of Q-Routing for given metrics.

  18. User Centered Design as a Framework for Applying Conversation Analysis in Hearing Aid Consultations

    DEFF Research Database (Denmark)

    Egbert, Maria; Matthews, Ben

    2011-01-01

    Recurrent issues in applying CA results to change in institutional practices concern the degree to which the CA researcher is involved and what aspects of the change process CA researchers is involved in. This paper presents a methodology from innovation studies called User Centered Design (Buur...... and Bagger, 1999) and, more recently, Participatory Innovation (Buur and Matthews, 2008) which is uniquely compatible with conversation analysis. Designers following this approach study how a ‘user’ of goods or services interacts with products and other interaction partners in order to derive ideas...... for innovation. Although this methodological convergence of disciplines is rooted in different traditions, it augurs well for successful cooperation. This paper reports on such a collaboration carried out within a federally funded research center for innovation. We present principles of the interdisciplinary...

  19. Econometrics analysis of consumer behaviour: a linear expenditure system applied to energy

    International Nuclear Information System (INIS)

    Giansante, C.; Ferrari, V.

    1996-12-01

    In economics literature the expenditure system specification is a well known subject. The problem is to define a coherent representation of consumer behaviour through functional forms easy to calculate. In this work it is used the Stone-Geary Linear Expenditure System and its multi-level decision process version. The Linear Expenditure system is characterized by an easy calculating estimation procedure, and its multi-level specification allows substitution and complementary relations between goods. Moreover, the utility function separability condition on which the Utility Tree Approach is based, justifies to use an estimation procedure in two or more steps. This allows to use an high degree of expenditure categories disaggregation, impossible to reach the Linear Expediture System. The analysis is applied to energy sectors

  20. Applied behavior analysis programs for autism: sibling psychosocial adjustment during and following intervention use.

    Science.gov (United States)

    Cebula, Katie R

    2012-05-01

    Psychosocial adjustment in siblings of children with autism whose families were using a home-based, applied behavior analysis (ABA) program was compared to that of siblings in families who were not using any intensive autism intervention. Data gathered from parents, siblings and teachers indicated that siblings in ABA families experienced neither significant drawbacks nor benefits in terms of their behavioral adjustment, sibling relationship quality and self-concept compared to control group siblings, either during or following intervention use. Parents and siblings perceived improvements in sibling interaction since the outset of ABA, with parents somewhat more positive in their views than were siblings. Social support was associated with better sibling outcomes in all groups. Implications for supporting families using ABA are considered.

  1. Applied behavior analysis is ideal for the development of a land mine detection technology using animals.

    Science.gov (United States)

    Jones, B M

    2011-01-01

    The detection and subsequent removal of land mines and unexploded ordnance (UXO) from many developing countries are slow, expensive, and dangerous tasks, but have the potential to improve the well-being of millions of people. Consequently, those involved with humanitarian mine and UXO clearance are actively searching for new and more efficient detection technologies. Remote explosive scent tracing (REST) using trained dogs has the potential to be one such technology. However, details regarding how best to train, test, and deploy dogs in this role have never been made publicly available. This article describes how the key characteristics of applied behavior analysis, as described by Baer, Wolf and Risley (1968, 1987), served as important objectives for the research and development of the behavioral technology component of REST while the author worked in humanitarian demining.

  2. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

    2015-01-01

    The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

  3. Pretreatment procedures applied to samples to be analysed by neutron activation analysis at CDTN/CNEN

    Energy Technology Data Exchange (ETDEWEB)

    Francisco, Dovenir; Menezes, Maria Angela de Barros Correia, E-mail: menezes@cdtn.b, E-mail: dovenir@cdtn.b [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil). Lab. de Ativacao Neutronica (Brazil)

    2009-07-01

    The neutron activation technique - using several methods - has been applied in 80% of the analytical demand of Division for Reactor and Analytical Techniques at CDTN/CNEN, Belo Horizonte, Minas Gerais. This scenario emphasizes the responsibility of the Laboratory to provide and assure the quality of the measurements. The first step to assure the results quality is the preparation of the samples. Therefore, this paper describes the experimental procedures adopted at CDTN/CNEN in order to uniform conditions of analysis and to avoid contaminations by elements present everywhere. Some of the procedures are based on methods described in the literature; others are based on many years of experience preparing samples from many kinds of matrices. The procedures described are related to geological material - soil, sediment, rock, gems, clay, archaeological ceramics and ore - biological materials - hair, fish, plants, food - water, etc. Analytical results in sediment samples are shown as n example pointing out the efficiency of the experimental procedure. (author)

  4. CONTROL AND STABILITY ANALYSIS OF THE GMC ALGORITHM APPLIED TO pH SYSTEMS

    Directory of Open Access Journals (Sweden)

    Manzi J.T.

    1998-01-01

    Full Text Available This paper deals with the control of the neutralization processes of the strong acid-strong base and the weak acid-strong base systems using the Generic Model Control (GMC algorithm. The control strategy is applied to a pilot plant where hydrochloric acid-sodium hydroxide and acetic acid-sodium hydroxide systems are neutralized. The GMC algorithm includes in the controller structure a nonlinear model of the process in the controller structure. The paper also focuses the provides a stability analysis of the controller for some of the uncertainties involved in the system. The rResults indicate that the controller stabilizes the system for a large range of uncertainties, but the performance may deteriorate when the system is submitted to large disturbances.

  5. [Matrix analysis of the client's voice: QFD applied to healthcare management].

    Science.gov (United States)

    Lorenzo, Susana; Mira, José; Olarte, Mayerly; Guerrero, Johana; Guerrero, Johann; Moyano, Silvia

    2004-01-01

    To apply quality function deployment (QFD) methodology to identify clients' needs by relating complaints with perceived quality domains. A hospital within the Public Health Service of Madrid. Matrix analysis based on the QFD model was performed, using the surveys (1998-2002) conducted in the hospital with the Servqhos questionnaire and a sample of 363 complaints made in 2002. The complaints analyzed were selected using a non-probabilistic sampling method. QFD methodology was highly useful, allowing complaints to be related to the results of a perceived quality questionnaire and identification of the attributes with the greatest influence on patient satisfaction. It also allowed us to identify areas for improvement according to clients' needs.

  6. Pretreatment procedures applied to samples to be analysed by neutron activation analysis at CDTN/CNEN

    International Nuclear Information System (INIS)

    Francisco, Dovenir; Menezes, Maria Angela de Barros Correia

    2009-01-01

    The neutron activation technique - using several methods - has been applied in 80% of the analytical demand of Division for Reactor and Analytical Techniques at CDTN/CNEN, Belo Horizonte, Minas Gerais. This scenario emphasizes the responsibility of the Laboratory to provide and assure the quality of the measurements. The first step to assure the results quality is the preparation of the samples. Therefore, this paper describes the experimental procedures adopted at CDTN/CNEN in order to uniform conditions of analysis and to avoid contaminations by elements present everywhere. Some of the procedures are based on methods described in the literature; others are based on many years of experience preparing samples from many kinds of matrices. The procedures described are related to geological material - soil, sediment, rock, gems, clay, archaeological ceramics and ore - biological materials - hair, fish, plants, food - water, etc. Analytical results in sediment samples are shown as n example pointing out the efficiency of the experimental procedure. (author)

  7. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  8. Challenges in the implementation of a quality management system applied to radiometric analysis

    International Nuclear Information System (INIS)

    Dias, Danila C.S.; Bonifacio, Rodrigo L.; Nascimento, Marcos R.L.; Silva, Nivaldo C. da; Taddei, Maria Helena T.

    2015-01-01

    The concept of quality in laboratories has been well established as an essential factor in the search for reliable results. Since its first version published (1999), the ISO/IEC 17025 has been applied in the industrial and research fields, in a wide range of laboratorial analyses. However, the implementation of a Quality Management System still poses great challenges to institutions and companies. The purpose of this work is to expose the constraints related to the implementation of ISO/IEC 17025 applied to analytical assays of radionuclides, accomplished by studying the case of the Pocos de Caldas Laboratory of the Brazilian Commission for Nuclear Energy. In this lab, a project of accreditation of techniques involving determination of radionuclides in water, soil, sediment and food samples has been conducted since 2011. The challenges presented by this project arise from the administrative view, where the governmental nature of the institution translates into unlevelled availability resources and the organizational view, whereas QMS requires inevitable changes in the organizational culture. It is important to point out that when it comes to accreditation of analysis involving radioactive elements, many aspects must be treated carefully due to the their very particular nature. Among these concerns are the determination of analysis uncertainties, accessibility to international proficiency studies, international radioactive samples and CRM transportation, the study of parameters on the validation of analytical methods and the lack of documentation and specialized personnel regarding quality at radiometric measurements. Through an effective management system, the institution is overcoming these challenges, moving toward the ISO/IEC 17025 accreditation. (author)

  9. Challenges in the implementation of a quality management system applied to radiometric analysis

    Energy Technology Data Exchange (ETDEWEB)

    Dias, Danila C.S.; Bonifacio, Rodrigo L.; Nascimento, Marcos R.L.; Silva, Nivaldo C. da; Taddei, Maria Helena T., E-mail: danilacdias@gmail.com [Comissao Nacional de Energia Nuclear (LAPOC/CNEN-MG), Pocos de Caldas, MG (Brazil). Laboratorio de Pocos de Caldas

    2015-07-01

    The concept of quality in laboratories has been well established as an essential factor in the search for reliable results. Since its first version published (1999), the ISO/IEC 17025 has been applied in the industrial and research fields, in a wide range of laboratorial analyses. However, the implementation of a Quality Management System still poses great challenges to institutions and companies. The purpose of this work is to expose the constraints related to the implementation of ISO/IEC 17025 applied to analytical assays of radionuclides, accomplished by studying the case of the Pocos de Caldas Laboratory of the Brazilian Commission for Nuclear Energy. In this lab, a project of accreditation of techniques involving determination of radionuclides in water, soil, sediment and food samples has been conducted since 2011. The challenges presented by this project arise from the administrative view, where the governmental nature of the institution translates into unlevelled availability resources and the organizational view, whereas QMS requires inevitable changes in the organizational culture. It is important to point out that when it comes to accreditation of analysis involving radioactive elements, many aspects must be treated carefully due to the their very particular nature. Among these concerns are the determination of analysis uncertainties, accessibility to international proficiency studies, international radioactive samples and CRM transportation, the study of parameters on the validation of analytical methods and the lack of documentation and specialized personnel regarding quality at radiometric measurements. Through an effective management system, the institution is overcoming these challenges, moving toward the ISO/IEC 17025 accreditation. (author)

  10. Early Seizure Detection by Applying Frequency-Based Algorithm Derived from the Principal Component Analysis.

    Science.gov (United States)

    Lee, Jiseon; Park, Junhee; Yang, Sejung; Kim, Hani; Choi, Yun Seo; Kim, Hyeon Jin; Lee, Hyang Woon; Lee, Byung-Uk

    2017-01-01

    The use of automatic electrical stimulation in response to early seizure detection has been introduced as a new treatment for intractable epilepsy. For the effective application of this method as a successful treatment, improving the accuracy of the early seizure detection is crucial. In this paper, we proposed the application of a frequency-based algorithm derived from principal component analysis (PCA), and demonstrated improved efficacy for early seizure detection in a pilocarpine-induced epilepsy rat model. A total of 100 ictal electroencephalographs (EEG) during spontaneous recurrent seizures from 11 epileptic rats were finally included for the analysis. PCA was applied to the covariance matrix of a conventional EEG frequency band signal. Two PCA results were compared: one from the initial segment of seizures (5 sec of seizure onset) and the other from the whole segment of seizures. In order to compare the accuracy, we obtained the specific threshold satisfying the target performance from the training set, and compared the False Positive (FP), False Negative (FN), and Latency (Lat) of the PCA based feature derived from the initial segment of seizures to the other six features in the testing set. The PCA based feature derived from the initial segment of seizures performed significantly better than other features with a 1.40% FP, zero FN, and 0.14 s Lat. These results demonstrated that the proposed frequency-based feature from PCA that captures the characteristics of the initial phase of seizure was effective for early detection of seizures. Experiments with rat ictal EEGs showed an improved early seizure detection rate with PCA applied to the covariance of the initial 5 s segment of visual seizure onset instead of using the whole seizure segment or other conventional frequency bands.

  11. A comparison of quantitative reconstruction techniques for PIXE-tomography analysis applied to biological samples

    Energy Technology Data Exchange (ETDEWEB)

    Beasley, D.G., E-mail: dgbeasley@ctn.ist.utl.pt [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Alves, L.C. [IST/C2TN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Barberet, Ph.; Bourret, S.; Devès, G.; Gordillo, N.; Michelet, C. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Le Trequesser, Q. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Institut de Chimie de la Matière Condensée de Bordeaux (ICMCB, UPR9048) CNRS, Université de Bordeaux, 87 avenue du Dr. A. Schweitzer, Pessac F-33608 (France); Marques, A.C. [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal); Seznec, H. [Univ. Bordeaux, CENBG, UMR 5797, F-33170 Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, F-33170 Gradignan (France); Silva, R.C. da [IST/IPFN, Universidade de Lisboa, Campus Tecnológico e Nuclear, E.N.10, 2686-953 Sacavém (Portugal)

    2014-07-15

    The tomographic reconstruction of biological specimens requires robust algorithms, able to deal with low density contrast and low element concentrations. At the IST/ITN microprobe facility new GPU-accelerated reconstruction software, JPIXET, has been developed, which can significantly increase the speed of quantitative reconstruction of Proton Induced X-ray Emission Tomography (PIXE-T) data. It has a user-friendly graphical user interface for pre-processing, data analysis and reconstruction of PIXE-T and Scanning Transmission Ion Microscopy Tomography (STIM-T). The reconstruction of PIXE-T data is performed using either an algorithm based on a GPU-accelerated version of the Maximum Likelihood Expectation Maximisation (MLEM) method or a GPU-accelerated version of the Discrete Image Space Reconstruction Algorithm (DISRA) (Sakellariou (2001) [2]). The original DISRA, its accelerated version, and the MLEM algorithm, were compared for the reconstruction of a biological sample of Caenorhabditis elegans – a small worm. This sample was analysed at the microbeam line of the AIFIRA facility of CENBG, Bordeaux. A qualitative PIXE-T reconstruction was obtained using the CENBG software package TomoRebuild (Habchi et al. (2013) [6]). The effects of pre-processing and experimental conditions on the elemental concentrations are discussed.

  12. Bayesian statistics applied to neutron activation data for reactor flux spectrum analysis

    International Nuclear Information System (INIS)

    Chiesa, Davide; Previtali, Ezio; Sisti, Monica

    2014-01-01

    Highlights: • Bayesian statistics to analyze the neutron flux spectrum from activation data. • Rigorous statistical approach for accurate evaluation of the neutron flux groups. • Cross section and activation data uncertainties included for the problem solution. • Flexible methodology applied to analyze different nuclear reactor flux spectra. • The results are in good agreement with the MCNP simulations of neutron fluxes. - Abstract: In this paper, we present a statistical method, based on Bayesian statistics, to analyze the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation experiment performed at the TRIGA Mark II reactor of Pavia University (Italy) in four irradiation positions characterized by different neutron spectra. In order to evaluate the neutron flux spectrum, subdivided in energy groups, a system of linear equations, containing the group effective cross sections and the activation rate data, has to be solved. However, since the system’s coefficients are experimental data affected by uncertainties, a rigorous statistical approach is fundamental for an accurate evaluation of the neutron flux groups. For this purpose, we applied the Bayesian statistical analysis, that allows to include the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, was used to define the problem statistical model and solve it. The first analysis involved the determination of the thermal, resonance-intermediate and fast flux components and the dependence of the results on the Prior distribution choice was investigated to confirm the reliability of the Bayesian analysis. After that, the main resonances of the activation cross sections were analyzed to implement multi-group models with finer energy subdivisions that would allow to determine the

  13. 3D MRI for Quantitative Analysis of Quadrant Percent Breast Density: Correlation with Quadrant Location of Breast Cancer.

    Science.gov (United States)

    Chen, Jeon-Hor; Liao, Fuyi; Zhang, Yang; Li, Yifan; Chang, Chia-Ju; Chou, Chen-Pin; Yang, Tsung-Lung; Su, Min-Ying

    2017-07-01

    Breast cancer occurs more frequently in the upper outer (UO) quadrant, but whether this higher cancer incidence is related to the greater amount of dense tissue is not known. Magnetic resonance imaging acquires three-dimensional volumetric images and is the most suitable among all breast imaging modalities for regional quantification of density. This study applied a magnetic resonance imaging-based method to measure quadrant percent density (QPD), and evaluated its association with the quadrant location of the developed breast cancer. A total of 126 cases with pathologically confirmed breast cancer were reviewed. Only women who had unilateral breast cancer located in a clear quadrant were selected for analysis. A total of 84 women, including 47 Asian women and 37 western women, were included. An established computer-aided method was used to segment the diseased breast and the contralateral normal breast, and to separate the dense and fatty tissues. Then, a breast was further separated into four quadrants using the nipple and the centroid as anatomic landmarks. The tumor was segmented using a computer-aided method to determine its quadrant location. The distribution of cancer quadrant location, the quadrant with the highest QPD, and the proportion of cancers occurring in the highest QPD were analyzed. The highest incidence of cancer occurred in the UO quadrant (36 out of 84, 42.9%). The highest QPD was also noted most frequently in the UO quadrant (31 out of 84, 36.9%). When correlating the highest QPD with the quadrant location of breast cancer, only 17 women out of 84 (20.2%) had breast cancer occurring in the quadrant with the highest QPD. The results showed that the development of breast cancer in a specific quadrant could not be explained by the density in that quadrant, and further studies are needed to find the biological reasons accounting for the higher breast cancer incidence in the UO quadrant. Copyright © 2017 The Association of University Radiologists

  14. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Directory of Open Access Journals (Sweden)

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  15. Retrieval of Stratospheric O3 and NO2 Density Profiles From a DOAS Analysis of UV-Visible Limb Scatter Measured by OSIRIS

    Science.gov (United States)

    Haley, C. S.; Sioris, C. E.; von Savigny, C.; McDade, I. C.; Griffioen, E.; McLinden, C. A.; Llewellyn, E. J.; ODIN Team

    2001-12-01

    Space-based atmospheric remote sounding measurements of minor species in the stratosphere using UV-visible radiances have traditionally been of two types: occultation measurements (POAM, SAGE) and nadir measurements (TOMS, GOME). These types of measurements are limited by either restricted spatial coverage (occultation) or poor vertical resolution (nadir). A new type of measurement, with the potential of providing both good spatial coverage and good vertical resolution, is limb scattered sunlight. This type of measurement has been used to recover O3 in the mesosphere and NO2 in the stratosphere from SME measurements and recently has been used to retrieve stratospheric ozone density profiles from SOLSE/LORE measurements. A number of new instruments are employing this method, including OSIRIS, onboard the recently launched Odin satellite, and SCIAMACHY, which will be launched on ENVISAT in late 2001. Though the measurements themselves are relatively straightforward to make, the process of retrieving minor species densities and other information from the radiances is complicated due to the viewing geometry. A method that has proved successful for analysing UV-visible measurements made with ground-based instruments and with satellite nadir-viewing instruments has been that of Differential Optical Absorption Spectroscopy (DOAS). With the DOAS method absorption features are used to recover slant column densities which are then converted to vertical column densities through calculated air mass factors. Here we present the application of the DOAS method to limb scattered sunlight. In this application, apparent column densities are calculated through an analysis of measured limb radiances in a similar fashion as the calculation of the slant column densities in other applications. The complication here is that there is no straightforward relationship between these apparent column densities and the vertical column density. We describe how these apparent column densities

  16. Association between low bone mineral density and fibromyalgia: a meta-analysis.

    Science.gov (United States)

    Lee, Young Ho; Song, Gwan Gyu

    2017-11-01

    We aimed to evaluate the relationship between bone mineral density (BMD) and fibromyalgia (FM). Meta-analyses were performed comparing BMD in FM patients and healthy controls, and in FM patients in subgroups based on ethnicity, BMD site, age, sex, and measurement method. Twelve studies including 695 FM patients and 784 controls were selected. Meta-analysis by ethnicity revealed a significantly lower BMD in the FM group in Caucasian populations [standardized mean difference (SMD) = -0.144, 95% CI = -0.271-(-0.017), p = 0.026], but not in Turkish populations. Subgroup analysis by BMD site showed that BMD was significantly lower in the FM group than in the control group in the lumbar spine [SMD = -0.588 (medium), 95% CI = -1.142-(-0.033), p = 0.038], but not in the femur neck and hip. Stratification by measurement method revealed a significantly lower BMD in the FM group by dual X-ray absorptiometry and dual-photon absorptiometry [SMD = -0.531 (medium), 95% CI = -1.040-(-0.023), p = 0.041; SMD = -0.315 (small), 95% CI = -0.544-(-0.085), p = 0.007, respectively], but not by quantitative ultrasound, but not by quantitative ultrasound. Subgroup analysis by sex, menopause status, and age revealed a significantly lower BMD in the female FM group [SMD = -0.588 (medium), 95% CI = -1.142-(-0.033), p = 0.038], but not in the pre-menopausal group and the group greater than mean age 50 years old. Our meta-analysis demonstrated that BMD was significantly lower in FM patients in Caucasian and female populations.

  17. Statistical analysis of modal parameters of a suspension bridge based on Bayesian spectral density approach and SHM data

    Science.gov (United States)

    Li, Zhijun; Feng, Maria Q.; Luo, Longxi; Feng, Dongming; Xu, Xiuli

    2018-01-01

    Uncertainty of modal parameters estimation appear in structural health monitoring (SHM) practice of civil engineering to quite some significant extent due to environmental influences and modeling errors. Reasonable methodologies are needed for processing the uncertainty. Bayesian inference can provide a promising and feasible identification solution for the purpose of SHM. However, there are relatively few researches on the application of Bayesian spectral method in the modal identification using SHM data sets. To extract modal parameters from large data sets collected by SHM system, the Bayesian spectral density algorithm was applied to address the uncertainty of mode extraction from output-only response of a long-span suspension bridge. The posterior most possible values of modal parameters and their uncertainties were estimated through Bayesian inference. A long-term variation and statistical analysis was performed using the sensor data sets collected from the SHM system of the suspension bridge over a one-year period. The t location-scale distribution was shown to be a better candidate function for frequencies of lower modes. On the other hand, the burr distribution provided the best fitting to the higher modes which are sensitive to the temperature. In addition, wind-induced variation of modal parameters was also investigated. It was observed that both the damping ratios and modal forces increased during the period of typhoon excitations. Meanwhile, the modal damping ratios exhibit significant correlation with the spectral intensities of the corresponding modal forces.

  18. Applied Electromagnetics

    International Nuclear Information System (INIS)

    Yamashita, H.; Marinova, I.; Cingoski, V.

    2002-01-01

    These proceedings contain papers relating to the 3rd Japanese-Bulgarian-Macedonian Joint Seminar on Applied Electromagnetics. Included are the following groups: Numerical Methods I; Electrical and Mechanical System Analysis and Simulations; Inverse Problems and Optimizations; Software Methodology; Numerical Methods II; Applied Electromagnetics

  19. Applying Chemical Imaging Analysis to Improve Our Understanding of Cold Cloud Formation

    Science.gov (United States)

    Laskin, A.; Knopf, D. A.; Wang, B.; Alpert, P. A.; Roedel, T.; Gilles, M. K.; Moffet, R.; Tivanski, A.

    2012-12-01

    The impact that atmospheric ice nucleation has on the global radiation budget is one of the least understood problems in atmospheric sciences. This is in part due to the incomplete understanding of various ice nucleation pathways that lead to ice crystal formation from pre-existing aerosol particles. Studies investigating the ice nucleation propensity of laboratory generated particles indicate that individual particle types are highly selective in their ice nucleating efficiency. This description of heterogeneous ice nucleation would present a challenge when applying to the atmosphere which contains a complex mixture of particles. Here, we employ a combination of micro-spectroscopic and optical single particle analytical methods to relate particle physical and chemical properties with observed water uptake and ice nucleation. Field-collected particles from urban environments impacted by anthropogenic and marine emissions and aging processes are investigated. Single particle characterization is provided by computer controlled scanning electron microscopy with energy dispersive analysis of X-rays (CCSEM/EDX) and scanning transmission X-ray microscopy with near edge X-ray absorption fine structure spectroscopy (STXM/NEXAFS). A particle-on-substrate approach coupled to a vapor controlled cooling-stage and a microscope system is applied to determine the onsets of water uptake and ice nucleation including immersion freezing and deposition ice nucleation as a function of temperature (T) as low as 200 K and relative humidity (RH) up to water saturation. We observe for urban aerosol particles that for T > 230 K the oxidation level affects initial water uptake and that subsequent immersion freezing depends on particle mixing state, e.g. by the presence of insoluble particles. For T cloud formation. Initial results applying single particle IN analysis using CCSEM/EDX and STXM/NEXAFS reveal that a significant amount of IN are coated by organics and, thus, are similar to the

  20. Effects of Growth Hormone Replacement Therapy on Bone Mineral Density in Growth Hormone Deficient Adults: A Meta-Analysis

    OpenAIRE

    Xue, Peng; Wang, Yan; Yang, Jie; Li, Yukun

    2013-01-01

    Objectives. Growth hormone deficiency patients exhibited reduced bone mineral density compared with healthy controls, but previous researches demonstrated uncertainty about the effect of growth hormone replacement therapy on bone in growth hormone deficient adults. The aim of this study was to determine whether the growth hormone replacement therapy could elevate bone mineral density in growth hormone deficient adults. Methods. In this meta-analysis, searches of Medline, Embase, and The Cochr...