WorldWideScience

Sample records for scatterplots

  1. Optical mammography combined with fluorescence imaging: lesion detection using scatterplots

    NARCIS (Netherlands)

    Leproux, Anaïs; van der Voort, Marjolein; van der Mark, Martin B.; Harbers, Rik; van de Ven, Stephanie M. W. Y.; van Leeuwen, Ton G.

    2011-01-01

    Using scatterplots of 2 or 3 parameters, diffuse optical tomography and fluorescence imaging are combined to improve detectability of breast lesions. Small or low contrast phantom-lesions that were missed in the optical and fluorescence images were detected in the scatterplots. In patient

  2. The nature of correlation perception in scatterplots.

    Science.gov (United States)

    Rensink, Ronald A

    2017-06-01

    For scatterplots with gaussian distributions of dots, the perception of Pearson correlation r can be described by two simple laws: a linear one for discrimination, and a logarithmic one for perceived magnitude (Rensink & Baldridge, 2010). The underlying perceptual mechanisms, however, remain poorly understood. To cast light on these, four different distributions of datapoints were examined. The first had 100 points with equal variance in both dimensions. Consistent with earlier results, just noticeable difference (JND) was a linear function of the distance away from r = 1, and the magnitude of perceived correlation a logarithmic function of this quantity. In addition, these laws were linked, with the intercept of the JND line being the inverse of the bias in perceived magnitude. Three other conditions were also examined: a dot cloud with 25 points, a horizontal compression of the cloud, and a cloud with a uniform distribution of dots. Performance was found to be similar in all conditions. The generality and form of these laws suggest that what underlies correlation perception is not a geometric structure such as the shape of the dot cloud, but the shape of the probability distribution of the dots, likely inferred via a form of ensemble coding. It is suggested that this reflects the ability of observers to perceive the information entropy in an image, with this quantity used as a proxy for Pearson correlation.

  3. Extending a scatterplot for displaying group structure in multivariate ...

    African Journals Online (AJOL)

    The power of canonical variate analysis (CVA) biplots, when regarded as extensions of ordinary scatterplots to describe variation and group structure in multivariate observations, is demonstrated by presenting a case study from the South African wood pulp industry. It is shown how multidimensional standards specified by ...

  4. scatterplot3d - An R Package for Visualizing Multivariate Data

    Directory of Open Access Journals (Sweden)

    Uwe Ligges

    2003-05-01

    Full Text Available Scatterplot3d is an R package for the visualization of multivariate data in a three dimensional space. R is a "language for data analysis and graphics". In this paper we discuss the features of the package. It is designed by exclusively making use of already existing functions of R and its graphics system and thus shows the extensibility of the R graphics system. Additionally some examples on generated and real world data are provided.

  5. Comparative eye-tracking evaluation of scatterplots and parallel coordinates

    Directory of Open Access Journals (Sweden)

    Rudolf Netzel

    2017-06-01

    Full Text Available We investigate task performance and reading characteristics for scatterplots (Cartesian coordinates and parallel coordinates. In a controlled eye-tracking study, we asked 24 participants to assess the relative distance of points in multidimensional space, depending on the diagram type (parallel coordinates or a horizontal collection of scatterplots, the number of data dimensions (2, 4, 6, or 8, and the relative distance between points (15%, 20%, or 25%. For a given reference point and two target points, we instructed participants to choose the target point that was closer to the reference point in multidimensional space. We present a visual scanning model that describes different strategies to solve this retrieval task for both diagram types, and propose corresponding hypotheses that we test using task completion time, accuracy, and gaze positions as dependent variables. Our results show that scatterplots outperform parallel coordinates significantly in 2 dimensions, however, the task was solved more quickly and more accurately with parallel coordinates in 8 dimensions. The eye-tracking data further shows significant differences between Cartesian and parallel coordinates, as well as between different numbers of dimensions. For parallel coordinates, there is a clear trend toward shorter fixations and longer saccades with increasing number of dimensions. Using an area-of-interest (AOI based approach, we identify different reading strategies for each diagram type: For parallel coordinates, the participants’ gaze frequently jumped back and forth between pairs of axes, while axes were rarely focused on when viewing Cartesian coordinates. We further found that participants’ attention is biased: toward the center of the whole plotfor parallel coordinates and skewed to the center/left side for Cartesian coordinates. We anticipate that these results may support the design of more effective visualizations for multidimensional data.

  6. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are considered for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.

  7. ScatterJn: An ImageJ Plugin for Scatterplot-Matrix Analysis and Classification of Spatially Resolved Analytical Microscopy Data

    Directory of Open Access Journals (Sweden)

    Fabian Zeitvogel

    2016-02-01

    Full Text Available We present ScatterJn, an ImageJ (and Fiji plugin for scatterplot-based exploration and analysis of analytical microscopy data. In contrast to commonly used scatterplot tools, it handles more than two input images (or image stacks, respectively by creating a matrix of pairwise scatterplots. The tool offers the possibility to manually classify pixels by selecting regions of datapoints in the scatterplots as well as in the spatial domain. We demonstrate its functioning using a set of elemental maps acquired by SEM-EDX mapping of a soil sample. The plugin is available at https://savannah.nongnu.org/projects/scatterjn.

  8. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations, 2. Robustness of Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C.; Kleijnen, J.P.C.

    1999-03-24

    Procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses are described and illustrated. These procedures attempt to detect increasingly complex patterns in scatterplots and involve the identification of (i) linear relationships with correlation coefficients, (ii) monotonic relationships with rank correlation coefficients, (iii) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (iv) trends in variability as defined by variances and interquartile ranges, and (v) deviations from randomness as defined by the chi-square statistic. A sequence of example analyses with a large model for two-phase fluid flow illustrates how the individual procedures can differ in the variables that they identify as having effects on particular model outcomes. The example analyses indicate that the use of a sequence of procedures is a good analysis strategy and provides some assurance that an important effect is not overlooked.

  9. A FORTRAN 77 Program and User's Guide for the Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Helton, Jon C.; Shortencarier, Maichael J.

    1999-08-01

    A description and user's guide are given for a computer program, PATTRN, developed at Sandia National Laboratories for use in sensitivity analyses of complex models. This program is intended for use in the analysis of input-output relationships in Monte Carlo analyses when the input has been selected using random or Latin hypercube sampling. Procedures incorporated into the program are based upon attempts to detect increasingly complex patterns in scatterplots and involve the detection of linear relationships, monotonic relationships, trends in measures of central tendency, trends in measures of variability, and deviations from randomness. The program was designed to be easy to use and portable.

  10. Extending a scatterplot for displaying group structure in multivariate ...

    African Journals Online (AJOL)

    : A case study. S Gardner∗. NJ Le Roux†. T Rypstra‡. JPJ Swart‡. Received: 3 March 2005; Revised: 31 May 2005; Accepted: 6 June 2005. Abstract. The power of canonical variate analysis (CVA) biplots, when regarded as extensions of or-.

  11. Shape Perception in 3-D Scatterplots Using Constant Visual Angle Glyphs

    DEFF Research Database (Denmark)

    Stenholt, Rasmus; Madsen, Claus B.

    2012-01-01

    the the same amount of screen space. For perceptual reasons, we call this approach constant visual angle glyphs, or CVA glyphs. The use of CVA glyphs implies some desirable perceptual consequences, which have not been previously described or discussed in existing literature: CVA glyphs not only have...

  12. On the Benefits of Using Constant Visual Angle Glyphs in Interactive Exploration of 3D Scatterplots

    DEFF Research Database (Denmark)

    Stenholt, Rasmus

    2014-01-01

    Visual exploration of clouds of data points is an important application of virtual environments. The common goal of this activ- ity is to use the strengths of human perception to identify interesting structures in data, which are often not detected using traditional, computational analysis method...

  13. Radiometric Normalization of Temporal Images Combining Automatic Detection of Pseudo-Invariant Features from the Distance and Similarity Spectral Measures, Density Scatterplot Analysis, and Robust Regression

    Directory of Open Access Journals (Sweden)

    Ana Paula Ferreira de Carvalho

    2013-05-01

    Full Text Available Radiometric precision is difficult to maintain in orbital images due to several factors (atmospheric conditions, Earth-sun distance, detector calibration, illumination, and viewing angles. These unwanted effects must be removed for radiometric consistency among temporal images, leaving only land-leaving radiances, for optimum change detection. A variety of relative radiometric correction techniques were developed for the correction or rectification of images, of the same area, through use of reference targets whose reflectance do not change significantly with time, i.e., pseudo-invariant features (PIFs. This paper proposes a new technique for radiometric normalization, which uses three sequential methods for an accurate PIFs selection: spectral measures of temporal data (spectral distance and similarity, density scatter plot analysis (ridge method, and robust regression. The spectral measures used are the spectral angle (Spectral Angle Mapper, SAM, spectral correlation (Spectral Correlation Mapper, SCM, and Euclidean distance. The spectral measures between the spectra at times t1 and t2 and are calculated for each pixel. After classification using threshold values, it is possible to define points with the same spectral behavior, including PIFs. The distance and similarity measures are complementary and can be calculated together. The ridge method uses a density plot generated from images acquired on different dates for the selection of PIFs. In a density plot, the invariant pixels, together, form a high-density ridge, while variant pixels (clouds and land cover changes are spread, having low density, facilitating its exclusion. Finally, the selected PIFs are subjected to a robust regression (M-estimate between pairs of temporal bands for the detection and elimination of outliers, and to obtain the optimal linear equation for a given set of target points. The robust regression is insensitive to outliers, i.e., observation that appears to deviate strongly from the rest of the data in which it occurs, and as in our case, change areas. New sequential methods enable one to select by different attributes, a number of invariant targets over the brightness range of the images.

  14. Interrelations between Three Proxies of Health Care Need at the Small Area Level: An Urban/Rural Comparison

    National Research Council Canada - National Science Library

    S. Barnett; P. Roderick; D. Martin; I. Diamond; H. Wrigley

    2002-01-01

    ...) and socioeconomic characteristics, and 1991-1996 data on all cause premature mortality. The interrelations between the three widely used proxies of health care need are examined using correlation coefficients and scatterplots...

  15. Exploring Naval Tactics with UAVs in an Island Complex Using Agent-Based Simulation

    Science.gov (United States)

    2007-06-01

    75 % Estealth 0-75 Enemy’s “next waypoint” 50-100 Enextwp 50-100 Enemy’s “cover” 0-100 Ecover 0-100 24 One of the most important issues for...20 30 Esensor 35 79 12 16 Estealth 0 20 40 60 Enextw p 5060 80 100 Ecover 20 50 80 Scatterplot Matrix Figure 16. Scatterplot matrix for the...screening experiment TOT Uspeed Usensor Ustealth Unextwp Uprocess Espeed Esensor Estealth Enextwp Ecover TOT 1 Uspeed -0.00734 1

  16. Microprocessor aided data acquisition at VEDAS

    Energy Technology Data Exchange (ETDEWEB)

    Ziem, P.; Drescher, B.; Kapper, K.; Kowallik, R.

    1985-08-01

    Three microprocessor systems have been developed to support data acquisition in nuclear physics multiparameter experiments. A bit-slice processor accumulates up to 256 1-dim spectra and 16 2-dim spectra. A microprocessor, based on the AM 29116 ALU, performs a fast consistency check on the coincidence data. A VME-Bus double-processor displays a colored scatterplot.

  17. Long-term Trends in Coral Reef Fish Yields and Exploitation Rates ...

    African Journals Online (AJOL)

    Daisy Ouya

    tonnes/km2/year) were used to analyse temporal trends in catches of the major demersal families of coral reef fish (e.g., Siganidae, Lethrinidae,. Lutjanidae, Scaridae, Acanthuridae, Serranidae and. 'others'). A locally-weighted scatterplot smoother. (LOWESS) (Cleveland, 1979) was used to fit smoothed trend lines to the full ...

  18. Cadmium versus phosphate in the world ocean

    NARCIS (Netherlands)

    Baar, Hein J.W. de; Saager, Paul M.; Nolting, Rob F.; Meer, Jaap van der

    1994-01-01

    Cadmium (Cd) is one of the best studied trace metals in seawater and at individual stations exhibits a more or less linear relation with phosphate. The compilation of all data from all oceans taken from over 30 different published sources into one global dataset yields only a broad scatterplot of Cd

  19. Visualizing Qualitative Information

    Science.gov (United States)

    Slone, Debra J.

    2009-01-01

    The abundance of qualitative data in today's society and the need to easily scrutinize, digest, and share this information calls for effective visualization and analysis tools. Yet, no existing qualitative tools have the analytic power, visual effectiveness, and universality of familiar quantitative instruments like bar charts, scatter-plots, and…

  20. Examining Student Conceptions of Covariation: A Focus on the Line of Best Fit

    Science.gov (United States)

    Casey, Stephanie A.

    2015-01-01

    The purpose of this research study was to learn about students' conceptions concerning the line of best fit just prior to their introduction to the topic. Task-based interviews were conducted with thirty-three students, focused on five tasks that asked them to place the line of best fit on a scatterplot and explain their reasoning throughout the…

  1. Sensitivity analysis of navy aviation readiness based sparing model

    Science.gov (United States)

    2017-09-01

    used in this research for other applications would be at his or her own risk. viii THIS PAGE INTENTIONALLY LEFT BLANK ix TABLE OF...26 1. NAVARM Configuration ............................................................26 2. Simulation by Visual Basic for Applications ...CORRELATION AND SCATTERPLOT MATRIX ................................................................................47 APPENDIX B. OAT SA GRAPHS OF MIS /BON

  2. Evaluation of Purine Salvage as a Chemotherapeutic Target in the Plasmodium yoelii Rodent Model

    Science.gov (United States)

    2008-03-01

    parasite must adapt to the mosquito environment where temperature fluctuates and nutrients are limited. Gametocytogenesis requires growth arrest and...enabling global median location normalization; (ii) using a global normalization using the scatter-plot smoother loess ; (iii) a 2 dimension spatial...normalization using the loess function. Selection of differentially expressed genes was performed using the package limma that is based on the

  3. Measurement and Modeling of Energetic-Material Mass Transfer to Soil-Pore Water

    Science.gov (United States)

    2006-05-01

    mme l esate SSKSK −−⋅⋅= where Ksat is the hydraulic conductivity at saturation, and l is a pore-connectivity parameter estimated to be about 0.5 as...the influence of particle size, it is really hard to interpret the real influence of this parameter. This behavior is confirmed by the scatterplots

  4. The half-half plot

    NARCIS (Netherlands)

    Einmahl, J.H.J.; Gantner, M.

    2012-01-01

    The Half-Half (HH) plot is a new graphical method to investigate qualitatively the shape of a regression curve. The empirical HH-plot counts observations in the lower and upper quarter of a strip that moves horizontally over the scatterplot. The plot displays jumps clearly and reveals further

  5. A randomized trial in a massive online open course shows people don’t know what a statistically significant relationship looks like, but they can learn

    Directory of Open Access Journals (Sweden)

    Aaron Fisher

    2014-10-01

    Full Text Available Scatterplots are the most common way for statisticians, scientists, and the public to visually detect relationships between measured variables. At the same time, and despite widely publicized controversy, P-values remain the most commonly used measure to statistically justify relationships identified between variables. Here we measure the ability to detect statistically significant relationships from scatterplots in a randomized trial of 2,039 students in a statistics massive open online course (MOOC. Each subject was shown a random set of scatterplots and asked to visually determine if the underlying relationships were statistically significant at the P < 0.05 level. Subjects correctly classified only 47.4% (95% CI [45.1%–49.7%] of statistically significant relationships, and 74.6% (95% CI [72.5%–76.6%] of non-significant relationships. Adding visual aids such as a best fit line or scatterplot smooth increased the probability a relationship was called significant, regardless of whether the relationship was actually significant. Classification of statistically significant relationships improved on repeat attempts of the survey, although classification of non-significant relationships did not. Our results suggest: (1 that evidence-based data analysis can be used to identify weaknesses in theoretical procedures in the hands of average users, (2 data analysts can be trained to improve detection of statistically significant results with practice, but (3 data analysts have incorrect intuition about what statistically significant relationships look like, particularly for small effects. We have built a web tool for people to compare scatterplots with their corresponding p-values which is available here: http://glimmer.rstudio.com/afisher/EDA/.

  6. GeoXp : An R Package for Exploratory Spatial Data Analysis

    Directory of Open Access Journals (Sweden)

    Thibault Laurent

    2012-04-01

    Full Text Available We present GeoXp, an R package implementing interactive graphics for exploratory spatial data analysis. We use a data set concerning public schools of the French MidiPyrenees region to illustrate the use of these exploratory techniques based on the coupling between a statistical graph and a map. Besides elementary plots like boxplots,histograms or simple scatterplots, GeoXp also couples maps with Moran scatterplots, variogram clouds, Lorenz curves and other graphical tools. In order to make the most of the multidimensionality of the data, GeoXp includes dimension reduction techniques such as principal components analysis and cluster analysis whose results are also linked to the map.

  7. Effects of Body Armor Fit on Marksmanship Performance

    Science.gov (United States)

    2016-09-01

    01760-5020 REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to...and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of...size). In all configurations, TPs also wore an Advanced Combat Helmet. Scatterplot of Stature (mm) against Chest Circumference (mm) 600 700 800 900

  8. Test de l'effet d'attraction sur deux jeux de données de visualisation d'information

    OpenAIRE

    Dimara, Evanthia; Bezerianos, Anastasia; Dragicevic, Pierre

    2016-01-01

    The attraction effect is a well-studied cognitive bias in decision making research, where people's choice between two options is influenced by the presence of an irrelevant (dominated) third option. In another article, we report on two crowdsourced experiments showing compelling evidence that the attraction effect can generalize to visualizations. However, we also conducted another experiment between the two where we failed to observe an effect. This experiment used scatterplots generated fro...

  9. Glutathione as a Biomarker in Parkinson's Disease: Associations with Aging and Disease Severity

    OpenAIRE

    Mischley, Laurie K; Standish, Leanna J.; Weiss, Noel S.; Padowski, Jeannie M.; Kavanagh, Terrance J.; White, Collin C.; Rosenfeld, Michael E.

    2016-01-01

    Objectives. Oxidative stress contributes to Parkinson's disease (PD) pathophysiology and progression. The objective was to describe central and peripheral metabolites of redox metabolism and to describe correlations between glutathione (Glu) status, age, and disease severity. Methods. 58 otherwise healthy individuals with PD were examined during a single study visit. Descriptive statistics and scatterplots were used to evaluate normality and distribution of this cross-sectional sample. Blood ...

  10. Medical Robotic and Telesurgical Simulation and Education Research

    Science.gov (United States)

    2013-09-01

    exercises with latency. The scatterplots shown in Figure 9 and 10 illustrate the relationships between subjects who received the same latency values...of a Novel Robotic Surgery Simulator, Journal of Urology. Janetschek G, Bartsch G, & Kavoussi LR. (1998). Transcontinental interactive laparoscopic...Once logged into each system, the instructor or the student navigates the instructional materials using the menu systems illustrated in Figure 2

  11. Metabolism of DMSP, DMS and DMSO by the cultivable bacterial community associated with the DMSP-producing dinoflagellate Scrippsiella trochoidea

    Digital Repository Service at National Institute of Oceanography (India)

    Hatton, A.D.; Shenoy, D.M.; Hart, M.C.; Mogg, A.; Green, D.H.

    radiation balance by changing the reflectivity of clouds (Charlson et al. 1987), has stimulated considerable research into this gas. Furthermore, and perhaps more significantly, in the marine environment DMS and its precursors represent important sources... of phytoplankton biomass from field measurements and satellite observation were hoped to provide a reliable proxy of DMS emissions. However, plots of DMS against chlorophyll a resemble scatter-plots, and do not show a sufficiently reliable correlation (Liss et...

  12. Random Number Simulations Reveal How Random Noise Affects the Measurements and Graphical Portrayals of Self-Assessed Competency

    Directory of Open Access Journals (Sweden)

    Edward Nuhfer

    2016-01-01

    Full Text Available Self-assessment measures of competency are blends of an authentic self-assessment signal that researchers seek to measure and random disorder or "noise" that accompanies that signal. In this study, we use random number simulations to explore how random noise affects critical aspects of self-assessment investigations: reliability, correlation, critical sample size, and the graphical representations of self-assessment data. We show that graphical conventions common in the self-assessment literature introduce artifacts that invite misinterpretation. Troublesome conventions include: (y minus x vs. (x scatterplots; (y minus x vs. (x column graphs aggregated as quantiles; line charts that display data aggregated as quantiles; and some histograms. Graphical conventions that generate minimal artifacts include scatterplots with a best-fit line that depict (y vs. (x measures (self-assessed competence vs. measured competence plotted by individual participant scores, and (y vs. (x scatterplots of collective average measures of all participants plotted item-by-item. This last graphic convention attenuates noise and improves the definition of the signal. To provide relevant comparisons across varied graphical conventions, we use a single dataset derived from paired measures of 1154 participants' self-assessed competence and demonstrated competence in science literacy. Our results show that different numerical approaches employed in investigating and describing self-assessment accuracy are not equally valid. By modeling this dataset with random numbers, we show how recognizing the varied expressions of randomness in self-assessment data can improve the validity of numeracy-based descriptions of self-assessment.

  13. Spectral characterization of tissues in high spectral and spatial resolution MR images: Implications for a classification-based synthetic CT algorithm.

    Science.gov (United States)

    Wood, Abbie M; Shea, Steven M; Medved, Milica; Karczmar, Gregory S; Surucu, Murat; Gros, Sebastien; Small, William; Roeske, John

    2017-05-01

    To characterize the spectral parameters of tissues with high spectral and spatial resolution magnetic resonance images to be used as a foundation for a classification-based synthetic CT algorithm. A phantom was constructed consisting of a section of fresh beef leg with bone embedded in 1% agarose gel. The high spectral and spatial (HiSS) resolution MR imaging sequence used had 1.0 mm in-plane resolution and 11.1 Hz spectral resolution. This sequence was used to image the phantom and one patient. Post-processing was performed off-line with IDL and included Fourier transformation of the time-domain data, labeling of fat and water peaks, and fitting the magnitude spectra with Lorentzian functions. Images of the peak height and peak integral of both the water and fat resonances were generated and analyzed. Several regions-of-interest (ROIs) were identified in phantom: bone marrow, cortical bone, adipose tissue, muscle, agar gel, and air; in the patient, no agar gel was present but an ROI of saline in the bladder was analyzed. All spectra were normalized by the noise within each voxel; thus, all parameters are reported in terms of signal-to-noise (SNR). The distributions of tissue spectral parameters were analyzed and scatterplots generated. Water peak height in cortical bone was compared to air using a nonparametric t-test. Composition of the various ROIs in terms of water, fat, or fat and water was also reported. In phantom, the scatterplot of peak height (water versus fat) showed good separation of bone marrow and adipose tissue. Water versus fat integral scatterplot showed better separation of muscle and cortical bone than the peak height scatterplot. In the patient data, the distributions of water and fat peak heights were similar to that in phantom, with more overlap of bone marrow and cortical bone than observed in phantom. The relationship between bone marrow and cortical bone for peak integral was better separated than those of peak heights in the patient data

  14. Webcharts – A Web-based Charting Library for Custom Interactive Data Visualization

    Directory of Open Access Journals (Sweden)

    Nathan Bryant

    2016-07-01

    Full Text Available Webcharts is a JavaScript library built on top of D3.js that creates reusable, flexible, interactive charts that are highly customizable. Webcharts provides a method for creating commonly-used charts, including bar charts, scatterplots, and timelines, through a simple configuration scheme. Charts created with Webcharts allow users to dynamically manipulate chart data, appearance, and behavior both through callback functions and input elements that are tied to chart objects. This approach allows users to create reusable charts that range from simple static graphics to complex interactive data exploration tools with custom user interfaces, all using the same library.

  15. Interactive Data Visualization using Mondrian

    Directory of Open Access Journals (Sweden)

    Martin Theus

    2002-11-01

    Full Text Available This paper presents the Mondrian data visualization software. In addition to standard plots like histograms, barcharts, scatterplots or maps, Mondrian offers advanced plots for high dimensional categorical (mosaic plots and continuous data (parallel coordinates. All plots are linked and offer various interaction techniques. A special focus is on the seamless integration of categorical data. Unique is Mondrian's special selection technique, which allows advanced selections in complex data sets. Besides loading data from local (ASCII files it can connect to databases, avoiding a local copy of the data on the client machine. Mondrian is written in 100% pure JAVA.

  16. A primer in biological data analysis and visualization using R

    CERN Document Server

    Hartvigsen, Gregg

    2014-01-01

    R is a popular programming language that statisticians use to perform a variety of statistical computing tasks. Rooted in Gregg Hartvigsen's extensive experience teaching biology, this text is an engaging, practical, and lab-oriented introduction to R for students in the life sciences. Underscoring the importance of R and RStudio to the organization, computation, and visualization of biological statistics and data, Hartvigsen guides readers through the processes of entering data into R, working with data in R, and using R to express data in histograms, boxplots, barplots, scatterplots, before/

  17. Multivariable modeling and multivariate analysis for the behavioral sciences

    CERN Document Server

    Everitt, Brian S

    2009-01-01

    Multivariable Modeling and Multivariate Analysis for the Behavioral Sciences shows students how to apply statistical methods to behavioral science data in a sensible manner. Assuming some familiarity with introductory statistics, the book analyzes a host of real-world data to provide useful answers to real-life issues.The author begins by exploring the types and design of behavioral studies. He also explains how models are used in the analysis of data. After describing graphical methods, such as scatterplot matrices, the text covers simple linear regression, locally weighted regression, multip

  18. Statistical analysis of medical data using SAS

    CERN Document Server

    Der, Geoff

    2005-01-01

    An Introduction to SASDescribing and Summarizing DataBasic InferenceScatterplots Correlation: Simple Regression and SmoothingAnalysis of Variance and CovarianceMultiple RegressionLogistic RegressionThe Generalized Linear ModelGeneralized Additive ModelsNonlinear Regression ModelsThe Analysis of Longitudinal Data IThe Analysis of Longitudinal Data II: Models for Normal Response VariablesThe Analysis of Longitudinal Data III: Non-Normal ResponseSurvival AnalysisAnalysis Multivariate Date: Principal Components and Cluster AnalysisReferences

  19. MaTSE: the gene expression time-series explorer.

    Science.gov (United States)

    Craig, Paul; Cannon, Alan; Kukla, Robert; Kennedy, Jessie

    2013-01-01

    High throughput gene expression time-course experiments provide a perspective on biological functioning recognized as having huge value for the diagnosis, treatment, and prevention of diseases. There are however significant challenges to properly exploiting this data due to its massive scale and complexity. In particular, existing techniques are found to be ill suited to finding patterns of changing activity over a limited interval of an experiments time frame. The Time-Series Explorer (TSE) was developed to overcome this limitation by allowing users to explore their data by controlling an animated scatter-plot view. MaTSE improves and extends TSE by allowing users to visualize data with missing values, cross reference multiple conditions, highlight gene groupings, and collaborate by sharing their findings. MaTSE was developed using an iterative software development cycle that involved a high level of user feedback and evaluation. The resulting software combines a variety of visualization and interaction techniques which work together to allow biologists to explore their data and reveal temporal patterns of gene activity. These include a scatter-plot that can be animated to view different temporal intervals of the data, a multiple coordinated view framework to support the cross reference of multiple experimental conditions, a novel method for highlighting overlapping groups in the scatter-plot, and a pattern browser component that can be used with scatter-plot box queries to support cooperative visualization. A final evaluation demonstrated the tools effectiveness in allowing users to find unexpected temporal patterns and the benefits of functionality such as the overlay of gene groupings and the ability to store patterns. We have developed a new exploratory analysis tool, MaTSE, that allows users to find unexpected patterns of temporal activity in gene expression time-series data. Overall, the study acted well to demonstrate the benefits of an iterative software

  20. Noise modelling and estimation of hyperspectral data from airborne imaging spectrometers

    Directory of Open Access Journals (Sweden)

    I. Pippi

    2006-06-01

    Full Text Available The definition of noise models suitable for hyperspectral data is slightly different depending on whether whiskbroom or push-broom are dealt with. Focussing on the latter type (e.g., VIRS-200 the noise is intrinsically non-stationary in the raw digital counts. After calibration, i.e. removing the variability effects due to different gains and offsets of detectors, the noise will exhibit stationary statistics, at least spatially. Hence, separable 3D processes correlated across track (x, along track (y and in the wavelength (?, modelled as auto-regressive with GG statistics have been found to be adequate. Estimation of model parameters from the true data is accomplished through robust techniques relying on linear regressions calculated on scatter-plots of local statistics. An original procedure was devised to detect areas within the scatter-plot corresponding to statistically homogeneous pixels. Results on VIRS-200 data show that the noise is heavy-tailed (tails longer than those of a Gaussian PDF and somewhat correlated along and across track by slightly different extents. Spectral correlation has been investigated as well and found to depend both on the sparseness (spectral sampling and on the wavelength values of the bands that have been selected.

  1. The Attraction Effect in Information Visualization.

    Science.gov (United States)

    Dimara, Evanthia; Bezerianos, Anastasia; Dragicevic, Pierre

    2017-01-01

    The attraction effect is a well-studied cognitive bias in decision making research, where one's choice between two alternatives is influenced by the presence of an irrelevant (dominated) third alternative. We examine whether this cognitive bias, so far only tested with three alternatives and simple presentation formats such as numerical tables, text and pictures, also appears in visualizations. Since visualizations can be used to support decision making - e.g., when choosing a house to buy or an employee to hire - a systematic bias could have important implications. In a first crowdsource experiment, we indeed partially replicated the attraction effect with three alternatives presented as a numerical table, and observed similar effects when they were presented as a scatterplot. In a second experiment, we investigated if the effect extends to larger sets of alternatives, where the number of alternatives is too large for numerical tables to be practical. Our findings indicate that the bias persists for larger sets of alternatives presented as scatterplots. We discuss implications for future research on how to further study and possibly alleviate the attraction effect.

  2. The bivariate statistical analysis of environmental (compositional) data.

    Science.gov (United States)

    Filzmoser, Peter; Hron, Karel; Reimann, Clemens

    2010-09-01

    Environmental sciences usually deal with compositional (closed) data. Whenever the concentration of chemical elements is measured, the data will be closed, i.e. the relevant information is contained in the ratios between the variables rather than in the data values reported for the variables. Data closure has severe consequences for statistical data analysis. Most classical statistical methods are based on the usual Euclidean geometry - compositional data, however, do not plot into Euclidean space because they have their own geometry which is not linear but curved in the Euclidean sense. This has severe consequences for bivariate statistical analysis: correlation coefficients computed in the traditional way are likely to be misleading, and the information contained in scatterplots must be used and interpreted differently from sets of non-compositional data. As a solution, the ilr transformation applied to a variable pair can be used to display the relationship and to compute a measure of stability. This paper discusses how this measure is related to the usual correlation coefficient and how it can be used and interpreted. Moreover, recommendations are provided for how the scatterplot can still be used, and which alternatives exist for displaying the relationship between two variables. Copyright 2010 Elsevier B.V. All rights reserved.

  3. A State-Dependent Quantification of Climate Sensitivity Based on Paleodata of the Last 2.1 Million Years

    Science.gov (United States)

    Köhler, Peter; Stap, Lennert B.; von der Heydt, Anna S.; de Boer, Bas; van de Wal, Roderik S. W.; Bloch-Johnson, J.

    2017-11-01

    The evidence from both data and models indicates that specific equilibrium climate sensitivity S[X]—the global annual mean surface temperature change (ΔTg) as a response to a change in radiative forcing X (ΔR[X])—is state dependent. Such a state dependency implies that the best fit in the scatterplot of ΔTg versus ΔR[X] is not a linear regression but can be some nonlinear or even nonsmooth function. While for the conventional linear case the slope (gradient) of the regression is correctly interpreted as the specific equilibrium climate sensitivity S[X], the interpretation is not straightforward in the nonlinear case. We here explain how such a state-dependent scatterplot needs to be interpreted and provide a theoretical understanding—or generalization—how to quantify S[X] in the nonlinear case. Finally, from data covering the last 2.1 Myr we show that—due to state dependency—the specific equilibrium climate sensitivity which considers radiative forcing of CO2 and land ice sheet (LI) albedo, S[CO2,LI], is larger during interglacial states than during glacial conditions by more than a factor 2.

  4. Different rates of DNA replication at early versus late S-phase sections: multiscale modeling of stochastic events related to DNA content/EdU (5-ethynyl-2'deoxyuridine) incorporation distributions.

    Science.gov (United States)

    Li, Biao; Zhao, Hong; Rybak, Paulina; Dobrucki, Jurek W; Darzynkiewicz, Zbigniew; Kimmel, Marek

    2014-09-01

    Mathematical modeling allows relating molecular events to single-cell characteristics assessed by multiparameter cytometry. In the present study we labeled newly synthesized DNA in A549 human lung carcinoma cells with 15-120 min pulses of EdU. All DNA was stained with DAPI and cellular fluorescence was measured by laser scanning cytometry. The frequency of cells in the ascending (left) side of the "horseshoe"-shaped EdU/DAPI bivariate distributions reports the rate of DNA replication at the time of entrance to S phase while their frequency in the descending (right) side is a marker of DNA replication rate at the time of transition from S to G2 phase. To understand the connection between molecular-scale events and scatterplot asymmetry, we developed a multiscale stochastic model, which simulates DNA replication and cell cycle progression of individual cells and produces in silico EdU/DAPI scatterplots. For each S-phase cell the time points at which replication origins are fired are modeled by a non-homogeneous Poisson Process (NHPP). Shifted gamma distributions are assumed for durations of cell cycle phases (G1, S and G2 M), Depending on the rate of DNA synthesis being an increasing or decreasing function, simulated EdU/DAPI bivariate graphs show predominance of cells in left (early-S) or right (late-S) side of the horseshoe distribution. Assuming NHPP rate estimated from independent experiments, simulated EdU/DAPI graphs are nearly indistinguishable from those experimentally observed. This finding proves consistency between the S-phase DNA-replication rate based on molecular-scale analyses, and cell population kinetics ascertained from EdU/DAPI scatterplots and demonstrates that DNA replication rate at entrance to S is relatively slow compared with its rather abrupt termination during S to G2 transition. Our approach opens a possibility of similar modeling to study the effect of anticancer drugs on DNA replication/cell cycle progression and also to quantify other

  5. Finding That College Students Cluster in Majors Based on Differing Patterns of Spatial Visualization and Language Processing Speeds

    Directory of Open Access Journals (Sweden)

    Richard M. Oldrieve

    2014-03-01

    Full Text Available For over 30 years, researchers such as Eisenberg and McGinty have investigated the relationship between 3-D visualization skills and choice of college major. Results of the present study support the fact that science and math majors tend to do well on a measure of 3-D visualization. Going beyond these earlier studies, the present study investigated whether a measure of Rapid Automatic Naming of Objects—which is normally used to screen for elementary school students who might struggle with speech, language, literacy, and numeracy—would further differentiate the choice of majors by college students. Far more research needs to be conducted, but results indicated that college students differentially clustered in scatterplot quadrants defined by the two screening assessments. Furthermore, several of these clusters, plus a statistical multiplier, may lead to a new understanding of students with phonological processing differences, learning disabilities, and speech and language impairments.

  6. Forecasting of Households Consumption Expenditure with Nonparametric Regression: The Case of Turkey

    Directory of Open Access Journals (Sweden)

    Aydin Noyan

    2016-11-01

    Full Text Available The relationship between household income and expenditure is important for understanding how the shape of the economic dynamics of the households. In this study, the relationship between household consumption expenditure and household disposable income were analyzed by Locally Weighted Scatterplot Smoothing Regression which is a nonparametric method using R programming. This study aimed to determine relationship between variables directly, unlike making any assumptions are commonly used as in the conventional parametric regression. According to the findings, effect on expenditure with increasing of income and household size together increased rapidly at first, and then speed of increase decreased. This increase can be explained by having greater compulsory consumption expenditure relatively in small households. Besides, expenditure is relatively higher in middle and high income levels according to low income level. However, the change in expenditure is limited in middle and is the most limited in high income levels when household size changes.

  7. Morphological variation between isolates of the nematode Haemonchus contortus from sheep and goat populations in Malaysia and Yemen.

    Science.gov (United States)

    Gharamah, A A; Rahman, W A; Siti Azizah, M N

    2014-03-01

    Haemonchus contortus is a highly pathogenic nematode parasite of sheep and goats. This work was conducted to investigate the population and host variations of the parasitic nematode H. contortus of sheep and goats from Malaysia and Yemen. Eight morphological characters were investigated, namely the total body length, cervical papillae, right spicule, left spicule, right barb, left barb, gubernaculum and cuticular ridge (synlophe) pattern. Statistical analysis showed the presence of morphological variation between populations of H. contortus from Malaysia and Yemen, with minor variation in the synlophe pattern of these isolates. Isolates from each country were grouped together in the scatterplots with no host isolation. Body, cervical papillae and spicule lengths were the most important characters that distinguished between populations of the two countries. This variation between Malaysia and Yemen may be attributed to geographical isolation and the possible presence of a different isolate of this worm in each country.

  8. [Postural control related to age in patients with benign positional paroxysmal vertigo].

    Science.gov (United States)

    Oliva Domínguez, M; Bartual Magro, J; Dañino González, J L; Dañino González, G; Roquette Gaona, J; Bartual Pastor, J

    2005-10-01

    To study the relationship between age and posturography in patients with benign positional paroxysmal vertigo. Prospective study performed in 65 patients with benign positional paroxysmal vertigo (BPPV) in any variant. Sensory Organization Test outcomes were recorded and compared with their equivalents in a control group by means of scatter-plot diagrams and regression line. For statistical study, Mann-Whitney's U-test was used. Slope for regression lines in composite was -0.0934 in group NORMAL; in group BPPV it was -0.4284. This difference is due to conditions 5 and 6. Results were statistically significative. BPPV patients have a worse postural control than control group. The difference is bigger the older the patient is. It is due to a failure in conditions 5 and 6, so it should be from a vestibular origin.

  9. Postural control according to the age in patients with benign paroxysmal positional vertigo.

    Science.gov (United States)

    Oliva Domínguez, M; Bartual Magro, J; Dañino González, J L; Dañino González, G; Roquette Gaona, J; Bartual Pastor, J

    2005-01-01

    To study the relationship between age and postural control in patients with benign paroxysmal positional vertigo. Prospective study performed in 65 diagnosed patients with benign paroxysmal positional vertigo (BPPV) in any of its variants. The results of the Sensory Organization Test are compared with their equivalent in a control group by means of scatterplot and regression lines. For the statistical study the Mann-Whitney's U test was used. In the NORMAL group, the regression line for composite has a 0.0934 slope; in the BPPV group, 0.4284. This difference is due to conditions 5 and 6 fundamentally, the results being statistically significant. The patients with BPPV have a worse postural control than the group control. This difference is much more pronounced the older the patient, and is of vestibular origin.

  10. Hierarchical aggregation for information visualization: overview, techniques, and design guidelines.

    Science.gov (United States)

    Elmqvist, Niklas; Fekete, Jean-Daniel

    2010-01-01

    We present a model for building, visualizing, and interacting with multiscale representations of information visualization techniques using hierarchical aggregation. The motivation for this work is to make visual representations more visually scalable and less cluttered. The model allows for augmenting existing techniques with multiscale functionality, as well as for designing new visualization and interaction techniques that conform to this new class of visual representations. We give some examples of how to use the model for standard information visualization techniques such as scatterplots, parallel coordinates, and node-link diagrams, and discuss existing techniques that are based on hierarchical aggregation. This yields a set of design guidelines for aggregated visualizations. We also present a basic vocabulary of interaction techniques suitable for navigating these multiscale visualizations.

  11. MLDS: Maximum Likelihood Difference Scaling in R

    Directory of Open Access Journals (Sweden)

    Kenneth Knoblauch

    2008-01-01

    Full Text Available The MLDS package in the R programming language can be used to estimate perceptual scales based on the results of psychophysical experiments using the method of difference scaling. In a difference scaling experiment, observers compare two supra-threshold differences (a,b and (c,d on each trial. The approach is based on a stochastic model of how the observer decides which perceptual difference (or interval (a,b or (c,d is greater, and the parameters of the model are estimated using a maximum likelihood criterion. We also propose a method to test the model by evaluating the self-consistency of the estimated scale. The package includes an example in which an observer judges the differences in correlation between scatterplots. The example may be readily adapted to estimate perceptual scales for arbitrary physical continua.

  12. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  13. Measuring magnetic correlations in nanoparticle assemblies

    DEFF Research Database (Denmark)

    Beleggia, Marco; Frandsen, Cathrine

    2014-01-01

    We illustrate how to extract correlations between magnetic moments in assemblies of nanoparticles from, e.g., electron holography data providing the combined knowledge of particle size distribution, inter-particle distances, and magnitude and orientation of each magnetic moment within a nanoparti......We illustrate how to extract correlations between magnetic moments in assemblies of nanoparticles from, e.g., electron holography data providing the combined knowledge of particle size distribution, inter-particle distances, and magnitude and orientation of each magnetic moment within...... a nanoparticle superstructure, We show, based on simulated data, how to build a radial/angular pair distribution function f(r,θ) encoding the spatial and angular difference between every pair of magnetic moments. A scatter-plot of f(r,θ) reveals the degree of structural and magnetic order present, and hence...

  14. Weapons of Maths Instruction: A Thousand Years of Technological Stasis in Arrowheads from the South Scandinavian Middle Mesolithic

    Directory of Open Access Journals (Sweden)

    Kevan Edinborough

    2005-11-01

    Full Text Available This paper presents some results from my doctoral research into the evolution of bow-arrow technology using archaeological data from the south Scandinavian Mesolithic (Edinborough 2004. A quantitative approach is used to describe the morphological variation found in samples taken from over 3600 armatures from nine Danish and Swedish lithic assemblages. A linked series of statistical techniques determines the two most significant metric variables across the nine arrowhead assemblages in terms of the cultural transmission of arrowhead technology. The resultant scatterplot uses confidence ellipses to reveal highly distinctive patterns of morphological variation that are related to population-specific technological traditions. A population-level hypothesis of a socially constrained transmission mechanism is presented that may explain the unusually long period of technological stasis demonstrated by six of the nine arrowhead phase-assemblages.

  15. Jacobi Fiber Surfaces for Bivariate Reeb Space Computation.

    Science.gov (United States)

    Tierny, Julien; Carr, Hamish

    2017-01-01

    This paper presents an efficient algorithm for the computation of the Reeb space of an input bivariate piecewise linear scalar function f defined on a tetrahedral mesh. By extending and generalizing algorithmic concepts from the univariate case to the bivariate one, we report the first practical, output-sensitive algorithm for the exact computation of such a Reeb space. The algorithm starts by identifying the Jacobi set of f, the bivariate analogs of critical points in the univariate case. Next, the Reeb space is computed by segmenting the input mesh along the new notion of Jacobi Fiber Surfaces, the bivariate analog of critical contours in the univariate case. We additionally present a simplification heuristic that enables the progressive coarsening of the Reeb space. Our algorithm is simple to implement and most of its computations can be trivially parallelized. We report performance numbers demonstrating orders of magnitude speedups over previous approaches, enabling for the first time the tractable computation of bivariate Reeb spaces in practice. Moreover, unlike range-based quantization approaches (such as the Joint Contour Net), our algorithm is parameter-free. We demonstrate the utility of our approach by using the Reeb space as a semi-automatic segmentation tool for bivariate data. In particular, we introduce continuous scatterplot peeling, a technique which enables the reduction of the cluttering in the continuous scatterplot, by interactively selecting the features of the Reeb space to project. We provide a VTK-based C++ implementation of our algorithm that can be used for reproduction purposes or for the development of new Reeb space based visualization techniques.

  16. Assessing the role of pavement macrotexture in preventing crashes on highways.

    Science.gov (United States)

    Pulugurtha, Srinivas S; Kusam, Prasanna R; Patel, Kuvleshay J

    2010-02-01

    The objective of this article is to assess the role of pavement macrotexture in preventing crashes on highways in the State of North Carolina. Laser profilometer data obtained from the North Carolina Department of Transportation (NCDOT) for highways comprising four corridors are processed to calculate pavement macrotexture at 100-m (approximately 330-ft) sections according to the American Society for Testing and Materials (ASTM) standards. Crash data collected over the same lengths of the corridors were integrated with the calculated pavement macrotexture for each section. Scatterplots were generated to assess the role of pavement macrotexture on crashes and logarithm of crashes. Regression analyses were conducted by considering predictor variables such as million vehicle miles of travel (as a function of traffic volume and length), the number of interchanges, the number of at-grade intersections, the number of grade-separated interchanges, and the number of bridges, culverts, and overhead signs along with pavement macrotexture to study the statistical significance of relationship between pavement macrotexture and crashes (both linear and log-linear) when compared to other predictor variables. Scatterplots and regression analysis conducted indicate a more statistically significant relationship between pavement macrotexture and logarithm of crashes than between pavement macrotexture and crashes. The coefficient for pavement macrotexture, in general, is negative, indicating that the number of crashes or logarithm of crashes decreases as it increases. The relation between pavement macrotexture and logarithm of crashes is generally stronger than between most other predictor variables and crashes or logarithm of crashes. Based on results obtained, it can be concluded that maintaining pavement macrotexture greater than or equal to 1.524 mm (0.06 in.) as a threshold limit would possibly reduce crashes and provide safe transportation to road users on highways.

  17. An analysis of wildfire frequency and burned area relationships with human pressure and climate gradients in the context of fire regime

    Science.gov (United States)

    Jiménez-Ruano, Adrián; Rodrigues Mimbrero, Marcos; de la Riva Fernández, Juan

    2017-04-01

    Understanding fire regime is a crucial step towards achieving a better knowledge of the wildfire phenomenon. This study proposes a method for the analysis of fire regime based on multidimensional scatterplots (MDS). MDS are a visual approach that allows direct comparison among several variables and fire regime features so that we are able to unravel spatial patterns and relationships within the region of analysis. Our analysis is conducted in Spain, one of the most fire-affected areas within the Mediterranean region. Specifically, the Spanish territory has been split into three regions - Northwest, Hinterland and Mediterranean - considered as representative fire regime zones according to MAGRAMA (Spanish Ministry of Agriculture, Environment and Food). The main goal is to identify key relationships between fire frequency and burnt area, two of the most common fire regime features, with socioeconomic activity and climate. In this way we will be able to better characterize fire activity within each fire region. Fire data along the period 1974-2010 was retrieved from the General Statistics Forest Fires database (EGIF). Specifically, fire frequency and burnt area size was examined for each region and fire season (summer and winter). Socioeconomic activity was defined in terms of human pressure on wildlands, i.e. the presence and intensity of anthropogenic activity near wildland or forest areas. Human pressure was built from GIS spatial information about land use (wildland-agriculture and wildland-urban interface) and demographic potential. Climate variables (average maximum temperature and annual precipitation) were extracted from MOTEDAS (Monthly Temperature Dataset of Spain) and MOPREDAS (Monthly Precipitation Dataset of Spain) datasets and later reclassified into ten categories. All these data were resampled to fit the 10x10 Km grid used as spatial reference for fire data. Climate and socioeconomic variables were then explored by means of MDS to find the extent to

  18. Analysis of full-waveform LiDAR data for classification of an orange orchard scene

    Science.gov (United States)

    Fieber, Karolina D.; Davenport, Ian J.; Ferryman, James M.; Gurney, Robert J.; Walker, Jeffrey P.; Hacker, Jorg M.

    2013-08-01

    Full-waveform laser scanning data acquired with a Riegl LMS-Q560 instrument were used to classify an orange orchard into orange trees, grass and ground using waveform parameters alone. Gaussian decomposition was performed on this data capture from the National Airborne Field Experiment in November 2006 using a custom peak-detection procedure and a trust-region-reflective algorithm for fitting Gauss functions. Calibration was carried out using waveforms returned from a road surface, and the backscattering coefficient γ was derived for every waveform peak. The processed data were then analysed according to the number of returns detected within each waveform and classified into three classes based on pulse width and γ. For single-peak waveforms the scatterplot of γ versus pulse width was used to distinguish between ground, grass and orange trees. In the case of multiple returns, the relationship between first (or first plus middle) and last return γ values was used to separate ground from other targets. Refinement of this classification, and further sub-classification into grass and orange trees was performed using the γ versus pulse width scatterplots of last returns. In all cases the separation was carried out using a decision tree with empirical relationships between the waveform parameters. Ground points were successfully separated from orange tree points. The most difficult class to separate and verify was grass, but those points in general corresponded well with the grass areas identified in the aerial photography. The overall accuracy reached 91%, using photography and relative elevation as ground truth. The overall accuracy for two classes, orange tree and combined class of grass and ground, yielded 95%. Finally, the backscattering coefficient γ of single-peak waveforms was also used to derive reflectance values of the three classes. The reflectance of the orange tree class (0.31) and ground class (0.60) are consistent with published values at the

  19. Research design and statistics in biomechanics and motor control.

    Science.gov (United States)

    Mullineaux, D R; Bartlett, R M; Bennett, S

    2001-10-01

    Biomechanics and motor control researchers measure how the body moves and interacts with its environment. The aim of this review paper is to consider some key issues in research methods in biomechanics and motor control. The review is organized into four sections: proposing, conducting, analysing and reporting research. In the first of these, we emphasize the importance of defining a worthy research question and of planning the study before its implementation to prevent later difficulties in the analysis and interpretation of data. In the second section, we cover selection of trial sizes and suggest that using three trials or more may be beneficial to provide more 'representative' and valid data. The third section on analysis of data concentrates on effect size statistics, qualitative and numerical trend analysis and cross-correlations. As sample sizes are often small, the use of effect size is recommended to support the results of statistical significance testing. In using cross-correlations, we recommend that scatterplots of one variable against the other, with the identified time lag included, be inspected to confirm that the linear relationship assumption underpinning this statistic is met and, if appropriate, that a linearity transformation be applied. Finally, we consider important information related to the issues above that should be included when reporting research. We recommend reporting checks or corrections for violations of underpinning assumptions, and the effect of these checks or corrections, to assist in advancing knowledge in biomechanics and motor control.

  20. Quantifying histone and RNA polymerase II post-translational modification dynamics in mother and daughter cells.

    Science.gov (United States)

    Stasevich, Timothy J; Sato, Yuko; Nozaki, Naohito; Kimura, Hiroshi

    2014-12-01

    Post-translational histone modifications are highly correlated with transcriptional activity, but the relative timing of these marks and their dynamic interplay during gene regulation remains controversial. To shed light on this problem and clarify the connections between histone modifications and transcription, we demonstrate how FabLEM (Fab-based Live Endogenous Modification labeling) can be used to simultaneously track histone H3 Lysine 9 acetylation (H3K9ac) together with RNA polymerase II Serine 2 and Serine 5 phosphorylation (RNAP2 Ser2ph/Ser5ph) in single living cells and their progeny. We provide a detailed description of the FabLEM methodology, including helpful tips for preparing and loading fluorescently conjugated antigen binding fragments (Fab) into cells for optimal results. We also introduce simple procedures for analyzing and visualizing FabLEM data, including color-coded scatterplots to track correlations between modifications through the cell cycle and temporal cross-correlation analysis to dissect modification dynamics. Using these methods, we find significant correlations that span cell generations, with a relatively strong correlation between H3K9ac and Ser5ph that appears to peak a few hours before mitosis and may reflect the bookmarking of genes for efficient re-initiation following mitosis. The techniques we have developed are broadly applicable and should help clarify how histone modifications dynamically contribute to gene regulation. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    Science.gov (United States)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  2. “compositions”: A unified R package to analyze compositional data

    Science.gov (United States)

    van den Boogaart, K. Gerald; Tolosana-Delgado, R.

    2008-04-01

    This contribution presents a new R package, called " compositions". It provides tools to analyze amount or compositional data sets in four different geometries, each one associated with an R class: rplus (for amounts, or open compositions, in a real, classical geometry), aplus (for amounts in a logarithmic geometry), rcomp (for closed compositions in a real geometry) and acomp (for closed compositions in a logistic geometry, following a log-ratio approach). The package allows to compare results obtained with these four approaches, since an analogous analysis can be performed according to each geometry, with minimal and straightforward modifications of the instructions. Beside these grounding classes, the package also includes: the most-basic features such as data transformations (e.g. logarithm, or additive logistic transform), basic statistics (both the classical ones, and those developed in the log-ratio framework of compositional analysis), high-level graphics (like ternary diagram matrix and scatter-plots) and high-level analysis (e.g. principal components or cluster analysis). Results of these functions and analysis are also provided in a consistent way among the four geometries, to ease their comparison.

  3. Longitudinal data on cortical thickness before and after working memory training

    Directory of Open Access Journals (Sweden)

    Claudia Metzler-Baddeley

    2016-06-01

    Full Text Available The data and supplementary information provided in this article relate to our research article “Task complexity and location specific changes of cortical thickness in executive and salience networks after working memory training” (Metzler-Baddeley et al., 2016 [1]. We provide cortical thickness and subcortical volume data derived from parieto-frontal cortical regions and the basal ganglia with the FreeSurfer longitudinal analyses stream (http://surfer.nmr.mgh.harvard.edu [2] before and after Cogmed working memory training (Cogmed and Cogmed Working Memory Training, 2012 [3]. This article also provides supplementary information to the research article, i.e., within-group comparisons between baseline and outcome cortical thickness and subcortical volume measures, between-group tests of performance changes in cognitive benchmark tests (www.cambridgebrainsciences.com [4], correlation analyses between performance changes in benchmark tests and training-related structural changes, correlation analyses between the time spent training and structural changes, a scatterplot of the relationship between cortical thickness measures derived from the occipital lobe as control region and the chronological order of the MRI sessions to assess potential scanner drift effects and a post-hoc vertex-wise whole brain analysis with FreeSurfer Qdec (https://surfer.nmr.mgh.harvard.edu/fswiki/Qdec [5].

  4. Sequential simulation approach to modeling of multi-seam coal deposits with an application to the assessment of a Louisiana lignite

    Science.gov (United States)

    Olea, Ricardo A.; Luppens, James A.

    2012-01-01

    There are multiple ways to characterize uncertainty in the assessment of coal resources, but not all of them are equally satisfactory. Increasingly, the tendency is toward borrowing from the statistical tools developed in the last 50 years for the quantitative assessment of other mineral commodities. Here, we briefly review the most recent of such methods and formulate a procedure for the systematic assessment of multi-seam coal deposits taking into account several geological factors, such as fluctuations in thickness, erosion, oxidation, and bed boundaries. A lignite deposit explored in three stages is used for validating models based on comparing a first set of drill holes against data from infill and development drilling. Results were fully consistent with reality, providing a variety of maps, histograms, and scatterplots characterizing the deposit and associated uncertainty in the assessments. The geostatistical approach was particularly informative in providing a probability distribution modeling deposit wide uncertainty about total resources and a cumulative distribution of coal tonnage as a function of local uncertainty.

  5. Making the most out of a hydrological model data set: Sensitivity analyses to open the model black-box

    Science.gov (United States)

    Borgonovo, E.; Lu, X.; Plischke, E.; Rakovec, O.; Hill, M. C.

    2017-09-01

    In this work, we investigate methods for gaining greater insight from hydrological model runs conducted for uncertainty quantification and model differentiation. We frame the sensitivity analysis questions in terms of the main purposes of sensitivity analysis: parameter prioritization, trend identification, and interaction quantification. For parameter prioritization, we consider variance-based sensitivity measures, sensitivity indices based on the L1-norm, the Kuiper metric, and the sensitivity indices of the DELSA methods. For trend identification, we investigate insights derived from graphing the one-way ANOVA sensitivity functions, the recently introduced CUSUNORO plots, and derivative scatterplots. For interaction quantification, we consider information delivered by variance-based sensitivity indices. We rely on the so-called given-data principle, in which results from a set of model runs are used to perform a defined set of analyses. One avoids using specific designs for each insight, thus controlling the computational burden. The methodology is applied to a hydrological model of a river in Belgium simulated using the well-established Framework for Understanding Structural Errors (FUSE) on five alternative configurations. The findings show that the integration of the chosen methods provides insights unavailable in most other analyses.

  6. The validity of using an electrocutaneous device for pain assessment in patients with cervical radiculopathy.

    Science.gov (United States)

    Abbott, Allan; Ghasemi-Kafash, Elaheh; Dedering, Åsa

    2014-10-01

    The purpose of this study was to evaluate the validity and preference for assessing pain magnitude with electrocutaneous testing (ECT) compared to the visual analogue scale (VAS) and Borg CR10 scale in men and women with cervical radiculopathy of varying sensory phenotypes. An additional purpose was to investigate ECT sensory and pain thresholds in men and women with cervical radiculopathy of varying sensory phenotypes. This is a cross-sectional study of 34 patients with cervical radiculopathy. Scatterplots and linear regression were used to investigate bivariate relationships between ECT, VAS and Borg CR10 methods of pain magnitude measurement as well as ECT sensory and pain thresholds. The use of the ECT pain magnitude matching paradigm for patients with cervical radiculopathy with normal sensory phenotype shows good linear association with arm pain VAS (R(2) = 0.39), neck pain VAS (R(2) = 0.38), arm pain Borg CR10 scale (R(2) = 0.50) and neck pain Borg CR10 scale (R(2) = 0.49) suggesting acceptable validity of the procedure. For patients with hypoesthesia and hyperesthesia sensory phenotypes, the ECT pain magnitude matching paradigm does not show adequate linear association with rating scale methods rendering the validity of the procedure as doubtful. ECT for sensory and pain threshold investigation, however, provides a method to objectively assess global sensory function in conjunction with sensory receptor specific bedside examination measures.

  7. Road Map Inference: A Segmentation and Grouping Framework

    Directory of Open Access Journals (Sweden)

    Jia Qiu

    2016-07-01

    Full Text Available We propose a new segmentation and grouping framework for road map inference from GPS traces. We first present a progressive Density-Based Spatial Clustering of Application with Noise (DBSCAN algorithm with an orientation constraint to partition the whole point set of the traces into clusters that represent road segments. A new point cluster grouping algorithm, according to the topological relationship and spatial proximity of the point clusters to recover the road network, is then developed. After generating the point clusters, the robust Locally-Weighted Scatterplot Smooth (Lowess method is used to extract their centerlines. We then propose to build the topological relationship of the centerlines by a Hidden Markov Model (HMM-based map matching algorithm; and to assess whether the spatial proximity between point clusters by assuming the distances from the points to the centerline comply with a Gaussian distribution. Finally, the point clusters are grouped according to their topological relationship and spatial proximity to form strokes for recovering the road map. Experimental results show that our algorithm is robust to noise and varied sampling rates. The generated road maps show high geometric accuracy.

  8. Investigation of coherent structures in a superheated jet using decomposition methods

    Science.gov (United States)

    Sinha, Avick; Gopalakrishnan, Shivasubramanian; Balasubramanian, Sridhar

    2016-11-01

    A superheated turbulent jet, commonly encountered in many engineering flows, is complex two phase mixture of liquid and vapor. The superposition of temporally and spatially evolving coherent vortical motions, known as coherent structures (CS), govern the dynamics of such a jet. Both POD and DMD are employed to analyze such vortical motions. PIV data is used in conjunction with the decomposition methods to analyze the CS in the flow. The experiments were conducted using water emanating into a tank containing homogeneous fluid at ambient condition. Three inlet pressure were employed in the study, all at a fixed inlet temperature. 90% of the total kinetic energy in the mean flow is contained within the first five modes. The scatterplot for any two POD coefficients predominantly showed a circular distribution, representing a strong connection between the two modes. We speculate that the velocity and vorticity contours of spatial POD basis functions show presence of K-H instability in the flow. From DMD, eigenvalues away from the origin is observed for all the cases indicating the presence of a non-oscillatory structure. Spatial structures are also obtained from DMD. The authors are grateful to Confederation of Indian Industry and General Electric India Pvt. Ltd. for partial funding of this project.

  9. Prevalence and Losses in Quality-Adjusted Life Years of Child Health Conditions: A Burden of Disease Analysis.

    Science.gov (United States)

    Craig, Benjamin M; Hartman, John D; Owens, Michelle A; Brown, Derek S

    2016-04-01

    To estimate the prevalence and losses in quality-adjusted life years (QALYs) associated with 20 child health conditions. Using data from the 2009-2010 National Survey of Children with Special Health Care Needs, preference weights were applied to 14 functional difficulties to summarize the quality of life burden of 20 health conditions. Among the 14 functional difficulties, "a little trouble with breathing" had the highest prevalence (37.1 %), but amounted to a loss of just 0.16 QALYs from the perspective of US adults. Though less prevalent, "a lot of behavioral problems" and "chronic pain" were associated with the greatest losses (1.86 and 3.43 QALYs). Among the 20 conditions, allergies and asthma were the most prevalent but were associated with the least burden. Muscular dystrophy and cerebral palsy were among the least prevalent and most burdensome. Furthermore, a scatterplot shows the association between condition prevalence and burden. In child health, condition prevalence is negatively associated with quality of life burden from the perspective of US adults. Both should be considered carefully when evaluating the appropriate role for public health prevention and interventions.

  10. Monitoring gender remuneration inequalities in academia using biplots

    Directory of Open Access Journals (Sweden)

    IS Walters

    2008-06-01

    Full Text Available Gender remuneration inequalities at universities have been studied in various parts of the world. In South Africa, the responsibility largely rests with individual higher education institutions to establish levels of pay for male and female academic staff members. The multidimensional character of the gender wage gap includes gender differentials in research output, age, academic rank and qualifications. The aim in this paper is to demonstrate the use of modern biplot methodology for describing and monitoring changes in the gender remuneration gap over time. A biplot is considered as a multivariate extension of an ordinary scatterplot. Our case study includes the permanent fulltime academic staff at Stellenbosch University for the period 2002 to 2005. We constructed canonical variate analysis (CVA biplots with 90% alpha bags for the five-dimensional data collected for males and females in 2002 and 2005 aggregated over faculties as well as for each faculty separately. The biplots illustrate, for our case study, that rank, age, research output and qualifications are related to remuneration. The CVA biplots show narrowing, widening and constant gender remuneration gaps in different faculties.

  11. Is nitrogen loading in wastewater more important than phosphorus? A historic review of the relationship between algae and macrophyte biomass and wastewater nutrient loading in the Bow River

    Science.gov (United States)

    Taube, Nadine; He, Jianxun; Ryan, Cathy; Valeo, Caterina

    2014-05-01

    The role of nutrient loading on biomass growth in wastewater-impacted rivers is important in understanding how to most effectively optimize wastewater treatment to avoid excessive biomass growth in the receiving water body. Nutrient loading is also affected by the nature of the effluent mixing in the river. This paper relates ammonium (NH4), nitrate (NO3) and total phosphorus (TP) from a wastewater treatment plant (WWTP) to epilithic algae and macrophyte biomass for determination of impacts of the WWTP on the Bow River ecosystem in Calgary, Alberta. Annual macrophyte biomass data and WWTP effluent nutrient data was analyzed for the years from 1981 - 2011. Locally Weighted Scatterplot Smoothing (LOWESS) was used to remove the influence of the river discharge from the biomass. The LOWESS method indicates that macrophytes do not grow beyond a maximum annual discharge of 300m3s-1. Algae biomass was most significantly correlated to daily mean discharge on sampling date and the LOWESS method indicates that they do not grow well beyond a daily mean discharge of 100m3s-1. Correlation analysis suggests that biomass in the Bow River is nitrogen limited. Epilithic algae are significantly correlated (p

  12. Longitudinal data on cortical thickness before and after working memory training.

    Science.gov (United States)

    Metzler-Baddeley, Claudia; Caeyenberghs, Karen; Foley, Sonya; Jones, Derek K

    2016-06-01

    The data and supplementary information provided in this article relate to our research article "Task complexity and location specific changes of cortical thickness in executive and salience networks after working memory training" (Metzler-Baddeley et al., 2016) [1]. We provide cortical thickness and subcortical volume data derived from parieto-frontal cortical regions and the basal ganglia with the FreeSurfer longitudinal analyses stream (http://surfer.nmr.mgh.harvard.edu [2]) before and after Cogmed working memory training (Cogmed and Cogmed Working Memory Training, 2012) [3]. This article also provides supplementary information to the research article, i.e., within-group comparisons between baseline and outcome cortical thickness and subcortical volume measures, between-group tests of performance changes in cognitive benchmark tests (www.cambridgebrainsciences.com [4]), correlation analyses between performance changes in benchmark tests and training-related structural changes, correlation analyses between the time spent training and structural changes, a scatterplot of the relationship between cortical thickness measures derived from the occipital lobe as control region and the chronological order of the MRI sessions to assess potential scanner drift effects and a post-hoc vertex-wise whole brain analysis with FreeSurfer Qdec (https://surfer.nmr.mgh.harvard.edu/fswiki/Qdec [5]).

  13. Dental color matching: A comparison between visual and instrumental methods.

    Science.gov (United States)

    Igiel, Christopher; Weyhrauch, Michael; Wentaschek, Stefan; Scheller, Herbert; Lehmann, Karl Martin

    2016-01-01

    The aim of this study was to compare the agreement rate (%) and color difference (ΔE*ab) of three dental color-measuring devices, with the visual shade identification. The tooth color were determined by two operators, which were advised to select a VITA classic shade tab in each other's agreement. The Shadepilot (SP), CrystalEye (CE) and ShadeVision (SV) were used to measure tooth color. Statistically analyses include agreement rate (%), color difference (ΔE*ab), McNemar test (p=0.05), Student's t-test (p=0.05) and Bland Altman scatterplots. The SP had an agreement of 56.3% with the visual shade determination, the CE 49.0% and SV 51.3%. ΔE*ab of the visually and instrumentally selected shade tabs and natural teeth were frequently above the threshold for acceptability. Comparing both methods, for SP ΔE*ab values differ in a range of clinical acceptability.

  14. Non-Linear Characterisation of Cerebral Pressure-Flow Dynamics in Humans.

    Directory of Open Access Journals (Sweden)

    Saqib Saleem

    Full Text Available Cerebral metabolism is critically dependent on the regulation of cerebral blood flow (CBF, so it would be expected that vascular mechanisms that play a critical role in CBF regulation would be tightly conserved across individuals. However, the relationships between blood pressure (BP and cerebral blood velocity fluctuations exhibit inter-individual variations consistent with heterogeneity in the integrity of CBF regulating systems. Here we sought to determine the nature and consistency of dynamic cerebral autoregulation (dCA during the application of oscillatory lower body negative pressure (OLBNP. In 18 volunteers we recorded BP and middle cerebral artery blood flow velocity (MCAv and examined the relationships between BP and MCAv fluctuations during 0.03, 0.05 and 0.07Hz OLBNP. dCA was characterised using project pursuit regression (PPR and locally weighted scatterplot smoother (LOWESS plots. Additionally, we proposed a piecewise regression method to statistically determine the presence of a dCA curve, which was defined as the presence of a restricted autoregulatory plateau shouldered by pressure-passive regions. Results show that LOWESS has similar explanatory power to that of PPR. However, we observed heterogeneous patterns of dynamic BP-MCAv relations with few individuals demonstrating clear evidence of a dCA central plateau. Thus, although BP explains a significant proportion of variance, dCA does not manifest as any single characteristic BP-MCAv function.

  15. Titan's surface from Cassini RADAR SAR and high resolution radiometry data of the first five flybys

    Science.gov (United States)

    Paganelli, F.; Janssen, M.A.; Stiles, B.; West, R.; Lorenz, R.D.; Lunine, J.I.; Wall, S.D.; Callahan, P.; Lopes, R.M.; Stofan, E.; Kirk, R.L.; Johnson, W.T.K.; Roth, L.; Elachi, C.; ,

    2007-01-01

    The first five Titan flybys with Cassini's Synthetic Aperture RADAR (SAR) and radiometer are examined with emphasis on the calibration and interpretation of the high-resolution radiometry data acquired during the SAR mode (SAR-radiometry). Maps of the 2-cm wavelength brightness temperature are obtained coincident with the SAR swath imaging, with spatial resolution approaching 6 km. A preliminary calibration shows that brightness temperature in these maps varies from 64 to 89 K. Surface features and physical properties derived from the SAR-radiometry maps and SAR imaging are strongly correlated; in general, we find that surface features with high radar reflectivity are associated with radiometrically cold regions, while surface features with low radar reflectivity correlate with radiometrically warm regions. We examined scatterplots of the normalized radar cross-section ??0 versus brightness temperature, finding differing signatures that characterize various terrains and surface features. Implications for the physical and compositional properties of these features are discussed. The results indicate that volume scattering is important in many areas of Titan's surface, particularly Xanadu, while other areas exhibit complex brightness temperature variations consistent with variable slopes or surface material and compositional properties. ?? 2007 Elsevier Inc.

  16. Titan's surface from the Cassini RADAR radiometry data during SAR mode

    Science.gov (United States)

    Paganelli, F.; Janssen, M.A.; Lopes, R.M.; Stofan, E.; Wall, S.D.; Lorenz, R.D.; Lunine, J.I.; Kirk, R.L.; Roth, L.; Elachi, C.

    2008-01-01

    We present initial results on the calibration and interpretation of the high-resolution radiometry data acquired during the Synthetic Aperture Radar (SAR) mode (SAR-radiometry) of the Cassini Radar Mapper during its first five flybys of Saturn's moon Titan. We construct maps of the brightness temperature at the 2-cm wavelength coincident with SAR swath imaging. A preliminary radiometry calibration shows that brightness temperature in these maps varies from 64 to 89 K. Surface features and physical properties derived from the SAR-radiometry maps and SAR imaging are strongly correlated; in general, we find that surface features with high radar reflectivity are associated with radiometrically cold regions, while surface features with low radar reflectivity correlate with radiometrically warm regions. We examined scatterplots of the normalized radar cross-section ??0 versus brightness temperature, outlining signatures that characterize various terrains and surface features. The results indicate that volume scattering is important in many areas of Titan's surface, particularly Xanadu, while other areas exhibit complex brightness temperature variations consistent with variable slopes or surface material and compositional properties. ?? 2007.

  17. Assessment of bone age in prepubertal healthy Korean children: Comparison among the Korean standard bone age chart, Greulich-Pyle method, and Tanner-Whitehouse method

    Energy Technology Data Exchange (ETDEWEB)

    Kim Jeong Rye; Lee, Young Seok; Yu, Jee Suk [Dankook University Hospital, Cheonan(Korea, Republic of)

    2015-02-15

    To compare the reliability of the Greulich-Pyle (GP) method, Tanner-Whitehouse 3 (TW3) method and Korean standard bone age chart (KS) in the evaluation of bone age of prepubertal healthy Korean children. Left hand-wrist radiographs of 212 prepubertal healthy Korean children aged 7 to 12 years, obtained for the evaluation of the traumatic injury in emergency department, were analyzed by two observers. Bone age was estimated using the GP method, TW3 method and KS, and was calculated in months. The correlation between bone age measured by each method and chronological age of each child was analyzed using Pearson correlation coefficient, scatterplot. The three methods were compared using one-way analysis of variance. Significant correlations were found between chronological age and bone age estimated by all three methods in whole group and in each gender (R2 ranged from 0.87 to 0.9, p < 0.01). Although bone age estimated by KS was slightly closer to chronological age than those estimated by the GP and TW3 methods, the difference between three methods was not statistically significant (p > 0.01). The KS, GP, and TW3 methods show good reliability in the evaluation of bone age of prepubertal healthy Korean children without significant difference between them. Any are useful for evaluation of bone age in prepubertal healthy Korean children.

  18. Evaluating the Effect of Style in Information Visualization.

    Science.gov (United States)

    Vande Moere, A; Tomitsch, M; Wimmer, C; Christoph, B; Grechenig, T

    2012-12-01

    This paper reports on a between-subject, comparative online study of three information visualization demonstrators that each displayed the same dataset by way of an identical scatterplot technique, yet were different in style in terms of visual and interactive embellishment. We validated stylistic adherence and integrity through a separate experiment in which a small cohort of participants assigned our three demonstrators to predefined groups of stylistic examples, after which they described the styles with their own words. From the online study, we discovered significant differences in how participants execute specific interaction operations, and the types of insights that followed from them. However, in spite of significant differences in apparent usability, enjoyability and usefulness between the style demonstrators, no variation was found on the self-reported depth, expert-rated depth, confidence or difficulty of the resulting insights. Three different methods of insight analysis have been applied, revealing how style impacts the creation of insights, ranging from higher-level pattern seeking to a more reflective and interpretative engagement with content, which is what underlies the patterns. As this study only forms the first step in determining how the impact of style in information visualization could be best evaluated, we propose several guidelines and tips on how to gather, compare and categorize insights through an online evaluation study, particularly in terms of analyzing the concise, yet wide variety of insights and observations in a trustworthy and reproducable manner.

  19. Dimension projection matrix/tree: interactive subspace visual exploration and analysis of high dimensional data.

    Science.gov (United States)

    Yuan, Xiaoru; Ren, Donghao; Wang, Zuchao; Guo, Cong

    2013-12-01

    For high-dimensional data, this work proposes two novel visual exploration methods to gain insights into the data aspect and the dimension aspect of the data. The first is a Dimension Projection Matrix, as an extension of a scatterplot matrix. In the matrix, each row or column represents a group of dimensions, and each cell shows a dimension projection (such as MDS) of the data with the corresponding dimensions. The second is a Dimension Projection Tree, where every node is either a dimension projection plot or a Dimension Projection Matrix. Nodes are connected with links and each child node in the tree covers a subset of the parent node's dimensions or a subset of the parent node's data items. While the tree nodes visualize the subspaces of dimensions or subsets of the data items under exploration, the matrix nodes enable cross-comparison between different combinations of subspaces. Both Dimension Projection Matrix and Dimension Project Tree can be constructed algorithmically through automation, or manually through user interaction. Our implementation enables interactions such as drilling down to explore different levels of the data, merging or splitting the subspaces to adjust the matrix, and applying brushing to select data clusters. Our method enables simultaneously exploring data correlation and dimension correlation for data with high dimensions.

  20. Dependence and independence: Structure and inference.

    Science.gov (United States)

    Vexler, Albert; Chen, Xiwei; Hutson, Alan D

    2017-10-01

    Evaluations of relationships between pairs of variables, including testing for independence, are increasingly important. Erich Leo Lehmann noted that "the study of the power and efficiency of tests of independence is complicated by the difficulty of defining natural classes of alternatives to the hypothesis of independence." This paper presents a general review, discussion and comparison of classical and novel tests of independence. We investigate a broad spectrum of dependence structures with/without random effects, including those that are well addressed in both the applied and the theoretical scientific literatures as well as scenarios when the classical tests of independence may break down completely. Motivated by practical considerations, the impact of random effects in dependence structures are studied in the additive and multiplicative forms. A novel index of dependence is proposed based on the area under the Kendall plot. In conjunction with the scatterplot and the Kendall plot, the proposed method provides a comprehensive presentation of the data in terms of graphing and conceptualizing the dependence. We also present a graphical methodology based on heat maps to effectively compare the powers of various tests. Practical examples illustrate the use of various tests of independence and the graphical representations of dependence structures.

  1. New Approaches for Calculating Moran’s Index of Spatial Autocorrelation

    Science.gov (United States)

    Chen, Yanguang

    2013-01-01

    Spatial autocorrelation plays an important role in geographical analysis; however, there is still room for improvement of this method. The formula for Moran’s index is complicated, and several basic problems remain to be solved. Therefore, I will reconstruct its mathematical framework using mathematical derivation based on linear algebra and present four simple approaches to calculating Moran’s index. Moran’s scatterplot will be ameliorated, and new test methods will be proposed. The relationship between the global Moran’s index and Geary’s coefficient will be discussed from two different vantage points: spatial population and spatial sample. The sphere of applications for both Moran’s index and Geary’s coefficient will be clarified and defined. One of theoretical findings is that Moran’s index is a characteristic parameter of spatial weight matrices, so the selection of weight functions is very significant for autocorrelation analysis of geographical systems. A case study of 29 Chinese cities in 2000 will be employed to validate the innovatory models and methods. This work is a methodological study, which will simplify the process of autocorrelation analysis. The results of this study will lay the foundation for the scaling analysis of spatial autocorrelation. PMID:23874592

  2. New approaches for calculating Moran's index of spatial autocorrelation.

    Science.gov (United States)

    Chen, Yanguang

    2013-01-01

    Spatial autocorrelation plays an important role in geographical analysis; however, there is still room for improvement of this method. The formula for Moran's index is complicated, and several basic problems remain to be solved. Therefore, I will reconstruct its mathematical framework using mathematical derivation based on linear algebra and present four simple approaches to calculating Moran's index. Moran's scatterplot will be ameliorated, and new test methods will be proposed. The relationship between the global Moran's index and Geary's coefficient will be discussed from two different vantage points: spatial population and spatial sample. The sphere of applications for both Moran's index and Geary's coefficient will be clarified and defined. One of theoretical findings is that Moran's index is a characteristic parameter of spatial weight matrices, so the selection of weight functions is very significant for autocorrelation analysis of geographical systems. A case study of 29 Chinese cities in 2000 will be employed to validate the innovatory models and methods. This work is a methodological study, which will simplify the process of autocorrelation analysis. The results of this study will lay the foundation for the scaling analysis of spatial autocorrelation.

  3. Matisse: A Visual Analytics System for Exploring Emotion Trends in Social Media Text Streams

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Drouhard, Margaret MEG G [ORNL; Beaver, Justin M [ORNL; Pyle, Joshua M [ORNL; BogenII, Paul L. [Google Inc.

    2015-01-01

    Dynamically mining textual information streams to gain real-time situational awareness is especially challenging with social media systems where throughput and velocity properties push the limits of a static analytical approach. In this paper, we describe an interactive visual analytics system, called Matisse, that aids with the discovery and investigation of trends in streaming text. Matisse addresses the challenges inherent to text stream mining through the following technical contributions: (1) robust stream data management, (2) automated sentiment/emotion analytics, (3) interactive coordinated visualizations, and (4) a flexible drill-down interaction scheme that accesses multiple levels of detail. In addition to positive/negative sentiment prediction, Matisse provides fine-grained emotion classification based on Valence, Arousal, and Dominance dimensions and a novel machine learning process. Information from the sentiment/emotion analytics are fused with raw data and summary information to feed temporal, geospatial, term frequency, and scatterplot visualizations using a multi-scale, coordinated interaction model. After describing these techniques, we conclude with a practical case study focused on analyzing the Twitter sample stream during the week of the 2013 Boston Marathon bombings. The case study demonstrates the effectiveness of Matisse at providing guided situational awareness of significant trends in social media streams by orchestrating computational power and human cognition.

  4. Grassroots Numeracy

    Directory of Open Access Journals (Sweden)

    H.L. Vacher

    2016-07-01

    Full Text Available The readers and authors of papers in Numeracy compose a multidisciplinary grassroots interest group that is defining and illustrating the meaning, content, and scope of quantitative literacy (QL and how it intersects with educational goals and practice. The 161 Numeracy papers that have been produced by this QL community were downloaded 42, 085 times in a total of 178 countries, including all 34 OECD countries, during 2015 and the first quarter of 2016. A scatterplot of normalized downloads per month vs. normalized total downloads for the eight years of Numeracy’s life allows identification of the 24 “most popular” of the 161 papers. These papers, which range over a wide landscape of subjects, were produced by a total of 41 authors, only nine of whom are mathematicians. The data clearly show that the QL community is not just a bunch of mathematicians talking amongst themselves. Rather the community is a vibrant mix of mathematicians and users and friends of mathematics. The heterogeneity of this grassroots community, and Numeracy’s commitment to serve it, dictates our mode of publication and the nature of our peer review. The journal is assertively open access for readers and free of page charges and processing fees to authors. The peer-review process is designed to provide constructive feedback to promote effective communication of the diverse activities and interests of a community that brings with it a multitude of publication cultures and experiences.

  5. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  6. Sampling and sensitivity analyses tools (SaSAT for computational modelling

    Directory of Open Access Journals (Sweden)

    Wilson David P

    2008-02-01

    Full Text Available Abstract SaSAT (Sampling and Sensitivity Analysis Tools is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  7. StreamMap: Smooth Dynamic Visualization of High-Density Streaming Points.

    Science.gov (United States)

    Li, Chenhui; Baciu, George; Yu, Han

    2017-02-13

    Interactive visualization of streaming points for real-time scatterplots and linear blending of correlation patterns is increasingly becoming the dominant mode of visual analytics for both big data and streaming data from active sensors and broadcasting media. To better visualize and interact with inter-stream patterns, it is generally necessary to smooth out gaps or distortions in the streaming data. Previous approaches either animate the points directly or present a sampled static heatmap. We propose a new approach, called StreamMap, to smoothly blend high-density streaming points and create a visual flow that emphasizes the density pattern distributions. In essence, we present three new contributions for the visualization of high-density streaming points. The first contribution is a density-based method called super kernel density estimation that aggregates streaming points using an adaptive kernel to solve the overlapping problem. The second contribution is a robust density morphing algorithm that generates several smooth intermediate frames for a given pair of frames. The third contribution is a trend representation design that can help convey the flow directions of the streaming points. The experimental results on three datasets demonstrate the effectiveness of StreamMap when dynamic visualization and visual analysis of trend patterns on streaming points are required.

  8. Uncertainty and sensitivity analyses for gas and brine migration at the Waste Isolation Pilot Plant, May 1992

    Energy Technology Data Exchange (ETDEWEB)

    Helton, J.C. [Arizona State Univ., Tempe, AZ (United States); Bean, J.E. [New Mexico Engineering Research Inst., Albuquerque, NM (United States); Butcher, B.M. [Sandia National Labs., Albuquerque, NM (United States); Garner, J.W.; Vaughn, P. [Applied Physics, Inc., Albuquerque, NM (United States); Schreiber, J.D. [Science Applications International Corp., Albuquerque, NM (United States); Swift, P.N. [Tech Reps, Inc., Albuquerque, NM (United States)

    1993-08-01

    Uncertainty and sensitivity analysis techniques based on Latin hypercube sampling, partial correlation analysis, stepwise regression analysis and examination of scatterplots are used in conjunction with the BRAGFLO model to examine two phase flow (i.e., gas and brine) at the Waste Isolation Pilot Plant (WIPP), which is being developed by the US Department of Energy as a disposal facility for transuranic waste. The analyses consider either a single waste panel or the entire repository in conjunction with the following cases: (1) fully consolidated shaft, (2) system of shaft seals with panel seals, and (3) single shaft seal without panel seals. The purpose of this analysis is to develop insights on factors that are potentially important in showing compliance with applicable regulations of the US Environmental Protection Agency (i.e., 40 CFR 191, Subpart B; 40 CFR 268). The primary topics investigated are (1) gas production due to corrosion of steel, (2) gas production due to microbial degradation of cellulosics, (3) gas migration into anhydrite marker beds in the Salado Formation, (4) gas migration through a system of shaft seals to overlying strata, and (5) gas migration through a single shaft seal to overlying strata. Important variables identified in the analyses include initial brine saturation of the waste, stoichiometric terms for corrosion of steel and microbial degradation of cellulosics, gas barrier pressure in the anhydrite marker beds, shaft seal permeability, and panel seal permeability.

  9. Psychosocial wellbeing and physical health among Tamil schoolchildren in northern Sri Lanka.

    Science.gov (United States)

    Hamilton, Alexander; Foster, Charlie; Richards, Justin; Surenthirakumaran, Rajendra

    2016-01-01

    Mental disorders contribute to the global disease burden and have an increased prevalence among children in emergency settings. Good physical health is crucial for mental well-being, although physical health is multifactorial and the nature of this relationship is not fully understood. Using Sri Lanka as a case study, we assessed the baseline levels of, and the association between, mental health and physical health in Tamil school children. We conducted a cross sectional study of mental and physical health in 10 schools in Kilinochchi town in northern Sri Lanka. All Grade 8 children attending selected schools were eligible to participate in the study. Mental health was assessed using the Sri Lankan Index for Psychosocial Stress - Child Version. Physical health was assessed using Body Mass Index for age, height for age Z scores and the Multi-stage Fitness Test. Association between physical and mental health variables was assessed using scatterplots and correlation was assessed using Pearson's R. There were 461 participants included in the study. Girls significantly outperformed boys in the MH testing t (459) = 2.201, p Sri Lanka. However, we identified a considerable physical health deficit in Tamil school children.

  10. Mann-Whitney Type Tests for Microarray Experiments: The R Package gMWT

    Directory of Open Access Journals (Sweden)

    Daniel Fischer

    2015-06-01

    Full Text Available We present the R package gMWT which is designed for the comparison of several treatments (or groups for a large number of variables. The comparisons are made using certain probabilistic indices (PI. The PIs computed here tell how often pairs or triples of observations coming from different groups appear in a specific order of magnitude. Classical two and several sample rank test statistics such as the Mann-Whitney-Wilcoxon, Kruskal-Wallis, or Jonckheere-Terpstra test statistics are simple functions of these PI. Also new test statistics for directional alternatives are provided. The package gMWT can be used to calculate the variable-wise PI estimates, to illustrate their multivariate distribution and mutual dependence with joint scatterplot matrices, and to construct several classical and new rank tests based on the PIs. The aim of the paper is first to briefly explain the theory that is necessary to understand the behavior of the estimated PIs and the rank tests based on them. Second, the use of the package is described and illustrated with simulated and real data examples. It is stressed that the package provides a new flexible toolbox to analyze large gene or microRNA expression data sets, collected on microarrays or by other high-throughput technologies. The testing procedures can be used in an eQTL analysis, for example, as implemented in the package GeneticTools.

  11. Validation of aerosol optical depth and total ozone column in the ultraviolet retrieved from multifilter rotating shadowband radiometer

    Science.gov (United States)

    Liu, Chaoshun; Chen, Maosi; Gao, Wei

    2013-09-01

    Aerosol optical depth (AOD), aerosol single scattering albedo (SSA), and asymmetry factor (g) at seven ultraviolet wavelengths along with total column ozone (TOC) were retrieved based on Bayesian optimal estimation (OE) from the measurements of the UltraViolet Multifilter Rotating Shadowband Radiometer (UV-MFRSR) deployed at the Southern Great Plains (SGP) site during March to November in 2009. To assess the accuracy of the OE technique, the AOD retrievals are compared to both the Beer's law derived ones and the AErosol RObotic Network (AERONET) AOD product; and the TOC retrievals are compared to both the TOC product of the U.S. Department of Agriculture UV-B Monitoring and Research Program (USDA UVMRP) and the Ozone Monitoring Instrument (OMI) satellite data. The scatterplots of the AOD estimated by the OE method with the Beer's law derived ones and the collocated AERONET AOD product both show a very good agreement: the correlation coefficients vary between 0.98 and 0.99; the slopes range from 0.95 to 1.0; and the offsets are less than 0.02 at 368 nm. The comparison of TOC also shows a promising accuracy of the OE method: the standard deviations of the difference between the OE derived TOC and other TOC products are about 5 to 6 Dobson Units (DU). The validation of the OE retrievals on the selected dates suggests the OE technique has its merits and is a supplemental tool in analyzing UVMRP data.

  12. The validity of a structured interactive 24-hour recall in estimating energy and nutrient intakes in 15-month-old rural Malawian children.

    Science.gov (United States)

    Thakwalakwa, Chrissie M; Kuusipalo, Heli M; Maleta, Kenneth M; Phuka, John C; Ashorn, Per; Cheung, Yin Bun

    2012-07-01

    This study aimed to compare the nutritional intake values among 15-month-old rural Malawian children obtained by weighed food record (WFR) with those obtained by modified 24-hour recall (mod 24-HR), and to develop algorithm for adjusting mod 24-HR values so as to predict mean intake based on WFRs. The study participants were 169 15-month-old children who participated in a clinical trial. Food consumption on one day was observed and weighed (established criterion) by a research assistant to provide the estimates of energy and nutrient intakes. On the following day, another research assistant, blinded to the direct observation, conducted the structured interactive 24-hour recall (24-HR) interview (test method). Paired t-tests and scatter-plots were used to compare intake values of the two methods. The structured interactive 24-HR method tended to overestimate energy and nutrient intakes (each P < 0.001). The regression-through-the-origin method was used to develop adjustment algorithms. Results showed that multiplying the mean energy, protein, fat, iron, zinc and vitamin A intake estimates based on the test method by 0.86, 0.80, 0.68, 0.69, 0.72 and 0.76, respectively, provides an approximation of the mean values based on WFRs. © 2011 Blackwell Publishing Ltd.

  13. OpinionSeer: interactive visualization of hotel customer feedback.

    Science.gov (United States)

    Wu, Yingcai; Wei, Furu; Liu, Shixia; Au, Norman; Cui, Weiwei; Zhou, Hong; Qu, Huamin

    2010-01-01

    The rapid development of Web technology has resulted in an increasing number of hotel customers sharing their opinions on the hotel services. Effective visual analysis of online customer opinions is needed, as it has a significant impact on building a successful business. In this paper, we present OpinionSeer, an interactive visualization system that could visually analyze a large collection of online hotel customer reviews. The system is built on a new visualization-centric opinion mining technique that considers uncertainty for faithfully modeling and analyzing customer opinions. A new visual representation is developed to convey customer opinions by augmenting well-established scatterplots and radial visualization. To provide multiple-level exploration, we introduce subjective logic to handle and organize subjective opinions with degrees of uncertainty. Several case studies illustrate the effectiveness and usefulness of OpinionSeer on analyzing relationships among multiple data dimensions and comparing opinions of different groups. Aside from data on hotel customer feedback, OpinionSeer could also be applied to visually analyze customer opinions on other products or services.

  14. Open Plot Project: an open-source toolkit for 3-D structural data analysis

    Directory of Open Access Journals (Sweden)

    S. Tavani

    2011-05-01

    Full Text Available In this work we present the Open Plot Project, an open-source software for structural data analysis, including a 3-D environment. The software includes many classical functionalities of structural data analysis tools, like stereoplot, contouring, tensorial regression, scatterplots, histograms and transect analysis. In addition, efficient filtering tools are present allowing the selection of data according to their attributes, including spatial distribution and orientation. This first alpha release represents a stand-alone toolkit for structural data analysis.

    The presence of a 3-D environment with digitalising tools allows the integration of structural data with information extracted from georeferenced images to produce structurally validated dip domains. This, coupled with many import/export facilities, allows easy incorporation of structural analyses in workflows for 3-D geological modelling. Accordingly, Open Plot Project also candidates as a structural add-on for 3-D geological modelling software.

    The software (for both Windows and Linux O.S., the User Manual, a set of example movies (complementary to the User Manual, and the source code are provided as Supplement. We intend the publication of the source code to set the foundation for free, public software that, hopefully, the structural geologists' community will use, modify, and implement. The creation of additional public controls/tools is strongly encouraged.

  15. Morphological analysis of Trichomycterus areolatus Valenciennes, 1846 from southern Chilean rivers using a truss-based system (Siluriformes, Trichomycteridae).

    Science.gov (United States)

    Colihueque, Nelson; Corrales, Olga; Yáñez, Miguel

    2017-01-01

    Trichomycterus areolatus Valenciennes, 1846 is a small endemic catfish inhabiting the Andean river basins of Chile. In this study, the morphological variability of three T. areolatus populations, collected in two river basins from southern Chile, was assessed with multivariate analyses, including principal component analysis (PCA) and discriminant function analysis (DFA). It is hypothesized that populations must segregate morphologically from each other based on the river basin that they were sampled from, since each basin presents relatively particular hydrological characteristics. Significant morphological differences among the three populations were found with PCA (ANOSIM test, r = 0.552, p < 0.0001) and DFA (Wilks's λ = 0.036, p < 0.01). PCA accounted for a total variation of 56.16% by the first two principal components. The first Principal Component (PC1) and PC2 explained 34.72 and 21.44% of the total variation, respectively. The scatter-plot of the first two discriminant functions (DF1 on DF2) also validated the existence of three different populations. In group classification using DFA, 93.3% of the specimens were correctly-classified into their original populations. Of the total of 22 transformed truss measurements, 17 exhibited highly significant (p < 0.01) differences among populations. The data support the existence of T. areolatus morphological variation across different rivers in southern Chile, likely reflecting the geographic isolation underlying population structure of the species.

  16. Experimental and theoretical high energy physics research. Annual grant progress report (FDP), January 15, 1993--January 14, 1993

    Energy Technology Data Exchange (ETDEWEB)

    Cline, D.B.

    1993-10-01

    Progress on seven tasks is reported. (I)UCLA hadronization model, antiproton decay, PEP4/9 e{sup +}e{sup {minus}} analysis: In addition to these topics, work on CP and CPT phenomenology at a {phi} factory and letters of support on the hadronization project are included. (II)ICARUS detector and rare B decays with hadron beams and colliders: Developments are summarized and some typcial events as shown; in addition, the RD5 collaboration at CERN and the asymmetric {phi} factory project are sketched. (III)Theoretical physics: Feynman diagram calculations in gauge theory; supersymmetric standard model; effects of quantum gravity in breaking of global symmetries; models of quark and lepton substructure; renormalized field theory; large-scale structure in the universe and particle-astrophysics/early universe cosmology. (IV)H dibaryon search at BNL, kaon experiments (E799/KTeV) at Fermilab: Project design and some scatterplots are given. (V)UCLA participation in the experiment CDF at Fermilab. (VI)Detectors for hadron physics at ultrahigh energy colliders: Scintillating fiber and visible light photon counter research. (VII)Administrative support and conference organization.

  17. P-Splines Using Derivative Information

    KAUST Repository

    Calderon, Christopher P.

    2010-01-01

    Time series associated with single-molecule experiments and/or simulations contain a wealth of multiscale information about complex biomolecular systems. We demonstrate how a collection of Penalized-splines (P-splines) can be useful in quantitatively summarizing such data. In this work, functions estimated using P-splines are associated with stochastic differential equations (SDEs). It is shown how quantities estimated in a single SDE summarize fast-scale phenomena, whereas variation between curves associated with different SDEs partially reflects noise induced by motion evolving on a slower time scale. P-splines assist in "semiparametrically" estimating nonlinear SDEs in situations where a time-dependent external force is applied to a single-molecule system. The P-splines introduced simultaneously use function and derivative scatterplot information to refine curve estimates. We refer to the approach as the PuDI (P-splines using Derivative Information) method. It is shown how generalized least squares ideas fit seamlessly into the PuDI method. Applications demonstrating how utilizing uncertainty information/approximations along with generalized least squares techniques improve PuDI fits are presented. Although the primary application here is in estimating nonlinear SDEs, the PuDI method is applicable to situations where both unbiased function and derivative estimates are available.

  18. Assessment of bone age in prepubertal healthy Korean children: comparison among the Korean standard bone age chart, Greulich-Pyle method, and Tanner-Whitehouse method.

    Science.gov (United States)

    Kim, Jeong Rye; Lee, Young Seok; Yu, Jeesuk

    2015-01-01

    To compare the reliability of the Greulich-Pyle (GP) method, Tanner-Whitehouse 3 (TW3) method and Korean standard bone age chart (KS) in the evaluation of bone age of prepubertal healthy Korean children. Left hand-wrist radiographs of 212 prepubertal healthy Korean children aged 7 to 12 years, obtained for the evaluation of the traumatic injury in emergency department, were analyzed by two observers. Bone age was estimated using the GP method, TW3 method and KS, and was calculated in months. The correlation between bone age measured by each method and chronological age of each child was analyzed using Pearson correlation coefficient, scatterplot. The three methods were compared using one-way analysis of variance. Significant correlations were found between chronological age and bone age estimated by all three methods in whole group and in each gender (R(2) ranged from 0.87 to 0.9, p 0.01). The KS, GP, and TW3 methods show good reliability in the evaluation of bone age of prepubertal healthy Korean children without significant difference between them. Any are useful for evaluation of bone age in prepubertal healthy Korean children.

  19. Graphics for relatedness research.

    Science.gov (United States)

    Galván-Femenía, Iván; Graffelman, Jan; Barceló-I-Vidal, Carles

    2017-11-01

    Studies of relatedness have been crucial in molecular ecology over the last decades. Good evidence of this is the fact that studies of population structure, evolution of social behaviours, genetic diversity and quantitative genetics all involve relatedness research. The main aim of this article was to review the most common graphical methods used in allele sharing studies for detecting and identifying family relationships. Both IBS- and IBD-based allele sharing studies are considered. Furthermore, we propose two additional graphical methods from the field of compositional data analysis: the ternary diagram and scatterplots of isometric log-ratios of IBS and IBD probabilities. We illustrate all graphical tools with genetic data from the HGDP-CEPH diversity panel, using mainly 377 microsatellites genotyped for 25 individuals from the Maya population of this panel. We enhance all graphics with convex hulls obtained by simulation and use these to confirm the documented relationships. The proposed compositional graphics are shown to be useful in relatedness research, as they also single out the most prominent related pairs. The ternary diagram is advocated for its ability to display all three allele sharing probabilities simultaneously. The log-ratio plots are advocated as an attempt to overcome the problems with the Euclidean distance interpretation in the classical graphics. © 2017 The Authors. Molecular Ecology Resources Published by John Wiley & Sons Ltd.

  20. Data visualization, bar naked: A free tool for creating interactive graphics.

    Science.gov (United States)

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  1. HiPiler: Visual Exploration of Large Genome Interaction Matrices with Interactive Small Multiples.

    Science.gov (United States)

    Lekschas, Fritz; Bach, Benjamin; Kerpedjiev, Peter; Gehlenborg, Nils; Pfister, Hanspeter

    2017-08-29

    This paper presents an interactive visualization interface-HiPiler-for the exploration and visualization of regions-of-interest in large genome interaction matrices. Genome interaction matrices approximate the physical distance of pairs of regions on the genome to each other and can contain up to 3 million rows and columns with many sparse regions. Regions of interest (ROIs) can be defined, e.g., by sets of adjacent rows and columns, or by specific visual patterns in the matrix. However, traditional matrix aggregation or pan-and-zoom interfaces fail in supporting search, inspection, and comparison of ROIs in such large matrices. In HiPiler, ROIs are first-class objects, represented as thumbnail-like "snippets". Snippets can be interactively explored and grouped or laid out automatically in scatterplots, or through dimension reduction methods. Snippets are linked to the entire navigable genome interaction matrix through brushing and linking. The design of HiPiler is based on a series of semi-structured interviews with 10 domain experts involved in the analysis and interpretation of genome interaction matrices. We describe six exploration tasks that are crucial for analysis of interaction matrices and demonstrate how HiPiler supports these tasks. We report on a user study with a series of data exploration sessions with domain experts to assess the usability of HiPiler as well as to demonstrate respective findings in the data.

  2. A study of bite force, part 1: Relationship to various physical characteristics.

    Science.gov (United States)

    Braun, S; Bantleon, H P; Hnat, W P; Freudenthaler, J W; Marcotte, M R; Johnson, B E

    1995-01-01

    A new device for measuring and recording bilateral bite force in the molar/premolar region has been developed. Because this new device is elastic and conforms to the occlusal surfaces of the teeth, and because the sensing element is relatively comfortable, it is believed that experimental subjects are less reluctant to register true maximal forces than in earlier studies. Potential correlations of maximum bite force to gender, age, weight, body type, stature, previous history of orthodontic treatment, presence of TMJ symptoms (jaw motion limitation, clicking with pain, or joint pain), or missing teeth were studied in a sample of 142 dental students. The mean maximum bite force of the sample was found to be 738 N, with a standard deviation of 209 N. The mean maximum bite force as related to gender was found to be statistically significant, while the correlation coefficients for age, weight, stature, and body type were found to be low. Even so, all data scatterplots exhibited relatively positive relationships. Correlations of maximum bite force to an earlier history of orthodontic treatment or to the absence of teeth were not found. Subjects reporting TMJ symptoms did not exhibit a significantly different maximum bite force than subjects without symptoms.

  3. Modelling bivariate astronomical data with multiple components and non-linear relationships

    Science.gov (United States)

    Koen, C.; Bere, A.

    2017-11-01

    A common approach towards modelling bivariate scatterplots is decomposition into Gaussian components, I.e. Gaussian mixture modelling. This implicitly assumes linear relationships between the variables within each of the components in the mixture. An alternative, namely dependence modelling by mixtures of copulas, is advocated in this paper. This approach allows separate modelling of the univariate marginal distributions and the dependence which can possibly be non-linear and/or asymmetric. It also accommodates the use of a variety of parametric families for modelling each component and for each variable. The variety of dependence structures can be extended by introducing rotated versions of the copulas. Gaussian mixture modelling on the one hand, and separate modelling of univariate marginal distributions and dependence on the other hand, are illustrated by application to pulsar period - period-derivative observations. Parameter estimation for mixtures of copulas is performed using the method of maximum likelihood and selected copula models are subjected to non-parametric goodness-of-fit testing.

  4. Temporal Dynamics of Phlebotomine Sand Flies Population in Response to Ambient Temperature Variation, Bam, Kerman Province of Iran.

    Science.gov (United States)

    Halimi, Mansour; Cheghabaleki, Zahra Zarei; Modrek, Mohammad Jafari; Delavari, Mahdi

    Variations in climate condition may have changed the dynamic of zoonotic cutaneous leishmaniasis (ZCL) and its agents such as sand flies and reservoir in the Bam Kerman the dry region of Iran. In this study we intend to examine the seasonal and interannual dynamics of the phlebotomine mosquito as a function of ambient temperature in Bam, Kerman one of the main leshmaniasis prevalence area in Iran. The MODIS land surface temperature product (LST; MODIS/Terra LST/E Monthly L3 Global 0.05Deg CMG [MOD11C3]) and land-based climatic data were used as explanatory variables. Monthly caught mosquitoes in Bam, Kerman, were used as a dependent variable. The temporal associations were first investigated by inspection of scatterplots and single-variable regression analysis. A multivariate linear regression model was developed to reveal the association between ambient temperature and the monthly mosquito abundance at a 95% confidence level (P 0.80 of temporal dynamics of phlebotomine mosquitos in Bam. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  5. Hardiness as a predictor of mental health and well-being of Australian army reservists on and after stability operations.

    Science.gov (United States)

    Orme, Geoffrey J; Kehoe, E James

    2014-04-01

    This study tested whether cognitive hardiness moderates the adverse effects of deployment-related stressors on health and well-being of soldiers on short-tour (4-7 months), peacekeeping operations. Australian Army reservists (N = 448) were surveyed at the start, end, and up to 24 months after serving as peacekeepers in Timor-Leste or the Solomon Islands. They retained sound mental health throughout (Kessler 10, Post-Traumatic Checklist-Civilian, Depression Anxiety Stress Scale 42). Ratings of either traumatic or nontraumatic stress were low. Despite range restrictions, scores on the Cognitive Hardiness Scale moderated the relationship between deployment stressors and a composite measure of psychological distress. Scatterplots revealed an asymmetric pattern for hardiness scores and measures of psychological distress. When hardiness scores were low, psychological distress scores were widely dispersed. However, when hardiness scores were higher, psychological distress scores became concentrated at a uniformly low level. Reprint & Copyright © 2014 Association of Military Surgeons of the U.S.

  6. Morphological analysis of Trichomycterus areolatus Valenciennes, 1846 from southern Chilean rivers using a truss-based system (Siluriformes, Trichomycteridae

    Directory of Open Access Journals (Sweden)

    Nelson Colihueque

    2017-09-01

    Full Text Available Trichomycterus areolatus Valenciennes, 1846 is a small endemic catfish inhabiting the Andean river basins of Chile. In this study, the morphological variability of three T. areolatus populations, collected in two river basins from southern Chile, was assessed with multivariate analyses, including principal component analysis (PCA and discriminant function analysis (DFA. It is hypothesized that populations must segregate morphologically from each other based on the river basin that they were sampled from, since each basin presents relatively particular hydrological characteristics. Significant morphological differences among the three populations were found with PCA (ANOSIM test, r = 0.552, p < 0.0001 and DFA (Wilks’s λ = 0.036, p < 0.01. PCA accounted for a total variation of 56.16% by the first two principal components. The first Principal Component (PC1 and PC2 explained 34.72 and 21.44% of the total variation, respectively. The scatter-plot of the first two discriminant functions (DF1 on DF2 also validated the existence of three different populations. In group classification using DFA, 93.3% of the specimens were correctly-classified into their original populations. Of the total of 22 transformed truss measurements, 17 exhibited highly significant (p < 0.01 differences among populations. The data support the existence of T. areolatus morphological variation across different rivers in southern Chile, likely reflecting the geographic isolation underlying population structure of the species.

  7. Simultaneous mapping of multiple gene loci with pooled segregants.

    Directory of Open Access Journals (Sweden)

    Jürgen Claesen

    Full Text Available The analysis of polygenic, phenotypic characteristics such as quantitative traits or inheritable diseases remains an important challenge. It requires reliable scoring of many genetic markers covering the entire genome. The advent of high-throughput sequencing technologies provides a new way to evaluate large numbers of single nucleotide polymorphisms (SNPs as genetic markers. Combining the technologies with pooling of segregants, as performed in bulked segregant analysis (BSA, should, in principle, allow the simultaneous mapping of multiple genetic loci present throughout the genome. The gene mapping process, applied here, consists of three steps: First, a controlled crossing of parents with and without a trait. Second, selection based on phenotypic screening of the offspring, followed by the mapping of short offspring sequences against the parental reference. The final step aims at detecting genetic markers such as SNPs, insertions and deletions with next generation sequencing (NGS. Markers in close proximity of genomic loci that are associated to the trait have a higher probability to be inherited together. Hence, these markers are very useful for discovering the loci and the genetic mechanism underlying the characteristic of interest. Within this context, NGS produces binomial counts along the genome, i.e., the number of sequenced reads that matches with the SNP of the parental reference strain, which is a proxy for the number of individuals in the offspring that share the SNP with the parent. Genomic loci associated with the trait can thus be discovered by analyzing trends in the counts along the genome. We exploit the link between smoothing splines and generalized mixed models for estimating the underlying structure present in the SNP scatterplots.

  8. Telephone interview for cognitive status (TICS) screening for clinical trials of physical activity and cognitive training: the seniors health and activity research program pilot (SHARP-P) study.

    Science.gov (United States)

    Espeland, Mark A; Rapp, Stephen R; Katula, Jeff A; Andrews, Lee Ann; Felton, Deborah; Gaussoin, Sarah A; Dagenbach, Dale; Legault, Claudine; Jennings, Janine M; Sink, Kaycee M

    2011-02-01

    To examine the performance of the Telephone Interview for Cognitive Status (TICS) for identifying participants appropriate for trials of physical activity and cognitive training interventions. Volunteers (N=343), ages 70-85 years, who were being recruited for a pilot clinical trial on approaches to prevent cognitive decline, were administered TICS and required to score ≥ 31 prior to an invitation to attend clinic-based assessments. The frequencies of contraindications for physical activity and cognitive training interventions were tallied for individuals grouped by TICS scores. Relationships between TICS scores and other measures of cognitive function were described by scatterplots and correlation coefficients. Eligibility criteria to identify candidates who were appropriate candidates for the trial interventions excluded 51.7% of the volunteers with TICSTICS scores above this range were not strongly related to cognition or attendance at screening visits, however overall enrollment yields were approximately half for participants with TICS=31 versus TICS=41, and increased in a graded fashion throughout the range of scores. Use of TICS to define eligibility criteria in trials of physical activity and cognitive training interventions may not be worthwhile in that many individuals with low scores would already be eliminated by intervention-specific criteria and the relationship of TICS with clinic-based tests of cognitive function among appropriate candidates for these interventions may be weak. TICS may be most useful in these trials to identify candidates for oversampling in order to obtain a balanced cohort of participants at risk for cognitive decline. Copyright © 2010 John Wiley & Sons, Ltd.

  9. PIIKA 2: an expanded, web-based platform for analysis of kinome microarray data.

    Science.gov (United States)

    Trost, Brett; Kindrachuk, Jason; Määttänen, Pekka; Napper, Scott; Kusalik, Anthony

    2013-01-01

    Kinome microarrays are comprised of peptides that act as phosphorylation targets for protein kinases. This platform is growing in popularity due to its ability to measure phosphorylation-mediated cellular signaling in a high-throughput manner. While software for analyzing data from DNA microarrays has also been used for kinome arrays, differences between the two technologies and associated biologies previously led us to develop Platform for Intelligent, Integrated Kinome Analysis (PIIKA), a software tool customized for the analysis of data from kinome arrays. Here, we report the development of PIIKA 2, a significantly improved version with new features and improvements in the areas of clustering, statistical analysis, and data visualization. Among other additions to the original PIIKA, PIIKA 2 now allows the user to: evaluate statistically how well groups of samples cluster together; identify sets of peptides that have consistent phosphorylation patterns among groups of samples; perform hierarchical clustering analysis with bootstrapping; view false negative probabilities and positive and negative predictive values for t-tests between pairs of samples; easily assess experimental reproducibility; and visualize the data using volcano plots, scatterplots, and interactive three-dimensional principal component analyses. Also new in PIIKA 2 is a web-based interface, which allows users unfamiliar with command-line tools to easily provide input and download the results. Collectively, the additions and improvements described here enhance both the breadth and depth of analyses available, simplify the user interface, and make the software an even more valuable tool for the analysis of kinome microarray data. Both the web-based and stand-alone versions of PIIKA 2 can be accessed via http://saphire.usask.ca.

  10. Study of risk factors for gastric cancer by populational databases analysis.

    Science.gov (United States)

    Ferrari, Fangio; Reis, Marco Antonio Moura

    2013-12-28

    To study the association between the incidence of gastric cancer and populational exposure to risk/protective factors through an analysis of international databases. Open-access global databases concerning the incidence of gastric cancer and its risk/protective factors were identified through an extensive search on the Web. As its distribution was neither normal nor symmetric, the cancer incidence of each country was categorized according to ranges of percentile distribution. The association of each risk/protective factor with exposure was measured between the extreme ranges of the incidence of gastric cancer (under the 25(th) percentile and above the 75(th) percentile) by the use of the Mann-Whitney test, considering a significance level of 0.05. A variable amount of data omission was observed among all of the factors under study. A weak or nonexistent correlation between the incidence of gastric cancer and the study variables was shown by a visual analysis of scatterplot dispersion. In contrast, an analysis of categorized incidence revealed that the countries with the highest human development index (HDI) values had the highest rates of obesity in males and the highest consumption of alcohol, tobacco, fruits, vegetables and meat, which were associated with higher incidences of gastric cancer. There was no significant difference for the risk factors of obesity in females and fish consumption. Higher HDI values, coupled with a higher prevalence of male obesity and a higher per capita consumption of alcohol, tobacco, fruits, vegetables and meat, are associated with a higher incidence of gastric cancer based on an analysis of populational global data.

  11. Joint modelling of flood peaks and volumes: A copula application for the Danube River

    Directory of Open Access Journals (Sweden)

    Papaioannou George

    2016-12-01

    Full Text Available Flood frequency analysis is usually performed as a univariate analysis of flood peaks using a suitable theoretical probability distribution of the annual maximum flood peaks or peak over threshold values. However, other flood attributes, such as flood volume and duration, are necessary for the design of hydrotechnical projects, too. In this study, the suitability of various copula families for a bivariate analysis of peak discharges and flood volumes has been tested. Streamflow data from selected gauging stations along the whole Danube River have been used. Kendall’s rank correlation coefficient (tau quantifies the dependence between flood peak discharge and flood volume settings. The methodology is applied to two different data samples: 1 annual maximum flood (AMF peaks combined with annual maximum flow volumes of fixed durations at 5, 10, 15, 20, 25, 30 and 60 days, respectively (which can be regarded as a regime analysis of the dependence between the extremes of both variables in a given year, and 2 annual maximum flood (AMF peaks with corresponding flood volumes (which is a typical choice for engineering studies. The bivariate modelling of the extracted peak discharge - flood volume couples is achieved with the use of the Ali-Mikhail-Haq (AMH, Clayton, Frank, Joe, Gumbel, Hüsler-Reiss, Galambos, Tawn, Normal, Plackett and FGM copula families. Scatterplots of the observed and simulated peak discharge - flood volume pairs and goodness-of-fit tests have been used to assess the overall applicability of the copulas as well as observing any changes in suitable models along the Danube River. The results indicate that for the second data sampling method, almost all of the considered Archimedean class copula families perform better than the other copula families selected for this study, and that for the first method, only the upper-tail-flat copulas excel (except for the AMH copula due to its inability to model stronger relationships.

  12. Season of sampling and season of birth influence serotonin metabolite levels in human cerebrospinal fluid.

    Directory of Open Access Journals (Sweden)

    Jurjen J Luykx

    Full Text Available BACKGROUND: Animal studies have revealed seasonal patterns in cerebrospinal fluid (CSF monoamine (MA turnover. In humans, no study had systematically assessed seasonal patterns in CSF MA turnover in a large set of healthy adults. METHODOLOGY/PRINCIPAL FINDINGS: Standardized amounts of CSF were prospectively collected from 223 healthy individuals undergoing spinal anesthesia for minor surgical procedures. The metabolites of serotonin (5-hydroxyindoleacetic acid, 5-HIAA, dopamine (homovanillic acid, HVA and norepinephrine (3-methoxy-4-hydroxyphenylglycol, MPHG were measured using high performance liquid chromatography (HPLC. Concentration measurements by sampling and birth dates were modeled using a non-linear quantile cosine function and locally weighted scatterplot smoothing (LOESS, span = 0.75. The cosine model showed a unimodal season of sampling 5-HIAA zenith in April and a nadir in October (p-value of the amplitude of the cosine = 0.00050, with predicted maximum (PC(max and minimum (PC(min concentrations of 173 and 108 nmol/L, respectively, implying a 60% increase from trough to peak. Season of birth showed a unimodal 5-HIAA zenith in May and a nadir in November (p = 0.00339; PC(max = 172 and PC(min = 126. The non-parametric LOESS showed a similar pattern to the cosine in both season of sampling and season of birth models, validating the cosine model. A final model including both sampling and birth months demonstrated that both sampling and birth seasons were independent predictors of 5-HIAA concentrations. CONCLUSION: In subjects without mental illness, 5-HT turnover shows circannual variation by season of sampling as well as season of birth, with peaks in spring and troughs in fall.

  13. Recall and recognition memory in amnesia: patients with hippocampal, medial temporal, temporal lobe or frontal pathology.

    Science.gov (United States)

    Kopelman, Michael D; Bright, Peter; Buckman, Joseph; Fradera, Alex; Yoshimasu, Haruo; Jacobson, Clare; Colchester, Alan C F

    2007-03-25

    The relationship between recall and recognition memory impairments was examined in memory-disordered patients with either hippocampal, medial temporal, more widespread temporal lobe or frontal pathology. The Hirst [Hirst, W., Johnson, M. K., Phelps, E. A., & Volpe, B. T. (1988). More on recognition and recall in amnesics. Journal of Experimental Psychology: Learning, Memory, & Cognition, 14, 758-762] technique for titrating exposure times was used to match recognition memory performance as closely as possible before comparing recall memory scores. Data were available from two different control groups given differing exposure times. Each of the patient groups showed poorer recall memory performance than recognition scores, proportionate to the difference seen in healthy participants. When patients' scores were converted to Z-scores, there was no significant difference between mean Z-recall and Z-recognition scores. When plotted on a scatterplot, the majority of the data-points indicating disproportionately low recall memory scores came from healthy controls or patients with pathology extending into the lateral temporal lobes, rather than from patients with pathology confined to the medial temporal lobes. Patients with atrophy extending into the parahippocampal gyrus (H+) performed worse than patients with atrophy confined to the hippocampi (H-); but, when H- patients were given a shorter exposure time (5s) and compared with H+ at a longer exposure (10s), their performance was virtually identical and did not indicate any disproportionate recall memory impairment in the H- group. Parahippocampal volumes on MRI correlated significantly with both recall and recognition memory. The possibility that findings were confounded by inter-stimulus artefacts was examined and rejected. These findings argue against the view that hippocampal amnesia or memory disorders in general are typically characterised by a disproportionate impairment in recall memory. Disproportionate recall

  14. Stock identification of neon flying squid (Ommastrephes bartramii in the North Pacific Ocean on the basis of beak and statolith morphology

    Directory of Open Access Journals (Sweden)

    Zhou Fang

    2014-06-01

    Full Text Available Cephalopods are becoming increasingly important in global fisheries as a result of increased landings and are playing an important ecological role in the trophic dynamics of marine ecosystems. Ommastrephes bartramii is a pelagic cephalopod species with two widely distributed spawning stocks in the North Pacific Ocean. It is also a major fishing target for the Chinese squid jigging fleets. Successful separation of these two spawning stocks is critical to fisheries management, but tends to be challenging because of their similar morphology. In this study we attempted to identify the stocks based on discriminant analyses of 9 morphological variables of statolith and 12 variables of beaks measured for O. bartramii samples in the North Pacific. A significant difference was revealed in the standardized beak and statolith variables between sexes in the northeast (NE stock (P 0.05, whereas the NW stock showed no significant difference in either sex for the statolith variables (P > 0.05. The same sex also revealed different patterns with different hard structures between the two stocks. In t-tests females showed significant differences between stocks in statolith morphology (P 0.05, but showed no difference between cohorts (P > 0.05 in beak morphometric variables. With the combination of two standardized hard parts, correct classification of stepwise discriminant analysis (SDA was raised by nearly 20% compared with using only one structure, although overlaps of the NW stock were still found in the scatter-plots. It is concluded that adding more appropriate hard structure variables will effectively increase the success of separating geographic stocks by the SDA method.

  15. Alcohol as a risk factor for sudden infant death syndrome (SIDS).

    Science.gov (United States)

    Phillips, David P; Brewer, Kimberly M; Wadensweiler, Paul

    2011-03-01

    To test whether alcohol is a risk factor for sudden infant death syndrome (SIDS). US epidemiological study using computerized death certificates, linked birth and infant death dataset, and Fatality Analysis Reporting System. All SIDS cases (n = 129,090) and other infant deaths (n = 295,151) from 1973-2006; all persons involved in late-night alcohol-related crashes (n = 135,946) from 1994-2008. Three measures were used: the expected number of deaths on New Year versus the observed number (expected values were determined using a locally weighted scatterplot smoothing polynomial), the average number of weekend deaths versus the average number of weekday deaths, and the SIDS death rate for children of alcohol-consuming versus non-alcohol-consuming mothers. These measures indicate that the largest spikes in alcohol consumption and in SIDS (33%) occur on New Year, alcohol consumption and SIDS increase significantly on weekends, and children of alcohol-consuming mothers are much more likely to die from SIDS than are children of non-alcohol-consuming mothers. Alcohol consumption appears to be a risk factor for sudden infant death syndrome, although it is unclear whether alcohol is an independent risk factor, a risk factor only in conjunction with other known risk factors (like co-sleeping), or a proxy for other risk factors associated with occasions when alcohol consumption increases (like smoking). Our findings suggest that caretakers and authorities should be informed that alcohol impairs parental capacity and might be a risk factor for sudden infant death syndrome; in addition, future research should further explore possible connections between sudden infant death syndrome and alcohol. © 2010 The Authors, Addiction © 2010 Society for the Study of Addiction.

  16. Relative importance of P and N in macrophyte and epilithic algae biomass in a wastewater-impacted oligotrophic river.

    Science.gov (United States)

    Taube, Nadine; He, Jianxun; Ryan, M Cathryn; Valeo, Caterina

    2016-08-01

    The role of nutrient loading on biomass growth in wastewater-impacted rivers is important in order to effectively optimize wastewater treatment to avoid excessive biomass growth in the receiving water body. This paper directly relates wastewater treatment plant (WWTP) effluent nutrients (including ammonia (NH3-N), nitrate (NO3-N) and total phosphorus (TP)) to the temporal and spatial distribution of epilithic algae and macrophyte biomass in an oligotrophic river. Annual macrophyte biomass, epilithic algae data and WWTP effluent nutrient data from 1980 to 2012 were statistically analysed. Because discharge can affect aquatic biomass growth, locally weighted scatterplot smoothing (LOWESS) was used to remove the influence of river discharge from the aquatic biomass (macrophytes and algae) data before further analysis was conducted. The results from LOWESS indicated that aquatic biomass did not increase beyond site-specific threshold discharge values in the river. The LOWESS-estimated biomass residuals showed a variable response to different nutrients. Macrophyte biomass residuals showed a decreasing trend concurrent with enhanced nutrient removal at the WWTP and decreased effluent P loading, whereas epilithic algae biomass residuals showed greater response to enhanced N removal. Correlation analysis between effluent nutrient concentrations and the biomass residuals (both epilithic algae and macrophytes) suggested that aquatic biomass is nitrogen limited, especially by NH3-N, at most sampling sites. The response of aquatic biomass residuals to effluent nutrient concentrations did not change with increasing distance to the WWTP but was different for P and N, allowing for additional conclusions about nutrient limitation in specific river reaches. The data further showed that the mixing process between the effluent and the river has an influence on the spatial distribution of biomass growth.

  17. Phonation Quotient in Women: A Measure of Vocal Efficiency Using Three Aerodynamic Instruments.

    Science.gov (United States)

    Joshi, Ashwini; Watts, Christopher R

    2017-03-01

    The purpose of this study was to examine measures of vital capacity and phonation quotient across three age groups in women using three different aerodynamic instruments representing low-tech and high-tech options. This study has a prospective, repeated measures design. Fifteen women in each age group of 25-39 years, 40-59 years, and 60-79 years were assessed using maximum phonation time and vital capacity obtained from three aerodynamic instruments: a handheld analog windmill type spirometer, a handheld digital spirometer, and the Phonatory Aerodynamic System (PAS), Model 6600. Phonation quotient was calculated using vital capacity from each instrument. Analyses of variance were performed to test for main effects of the instruments and age on vital capacity and derived phonation quotient. Pearson product moment correlation was performed to assess measurement reliability (parallel forms) between the instruments. Regression equations, scatterplots, and coefficients of determination were also calculated. Statistically significant differences were found in vital capacity measures for the digital spirometer compared with the windmill-type spirometer and PAS across age groups. Strong positive correlations were present between all three instruments for both vital capacity and derived phonation quotient measurements. Measurement precision for the digital spirometer was lower than the windmill spirometer compared with the PAS. However, all three instruments had strong measurement reliability. Additionally, age did not have an effect on the measurement across instruments. These results are consistent with previous literature reporting data from male speakers and support the use of low-tech options for measurement of basic aerodynamic variables associated with voice production. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  18. Vitamin D status and community-acquired pneumonia: results from the third National Health and Nutrition Examination Survey.

    Directory of Open Access Journals (Sweden)

    Sadeq A Quraishi

    Full Text Available To investigate the association between serum 25-hydroxyvitamin D [25(OHD] level and history of community-acquired pneumonia (CAP.We identified 16,975 individuals (≥17 years from the third National Health and Nutrition Examination Survey (NHANES III with documented 25(OHD levels. To investigate the association of 25(OHD with history of CAP in these participants, we developed a multivariable logistic regression model, adjusting for demographic factors (age, sex, race, poverty-to-income ratio, and geographic location, clinical data (body mass index, smoking status, asthma, chronic obstructive pulmonary disease, congestive heart failure, diabetes mellitus, stroke, chronic kidney disease, neutropenia, and alcohol consumption, and season. Locally weighted scatterplot smoothing (LOWESS was used to depict the relationship between increasing 25(OHD levels and the cumulative frequency of CAP in the study cohort.The median [interquartile range (IQR] serum 25(OHD level was 24 (IQR 18-32 ng/mL. 2.1% [95% confidence interval (CI: 1.9-2.3] of participants reported experiencing a CAP within one year of their participation in the national survey. After adjusting for demographic factors, clinical data, and season, 25(OHD levels <30 ng/mL were associated with 56% higher odds of CAP [odds ratio 1.56; 95% confidence interval: 1.17-2.07] compared to levels ≥30 ng/mL. LOWESS analysis revealed a near linear relationship between vitamin D status and the cumulative frequency of CAP up to 25(OHD levels around 30 ng/mL.Among 16,975 participants in NHANES III, 25(OHD levels were inversely associated with history of CAP. Randomized controlled trials are warranted to determine the effect of optimizing vitamin D status on the risk of CAP.

  19. Smoking trajectories among Koreans in Seoul and California: exemplifying a common error in age parameterization.

    Science.gov (United States)

    Allem, Jon-Patrick; Ayers, John W; Unger, Jennifer B; Irvin, Veronica L; Hofstetter, C Richard; Hovell, Melbourne F

    2012-01-01

    Immigration to a nation with a stronger anti-smoking environment has been hypothesized to make smoking less common. However, little is known about how environments influence risk of smoking across the lifecourse. Research suggested a linear decline in smoking over the lifecourse but these associations, in fact, might not be linear. This study assessed the possible nonlinear associations between age and smoking and examined how these associations differed by environment through comparing Koreans in Seoul, South Korea and Korean Americans in California, United States. Data were drawn from population based telephone surveys of Korean adults in Seoul (N=500) and California (N=2,830) from 2001-2002. Locally weighted scatterplot smoothing (lowess) was used to approximate the association between age and smoking with multivariable spline logistic regressions, including adjustment for confounds used to draw population inferences. Smoking differed across the lifecourse between Korean and Korean American men. The association between age and smoking peaked around 35 years among Korean and Korean American men. From 18 to 35 the probability of smoking was 57% higher (95%CI, 40 to 71) among Korean men versus 8% (95%CI, 3 to 19) higher among Korean American men. A similar difference in age after 35, from 40 to 57 years of age, was associated with a 2% (95%CI, 0 to 10) and 20% (95%CI, 16 to 25) lower probability of smoking among Korean and Korean American men. A nonlinear pattern was also observed among Korean American women. Social role transitions provide plausible explanations for the decline in smoking after 35. Investigators should be mindful of nonlinearities in age when attempting to understand tobacco use.

  20. Is radiological shortening of the ramus a reliable guide to operative management of unilateral fractures of the mandibular condyle?

    Science.gov (United States)

    Kommers, Sofie; Moghimi, Meshkan; van de Ven, Lisanne; Forouzanfar, Tymour

    2014-07-01

    Several studies have published measurements of the height of the ramus on orthopantomographic (OPT) images of patients with unilateral fractures of the mandibular condyle as a possible quantitative measure for making decisions about treatment. However, we know of no studies that have described the accuracy and validity of such measurements. The aim of the present study was to assess the shortening of the ramus in patients with such fractures, and compare them with differences found in a control group. Seventy-four patients and 74 controls were studied. The height of the ramus on the fractured was less than that on the uninjured side, although this was not statistically significant (p=0.25). In the control group, 50 subjects (68%) had a difference in the ramal height of more than 2mm. Of 74 patients, 25 (34%) had a shorter, uninjured ramus on the opposite side. A Bland and Altman scatterplot showed 23 outliers (31%) among the patients, which exceeded the mean (SD 1.96) of the control group. The interobserver and intraobserver reliability both showed excellent agreement for all measurements made. Shortening of the ramus can be measured on OPT images. However, in a control group there was a large mean difference in height. Among the patients, 25/74 (34%) also had an uninjured ramus on the opposite side that was shorter than that on the fractured side. Measurement of the difference in height on an OPT image cannot be relied on as an absolute indication for intervention. Copyright © 2014 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  1. A new method for correlation analysis of compositional (environmental) data - a worked example.

    Science.gov (United States)

    Reimann, C; Filzmoser, P; Hron, K; Kynčlová, P; Garrett, R G

    2017-12-31

    Most data in environmental sciences and geochemistry are compositional. Already the unit used to report the data (e.g., μg/l, mg/kg, wt%) implies that the analytical results for each element are not free to vary independently of the other measured variables. This is often neglected in statistical analysis, where a simple log-transformation of the single variables is insufficient to put the data into an acceptable geometry. This is also important for bivariate data analysis and for correlation analysis, for which the data need to be appropriately log-ratio transformed. A new approach based on the isometric log-ratio (ilr) transformation, leading to so-called symmetric coordinates, is presented here. Summarizing the correlations in a heat-map gives a powerful tool for bivariate data analysis. Here an application of the new method using a data set from a regional geochemical mapping project based on soil O and C horizon samples is demonstrated. Differences to 'classical' correlation analysis based on log-transformed data are highlighted. The fact that some expected strong positive correlations appear and remain unchanged even following a log-ratio transformation has probably led to the misconception that the special nature of compositional data can be ignored when working with trace elements. The example dataset is employed to demonstrate that using 'classical' correlation analysis and plotting XY diagrams, scatterplots, based on the original or simply log-transformed data can easily lead to severe misinterpretations of the relationships between elements. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Prevalence of hyperuricemia and relation of serum uric acid with cardiovascular risk factors in a developing country

    Directory of Open Access Journals (Sweden)

    Shamlaye C

    2004-03-01

    Full Text Available Abstract Background The prevalence of hyperuricemia has rarely been investigated in developing countries. The purpose of the present study was to investigate the prevalence of hyperuricemia and the association between uric acid levels and the various cardiovascular risk factors in a developing country with high average blood pressures (the Seychelles, Indian Ocean, population mainly of African origin. Methods This cross-sectional health examination survey was based on a population random sample from the Seychelles. It included 1011 subjects aged 25 to 64 years. Blood pressure (BP, body mass index (BMI, waist circumference, waist-to-hip ratio, total and HDL cholesterol, serum triglycerides and serum uric acid were measured. Data were analyzed using scatterplot smoothing techniques and gender-specific linear regression models. Results The prevalence of a serum uric acid level >420 μmol/L in men was 35.2% and the prevalence of a serum uric acid level >360 μmol/L was 8.7% in women. Serum uric acid was strongly related to serum triglycerides in men as well as in women (r = 0.73 in men and r = 0.59 in women, p Conclusions This study shows that the prevalence of hyperuricemia can be high in a developing country such as the Seychelles. Besides alcohol consumption and the use of antihypertensive therapy, mainly diuretics, serum uric acid is markedly associated with parameters of the metabolic syndrome, in particular serum triglycerides. Considering the growing incidence of obesity and metabolic syndrome worldwide and the potential link between hyperuricemia and cardiovascular complications, more emphasis should be put on the evolving prevalence of hyperuricemia in developing countries.

  3. Enhanced Laws textures: A potential MRI surrogate marker of hepatic fibrosis in a murine model.

    Science.gov (United States)

    Li, Baojun; Jara, Hernan; Yu, Heishun; O'Brien, Michael; Soto, Jorge; Anderson, Stephan W

    2017-04-01

    To compare enhanced Laws textures derived from parametric proton density (PD) maps to other MRI surrogate markers (T 2 , PD, apparent diffusion coefficient (ADC)) in assessing degrees of liver fibrosis in an ex vivo murine model of hepatic fibrosis imaged using 11.7T MRI. This animal study was IACUC approved. Fourteen male, C57BL/6 mice were divided into control and experimental groups. The latter were fed a 3,5-dicarbethoxy-1,4-dihydrocollidine (DDC) supplemented diet to induce hepatic fibrosis. Ex vivo liver specimens were imaged using an 11.7T scanner, from which the parametric PD, T 2 , and ADC maps were generated from spin-echo pulsed field gradient and multi-echo spin-echo acquisitions. A sequential enhanced Laws texture analysis was applied to the PD maps: automated dual-clustering algorithm, optimal thresholding algorithm, global grayscale correction, and Laws texture features extraction. Degrees of fibrosis were independently assessed by digital image analysis (a.k.a. %Area Fibrosis). Scatterplot graphs comparing enhanced Laws texture features, T 2 , PD, and ADC values to degrees of fibrosis were generated and correlation coefficients were calculated. Hepatic fibrosis and the enhanced Laws texture features were strongly correlated with higher %Area Fibrosis associated with higher Laws textures (r=0.89). Without the proposed enhancements, only a moderate correlation was detected between %Area Fibrosis and unenhanced Laws texture features (r=0.70). Correlation also existed between %Area Fibrosis and ADC (r=0.86), PD (r=0.65), and T 2 (r=0.66). Higher degrees of hepatic fibrosis are associated with increased Laws textures. The proposed enhancements could improve the accuracy of Laws texture features significantly. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Cerebral blood flow SPET in transient global amnesia with automated ROI analysis by 3DSRT

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Ryo [Division of Nuclear Medicine, Nishi-Kobe Medical Center, Kohjidai 5-7-1, 651-2273, Nishi-ku, Kobe-City, Hyogo (Japan); Matsuda, Hiroshi [Department of Radiology, National Center Hospital for Mental, Nervous and Muscular Disorders, National Center of Neurology and Psychiatry, Tokyo (Japan); Yoshioka, Katsunori [Daiichi Radioisotope Laboratories, Ltd., Tokyo (Japan); Yonekura, Yoshiharu [Biomedical Imaging Research Center, University of Fukui, Fukui (Japan)

    2004-04-01

    The aim of this study was to determine the areas involved in episodes of transient global amnesia (TGA) by calculation of cerebral blood flow (CBF) using 3DSRT, fully automated ROI analysis software which we recently developed. Technetium-99m l,l-ethyl cysteinate dimer single-photon emission tomography ({sup 99m}Tc-ECD SPET) was performed during and after TGA attacks on eight patients (four men and four women; mean study interval, 34 days). The SPET images were anatomically standardized using SPM99 followed by quantification of 318 constant ROIs, grouped into 12 segments (callosomarginal, precentral, central, parietal, angular, temporal, posterior cerebral, pericallosal, lenticular nucleus, thalamus, hippocampus and cerebellum), in each hemisphere to calculate segmental CBF (sCBF) as the area-weighted mean value for each of the respective 12 segments based on the regional CBF in each ROI. Correlation of the intra- and post-episodic sCBF of each of the 12 segments of the eight patients was estimated by scatter-plot graphical analysis and Pearson's correlation test with Fisher's Z-transformation. For the control, {sup 99m}Tc-ECD SPET was performed on eight subjects (three men and five women) and repeated within 1 month; the correlation between the first and second sCBF values of each of the 12 segments was evaluated in the same way as for patients with TGA. Excellent reproducibility between the two sCBF values was found in all 12 segments of the control subjects. However, a significant correlation between intra- and post-episodic sCBF was not shown in the thalamus or angular segments of TGA patients. The present study was preliminary, but at least suggested that thalamus and angular regions are closely involved in the symptoms of TGA. (orig.)

  5. PIIKA 2: an expanded, web-based platform for analysis of kinome microarray data.

    Directory of Open Access Journals (Sweden)

    Brett Trost

    Full Text Available Kinome microarrays are comprised of peptides that act as phosphorylation targets for protein kinases. This platform is growing in popularity due to its ability to measure phosphorylation-mediated cellular signaling in a high-throughput manner. While software for analyzing data from DNA microarrays has also been used for kinome arrays, differences between the two technologies and associated biologies previously led us to develop Platform for Intelligent, Integrated Kinome Analysis (PIIKA, a software tool customized for the analysis of data from kinome arrays. Here, we report the development of PIIKA 2, a significantly improved version with new features and improvements in the areas of clustering, statistical analysis, and data visualization. Among other additions to the original PIIKA, PIIKA 2 now allows the user to: evaluate statistically how well groups of samples cluster together; identify sets of peptides that have consistent phosphorylation patterns among groups of samples; perform hierarchical clustering analysis with bootstrapping; view false negative probabilities and positive and negative predictive values for t-tests between pairs of samples; easily assess experimental reproducibility; and visualize the data using volcano plots, scatterplots, and interactive three-dimensional principal component analyses. Also new in PIIKA 2 is a web-based interface, which allows users unfamiliar with command-line tools to easily provide input and download the results. Collectively, the additions and improvements described here enhance both the breadth and depth of analyses available, simplify the user interface, and make the software an even more valuable tool for the analysis of kinome microarray data. Both the web-based and stand-alone versions of PIIKA 2 can be accessed via http://saphire.usask.ca.

  6. Brightness of Solar Magnetic Elements As a Function of Magnetic Flux at High Spatial Resolution

    Science.gov (United States)

    Kahil, F.; Riethmüller, T. L.; Solanki, S. K.

    2017-03-01

    We investigate the relationship between the photospheric magnetic field of small-scale magnetic elements in the quiet-Sun (QS) at disk center and the brightness at 214, 300, 313, 388, 397, and 525.02 nm. To this end, we analyzed spectropolarimetric and imaging time series acquired simultaneously by the Imaging Magnetograph eXperiment magnetograph and the SuFI filter imager on board the balloon-borne observatory {{S}}{{UNRISE}} during its first science flight in 2009, with high spatial and temporal resolution. We find a clear dependence of the contrast in the near ultraviolet and the visible on the line-of-sight component of the magnetic field, B LOS, which is best described by a logarithmic model. This function effectively represents the relationship between the Ca ii H-line emission and B LOS and works better than the power-law fit adopted by previous studies. This, along with the high contrast reached at these wavelengths, will help with determining the contribution of small-scale elements in the QS to the irradiance changes for wavelengths below 388 nm. At all wavelengths, including the continuum at 525.40 nm, the intensity contrast does not decrease with increasing B LOS. This result also strongly supports the fact that {{S}}{{UNRISE}} has resolved small strong magnetic field elements in the internetwork, resulting in constant contrasts for large magnetic fields in our continuum contrast at 525.40 nm versus the B LOS scatterplot, unlike the turnover obtained in previous observational studies. This turnover is due to the intermixing of the bright magnetic features with the dark intergranular lanes surrounding them.

  7. Temperature extremes in Western Europe and associated atmospheric anomalies

    Science.gov (United States)

    Carvalho, V. A.; Santos, J. A.

    2009-09-01

    This worḱs focal point is the analysis of temperature extremes over Western Europe in the period 1957-2007 and their relationship to large-scale anomalies in the atmospheric circulation patterns. The study is based on temperature daily time series recorded at a set of meteorological stations covering the target area. The large-scale anomalies are analyzed using data from the National Centers for Environmental Prediction reanalysis project. Firstly, a preliminary statistical analysis was undertaken in order to identify data gaps and erroneous values and to check the homogeneity of the time series, using not only elementary statistical approaches (e.g., chronograms, box-plots, scatter-plots), but also a set of non-parametric statistical tests particularly suitable for the analysis of monthly and seasonal mean temperature time series (e.g., Wald-Wolfowitz serial correlation test, Spearman and Mann-Kendall trend tests). Secondly, based on previous results, a selection of the highest quality time series was carried out. Aiming at identifying temperature extremes, we then proceed to the isolation of months with temperature values above or below pre-selected thresholds based on the empirical distribution of each time series. In particular, thresholds are based on percentiles specifically computed for each individual temperature record (data adaptive) and not on fixed values. As a result, a calendar of extremely high and extremely low monthly mean temperatures is obtained and the large-scale atmospheric conditions during each extreme are analyzed. Several atmospheric fields are considered in this study (e.g., 2-m maximum and minimum air temperature, sea level pressure, geopotential height, zonal and meridional wind components, vorticity, relative humidity) at different isobaric levels. Results show remarkably different synoptic conditions for temperature extremes in different parts of Western Europe, highlighting the different dynamical mechanisms underlying their

  8. Visualization and exploratory analysis of epidemiologic data using a novel space time information system

    Directory of Open Access Journals (Sweden)

    Kaufmann Andrew M

    2004-11-01

    Full Text Available Abstract Background Recent years have seen an expansion in the use of Geographic Information Systems (GIS in environmental health research. In this field GIS can be used to detect disease clustering, to analyze access to hospital emergency care, to predict environmental outbreaks, and to estimate exposure to toxic compounds. Despite these advances the inability of GIS to properly handle temporal information is increasingly recognised as a significant constraint. The effective representation and visualization of both spatial and temporal dimensions therefore is expected to significantly enhance our ability to undertake environmental health research using time-referenced geospatial data. Especially for diseases with long latency periods (such as cancer the ability to represent, quantify and model individual exposure through time is a critical component of risk estimation. In response to this need a STIS – a Space Time Information System has been developed to visualize and analyze objects simultaneously through space and time. Results In this paper we present a "first use" of a STIS in a case-control study of the relationship between arsenic exposure and bladder cancer in south eastern Michigan. Individual arsenic exposure is reconstructed by incorporating spatiotemporal data including residential mobility and drinking water habits. The unique contribution of the STIS is its ability to visualize and analyze residential histories over different temporal scales. Participant information is viewed and statistically analyzed using dynamic views in which values of an attribute change through time. These views include tables, graphs (such as histograms and scatterplots, and maps. In addition, these views can be linked and synchronized for complex data exploration using cartographic brushing, statistical brushing, and animation. Conclusion The STIS provides new and powerful ways to visualize and analyze how individual exposure and associated

  9. Boundaries of schizoaffective disorder: revisiting Kraepelin.

    Science.gov (United States)

    Kotov, Roman; Leong, Shirley H; Mojtabai, Ramin; Erlanger, Ann C Eckardt; Fochtmann, Laura J; Constantino, Eduardo; Carlson, Gabrielle A; Bromet, Evelyn J

    2013-12-01

    Established nosology identifies schizoaffective disorder as a distinct category with boundaries separating it from mood disorders with psychosis and from schizophrenia. Alternative models argue for a single boundary distinguishing mood disorders with psychosis from schizophrenia (kraepelinian dichotomy) or a continuous spectrum from affective to nonaffective psychosis. To identify natural boundaries within psychotic disorders by evaluating associations between symptom course and long-term outcome. The Suffolk County Mental Health Project cohort consists of first-admission patients with psychosis recruited from all inpatient units of Suffolk County, New York (72% response rate). In an inception cohort design, participants were monitored closely for 4 years after admission, and their symptom course was charted for 526 individuals; 10-year outcome was obtained for 413. Global Assessment of Functioning (GAF) and other consensus ratings of study psychiatrists. We used nonlinear modeling (locally weighted scatterplot smoothing and spline regression) to examine links between 4-year symptom variables (ratio of nonaffective psychosis to mood disturbance, duration of mania/hypomania, depression, and psychosis) and 10-year outcomes. Nonaffective psychosis ratio exhibited a sharp discontinuity-10 days or more of psychosis outside mood episodes predicted an 11-point decrement in GAF-consistent with the kraepelinian dichotomy. Duration of mania/hypomania showed 2 discontinuities demarcating 3 groups: mania absent, episodic mania, and chronic mania (manic/hypomanic >1 year). The episodic group had a better outcome compared with the mania absent and chronic mania groups (12-point and 8-point difference on GAF). Duration of depression and psychosis had linear associations with worse outcome. Our data support the kraepelinian dichotomy, although the study requires replication. A boundary between schizoaffective disorder and schizophrenia was not observed, which casts further doubt

  10. Correlation in Severity Between Glaucoma and Erectile Dysfunction.

    Science.gov (United States)

    Law, Geoffrey; Nathoo, Nawaaz A; Reiner, Ethan; Berkowitz, Jonathan; Warner, Simon J; Mikelberg, Frederick S

    2016-09-01

    To examine the association between open-angle glaucoma and erectile dysfunction (ED), and investigate the correlation in severity between these 2 conditions. Cross-sectional study with patient questionnaire and retrospective chart review. A total of 167 male patients over 40 years of age who attended ophthalmology clinic visits in Vancouver, British Columbia, Canada, participated in the study by providing written consent and responding to the survey. Patients with previous radiation or surgical prostate treatment were excluded, leaving final sample sizes of 61 glaucoma patients and 67 control patients. Presence and severity of ED was determined using a validated patient questionnaire (the International Index of Erectile Function questionnaire). Presence of glaucoma was based on previous clinical diagnosis, and severity was graded based on visual field index using a 30-2 visual field test with the SITA Standard protocol. Bivariate analysis examined the presence of ED in glaucoma patients versus controls. Risk factors including dyslipidemia, diabetes, hypertension, and smoking were adjusted for using multiple logistic regression. The association between glaucoma and ED severity was assessed with correlation and scatterplot analysis. Glaucoma was found to be a significant risk factor for ED in our population, with an odds ratio of 2.58 (95% confidence interval, 1.15-5.83). Severity of glaucoma and ED were significantly correlated (r=0.365, P=0.007). Our results demonstrate that there is a positive association between the presence of ED and the diagnosis of glaucoma and a positive association between the severity of ED and the severity of glaucoma.

  11. Assessment of the Second Mesiobuccal Root Canal in Maxillary First Molars: A Cone-beam Computed Tomographic Study.

    Science.gov (United States)

    Zhang, Yuerong; Xu, Hai; Wang, Dongmiao; Gu, Yongchun; Wang, Juan; Tu, Shuzhen; Qiu, Xiaohui; Zhang, Fuyu; Luo, Yao; Xu, Shi; Bai, Jianling; Simone, Grandini; Zhang, Guangdong

    2017-12-01

    The purpose of this study was to investigate the incidence and location of the second mesiobuccal (MB2) root canal of the maxillary first molar and the relationship between the presence of an MB2 canal and the distribution of canal orifices on the pulpal floor with the aid of cone-beam computed tomographic (CBCT) technology. A total of 1008 maxillary first molars (548 patients) were randomly selected and analyzed through CBCT imaging. The association between the incidence of MB2 canals and potential impacting factors including sex, side, age, and the distribution of the main root canal orifices on the pulpal floor was explored. The interorifice distances (ie, the length of a line between the center point of any 2 orifices) at the pulpal floor level were measured using Mimics 10.01 software (ImageWorks, Materialise, Belgium). The majority of 3-rooted maxillary first molars showed 2 root canals (85.4%) in the mesiobuccal root. The incidence of MB2 canals had no statistically significant difference between the left and right sides (P > .05) but had a significant association with the patients' sex and age (P 1.26) indicated a highly probable existence of an MB2 canal. In this study, no molar presented an MB2 canal with a distance ratio of less than 1.16, whereas all molars with a ratio greater than 1.37 presented an MB2 canal without exception. A Bland-Altman scatterplot showed great agreement between the distances of the main mesiobuccal and the distobuccal canal orifices and the second mesiobuccal and the distobuccal canal orifices. Understanding the incidence of MB2 canals and the distribution pattern of canal orifices on the pulpal floor may help clinicians to quickly identify and locate MB2 canals. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  12. Semiautomated and automated algorithms for analysis of the carotid artery wall on computed tomography and sonography: a correlation study.

    Science.gov (United States)

    Saba, Luca; Tallapally, Niranjan; Gao, Hao; Molinari, Filippo; Anzidei, Michele; Piga, Mario; Sanfilippo, Roberto; Suri, Jasjit S

    2013-04-01

    The purpose of this study was to compare automated and semiautomated algorithms for analysis of carotid artery wall thickness and intima-media thickness on multidetector row computed tomographic (CT) angiography and sonography, respectively, and to study the correlation between them. Twenty consecutive patients underwent multidetector row CT angiographic and sonographic analysis of carotid arteries (mean age, 66 years; age range, 59-79 years). The intima-media thickness of the 40 carotid arteries was measured with novel and dedicated automated software analysis and by 4 observers who manually calculated the intima-media thickness. The carotid artery wall thickness was automatically estimated by using a specific algorithm and was also semiautomatically quantified. The correlation between groups was calculated by using the Pearson ρ statistic, and scatterplots were calculated. We evaluated intermethod agreement using Bland-Altman analysis. By comparing automated carotid artery wall thickness, automated intima-media thickness, semiautomated carotid artery wall thickness, and semiautomated intima-media thickness analyses, a statistically significant association was found, with the highest values obtained for the association between semiautomated and automated intima-media thickness analyses(Pearson ρ = 0.9; 95% confidence interval, 0.82-0.95; P = 0.0001). The lowest values were obtained for the association between semiautomated intima-media thickness and automated carotid artery wall thickness analyses (Pearson ρ = 0.44; 95% confidence interval, 0.15-0.66; P = 0.0047). In the Bland-Altman analysis, the better results were obtained by comparing the semiautomated and automated algorithms for the study of intima-media thickness, with an interval of -16.1% to +43.6%. The results of this preliminary study showed that carotid artery wall thickness and intima-media thickness can be studied with automated software, although the CT analysis needs to be further improved.

  13. Validating hospital antibiotic purchasing data as a metric of inpatient antibiotic use.

    Science.gov (United States)

    Tan, Charlie; Ritchie, Michael; Alldred, Jason; Daneman, Nick

    2016-02-01

    Antibiotic purchasing data are a widely used, but unsubstantiated, measure of antibiotic consumption. To validate this source, we compared purchasing data from hospitals and external medical databases with patient-level dispensing data. Antibiotic purchasing and dispensing data from internal hospital records and purchasing data from IMS Health were obtained for two hospitals between May 2013 and April 2015. Internal purchasing data were validated against dispensing data, and IMS data were compared with both internal metrics. Scatterplots of individual antimicrobial data points were generated; Pearson's correlation and linear regression coefficients were computed. A secondary analysis re-examined these correlations over shorter calendar periods. Internal purchasing data were strongly correlated with dispensing data, with correlation coefficients of 0.90 (95% CI = 0.83-0.95) and 0.98 (95% CI = 0.95-0.99) at hospitals A and B, respectively. Although dispensing data were consistently lower than purchasing data, this was attributed to a single antibiotic at both hospitals. IMS data were favourably correlated with, but underestimated, internal purchasing and dispensing data. This difference was accounted for by eight antibiotics for which direct sales from some manufacturers were not included in the IMS database. The correlation between purchasing and dispensing data was consistent across periods as short as 3 months, but not at monthly intervals. Both internal and external antibiotic purchasing data are strongly correlated with dispensing data. If outliers are accounted for appropriately, internal purchasing data could be used for cost-effective evaluation of antimicrobial stewardship programmes, and external data sets could be used for surveillance and research across geographical regions. © The Author 2015. Published by Oxford University Press on behalf of the British Society for Antimicrobial Chemotherapy. All rights reserved. For Permissions, please e

  14. Nutrient, organic carbon, and chloride concentrations and loads in selected Long Island Sound tributaries—Four decades of change following the passage of the Federal Clean Water Act

    Science.gov (United States)

    Mullaney, John R.

    2016-03-10

    Trends in long-term water-quality and streamflow data from 14 water-quality monitoring sites in Connecticut were evaluated for water years 1974–2013 and 2001–13, coinciding with implementation of the Clean Water Act of 1972 and the Connecticut Nitrogen Credit Exchange program, as part of an assessment of nutrient and chloride concentrations and loads discharged to Long Island Sound. In this study, conducted by the U.S. Geological Survey in cooperation with the Connecticut Department of Energy and Environmental Protection, data were evaluated using a recently developed methodology of weighted regressions with time, streamflow, and season. Trends in streamflow were evaluated using a locally weighted scatterplot smoothing method. Annual mean streamflow increased at 12 of the 14 sites averaging 8 percent during the entire study period, primarily in the summer months, and increased by an average of 9 percent in water years 2001–13, primarily during summer and fall months. Downward trends in flow-normalized nutrient concentrations and loads were observed during both periods for most sites for total nitrogen, total Kjeldahl nitrogen, nitrite plus nitrate nitrogen, total phosphorus, and total organic carbon. Average flow-normalized loads of total nitrogen decreased by 23.9 percent for the entire period and 10.9 percent for the period of water years 2001‒13. Major factors contributing to decreases in flow-normalized loads and concentrations of these nutrients include improvements in wastewater treatment practices, declining atmospheric wet deposition of nitrogen, and changes in land management and land use.

  15. Environmental variations drive polyploid evolution in neotropical Eugenia species (Myrtaceae).

    Science.gov (United States)

    Silveira, R M; Machado, R M; Forni-Martins, E R; Verola, C F; Costa, I R

    2016-10-24

    Polyploidy is one of the most important mechanisms of speciation and diversification in plant evolution. Polyploidy results in genetic variation among individuals of the same species and even between populations, and may be responsible for differences in environmental tolerance between populations of the same species. This study determined chromosome numbers of Eugenia L. (Myrtaceae, x = 11) for 26 populations of 14 species by conventional cytogenetic techniques. Nine species (13 populations) were diploid (2n = 2x = 22), but diploid and/or polyploid cytotypes were found in the other five species (13 populations), with 2n = 33, 2n = 44, and 2n = 55. Data on chromosome number/ploidy level for other Eugenia species/populations were collected from the literature and included in this cytogeographic analysis. For each collection point (32 species and 62 populations), environmental variables were recorded using georeferencing techniques through the DIVA-GIS v.7.5 program. Environmental variables such as temperature, altitude, rainfall, solar radiation, soil type, and vegetation were analyzed with the R program, using Mann-Whitney and chi-square tests, principal component analysis, and graphic analyses, such as scatterplots, boxplots, and barplot. Polyploid and diploid populations had different spatial distribution patterns and were found in areas subjected to different environmental conditions. Polyploid individuals were collected from locations with more adverse environmental conditions, usually at higher elevations than the diploid individuals. Polyploidy allows species to occur at locations with varying environmental conditions. As diploidy and polyploidy occur under different environmental conditions, species with cytotypes exhibit wide environmental tolerance.

  16. Hippocampal volumes are important predictors for memory function in elderly women

    Directory of Open Access Journals (Sweden)

    Adolfsdottir Steinunn

    2009-08-01

    Full Text Available Abstract Background Normal aging involves a decline in cognitive function that has been shown to correlate with volumetric change in the hippocampus, and with genetic variability in the APOE-gene. In the present study we utilize 3D MR imaging, genetic analysis and assessment of verbal memory function to investigate relationships between these factors in a sample of 170 healthy volunteers (age range 46–77 years. Methods Brain morphometric analysis was performed with the automated segmentation work-flow implemented in FreeSurfer. Genetic analysis of the APOE genotype was determined with polymerase chain reaction (PCR on DNA from whole-blood. All individuals were subjected to extensive neuropsychological testing, including the California Verbal Learning Test-II (CVLT. To obtain robust and easily interpretable relationships between explanatory variables and verbal memory function we applied the recent method of conditional inference trees in addition to scatterplot matrices and simple pairwise linear least-squares regression analysis. Results APOE genotype had no significant impact on the CVLT results (scores on long delay free recall, CVLT-LD or the ICV-normalized hippocampal volumes. Hippocampal volumes were found to decrease with age and a right-larger-than-left hippocampal asymmetry was also found. These findings are in accordance with previous studies. CVLT-LD score was shown to correlate with hippocampal volume. Multivariate conditional inference analysis showed that gender and left hippocampal volume largely dominated predictive values for CVLT-LD scores in our sample. Left hippocampal volume dominated predictive values for females but not for males. APOE genotype did not alter the model significantly, and age was only partly influencing the results. Conclusion Gender and left hippocampal volumes are main predictors for verbal memory function in normal aging. APOE genotype did not affect the results in any part of our analysis.

  17. Exploring the seasonality of reported treated malaria cases in Mpumalanga, South Africa.

    Directory of Open Access Journals (Sweden)

    Sheetal Prakash Silal

    Full Text Available South Africa, having met the World Health Organisation's pre-elimination criteria, has set a goal to achieve malaria elimination by 2018. Mpumalanga, one of three provinces where malaria transmission still occurs, has a malaria season subject to unstable transmission that is prone to sporadic outbreaks. As South Africa prepares to intensify efforts towards malaria elimination, there is a need to understand patterns in malaria transmission so that efforts may be targeted appropriately. This paper describes the seasonality of transmission by exploring the relationship between malaria cases and three potential drivers: rainfall, geography (physical location and the source of infection (local/imported. Seasonal decomposition of the time series by Locally estimated scatterplot smoothing is applied to the case data for the geographical and source of infection sub-groups. The relationship between cases and rainfall is assessed using a cross-correlation analysis. The malaria season was found to have a short period of no/low level of reported cases and a triple peak in reported cases between September and May; the three peaks occurring in October, January and May. The seasonal pattern of locally-sourced infection mimics the triple-peak characteristic of the total series while imported infections contribute mostly to the second and third peak of the season (Christmas and Easter respectively. Geographically, Bushbuckridge municipality, which exhibits a different pattern of cases, contributed mostly to the first and second peaks in cases while Maputo province (Mozambique experienced a similar pattern in transmission to the imported cases. Though rainfall lagged at 4 weeks was significantly correlated with malaria cases, this effect was dampened due to the growing proportion of imported cases since 2006. These findings may be useful as they enhance the understanding of the current incidence pattern and may inform mathematical models that enable one to

  18. Tweeting PP: an analysis of the 2015-2016 Planned Parenthood controversy on Twitter.

    Science.gov (United States)

    Han, Leo; Han, Lisa; Darney, Blair; Rodriguez, Maria I

    2017-12-01

    We analyzed Twitter tweets and Twitter-provided user data to give geographical, temporal and content insight into the use of social media in the Planned Parenthood video controversy. We randomly sampled the full Twitter repository (also known as the Firehose) (n=30,000) for tweets containing the phrase "planned parenthood" as well as group-defining hashtags "#defundpp" and "#standwithpp." We used demographic content provided by the user and word analysis to generate charts, maps and timeline visualizations. Chi-square and t tests were used to compare differences in content, statistical references and dissemination strategies. From July 14, 2015, to January 30, 2016, 1,364,131 and 795,791 tweets contained "#defundpp" and "#standwithpp," respectively. Geographically, #defundpp and #standwithpp were disproportionally distributed to the US South and West, respectively. Word analysis found that early tweets predominantly used "sensational" words and that the proportion of "political" and "call to action" words increased over time. Scatterplots revealed that #standwithpp tweets were clustered and episodic compared to #defundpp. #standwithpp users were more likely to be female [odds ratio (OR) 2.2, confidence interval (CI) 2.0-2.4] and have fewer followers (median 544 vs. 1578, pusers were more likely to link to websites (OR 1.8, CI 1.7-1.9) and to other online dialogs (mean 3.3 vs. 2.0 presearch may inform proabortion efforts in terms of how information can be more effectively conveyed to the public. This study has implications for how the medical community interfaces with the public with regards to abortion. It highlights how social media are actively exploited instruments for information and message dissemination. Researchers, providers and advocates should be monitoring social media and addressing the public through these modern channels. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Climatology of aerosol optical depth in North-Central Oklahoma: 1992-2008

    Energy Technology Data Exchange (ETDEWEB)

    Michalsky, J.; Schwartz, S.; Denn, F.; Flynn, C.; Hodges, G.; Kiedron, P.; Koontz, A.; Schlemmer, J., and Schwartz, S. E

    2010-04-01

    Aerosol optical depth (AOD) has been measured at the Atmospheric Radiation Measurement Program central facility near Lamont, Oklahoma, since the fall of 1992. Most of the data presented are from the multifilter rotating shadowband radiometer, a narrow-band, interference-filter Sun radiometer with five aerosol bands in the visible and near infrared; however, AOD measurements have been made simultaneously and routinely at the site by as many as three different types of instruments, including two pointing Sun radiometers. Scatterplots indicate high correlations and small biases consistent with earlier comparisons. The early part of this 16 year record had a disturbed stratosphere with residual Mt. Pinatubo aerosols, followed by the cleanest stratosphere in decades. As such, the last 13 years of the record reflect changes that have occurred predominantly in the troposphere. The field calibration technique is briefly described and compared to Langley calibrations from Mauna Loa Observatory. A modified cloud-screening technique is introduced that increases the number of daily averaged AODs retrieved annually to about 250 days compared with 175 days when a more conservative method was employed in earlier studies. AODs are calculated when the air mass is less than six; that is, when the Sun's elevation is greater than 9.25{sup o}. The more inclusive cloud screen and the use of most of the daylight hours yield a data set that can be used to more faithfully represent the true aerosol climate for this site. The diurnal aerosol cycle is examined month-by-month to assess the effects of an aerosol climatology on the basis of infrequent sampling such as that from satellites.

  20. Climatology of aerosol optical depth in north-central Oklahoma: 1992–2008

    Energy Technology Data Exchange (ETDEWEB)

    Michalsky, Joseph; Denn, Frederick; Flynn, Connor; Hodges, Gary; Kiedron, Piotr; Koontz, Annette; Schlemmer, James; Schwartz, Stephen E.

    2010-04-13

    Aerosol optical depth (AOD) has been measured at the Atmospheric Radiation Measurement Program central facility near Lamont, Oklahoma, since the fall of 1992. Most of the data presented are from the multifilter rotating shadowband radiometer, a narrow-band, interference-filter Sun radiometer with five aerosol bands in the visible and near infrared; however, AOD measurements have been made simultaneously and routinely at the site by as many as three different types of instruments, including two pointing Sun radiometers. Scatterplots indicate high correlations and small biases consistent with earlier comparisons. The early part of this 16 year record had a disturbed stratosphere with residual Mt. Pinatubo aerosols, followed by the cleanest stratosphere in decades. As such, the last 13 years of the record reflect changes that have occurred predominantly in the troposphere. The field calibration technique is briefly described and compared to Langley calibrations from Mauna Loa Observatory. A modified cloudscreening technique is introduced that increases the number of daily averaged AODs retrieved annually to about 250 days compared with 175 days when a more conservative method was employed in earlier studies. AODs are calculated when the air mass is less than six; that is, when the Sun’s elevation is greater than 9.25°. The more inclusive cloud screen and the use of most of the daylight hours yield a data set that can be used to more faithfully represent the true aerosol climate for this site. The diurnal aerosol cycle is examined month-by-month to assess the effects of an aerosol climatology on the basis of infrequent sampling such as that from satellites.

  1. The Topology ToolKit.

    Science.gov (United States)

    Tierny, Julien; Favelier, Guillaume; Levine, Joshua A; Gueunet, Charles; Michaux, Michael

    2017-08-29

    This system paper presents the Topology ToolKit (TTK), a software platform designed for the topological analysis of scalar data in scientific visualization. While topological data analysis has gained in popularity over the last two decades, it has not yet been widely adopted as a standard data analysis tool for end users or developers. TTK aims at addressing this problem by providing a unified, generic, efficient, and robust implementation of key algorithms for the topological analysis of scalar data, including: critical points, integral lines, persistence diagrams, persistence curves, merge trees, contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots, Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due to a tight integration with ParaView. It is also easily accessible to developers through a variety of bindings (Python, VTK/C++) for fast prototyping or through direct, dependency-free, C++, to ease integration into pre-existing complex systems. While developing TTK, we faced several algorithmic and software engineering challenges, which we document in this paper. In particular, we present an algorithm for the construction of a discrete gradient that complies to the critical points extracted in the piecewise-linear setting. This algorithm guarantees a combinatorial consistency across the topological abstractions supported by TTK, and importantly, a unified implementation of topological data simplification for multi-scale exploration and analysis. We also present a cached triangulation data structure, that supports time efficient and generic traversals, which self-adjusts its memory usage on demand for input simplicial meshes and which implicitly emulates a triangulation for regular grids with no memory overhead. Finally, we describe an original software architecture, which guarantees memory efficient and direct accesses to TTK features, while still allowing for researchers powerful and easy bindings and extensions

  2. Acute Renal Failure Secondary to Rhabdomyolysis as a Complication of Major Urological Surgery: The Experience of a High-Volume Urological Center.

    Science.gov (United States)

    De Gracia-Nieto, Armando E; Angerri, Oriol; Bover, Jordi; Salas, Daniel; Villamizar, Juan Manuel; Villavicencio, Humberto

    The aim of this study was to determine the incidence of acute renal failure secondary to rhabdomyolysis (ARFSR) as a complication of major urological surgery (MUS), as well as to describe the clinical characteristics and identify possible risk and protective factors. Cases of ARFSR due to MUS between January 1997 and August 2011 were identified using the institutional database. The incidence was estimated and the clinical characteristics were analyzed using simple scatterplot graphs to identify possible risk and protective factors. In this period, 14,337 MUS procedures were performed, in which 4 cases suffered from ARFSR (the incidence rate was 0.03%). The incidence rates after radical cystectomy and urethroplasty were 0.26% (3/1,175 cases) and 0.15% (1/651 cases), respectively. No case of rhabdomyolysis was reported among the patients who underwent other major surgical procedures. Two patients required dialysis, and all 4 patients recovered to their baseline renal function at an average of 11 days (7-17) with the appropriate treatment. Male gender, younger age, lower ASA score, prolonged operative time, high body mass index, elevated preoperative serum creatinine and estimated blood loss were possible risk factors for developing ARFSR due to MUS. We found that a higher intraoperative administered volume was a possible protective factor. The operative position and type of surgery seemed to play minor roles. Early diagnosis and treatment possibly leads to an improved outcome. In our study, ARFSR due to MUS was a rare entity and had a good prognosis. It was more frequent as a complication of radical cystectomy. Further studies are required to confirm our findings. © 2016 S. Karger AG, Basel.

  3. The mitochondrial DNA makeup of Romanians: A forensic mtDNA control region database and phylogenetic characterization.

    Science.gov (United States)

    Turchi, Chiara; Stanciu, Florin; Paselli, Giorgia; Buscemi, Loredana; Parson, Walther; Tagliabracci, Adriano

    2016-09-01

    To evaluate the pattern of Romanian population from a mitochondrial perspective and to establish an appropriate mtDNA forensic database, we generated a high-quality mtDNA control region dataset from 407 Romanian subjects belonging to four major historical regions: Moldavia, Transylvania, Wallachia and Dobruja. The entire control region (CR) was analyzed by Sanger-type sequencing assays and the resulting 306 different haplotypes were classified into haplogroups according to the most updated mtDNA phylogeny. The Romanian gene pool is mainly composed of West Eurasian lineages H (31.7%), U (12.8%), J (10.8%), R (10.1%), T (9.1%), N (8.1%), HV (5.4%),K (3.7%), HV0 (4.2%), with exceptions of East Asian haplogroup M (3.4%) and African haplogroup L (0.7%). The pattern of mtDNA variation observed in this study indicates that the mitochondrial DNA pool is geographically homogeneous across Romania and that the haplogroup composition reveals signals of admixture of populations of different origin. The PCA scatterplot supported this scenario, with Romania located in southeastern Europe area, close to Bulgaria and Hungary, and as a borderland with respect to east Mediterranean and other eastern European countries. High haplotype diversity (0.993) and nucleotide diversity indices (0.00838±0.00426), together with low random match probability (0.0087) suggest the usefulness of this control region dataset as a forensic database in routine forensic mtDNA analysis and in the investigation of maternal genetic lineages in the Romanian population. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  4. Effect of organ enhancement and habitus on estimation of unenhanced attenuation at contrast-enhanced dual-energy MDCT: concepts for individualized and organ-specific spectral iodine subtraction strategies.

    Science.gov (United States)

    Miller, Chad M; Gupta, Rajan T; Paulson, Erik K; Neville, Amy M; Bashir, Mustafa R; Merkle, Elmar M; Boll, Daniel T

    2011-05-01

    The purpose of this study was to assess whether habitus and organ enhancement influence iodine subtraction and should be incorporated into spectral subtraction algorithms. This study included 171 patients. In the unenhanced phase, MDCT was performed with single-energy acquisition (120 kVp, 250 mAs) and in the parenchymal phase with dual-energy acquisitions (80 kVp, 499 mAs; 140 kVp, 126 mAs). Habitus was determined by measuring trunk diameters and calculating circumference. Iodine subtraction was performed with input parameters individualized to muscle, fat, and blood ratio. Attenuation of the liver, pancreas, spleen, kidneys, and aorta was assessed in truly and virtually unenhanced image series. Pearson analysis was performed to correlate habitus with the input parameters. Analysis of truly unenhanced and virtually unenhanced images was performed with the Student t test; magnitude of variation was evaluated with Bland-Altman plots. Correction strategies were derived from organ-specific regression analysis of scatterplots of truly unenhanced and virtually unenhanced attenuation and implemented in a pixel-by-pixel approach. Analysis of individual organ correction and truly unenhanced attenuation was performed with the Student t test. The correlations between habitus and blood ratio (r = 0.694) and attenuation variation of fat at 80 kVp (r = -0.468) and 140 kV (r = -0.454) were confirmed. Although overall mean attenuation differed by no more than 10 HU between truly and virtually unenhanced scans overall, these differences varied by organ and were large in individual patients. Paired comparisons of truly and virtually unenhanced measurements differed significantly for liver, spleen, pancreas, kidneys, and aortic blood pool (p habitus-based correction strategies were applied (p > 0.38 for all comparisons). Habitus and organ enhancement influence virtually unenhanced imaging and should be incorporated into spectral subtraction algorithms.

  5. Retrievals of aerosol optical depth and total column ozone from Ultraviolet Multifilter Rotating Shadowband Radiometer measurements based on an optimal estimation technique

    Science.gov (United States)

    Liu, Chaoshun; Chen, Maosi; Shi, Runhe; Gao, Wei

    2014-12-01

    A Bayesian optimal estimation (OE) retrieval technique was used to retreive aerosol optical depth (AOD), aerosol single scattering albedo (SSA), and an asymmetry factor ( g) at seven ultraviolet wavelengths, along with total column ozone (TOC), from the measurements of the UltraViolet Multifilter Rotating Shadowband Radiometer (UV-MFRSR) deployed at the Southern Great Plains (SGP) site during March through November in 2009. The OE technique specifies appropriate error covariance matrices and optimizes a forward model (Tropospheric ultraviolet radiative transfer model, TUV), and thus provides a supplemental method for use across the network of the Department of Agriculture UV-B Monitoring and Research Program (USDA UVMRP) for the retrieval of aerosol properties and TOC with reasonable accuracy in the UV spectral range under various atmospheric conditions. In order to assess the accuracy of the OE technique, we compared the AOD retreivals from this method with those from Beer's Law and the AErosol RObotic Network (AERONET) AOD product. We also examine the OE retrieved TOC in comparison with the TOC from the U.S. Department of Agriculture UV-B Monitoring and Research Program (USDA UVMRP) and the Ozone Monitoring Instrument (OMI) satellite data. The scatterplots of the estimated AOD from the OE method agree well with those derived from Beer's law and the collocated AERONETAOD product, showing high values of correlation coefficients, generally 0.98 and 0.99, and large slopes, ranging from 0.95 to 1.0, as well as small offsets, less than 0.02 especially at 368 nm. The comparison of TOC retrievals also indicates the promising accuracy of the OE method in that the standard deviations of the difference between the OE derived TOC and other TOC products are about 5 to 6 Dobson Units (DU). Validation of the OE retrievals on these selected dates suggested that the OE technique has its merits and can serve as a supplemental tool in further analyzing UVMRP data.

  6. Climate and streamflow characteristics for selected streamgages in eastern South Dakota, water years 1945–2013

    Science.gov (United States)

    Hoogestraat, Galen K.; Stamm, John F.

    2015-11-02

    Upward trends in precipitation and streamflow have been observed in the northeastern Missouri River Basin during the past century, including the area of eastern South Dakota. Some of the identified upward trends were anomalously large relative to surrounding parts of the northern Great Plains. Forcing factors for streamflow trends in eastern South Dakota are not well understood, and it is not known whether streamflow trends are driven primarily by climatic changes or various land-use changes. Understanding the effects that climate (specifically precipitation and temperature) has on streamflow characteristics within a region will help to better understand additional factors such as land-use alterations that may affect the hydrology of the region. To aid in this understanding, a study was completed by the U.S. Geological Survey, in cooperation with the East Dakota Water Development District and James River Water Development District, to assess trends in climate and streamflow characteristics at 10 selected streamgages in eastern South Dakota for water years (WYs) 1945–2013 (69 years) and WYs 1980–2013 (34 years). A WY is the 12-month period, October 1 through September 30, and is designated by the calendar year in which it ends. One streamgage is on the Whetstone River, a tributary to the Minnesota River, and the other streamgages are in the James, Big Sioux, and Vermillion River Basins. The watersheds for two of the James River streamgages extend into North Dakota, and parts of the watersheds for two of the Big Sioux River streamgages extend into Minnesota and Iowa. The objectives of this study were to document trends in streamflow and precipitation in these watersheds, and characterize the residual streamflow variability that might be attributed to factors other than precipitation. Residuals were computed as the departure from a locally-weighted scatterplot smoothing (LOWESS) model. Significance of trends was based on the Mann-Kendall nonparametric test at a 0

  7. Increased plasma donepezil concentration improves cognitive function in patients with dementia with Lewy bodies: An exploratory pharmacokinetic/pharmacodynamic analysis in a phase 3 randomized controlled trial.

    Science.gov (United States)

    Mori, Etsuro; Ikeda, Manabu; Nakai, Kenya; Miyagishi, Hideaki; Nakagawa, Masaki; Kosaka, Kenji

    2016-07-15

    To investigate whether increasing plasma donepezil concentration further improves cognitive function and neuropsychiatric symptoms without compromising safety in patients with dementia with Lewy bodies (DLB). We analyzed data from a 12-week phase 3 trial of donepezil (5 and 10mg/day) in patients with DLB. The contribution of factors affecting plasma donepezil concentration was evaluated using multivariate regression analysis. The relationships between plasma donepezil concentration and efficacy (cognitive function as measured by the Mini-Mental State Examination [MMSE], hallucinations and cognitive fluctuation), or safety (blood pressure, pulse rate, body weight, and parkinsonism as measured by the Unified Parkinson's Disease Rating Scale part III) were assessed by scatterplots and Pearson correlation. The data of 87 patients were used in the analyses. Plasma donepezil concentration increased proportionally with increasing dose from 5 to 10mg/day. The dose (contribution rate: 0.39, p<0.0001) and age (contribution rate: 0.12, p=0.0003) were statistically significant contributing factors affecting plasma donepezil concentration. Plasma donepezil concentration correlated significantly with improvement of MMSE score (p=0.040), but no significant correlations were found with the change in other tested parameters. Plasma donepezil concentration correlated positively with change in cognitive function without affecting safety, and was affected mainly by dose and to a lesser extent by age. Therefore, for patients in whom safety concerns are not found at donepezil 5mg/day, increasing the dose to 10mg/day to increase plasma concentration is worthwhile to further improve cognitive function. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  8. Small-area analysis of social inequalities in residential exposure to road traffic noise in Marseilles, France.

    Science.gov (United States)

    Bocquier, Aurélie; Cortaredona, Sébastien; Boutin, Céline; David, Aude; Bigot, Alexis; Chaix, Basile; Gaudart, Jean; Verger, Pierre

    2013-08-01

    Few studies have focused on the social inequalities associated with environmental noise despite its significant potential health effects. This study analysed the associations between area socio-economic status (SES) and potential residential exposure to road traffic noise at a small-area level in Marseilles, second largest city in France. We calculated two potential road noise exposure indicators (PNEI) at the census block level (for 24-h and night periods), with the noise propagation prediction model CadnaA. We built a deprivation index from census data to estimate SES at the census block level. Locally estimated scatterplot smoothing diagrams described the associations between this index and PNEIs. Since the extent to which coefficient values vary between standard regression models and spatial methods are sensitive to the specific spatial model, we analysed these associations further with various regression models controlling for spatial autocorrelation and conducted sensitivity analyses with different spatial weight matrices. We observed a non-linear relation between the PNEIs and the deprivation index: exposure levels were highest in the intermediate categories. All the spatial models led to a better fit and more or less pronounced reductions of the regression coefficients; the shape of the relations nonetheless remained the same. Finding the highest noise exposure in midlevel deprivation areas was unexpected, given the general literature on environmental inequalities. It highlights the need to study the diversity of the patterns of environmental inequalities across various economic, social and cultural contexts. Comparative studies of environmental inequalities are needed, between regions and countries, for noise and other pollutants.

  9. Tests of Sunspot Number Sequences: 3. Effects of Regression Procedures on the Calibration of Historic Sunspot Data

    Science.gov (United States)

    Lockwood, M.; Owens, M. J.; Barnard, L.; Usoskin, I. G.

    2016-11-01

    We use sunspot-group observations from the Royal Greenwich Observatory (RGO) to investigate the effects of intercalibrating data from observers with different visual acuities. The tests are made by counting the number of groups [RB] above a variable cut-off threshold of observed total whole spot area (uncorrected for foreshortening) to simulate what a lower-acuity observer would have seen. The synthesised annual means of RB are then re-scaled to the full observed RGO group number [RA] using a variety of regression techniques. It is found that a very high correlation between RA and RB (r_{AB} > 0.98) does not prevent large errors in the intercalibration (for example sunspot-maximum values can be over 30 % too large even for such levels of r_{AB}). In generating the backbone sunspot number [R_{BB}], Svalgaard and Schatten ( Solar Phys., 2016) force regression fits to pass through the scatter-plot origin, which generates unreliable fits (the residuals do not form a normal distribution) and causes sunspot-cycle amplitudes to be exaggerated in the intercalibrated data. It is demonstrated that the use of Quantile-Quantile ("Q-Q") plots to test for a normal distribution is a useful indicator of erroneous and misleading regression fits. Ordinary least-squares linear fits, not forced to pass through the origin, are sometimes reliable (although the optimum method used is shown to be different when matching peak and average sunspot-group numbers). However, other fits are only reliable if non-linear regression is used. From these results it is entirely possible that the inflation of solar-cycle amplitudes in the backbone group sunspot number as one goes back in time, relative to related solar-terrestrial parameters, is entirely caused by the use of inappropriate and non-robust regression techniques to calibrate the sunspot data.

  10. Considering body mass differences, who are the world's strongest women?

    Science.gov (United States)

    Vanderburgh, P M; Dooman, C

    2000-01-01

    Allometric modeling (AM) has been used to determine the world's strongest body mass-adjusted man. Recently, however, AM was shown to demonstrate body mass bias in elite Olympic weightlifting performance. A second order polynomial (2OP) provided a better fit than AM with no body mass bias for men and women. The purpose of this study was to apply both AM and 2OP models to women's world powerlifting records (more a function of pure strength and less power than Olympic lifts) to determine the optimal model approach as well as the strongest body mass-adjusted woman in each event. Subjects were the 36 (9 per event) current women world record holders (as of Nov., 1997) for bench press (BP), deadlift (DL), squat (SQ), and total (TOT) lift (BP + DL + SQ) according to the International Powerlifting Federation (IPF). The 2OP model demonstrated the superior fit and no body mass bias as indicated by the coefficient of variation and residuals scatterplot inspection, respectively, for DL, SQ, and TOT. The AM for these three lifts, however, showed favorable bias toward the middle weight classes. The 2OP and AM yielded an essentially identical fit for BP. Although body mass-adjusted world records were dependent on the model used, Carrie Boudreau (U.S., 56-kg weight class), who received top scores in TOT and DL with both models, is arguably the world's strongest woman overall. Furthermore, although the 2OP model provides a better fit than AM for this elite population, a case can still be made for AM use, particularly in light of theoretical superiority.

  11. Metrics for Identifying Food Security Status and the Population with Potential to Benefit from Nutrition Interventions in the Lives Saved Tool (LiST).

    Science.gov (United States)

    Jackson, Bianca D; Walker, Neff; Heidkamp, Rebecca

    2017-11-01

    Background: The Lives Saved Tool (LiST) uses the poverty head-count ratio at $1.90/d as a proxy for food security to identify the percentage of the population with the potential to benefit from balanced energy supplementation and complementary feeding (CF) interventions, following the approach used for the Lancet 's 2008 series on Maternal and Child Undernutrition. Because much work has been done in the development of food security indicators, a re-evaluation of the use of this indicator was warranted. Objective: The aim was to re-evaluate the use of the poverty head-count ratio at $1.90/d as the food security proxy indicator in LiST. Methods: We carried out a desk review to identify available indicators of food security. We identified 3 indicators and compared them by using scatterplots, Spearman's correlations, and Bland-Altman plot analysis. We generated LiST projections to compare the modeled impact results with the use of the different indicators. Results: There are many food security indicators available, but only 3 additional indicators were identified with the data availability requirements to be used as the food security indicator in LiST. As expected, analyzed food security indicators were significantly positively correlated ( P food-insecure contexts. Conclusions: There was no single indicator identified that is ideal for measuring the percentage of the population who is food insecure for LiST. Thus, LiST will use the food security indicators that were used in the meta-analyses that produced the effect estimates. These are the poverty head-count ratio at $1.90/d for CF interventions and the prevalence of a low body mass index in women of reproductive age for balanced energy supplementation interventions. © 2017 American Society for Nutrition.

  12. Preprocessing differential methylation hybridization microarray data

    Directory of Open Access Journals (Sweden)

    Sun Shuying

    2011-05-01

    Full Text Available Abstract Background DNA methylation plays a very important role in the silencing of tumor suppressor genes in various tumor types. In order to gain a genome-wide understanding of how changes in methylation affect tumor growth, the differential methylation hybridization (DMH protocol has been developed and large amounts of DMH microarray data have been generated. However, it is still unclear how to preprocess this type of microarray data and how different background correction and normalization methods used for two-color gene expression arrays perform for the methylation microarray data. In this paper, we demonstrate our discovery of a set of internal control probes that have log ratios (M theoretically equal to zero according to this DMH protocol. With the aid of this set of control probes, we propose two LOESS (or LOWESS, locally weighted scatter-plot smoothing normalization methods that are novel and unique for DMH microarray data. Combining with other normalization methods (global LOESS and no normalization, we compare four normalization methods. In addition, we compare five different background correction methods. Results We study 20 different preprocessing methods, which are the combination of five background correction methods and four normalization methods. In order to compare these 20 methods, we evaluate their performance of identifying known methylated and un-methylated housekeeping genes based on two statistics. Comparison details are illustrated using breast cancer cell line and ovarian cancer patient methylation microarray data. Our comparison results show that different background correction methods perform similarly; however, four normalization methods perform very differently. In particular, all three different LOESS normalization methods perform better than the one without any normalization. Conclusions It is necessary to do within-array normalization, and the two LOESS normalization methods based on specific DMH internal

  13. Evaluating concentration estimation errors in ELISA microarray experiments

    Directory of Open Access Journals (Sweden)

    Anderson Kevin K

    2005-01-01

    Full Text Available Abstract Background Enzyme-linked immunosorbent assay (ELISA is a standard immunoassay to estimate a protein's concentration in a sample. Deploying ELISA in a microarray format permits simultaneous estimation of the concentrations of numerous proteins in a small sample. These estimates, however, are uncertain due to processing error and biological variability. Evaluating estimation error is critical to interpreting biological significance and improving the ELISA microarray process. Estimation error evaluation must be automated to realize a reliable high-throughput ELISA microarray system. In this paper, we present a statistical method based on propagation of error to evaluate concentration estimation errors in the ELISA microarray process. Although propagation of error is central to this method and the focus of this paper, it is most effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization, and statistical diagnostics when evaluating ELISA microarray concentration estimation errors. Results We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of concentration estimation errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error. We summarize the results with a simple, three-panel diagnostic visualization featuring a scatterplot of the standard data with logistic standard curve and 95% confidence intervals, an annotated histogram of sample measurements, and a plot of the 95% concentration coefficient of variation, or relative error, as a function of concentration. Conclusions This statistical method should be of value in the rapid evaluation and quality control of high

  14. Seasonality, water quality trends and biological responses in four streams in the Cairngorm Mountains, Scotland

    Directory of Open Access Journals (Sweden)

    C. Soulsby

    2001-01-01

    Full Text Available The chemical composition and invertebrate communities found in four streams in the Cairngorms, Scotland, were monitored between 1985-1997. Stream waters were mildly acidic (mean pH ca. 6.5, with low alkalinity (mean acid neutralising capacity varying from 35-117 meq l-1 and low ionic strength. Subtle differences in the chemistry of each stream were reflected in their invertebrate faunas. Strong seasonality in water chemistry occurred, with the most acid, low alkalinity waters observed during the winter and early spring. This was particularly marked during snowmelt between January and April. In contrast, summer flows were usually groundwater dominated and characterised by higher alkalinity and higher concentrations of most other weathering-derived solutes. Seasonality was also clear in the invertebrate data, with Canonical Correspondence Analysis (CCA separating seasonal samples along axes related to water temperature and discharge characteristics. Inter-annual hydrological and chemical differences were marked, particularly with respect to the winter period. Invertebrate communities found in each of the streams also varied from year to year, with spring communities significantly more variable (PHydrochemical trends over the study period were analysed using a seasonal Kendall test, LOcally WEighted Scatterplot Smoothing (LOWESS and graphical techniques. These indicated that a reduction in sulphate concentrations in stream water is occurring, consistent with declining levels of atmospheric deposition. This may be matched by increases in pH and declining calcium concentrations, though available evidence is inconclusive. Other parameters, such as chloride, total organic carbon and zinc, reveal somewhat random patterns, probably reflecting irregular variations in climatic factors and/or atmospheric deposition. Previous studies have shown that the stream invertebrate communities have remained stable over this period (i.e. no significant linear trends

  15. Water-quality characteristics and trends for selected sites in or near the Earth Resources Observation Systems (EROS) Data Center, South Dakota, 1973-2000

    Science.gov (United States)

    Neitzert, Kathleen M.

    2004-01-01

    lines for selected constituents are presented for selected surface-water and ground-water sites. Regression analyses using a Lowess (Locally Weighted Scatterplot Smoothing) smoothing line for Split Rock Creek, EROS Lake, the lagoon sites, and the ground-water sites indicated variable results, with some constituents indicating an increasing or decreasing trend, some having varied results, and others indicating no change during the sampling period.

  16. Vcs.js - Visualization Control System for the Web

    Science.gov (United States)

    Chaudhary, A.; Lipsa, D.; Doutriaux, C.; Beezley, J. D.; Williams, D. N.; Fries, S.; Harris, M. B.

    2016-12-01

    VCS is a general purpose visualization library, optimized for climate data, which is part of the UV-CDAT system. It provides a Python API for drawing 2D plots such as lineplots, scatter plots, Taylor diagrams, data colored by scalar values, vector glyphs, isocontours and map projections. VCS is based on the VTK library. Vcs.js is the corresponding JavaScript API, designed to be as close as possible to the original VCS Python API and to provide similar functionality for the Web. Vcs.js includes additional functionality when compared with VCS. This additional API is used to introspect data files available on the server and variables available in a data file. Vcs.js can display plots in the browser window. It always works with a server that reads a data file, extracts variables from the file and subsets the data. From this point, two alternate paths are possible. First the system can render the data on the server using VCS producing an image which is send to the browser to be displayed. This path works for for all plot types and produces a reference image identical with the images produced by VCS. This path uses the VTK-Web library. As an optimization, usable in certain conditions, a second path is possible. Data is packed, and sent to the browser which uses a JavaScript plotting library, such as plotly, to display the data. Plots that work well in the browser are line-plots, scatter-plots for any data and many other plot types for small data and supported grid types. As web technology matures, more plots could be supported for rendering in the browser. Rendering can be done either on the client or on the server and we expect that the best place to render will change depending on the available web technology, data transfer costs, server management costs and value provided to users. We intend to provide a flexible solution that allows for both client and server side rendering and a meaningful way to choose between the two. We provide a web-based user interface called v

  17. Psychosocial family factors and glycemic control among children aged 1-15 years with type 1 diabetes: a population-based survey.

    Science.gov (United States)

    Haugstvedt, Anne; Wentzel-Larsen, Tore; Rokne, Berit; Graue, Marit

    2011-12-20

    Being the parents of children with diabetes is demanding. Jay Belsky's determinants of parenting model emphasizes both the personal psychological resources, the characteristics of the child and contextual sources such as parents' work, marital relations and social network support as important determinants for parenting. To better understand the factors influencing parental functioning among parents of children with type 1 diabetes, we aimed to investigate associations between the children's glycated hemoglobin (HbA1c) and 1) variables related to the parents' psychological and contextual resources, and 2) frequency of blood glucose measurement as a marker for diabetes-related parenting behavior. Mothers (n = 103) and fathers (n = 97) of 115 children younger than 16 years old participated in a population-based survey. The questionnaire comprised the Life Orientation Test, the Oslo 3-item Social Support Scale, a single question regarding perceived social limitation because of the child's diabetes, the Relationship Satisfaction Scale and demographic and clinical variables. We investigated associations by using regression analysis. Related to the second aim hypoglycemic events, child age, diabetes duration, insulin regimen and comorbid diseases were included as covariates. The mean HbA1c was 8.1%, and 29% had HbA1c ≤ 7.5%. In multiple regression analysis, lower HbA1c was associated with higher education and stronger perceptions of social limitation among the mothers. A higher frequency of blood glucose measurement was significantly associated with lower HbA1c in bivariate analysis. Higher child age was significantly associated with higher HbA1c both in bivariate and multivariate analysis. A scatterplot indicated this association to be linear. Most families do not reach recommended treatment goals for their child with type 1 diabetes. Concerning contextual sources of stress and support, the families who successfully reached the treatment goals had mothers with higher

  18. AN OPEN SOURCE GEOVISUAL ANALYTICS TOOLBOX FOR MULTIVARIATE SPATIO-TEMPORAL DATA IN ENVIRONMENTAL CHANGE MODELLING

    Directory of Open Access Journals (Sweden)

    M. Bernasocchi

    2012-07-01

    Full Text Available In environmental change studies, often multiple variables are measured or modelled, and temporal information is essential for the task. These multivariate geographic time-series datasets are often big and difficult to analyse. While many established methods such as PCP (parallel coordinate plots, STC (space-time cubes, scatter-plots and multiple (linked visualisations help provide more information, we observe that most of the common geovisual analytics suits do not include three-dimensional (3D visualisations. However, in many environmental studies, we hypothesize that the addition of 3D terrain visualisations along with appropriate data plots and two-dimensional views can help improve the analysts’ ability to interpret the spatial relevance better. To test our ideas, we conceptualize, develop, implement and evaluate a geovisual analytics toolbox in a user-centred manner. The conceptualization of the tool is based on concrete user needs that have been identified and collected during informal brainstorming sessions and in a structured focus group session prior to the development. The design process, therefore, is based on a combination of user-centred design with a requirement analysis and agile development. Based on the findings from this phase, the toolbox was designed to have a modular structure and was built on open source geographic information systems (GIS program Quantum GIS (QGIS, thus benefiting from existing GIS functionality. The modules include a globe view for 3D terrain visualisation (OSGEarth, a scattergram, a time vs. value plot, and a 3D helix visualisation as well as the possibility to view the raw data. The visualisation frame allows real-time linking of these representations. After the design and development stage, a case study was created featuring data from Zermatt valley and the toolbox was evaluated based on expert interviews. Analysts performed multiple spatial and temporal tasks with the case study using the toolbox

  19. Examining Spectral Reflectance Saturation in Landsat Imagery and Corresponding Solutions to Improve Forest Aboveground Biomass Estimation

    Directory of Open Access Journals (Sweden)

    Panpan Zhao

    2016-06-01

    Full Text Available The data saturation problem in Landsat imagery is well recognized and is regarded as an important factor resulting in inaccurate forest aboveground biomass (AGB estimation. However, no study has examined the saturation values for different vegetation types such as coniferous and broadleaf forests. The objective of this study is to estimate the saturation values in Landsat imagery for different vegetation types in a subtropical region and to explore approaches to improving forest AGB estimation. Landsat Thematic Mapper imagery, digital elevation model data, and field measurements in Zhejiang province of Eastern China were used. Correlation analysis and scatterplots were first used to examine specific spectral bands and their relationships with AGB. A spherical model was then used to quantitatively estimate the saturation value of AGB for each vegetation type. A stratification of vegetation types and/or slope aspects was used to determine the potential to improve AGB estimation performance by developing a specific AGB estimation model for each category. Stepwise regression analysis based on Landsat spectral signatures and textures using grey-level co-occurrence matrix (GLCM was used to develop AGB estimation models for different scenarios: non-stratification, stratification based on either vegetation types, slope aspects, or the combination of vegetation types and slope aspects. The results indicate that pine forest and mixed forest have the highest AGB saturation values (159 and 152 Mg/ha, respectively, Chinese fir and broadleaf forest have lower saturation values (143 and 123 Mg/ha, respectively, and bamboo forest and shrub have the lowest saturation values (75 and 55 Mg/ha, respectively. The stratification based on either vegetation types or slope aspects provided smaller root mean squared errors (RMSEs than non-stratification. The AGB estimation models based on stratification of both vegetation types and slope aspects provided the most

  20. A statistical learning framework for groundwater nitrate models of the Central Valley, California, USA

    Science.gov (United States)

    Nolan, Bernard T.; Fienen, Michael N.; Lorenz, David L.

    2015-12-01

    We used a statistical learning framework to evaluate the ability of three machine-learning methods to predict nitrate concentration in shallow groundwater of the Central Valley, California: boosted regression trees (BRT), artificial neural networks (ANN), and Bayesian networks (BN). Machine learning methods can learn complex patterns in the data but because of overfitting may not generalize well to new data. The statistical learning framework involves cross-validation (CV) training and testing data and a separate hold-out data set for model evaluation, with the goal of optimizing predictive performance by controlling for model overfit. The order of prediction performance according to both CV testing R2 and that for the hold-out data set was BRT > BN > ANN. For each method we identified two models based on CV testing results: that with maximum testing R2 and a version with R2 within one standard error of the maximum (the 1SE model). The former yielded CV training R2 values of 0.94-1.0. Cross-validation testing R2 values indicate predictive performance, and these were 0.22-0.39 for the maximum R2 models and 0.19-0.36 for the 1SE models. Evaluation with hold-out data suggested that the 1SE BRT and ANN models predicted better for an independent data set compared with the maximum R2 versions, which is relevant to extrapolation by mapping. Scatterplots of predicted vs. observed hold-out data obtained for final models helped identify prediction bias, which was fairly pronounced for ANN and BN. Lastly, the models were compared with multiple linear regression (MLR) and a previous random forest regression (RFR) model. Whereas BRT results were comparable to RFR, MLR had low hold-out R2 (0.07) and explained less than half the variation in the training data. Spatial patterns of predictions by the final, 1SE BRT model agreed reasonably well with previously observed patterns of nitrate occurrence in groundwater of the Central Valley.

  1. Characterizing China's energy consumption with selective economic factors and energy-resource endowment: a spatial econometric approach

    Science.gov (United States)

    Jiang, Lei; Ji, Minhe; Bai, Ling

    2015-06-01

    Coupled with intricate regional interactions, the provincial disparity of energy-resource endowment and other economic conditions in China have created spatially complex energy consumption patterns that require analyses beyond the traditional ones. To distill the spatial effect out of the resource and economic factors on China's energy consumption, this study recast the traditional econometric model in a spatial context. Several analytic steps were taken to reveal different aspects of the issue. Per capita energy consumption (AVEC) at the provincial level was first mapped to reveal spatial clusters of high energy consumption being located in either well developed or energy resourceful regions. This visual spatial autocorrelation pattern of AVEC was quantitatively tested to confirm its existence among Chinese provinces. A Moran scatterplot was employed to further display a relatively centralized trend occurring in those provinces that had parallel AVEC, revealing a spatial structure with attraction among high-high or low-low regions and repellency among high-low or low-high regions. By a comparison between the ordinary least square (OLS) model and its spatial econometric counterparts, a spatial error model (SEM) was selected to analyze the impact of major economic determinants on AVEC. While the analytic results revealed a significant positive correlation between AVEC and economic development, other determinants showed some intricate influential patterns. The provinces endowed with rich energy reserves were inclined to consume much more energy than those otherwise, whereas changing the economic structure by increasing the proportion of secondary and tertiary industries also tended to consume more energy. Both situations seem to underpin the fact that these provinces were largely trapped in the economies that were supported by technologies of low energy efficiency during the period, while other parts of the country were rapidly modernized by adopting advanced

  2. Psychosocial family factors and glycemic control among children aged 1-15 years with type 1 diabetes: a population-based survey

    Directory of Open Access Journals (Sweden)

    Haugstvedt Anne

    2011-12-01

    Full Text Available Abstract Background Being the parents of children with diabetes is demanding. Jay Belsky's determinants of parenting model emphasizes both the personal psychological resources, the characteristics of the child and contextual sources such as parents' work, marital relations and social network support as important determinants for parenting. To better understand the factors influencing parental functioning among parents of children with type 1 diabetes, we aimed to investigate associations between the children's glycated hemoglobin (HbA1c and 1 variables related to the parents' psychological and contextual resources, and 2 frequency of blood glucose measurement as a marker for diabetes-related parenting behavior. Methods Mothers (n = 103 and fathers (n = 97 of 115 children younger than 16 years old participated in a population-based survey. The questionnaire comprised the Life Orientation Test, the Oslo 3-item Social Support Scale, a single question regarding perceived social limitation because of the child's diabetes, the Relationship Satisfaction Scale and demographic and clinical variables. We investigated associations by using regression analysis. Related to the second aim hypoglycemic events, child age, diabetes duration, insulin regimen and comorbid diseases were included as covariates. Results The mean HbA1c was 8.1%, and 29% had HbA1c ≤ 7.5%. In multiple regression analysis, lower HbA1c was associated with higher education and stronger perceptions of social limitation among the mothers. A higher frequency of blood glucose measurement was significantly associated with lower HbA1c in bivariate analysis. Higher child age was significantly associated with higher HbA1c both in bivariate and multivariate analysis. A scatterplot indicated this association to be linear. Conclusions Most families do not reach recommended treatment goals for their child with type 1 diabetes. Concerning contextual sources of stress and support, the families who

  3. Optical space weathering on Vesta: Radiative-transfer models and Dawn observations

    Science.gov (United States)

    Blewett, David T.; Denevi, Brett W.; Le Corre, Lucille; Reddy, Vishnu; Schröder, Stefan E.; Pieters, Carle M.; Tosi, Federico; Zambon, Francesca; De Sanctis, Maria Cristina; Ammannito, Eleonora; Roatsch, Thomas; Raymond, Carol A.; Russell, Christopher T.

    2016-02-01

    Exposure to ion and micrometeoroid bombardment in the space environment causes physical and chemical changes in the surface of an airless planetary body. These changes, called space weathering, can strongly influence a surface's optical characteristics, and hence complicate interpretation of composition from reflectance spectroscopy. Prior work using data from the Dawn spacecraft (Pieters, C.M. et al. [2012]. Nature 491, 79-82) found that accumulation of nanophase metallic iron (npFe0), which is a key space-weathering product on the Moon, does not appear to be important on Vesta, and instead regolith evolution is dominated by mixing with carbonaceous chondrite (CC) material delivered by impacts. In order to gain further insight into the nature of space weathering on Vesta, we constructed model reflectance spectra using Hapke's radiative-transfer theory and used them as an aid to understanding multispectral observations obtained by Dawn's Framing Cameras (FC). The model spectra, for a howardite mineral assemblage, include both the effects of npFe0 and that of a mixed CC component. We found that a plot of the 438-nm/555-nm ratio vs. the 555-nm reflectance for the model spectra helps to separate the effects of lunar-style space weathering (LSSW) from those of CC-mixing. We then constructed ratio-reflectance pixel scatterplots using FC images for four areas of contrasting composition: a eucritic area at Vibidia crater, a diogenitic area near Antonia crater, olivine-bearing material within Bellicia crater, and a light mantle unit (referred to as an ;orange patch; in some previous studies, based on steep spectral slope in the visible) northeast of Oppia crater. In these four cases the observed spectral trends are those expected from CC-mixing, with no evidence for weathering dominated by production of npFe0. In order to survey a wider range of surfaces, we also defined a spectral parameter that is a function of the change in 438-nm/555-nm ratio and the 555-nm reflectance

  4. Short-term association between environmental factors and hospital admissions due to dementia in Madrid.

    Science.gov (United States)

    Linares, C; Culqui, D; Carmona, R; Ortiz, C; Díaz, J

    2017-01-01

    Spain has one of the highest proportions of dementia in the world among the population aged 60 years or over. Recent studies link various environmental factors to neurocognitive-type diseases. This study sought to analyse whether urban risk factors such as traffic noise, pollutants and heat waves might have a short-term impact on exacerbation of symptoms of dementia, leading to emergency hospital admission. We conducted a longitudinal ecological time-series study, with the dependent variable being the number of daily dementia-related emergency (DDE) hospital admissions to Madrid municipal hospitals (ICD-10 codes 290.0-290.2, 290.4-290.9, 294.1-294) from 01 to 01-2001 to 31-12-2009, as obtained from the Hospital Morbidity Survey (National Statistics Institute). The measures used were as follows: for noise pollution, Leqd, equivalent diurnal noise level (from 8 to 22h), and Leqn, equivalent nocturnal noise level (from 22 to 8h) in dB(A); for chemical pollution, mean daily NO2, PM2.5, PM1 as provided by the Madrid Municipal Air Quality Monitoring Grid; and lastly, maximum daily temperature (°C), as supplied by the State Meteorological Agency. Scatterplot diagrams were plotted to assess the type of functional relationship existing between the main variable of analysis and the environmental variables. The lags of the environmental variables were calculated to analyse the timing of the effect. Poisson regression models were fitted, controlling for trends and seasonalities, to quantify relative risk (RR). During the study period, there were 1175 DDE hospital admissions. These admissions displayed a linear functional relationship without a threshold in the case of Leqd. The RR of DDE admissions was 1.15 (1.11-1.20) for an increase of 1dB in Leqd, with impact at lag 0. In the case of maximum daily temperature, there was a threshold temperature of 34°C, with an increase of 1°C over this threshold posing an RR of 1.19 (1.09-1.30) at lag 1. The only pollutant to show an

  5. Local indicators of geocoding accuracy (LIGA: theory and application

    Directory of Open Access Journals (Sweden)

    Jacquez Geoffrey M

    2009-10-01

    Full Text Available Abstract Background Although sources of positional error in geographic locations (e.g. geocoding error used for describing and modeling spatial patterns are widely acknowledged, research on how such error impacts the statistical results has been limited. In this paper we explore techniques for quantifying the perturbability of spatial weights to different specifications of positional error. Results We find that a family of curves describes the relationship between perturbability and positional error, and use these curves to evaluate sensitivity of alternative spatial weight specifications to positional error both globally (when all locations are considered simultaneously and locally (to identify those locations that would benefit most from increased geocoding accuracy. We evaluate the approach in simulation studies, and demonstrate it using a case-control study of bladder cancer in south-eastern Michigan. Conclusion Three results are significant. First, the shape of the probability distributions of positional error (e.g. circular, elliptical, cross has little impact on the perturbability of spatial weights, which instead depends on the mean positional error. Second, our methodology allows researchers to evaluate the sensitivity of spatial statistics to positional accuracy for specific geographies. This has substantial practical implications since it makes possible routine sensitivity analysis of spatial statistics to positional error arising in geocoded street addresses, global positioning systems, LIDAR and other geographic data. Third, those locations with high perturbability (most sensitive to positional error and high leverage (that contribute the most to the spatial weight being considered will benefit the most from increased positional accuracy. These are rapidly identified using a new visualization tool we call the LIGA scatterplot. Herein lies a paradox for spatial analysis: For a given level of positional error increasing sample density

  6. FlooDSuM - a decision support methodology for assisting local authorities in flood situations

    Science.gov (United States)

    Schwanbeck, Jan; Weingartner, Rolf

    2014-05-01

    Decision making in flood situations is a difficult task, especially in small to medium-sized mountain catchments (30 - 500 km2) which are usually characterized by complex topography, high drainage density and quick runoff response to rainfall events. Operating hydrological models driven by numerical weather prediction systems, which have a lead-time of several hours up to few even days, would be beneficial in this case as time for prevention could be gained. However, the spatial and quantitative accuracy of such meteorological forecasts usually decrease with increasing lead-time. In addition, the sensitivity of rainfall-runoff models to inaccuracies in estimations of areal rainfall increases with decreasing catchment size. Accordingly, decisions on flood alerts should ideally be based on areal rainfall from high resolution and short-term numerical weather prediction, nowcasts or even real-time measurements, which is transformed into runoff by a hydrological model. In order to benefit from the best possible rainfall data while retaining enough time for alerting and for prevention, the hydrological model should be fast and easily applicable by decision makers within local authorities themselves. The proposed decision support methodology FlooDSuM (Flood Decision Support Methodology) aims to meet those requirements. Applying FlooDSuM, a few successive binary decisions of increasing complexity have to be processed following a flow-chart-like structure. Prepared data and straightforwardly applicable tools are provided for each of these decisions. Maps showing the current flood disposition are used for the first step. While danger of flooding cannot be excluded more and more complex and time consuming methods will be applied. For the final decision, a set of scatter-plots relating areal precipitation to peak flow is provided. These plots take also further decisive parameters into account such as storm duration, distribution of rainfall intensity in time as well as the

  7. A critical review of the ESCAPE project for estimating long-term health effects of air pollution.

    Science.gov (United States)

    Lipfert, Frederick W

    2017-02-01

    The European Study of Cohorts for Air Pollution Effects (ESCAPE) is a13-nation study of long-term health effects of air pollution based on subjects pooled from up to 22 cohorts that were intended for other purposes. Twenty-five papers have been published on associations of various health endpoints with long-term exposures to NOx, NO2, traffic indicators, PM10, PM2.5 and PM constituents including absorbance (elemental carbon). Seven additional ESCAPE papers found moderate correlations (R2=0.3-0.8) between measured air quality and estimates based on land-use regression that were used; personal exposures were not considered. I found no project summaries or comparisons across papers; here I conflate the 25 ESCAPE findings in the context of other recent European epidemiology studies. Because one ESCAPE cohort contributed about half of the subjects, I consider it and the other 18 cohorts separately to compare their contributions to the combined risk estimates. I emphasize PM2.5 and confirm the published hazard ratio of 1.14 (1.04-1.26) per 10μg/m3 for all-cause mortality. The ESCAPE papers found 16 statistically significant (ppollutant-endpoint combinations; 4 each for PM2.5 and PM10, 1 for PM absorbance, 5 for NO2, and 2 for traffic. No PM constituent was consistently significant. No significant associations were reported for cardiovascular mortality; low birthrate was significant for all pollutants except PM absorbance. Based on associations with PM2.5, I find large differences between all-cause death estimates and the sum of specific-cause death estimates. Scatterplots of PM2.5 mortality risks by cause show no consistency across the 18 cohorts, ostensibly because of the relatively few subjects. Overall, I find the ESCAPE project inconclusive and I question whether the efforts required to estimate exposures for small cohorts were worthwhile. I suggest that detailed studies of the large cohort using historical exposures and additional cardiovascular risk factors might

  8. Methyl chloride in the UT/LS observed by CARIBIC: global distribution, Asian summer monsoon outflow, and use as a tracer for tropical air

    Science.gov (United States)

    Baker, A. K.; Umezawa, T.; Oram, D.; Sauvage, C.; Rauthe-Schoech, A.; Montzka, S. A.; Zahn, A.; Brenninkmeijer, C. A. M.

    2014-12-01

    We present spatiotemporal variations of methyl chloride (CH3Cl) in the UT/LS observed mainly by the CARIBIC passenger aircraft for the years 2005-2011. The CH3Cl mixing ratio in the UT over Europe was higher than that observed at a European surface baseline station year-round, indicative of a persistent positive vertical gradient at NH mid latitudes. A series of flights over Africa and South Asia show that CH3Cl mixing ratios increase toward tropical latitudes, and the observed UT CH3Cl level over these two regions and the Atlantic was higher than that measured at remote surface sites. Strong emissions of CH3Cl in the tropics combined with meridional transport through the UT may explain such vertical and latitudinal gradients. Comparisons with CO data indicate that non-combustion sources in the tropics dominantly contribute to forming the latitudinal gradient of CH3Cl in the UT. We also observed elevated CH3Cl and CO in air influenced by biomass burning in South America and Africa, and the enhancement ratios derived for CH3Cl to CO in those regions agree with previous observations. In contrast, correlations indicate a high CH3Cl to CO ratio of 2.9±0.5 ppt ppb-1 in the Asian summer monsoon anticyclone and domestic biofuel emissions in South Asia are inferred to be responsible. We estimated CH3Cl emissions from South Asia to be 134±23 Gg Cl yr-1, which is higher than a previous estimate due to the higher CH3Cl to CO ratio observed in this study. We also examine the use of CH3Cl as a tracer of tropical tropospheric air in the LMS, where we identified air masses with elevated CH3Cl that were however stratospheric in terms of N2O. Back trajectories suggest recent low-latitude origins of such air masses in early summer. In this season, high CH3Cl LMS air shows a clear branch connecting stratospheric and tropical tropospheric air on N2O-CH3Cl scatterplots. This distinct feature vanishes in late summer when the LMS is ventilated by tropospheric air.

  9. Mathematical formulae to estimate chronic subdural haematoma volume. Flawed assumption regarding ellipsoid morphology.

    Science.gov (United States)

    Manickam, Appukutty; Marshman, Laurence A G; Johnston, Ross; Thomas, Piers A W

    2017-06-01

    Mathematical formulae are commonly used to estimate intra-cranial haematoma volume. Such formulae tacitly assume an ellipsoid geometrical morphology. Recently, the 'XYZ/2' formula has been validated and recommended for chronic subdural haematoma (CSDH) volumetric estimation. We aimed to assess the precision and accuracy of mathematical formulae specifically in estimating CSDH volume, and to determine typical CSDH 3-D morphology. Three extant formulae ('XYZ/2', 'π/6·XYZ' and '2/3S·h') were compared against computer-assisted 3D volumetric analysis as Gold standard in CTs where CSDH sufficiently contrasted with brain. Scatter-plots (n=45) indicated that, in contrast to prior reports, all formulae most commonly over-estimated CSDH volume against 3-D Gold standard ('2/3S·h': 44.4%, 'XYZ/2': 48.84% and 'π/6·XYZ': 55.6%). With all formulae, imprecision increased with increased CSDH volume: in particular, with clinically-relevant CSDH volumes (i.e. >50ml). Deviations >10% of equivalence were observed in 60% of estimates for 2/3S·h, 77.8% for 'XYZ/2' and 84.4% for 'π/6·XYZ'. The maximum error for 'XYZ/2' was 142.3% of a clinically-relevant volume. Three-D simulations revealed that only 4/45 (9%) CSDH remotely conformed to ellipsoid geometrical morphology. Most (41/45, 91%) demonstrated highly irregular morphology neither recognisable as ellipsoid, nor as any other regular/non-regular geometric solid. Mathematical formulae, including 'XYZ/2', most commonly proved inaccurate and imprecise when applied to CSDH. In contrast to prior studies, all most commonly over-estimated CSDH volume. Imprecision increased with CSDH volume, and was maximal with clinically-relevant CSDH volumes. Errors most commonly related to a flawed assumption regarding ellipsoid 3-D CSDH morphology. The validity of mean comparisons, or correlation analyses, used in prior studies is questioned. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  10. Analysis of N-H···O hydrogen bonds in new C(O)-NH-P(O)-based phosphoric triamides and analogous structures deposited in the Cambridge Structural Database.

    Science.gov (United States)

    Pourayoubi, Mehrdad; Toghraee, Maryam; Divjakovic, Vladimir; van der Lee, Arie; Mancilla Percino, Teresa; Leyva Ramírez, Marco A; Saneei, Anahid

    2013-04-01

    Five new compounds belonging to the phosphoric triamide family have been synthesized: two of them with the formula XC(O)NHP(O)Y [X = CF3 (1) and CClF2 (2), Y = NHCH2C(CH3)2CH2NH] involving a 1,3-diazaphosphorinane ring part, and three 2,6-Cl2C6H3C(O)NHP(O)Z2 phosphoric triamides [Z = NHC(CH3)3 (3), N(CH3)(C6H11) (4) and N(CH3)(CH2C6H5) (5)]. The characterization was performed by (31)P{(1)H}, (1)H, (13)C NMR, IR spectroscopy besides (19)F NMR for fluorine containing compounds (1) and (2), and X-ray single-crystal structure analysis for (1), (3), (4) and (5). In each molecule the P atom has a distorted tetrahedral environment. The N atoms bonded to P atom have mainly sp(2) character with a very slight tendency to a pyramidal coordination for some amido groups. Different types of N-H···O hydrogen bonds have been analyzed for (1), (3), (4) and (5) and 118 other structures (including 194 hydrogen bonds) deposited in the Cambridge Structural Database, containing either C(O)-NH-P(O)[N(C)(C)]2 or C(O)-NH-P(O)[NH(C)]2. The participation of N(CP)-H···O=P [N(CP) = the nitrogen atom of the C(O)-NH-P(O) fragment], N-H···O=P, N-H···O=C and N(CP)-H···O=C hydrogen bonds in different hydrogen-bonded motifs are discussed. Moreover, the involvement of the O atoms of C=O or P=O in the [N(CP)-H][N-H]···O=P, [N-H]2···O=P, [N-H]2···O=C and [N-H]3···O=C groups are considered. A histogram of N···O distances, the distribution of N-H···O angles and the scatterplot of N-H···O angles versus N···O distances are studied.

  11. Evaluating concentration estimation errors in ELISA microarray experiments.

    Science.gov (United States)

    Daly, Don Simone; White, Amanda M; Varnum, Susan M; Anderson, Kevin K; Zangar, Richard C

    2005-01-26

    Enzyme-linked immunosorbent assay (ELISA) is a standard immunoassay to estimate a protein's concentration in a sample. Deploying ELISA in a microarray format permits simultaneous estimation of the concentrations of numerous proteins in a small sample. These estimates, however, are uncertain due to processing error and biological variability. Evaluating estimation error is critical to interpreting biological significance and improving the ELISA microarray process. Estimation error evaluation must be automated to realize a reliable high-throughput ELISA microarray system. In this paper, we present a statistical method based on propagation of error to evaluate concentration estimation errors in the ELISA microarray process. Although propagation of error is central to this method and the focus of this paper, it is most effective only when comparable data are available. Therefore, we briefly discuss the roles of experimental design, data screening, normalization, and statistical diagnostics when evaluating ELISA microarray concentration estimation errors. We use an ELISA microarray investigation of breast cancer biomarkers to illustrate the evaluation of concentration estimation errors. The illustration begins with a description of the design and resulting data, followed by a brief discussion of data screening and normalization. In our illustration, we fit a standard curve to the screened and normalized data, review the modeling diagnostics, and apply propagation of error. We summarize the results with a simple, three-panel diagnostic visualization featuring a scatterplot of the standard data with logistic standard curve and 95% confidence intervals, an annotated histogram of sample measurements, and a plot of the 95% concentration coefficient of variation, or relative error, as a function of concentration. This statistical method should be of value in the rapid evaluation and quality control of high-throughput ELISA microarray analyses. Applying propagation of error to

  12. Is BMI a valid measure of obesity in postmenopausal women?

    Science.gov (United States)

    Banack, Hailey R; Wactawski-Wende, Jean; Hovey, Kathleen M; Stokes, Andrew

    2017-11-13

    Body mass index (BMI) is a widely used indicator of obesity status in clinical settings and population health research. However, there are concerns about the validity of BMI as a measure of obesity in postmenopausal women. Unlike BMI, which is an indirect measure of obesity and does not distinguish lean from fat mass, dual-energy x-ray absorptiometry (DXA) provides a direct measure of body fat and is considered a gold standard of adiposity measurement. The goal of this study is to examine the validity of using BMI to identify obesity in postmenopausal women relative to total body fat percent measured by DXA scan. Data from 1,329 postmenopausal women participating in the Buffalo OsteoPerio Study were used in this analysis. At baseline, women ranged in age from 53 to 85 years. Obesity was defined as BMI ≥ 30 kg/m and body fat percent (BF%) greater than 35%, 38%, or 40%. We calculated sensitivity, specificity, positive predictive value, and negative predictive value to evaluate the validity of BMI-defined obesity relative BF%. We further explored the validity of BMI relative to BF% using graphical tools, such as scatterplots and receiver-operating characteristic curves. Youden's J index was used to determine the empirical optimal BMI cut-point for each level of BF% defined obesity. The sensitivity of BMI-defined obesity was 32.4% for 35% body fat, 44.6% for 38% body fat, and 55.2% for 40% body fat. Corresponding specificity values were 99.3%, 97.1%, and 94.6%, respectively. The empirical optimal BMI cut-point to define obesity is 24.9 kg/m for 35% BF, 26.49 kg/m for 38% BF, and 27.05 kg/m for 40% BF according to the Youden's index. Results demonstrate that a BMI cut-point of 30 kg/m does not appear to be an appropriate indicator of true obesity status in postmenopausal women. Empirical estimates of the validity of BMI from this study may be used by other investigators to account for BMI-related misclassification in older women.

  13. Descriptive Statistics: Reporting the Answers to the 5 Basic Questions of Who, What, Why, When, Where, and a Sixth, So What?

    Science.gov (United States)

    Vetter, Thomas R

    2017-11-01

    of the association between the exposure and the outcome (eg, the risk ratio or odds ratio) in the population likely resides. There are many possible ways to graphically display or illustrate different types of data. While there is often latitude as to the choice of format, ultimately, the simplest and most comprehensible format is preferred. Common examples include a histogram, bar chart, line chart or line graph, pie chart, scatterplot, and box-and-whisker plot. Valid and reliable descriptive statistics can answer basic yet important questions about a research data set, namely: "Who, What, Why, When, Where, How, How Much?"

  14. [An Improved DDV Method to Retrieve AOT for HJ CCD Image in Typical Mountainous Areas].

    Science.gov (United States)

    Zhao, Zhi-qiang; Li, Ai-nong; Bian, Jin-hu; Huang, Cheng-quan

    2015-06-01

    Domestic HJ CCD imaging applications in environment and disaster monitoring and prediction has great potential. But, HJ CCD image lack of Mid-Nir band can not directly retrieve Aerosol Optical Thickness (AOT) by the traditional Dark Dense Vegetation (DDV) method, and the mountain AOT changes in space-time dramatically affected by the mountain environment, which reduces the accuracy of atmospheric correction. Based on wide distribution of mountainous dark dense forest, the red band histogram threshold method was introduced to identify the mountainous DDV pixels. Subsequently, the AOT of DDV pixels were retrieved by lookup table constructed by 6S radiative transfer model with assumption of constant ratio between surface reflectance in red and blue bands, and then were interpolated to whole image. MODIS aerosol product and the retrieved AOT by the proposed algorithm had very good consistency in spatial distribution, and HJ CCD image was more suitable for the remote sensing monitoring of aerosol in mountain areas, which had higher spatial resolution. Their fitting curve of scatterplot was y = 0.828 6x-0.01 and R2 was 0.984 3 respectively. Which indicate the improved DDV method can effectively retrieve AOT, and its precision can satisfy the atmospheric correction and terrain radiation correction for Hj CCD image in mountainous areas. The improvement of traditional DDV method can effectively solve the insufficient information problem of the HJ CCD image which have only visible light and near infrared band, when solving radiative transfer equation. Meanwhile, the improved method fully considered the influence of mountainous terrain environment. It lays a solid foundation for the HJ CCD image atmospheric correction in the mountainous areas, and offers the possibility for its automated processing. In addition, the red band histogram threshold method was better than NDVI method to identify mountain DDV pixels. And, the lookup table and ratio between surface reflectance

  15. An exploratory factor analysis of the spontaneous reporting of severe cutaneous adverse reactions.

    Science.gov (United States)

    Hauben, Manfred; Hung, Eric; Hsieh, Wen-Yaw

    2017-01-01

    Severe cutaneous adverse reactions (SCARs) are prominent in pharmacovigilance (PhV). They have some commonalities such as nonimmediate nature and T-cell mediation and rare overlap syndromes have been documented, most commonly involving acute generalized exanthematous pustulosis (AGEP) and drug rash with eosinophilia and systemic symptoms (DRESS), and DRESS and toxic epidermal necrolysis (TEN). However, they display diverse clinical phenotypes and variations in specific T-cell immune response profiles, plus some specific genotype-phenotype associations. A question is whether causation of a given SCAR by a given drug supports causality of the same drug for other SCARs. If so, we might expect significant intercorrelations between SCARs with respect to overall drug-reporting patterns. SCARs with significant intercorrelations may reflect a unified underlying concept. We used exploratory factor analysis (EFA) on data from the United States Food and Drug Administration Adverse Event Reporting System (FAERS) to assess reporting intercorrelations between six SCARs [AGEP, DRESS, erythema multiforme (EM), Stevens-Johnson syndrome (SJS), TEN, exfoliative dermatitis (ExfolDerm)]. We screened the data using visual inspection of scatterplot matrices for problematic data patterns. We assessed factorability via Bartlett's test of sphericity, Kaiser-Myer-Olkin (KMO) statistic, initial estimates of communality and the anti-image correlation matrix. We extracted factors via principle axis factoring (PAF). The number of factors was determined by scree plot/Kaiser's rule. We also examined solutions with an additional factor. We applied various oblique rotations. We assessed the strength of the solution by percentage of variance explained, minimum number of factors loading per major factor, the magnitude of the communalities, loadings and crossloadings, and reproduced- and residual correlations. The data were generally adequate for factor analysis but the amount of variance explained

  16. Clinic Blood Pressure Underestimates Ambulatory Blood Pressure in an Untreated Employer-Based US Population: Results From the Masked Hypertension Study.

    Science.gov (United States)

    Schwartz, Joseph E; Burg, Matthew M; Shimbo, Daichi; Broderick, Joan E; Stone, Arthur A; Ishikawa, Joji; Sloan, Richard; Yurgel, Tyla; Grossman, Steven; Pickering, Thomas G

    2016-12-06

    Ambulatory blood pressure (ABP) is consistently superior to clinic blood pressure (CBP) as a predictor of cardiovascular morbidity and mortality risk. A common perception is that ABP is usually lower than CBP. The relationship of the CBP minus ABP difference to age has not been examined in the United States. Between 2005 and 2012, 888 healthy, employed, middle-aged (mean±SD age, 45±10.4 years) individuals (59% female, 7.4% black, 12% Hispanic) with screening BP ABP recording for the Masked Hypertension Study. The distributions of CBP, mean awake ABP (aABP), and the CBP-aABP difference in the full sample and by demographic characteristics were compared. Locally weighted scatterplot smoothing was used to model the relationship of the BP measures to age and body mass index. The prevalence of discrepancies in ABP- versus CBP-defined hypertension status-white-coat hypertension and masked hypertension-were also examined. Average systolic/diastolic aABP (123.0/77.4±10.3/7.4 mm Hg) was significantly higher than the average of 9 CBP readings over 3 visits (116.0/75.4±11.6/7.7 mm Hg). aABP exceeded CBP by >10 mm Hg much more frequently than CBP exceeded aABP. The difference (aABP>CBP) was most pronounced in young adults and those with normal body mass index. The systolic difference progressively diminished, but did not disappear, at older ages and higher body mass indexes. The diastolic difference vanished around age 65 and reversed (CBP>aABP) for body mass index >32.5 kg/m2. Whereas 5.3% of participants were hypertensive by CBP, 19.2% were hypertensive by aABP; 15.7% of those with nonelevated CBP had masked hypertension. Contrary to a widely held belief, based primarily on cohort studies of patients with elevated CBP, ABP is not usually lower than CBP, at least not among healthy, employed individuals. Furthermore, a substantial proportion of otherwise healthy individuals with nonelevated CBP have masked hypertension. Demonstrated CBP-aABP gradients, if confirmed

  17. Observing Mean Annual Mediterranean Maquis Ecosystem Respiration

    Science.gov (United States)

    Marras, S.; Bellucco, V.; Mereu, S.; Sirca, C.; Spano, D.

    2014-12-01

    In semi arid ecosystems, extremely low Soil Water Content (SWC) values may limit ecosystem respiration (Reco) to the point of hiding the typical exponential response of respiration to temperature. This work is aimed to understand and model the Reco of an evergreen Mediterranean maquis ecosystem and to estimate the contribution of soil CO2 efflux to Reco. The selected site is located in the center of the Mediterranean sea in Sardinia (Italy). Mean annual precipitation is 588 mm and mean annual temperature is 15.9 °C. Vegetation cover is heterogeneous: 70% covered by shrubs and 30% of bare soil. Net Ecosystem Exchange (NEE) is monitored with an Eddy Covariance (EC) tower since April 2004. Soil collars were placed underneath the dominant species (Juniperus phoenicea and Pistacia lentiscus) and over the bare soil. Soil CO2 efflux was measured once a month since April 2012. Soil temperature and SWC were monitored continuously at 5 cm depth in 4 different positions close to the soil collars. Six years of EC measurements (2005-2010) and two years of soil CO2 efflux (2012-2013) measurements were analysed. Reco was estimated from the measured EC fluxes at night after filtering for adequate turbulence (u* > 1.5). Reco measurements were then binned into 1°C intervals and median values were first fitted using the Locally Estimated Scatterplot Smoothing (LOESS) method (to determine the dominant trend of the experimental curve) Reco shows an exponential increase with air and soil temperature, until SWC measured at 0.2 m depth remains above 19% vol. Secondly, the coefficients of the selected Lloyd and Taylor (1994) were estimated through the nonlinear least square (nls) method: Rref (ecosystem respiration rate at a reference temperature of 10 °C was equal to 1.65 μmol m-2 s-1 and E0 (activation energy parameter that determines the temperature sensitivity) was equal to 322.46. In addition, bare and drier soils show a reduced response of measured CO2 efflux to increasing

  18. MRI-based flow measurements in the main pulmonary artery to detect pulmonary arterial hypertension in patients with cystic fibrosis; MRT-basierte Flussmessungen im Truncus pulmonalis zur Detektion einer pulmonal-arteriellen Hypertonie in Patienten mit zystischer Fibrose

    Energy Technology Data Exchange (ETDEWEB)

    Wolf, T.; Anjorin, A.; Abolmaali, N. [TU Dresden (Germany). OncoRay, Biologisches und Molekulares Imaging; Posselt, H. [Frankfurt Univ. (Germany). Klinik fuer Paediatrie I, Muskoviszidoseambulanz; Smaczny, C. [Frankfurt Univ. (Germany). Medizinische Klinik I, Pneumologie und Allergologie; Vogl, T.J. [Frankfurt Univ. (Germany). Inst. fuer Diagnostische und Interventionelle Radiologie

    2009-02-15

    Development of pulmonary arterial hypertension (PH) is a common problem in the course of patients suffering from cystic fibrosis (CF). This study was performed to evaluate MRI based flow measurements (MR{sub venc}; Velocity ENCoding) to detect signs of an evolving PH in patients suffering from CF. 48 patients (median age: 16 years, range: 10 - 40 years, 25 female) suffering from CF of different severity (mean FEV1: 74 % {+-} 23, mean Shwachman-score: 63 {+-} 10) were examined using MRI based flow measurements of the main pulmonary artery (MPA). Phase-contrast flash sequences (TR: 9.6 ms, TE: 2.5 ms, bandwidth: 1395 Hertz/Pixel) were utilized. Results were compared to an age- and sex-matched group of 48 healthy subjects. Analyzed flow data where: heart frequency (HF), cardiac output (HZV), acceleration time (AT), proportional acceleration time related to heart rate (ATr), mean systolic blood velocity (MFG), peak velocity (Peak), maximum flow (Fluss{sub max}), mean flow (Fluss{sub mitt}) and distensibility (Dist). The comparison of means revealed significant differences only for MFG, Fluss{sub max} and Dist, but overlap was marked. However, using a scatter-plot of AT versus MFG, it was possible to identify five CF-patients demonstrating definite signs of PH: AT = 81 ms {+-} 14, MFG = 46 {+-} 11 cm/s, Dist = 41 % {+-} 7. These CF-patients where the most severely affected in the investigated group, two of them were listed for complete heart and lung transplantation. The comparison of this subgroup and the remaining CF-patients revealed a highly significant difference for the AT (p = 0.000001) without overlap. Screening of CF-patients for the development of PH using MR{sub venc} of the MPA is not possible. In later stages of disease, the quantification of AT, MFG and Dist in the MPA may be useful for the detection, follow-up and control of therapy of PH. MR{sub venc} of the MPA completes the MRI-based follow-up of lung parenchyma damage in patients suffering from CF

  19. Association between progression-free survival and health-related quality of life in oncology: a systematic review protocol.

    Science.gov (United States)

    Kovic, Bruno; Guyatt, Gordon; Brundage, Michael; Thabane, Lehana; Bhatnagar, Neera; Xie, Feng

    2016-09-02

    There is an increasing number of new oncology drugs being studied, approved and put into clinical practice based on improvement in progression-free survival, when no overall survival benefits exist. In oncology, the association between progression-free survival and health-related quality of life is currently unknown, despite its importance for patients with cancer, and the unverified assumption that longer progression-free survival indicates improved health-related quality of life. Thus far, only 1 study has investigated this association, providing insufficient evidence and inconclusive results. The objective of this study protocol is to provide increased transparency in supporting a systematic summary of the evidence bearing on this association in oncology. Using the OVID platform in MEDLINE, Embase and Cochrane databases, we will conduct a systematic review of randomised controlled human trials addressing oncology issues published starting in 2000. A team of reviewers will, in pairs, independently screen and abstract data using standardised, pilot-tested forms. We will employ numerical integration to calculate mean incremental area under the curve between treatment groups in studies for health-related quality of life, along with total related error estimates, and a 95% CI around incremental area. To describe the progression-free survival to health-related quality of life association, we will construct a scatterplot for incremental health-related quality of life versus incremental progression-free survival. To estimate the association, we will use a weighted simple regression approach, comparing mean incremental health-related quality of life with either median incremental progression-free survival time or the progression-free survival HR, in the absence of overall survival benefit. Identifying direction and magnitude of association between progression-free survival and health-related quality of life is critically important in interpreting results of oncology

  20. Uncertainty and Sensitivity Analysis Results Obtained in the 1996 Performance Assessment for the Waste Isolation Pilot Plant

    Energy Technology Data Exchange (ETDEWEB)

    Bean, J.E.; Berglund, J.W.; Davis, F.J.; Economy, K.; Garner, J.W.; Helton, J.C.; Johnson, J.D.; MacKinnon, R.J.; Miller, J.; O' Brien, D.G.; Ramsey, J.L.; Schreiber, J.D.; Shinta, A.; Smith, L.N.; Stockman, C.; Stoelzel, D.M.; Vaughn, P.

    1998-09-01

    The Waste Isolation Pilot Plant (WPP) is located in southeastern New Mexico and is being developed by the U.S. Department of Energy (DOE) for the geologic (deep underground) disposal of transuranic (TRU) waste. A detailed performance assessment (PA) for the WIPP was carried out in 1996 and supports an application by the DOE to the U.S. Environmental Protection Agency (EPA) for the certification of the WIPP for the disposal of TRU waste. The 1996 WIPP PA uses a computational structure that maintains a separation between stochastic (i.e., aleatory) and subjective (i.e., epistemic) uncertainty, with stochastic uncertainty arising from the many possible disruptions that could occur over the 10,000 yr regulatory period that applies to the WIPP and subjective uncertainty arising from the imprecision with which many of the quantities required in the PA are known. Important parts of this structure are (1) the use of Latin hypercube sampling to incorporate the effects of subjective uncertainty, (2) the use of Monte Carlo (i.e., random) sampling to incorporate the effects of stochastic uncertainty, and (3) the efficient use of the necessarily limited number of mechanistic calculations that can be performed to support the analysis. The use of Latin hypercube sampling generates a mapping from imprecisely known analysis inputs to analysis outcomes of interest that provides both a display of the uncertainty in analysis outcomes (i.e., uncertainty analysis) and a basis for investigating the effects of individual inputs on these outcomes (i.e., sensitivity analysis). The sensitivity analysis procedures used in the PA include examination of scatterplots, stepwise regression analysis, and partial correlation analysis. Uncertainty and sensitivity analysis results obtained as part of the 1996 WIPP PA are presented and discussed. Specific topics considered include two phase flow in the vicinity of the repository, radionuclide release from the repository, fluid flow and radionuclide

  1. Alteration in the liver metabolome of rats with metabolic syndrome after treatment with Hydroxytyrosol. A Mass Spectrometry And Nuclear Magnetic Resonance - based metabolomics study.

    Science.gov (United States)

    Dagla, Ioanna; Benaki, Dimitra; Baira, Eirini; Lemonakis, Nikolaos; Poudyal, Hemant; Brown, Lindsay; Tsarbopoulos, Anthony; Skaltsounis, Alexios-Leandros; Mikros, Emmanouel; Gikas, Evagelos

    2018-02-01

    Metabolic syndrome (MetS) represents a group of abnormalities that enhances the risk for cardiovascular disease, diabetes and stroke. The Mediterranean diet seems to be an important dietary pattern, which reduces the incidence of MetS. Hydroxytyrosol (HT) - a simple phenol found in olive oil - has received increased attention for its antioxidant activity. Recently, the European Foods Safety Authority (EFSA) claimed that dietary consumption of HT exhibits a protective role against cardiovascular disease. In this study, an experimental protocol has been setup, including isolated HT administration in a diet induced model of MetS in young Wistar rats, in order to find out whether HT has a protective effect against MetS. Rats were randomly divided into two groups nurtured by high-carbohydrate high-fat (H) (MetS inducing diet) and high-carbohydrate high-fat + HT (HHT). HT (20mg/kg/d oral gavage, water vehicle) was administered for 8 weeks on the basal diet. Previous pharmacological evaluation of HT showed that hepatic steatosis was reduced and the inflammatory cells into the liver were infiltrated. These indicate that HT shows bioactivity against metabolic syndrome. Therefore, the metabolomics evaluation of liver extracts would indicate the putative biochemical mechanisms of HT activity. Thus, the extracts of liver tissues were analyzed using Ultra Performance Liquid Chromatography - High Resolution Mass Spectrometry (UPLC-HRMS, Orbitrap Discovery) and Nuclear Magnetic Resonance (NMR) spectroscopy (Bruker Avance III 600MHz). Multivariate analysis was performed in order to gain insight on the metabolic effects of HT administration on the liver metabolome. Normalization employing multiple internal standards and Quality Control-based Robust LOESS (LOcally Estimated Scatterplot Smoothing) Signal Correction algorithm (QC-RLSC) was added in the processing pipeline to enhance the reliability of metabolomic analysis by reducing unwanted information. Experimentally, HHT rats were

  2. Comparison of spatiotemporal prediction models of daily exposure of individuals to ambient nitrogen dioxide and ozone in Montreal, Canada.

    Science.gov (United States)

    Buteau, Stephane; Hatzopoulou, Marianne; Crouse, Dan L; Smargiassi, Audrey; Burnett, Richard T; Logan, Travis; Cavellin, Laure Deville; Goldberg, Mark S

    2017-07-01

    In previous studies investigating the short-term health effects of ambient air pollution the exposure metric that is often used is the daily average across monitors, thus assuming that all individuals have the same daily exposure. Studies that incorporate space-time exposures of individuals are essential to further our understanding of the short-term health effects of ambient air pollution. As part of a longitudinal cohort study of the acute effects of air pollution that incorporated subject-specific information and medical histories of subjects throughout the follow-up, the purpose of this study was to develop and compare different prediction models using data from fixed-site monitors and other monitoring campaigns to estimate daily, spatially-resolved concentrations of ozone (O 3 ) and nitrogen dioxide (NO 2 ) of participants' residences in Montreal, 1991-2002. We used the following methods to predict spatially-resolved daily concentrations of O 3 and NO 2 for each geographic region in Montreal (defined by three-character postal code areas): (1) assigning concentrations from the nearest monitor; (2) spatial interpolation using inverse-distance weighting; (3) back-extrapolation from a land-use regression model from a dense monitoring survey, and; (4) a combination of a land-use and Bayesian maximum entropy model. We used a variety of indices of agreement to compare estimates of exposure assigned from the different methods, notably scatterplots of pairwise predictions, distribution of differences and computation of the absolute agreement intraclass correlation (ICC). For each pairwise prediction, we also produced maps of the ICCs by these regions indicating the spatial variability in the degree of agreement. We found some substantial differences in agreement across pairs of methods in daily mean predicted concentrations of O 3 and NO 2 . On a given day and postal code area the difference in the concentration assigned could be as high as 131ppb for O 3 and 108ppb

  3. Correlating tephras and cryptotephras using glass compositional analyses and numerical and statistical methods: Review and evaluation

    Science.gov (United States)

    Lowe, David J.; Pearce, Nicholas J. G.; Jorgensen, Murray A.; Kuehn, Stephen C.; Tryon, Christian A.; Hayward, Chris L.

    2017-11-01

    . Adding stratigraphic, chronological, spatial, or palaeoenvironmental data (i.e. multiple criteria) is usually necessary and allows for more robust correlations to be made. A two-stage approach is useful, the first focussed on differences in the mean composition of samples, or their range, which can be visualised graphically via scatterplot matrices or bivariate plots coupled with the use of statistical tools such as distance measures, similarity coefficients, hierarchical cluster analysis (informed by distance measures or similarity or cophenetic coefficients), and principal components analysis (PCA). Some statistical methods (cluster analysis, discriminant analysis) are referred to as 'machine learning' in the computing literature. The second stage examines sample variance and the degree of compositional similarity so that sample equivalence or otherwise can be established on a statistical basis. This stage may involve discriminant function analysis (DFA), support vector machines (SVMs), canonical variates analysis (CVA), and ANOVA or MANOVA (or its two-sample special case, the Hotelling two-sample T2 test). Randomization tests can be used where distributional assumptions such as multivariate normality underlying parametric tests are doubtful. Compositional data may be transformed and scaled before being subjected to multivariate statistical procedures including calculation of distance matrices, hierarchical cluster analysis, and PCA. Such transformations may make the assumption of multivariate normality more appropriate. A sequential procedure using Mahalanobis distance and the Hotelling two-sample T2 test is illustrated using glass major element data from trachytic to phonolitic Kenyan tephras. All these methods require a broad range of high-quality compositional data which can be used to compare 'unknowns' with reference (training) sets that are sufficiently complete to account for all possible correlatives, including tephras with heterogeneous glasses that contain

  4. HJ-Biplot como herramienta de inspección de matrices de datos bibliométricos

    Directory of Open Access Journals (Sweden)

    Díaz-Faes, Adrián A.

    2013-03-01

    Full Text Available The aim of this paper is to demonstrate the usefulness of the HJ-Biplot in bibliometric studies. It is a simple and intuitive display, similar to a scatterplot, but capturing the multivariate covariance structures between bibliometric indicators. Their interpretation does not require specialized statistical knowledge, but merely to know how to interpret the length of a vector, the angle between two vectors and the distance between two points. With this aim, an analysis has been performed of the scientific output of CSIC's own centres as well as of joint centres during the period 2006-2009, in relation to a series of indicators based on impact and collaboration. Biplot methods are graphical representations of multivariate data. Using HJ-Biplot it is possible to interpret simultaneously the position of the centres, represented by dots; indicators, represented by vectors; and the relationships between them. The position of the centres in the context of their area as well as within the overall CSIC is analysed and those centres with a unique behaviour are identified. We conclude that the Humanities and Social Sciences, and Food Science and Technology are the areas with a greater homogeneous pattern in the performance of their centres, while Physics and Agriculture, are more heterogeneous.

    El objetivo de este trabajo es poner de manifiesto la utilidad del HJ-Biplot en los estudios bibliométricos. El HJ-Biplot es una representación intuitiva y sencilla, similar a un diagrama de dispersión, pero que captura las estructuras de covariación multivariantes entre los indicadores bibliométricos. Su interpretación no requiere conocimientos estadísticos especializados, basta con saber interpretar la longitud de un vector, el ángulo entre dos vectores y la distancia entre dos puntos. Con este fin, se analiza la actividad científica de los centros propios y mixtos del CSIC durante el período 2006-2009 mediante una serie de indicadores de

  5. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    Science.gov (United States)

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  6. APLICACIÓN DEL PROCEDIMIENTO DE DIAGNÓSTICO DE LA CALIDAD DE LOS DATOS EN EMPRESA PRODUCTORA DE ENVASES DE MADERA

    Directory of Open Access Journals (Sweden)

    Darian Pérez Rodríguez

    2010-11-01

    pesos because of bad data quality. The consistency is the least affected dimension, globally the quality level is 78% and the security of data is regular. For sealer and clear approximately half of errors are graves and mean grave. The mistaken values are produce in the economic area grouping the causes in input of data, bad interpretation of solicitude and bad design of data base, giving solutions such as created system for double data input.

    The techniques and analysis methods more frequently use were: inquiry, interviews, bibliographic consultation, revision of documents, process focusing, boxplot, scatterplot, Pareto and Ishikawa´s diagrams.


  7. Accounting for the relationship between per diem cost and LOS when estimating hospitalization costs

    Directory of Open Access Journals (Sweden)

    Ishak K

    2012-12-01

    Full Text Available Abstract Background Hospitalization costs in clinical trials are typically derived by multiplying the length of stay (LOS by an average per-diem (PD cost from external sources. This assumes that PD costs are independent of LOS. Resource utilization in early days of the stay is usually more intense, however, and thus, the PD cost for a short hospitalization may be higher than for longer stays. The shape of this relationship is unlikely to be linear, as PD costs would be expected to gradually plateau. This paper describes how to model the relationship between PD cost and LOS using flexible statistical modelling techniques. Methods An example based on a clinical study of clevidipine for the treatment of peri-operative hypertension during hospitalizations for cardiac surgery is used to illustrate how inferences about cost-savings associated with good blood pressure (BP control during the stay can be affected by the approach used to derive hospitalization costs. Data on the cost and LOS of hospitalizations for coronary artery bypass grafting (CABG from the Massachusetts Acute Hospital Case Mix Database (the MA Case Mix Database were analyzed to link LOS to PD cost, factoring in complications that may have occurred during the hospitalization or post-discharge. The shape of the relationship between LOS and PD costs in the MA Case Mix was explored graphically in a regression framework. A series of statistical models including those based on simple logarithmic transformation of LOS to more flexible models using LOcally wEighted Scatterplot Smoothing (LOESS techniques were considered. A final model was selected, using simplicity and parsimony as guiding principles in addition traditional fit statistics (like Akaike’s Information Criterion, or AIC. This mapping was applied in ECLIPSE to predict an LOS-specific PD cost, and then a total cost of hospitalization. These were then compared for patients who had good vs. poor peri-operative blood

  8. Accounting for the relationship between per diem cost and LOS when estimating hospitalization costs.

    Science.gov (United States)

    Ishak, K Jack; Stolar, Marilyn; Hu, Ming-yi; Alvarez, Piedad; Wang, Yamei; Getsios, Denis; Williams, Gregory C

    2012-12-01

    Hospitalization costs in clinical trials are typically derived by multiplying the length of stay (LOS) by an average per-diem (PD) cost from external sources. This assumes that PD costs are independent of LOS. Resource utilization in early days of the stay is usually more intense, however, and thus, the PD cost for a short hospitalization may be higher than for longer stays. The shape of this relationship is unlikely to be linear, as PD costs would be expected to gradually plateau. This paper describes how to model the relationship between PD cost and LOS using flexible statistical modelling techniques. An example based on a clinical study of clevidipine for the treatment of peri-operative hypertension during hospitalizations for cardiac surgery is used to illustrate how inferences about cost-savings associated with good blood pressure (BP) control during the stay can be affected by the approach used to derive hospitalization costs.Data on the cost and LOS of hospitalizations for coronary artery bypass grafting (CABG) from the Massachusetts Acute Hospital Case Mix Database (the MA Case Mix Database) were analyzed to link LOS to PD cost, factoring in complications that may have occurred during the hospitalization or post-discharge. The shape of the relationship between LOS and PD costs in the MA Case Mix was explored graphically in a regression framework. A series of statistical models including those based on simple logarithmic transformation of LOS to more flexible models using LOcally wEighted Scatterplot Smoothing (LOESS) techniques were considered. A final model was selected, using simplicity and parsimony as guiding principles in addition traditional fit statistics (like Akaike's Information Criterion, or AIC). This mapping was applied in ECLIPSE to predict an LOS-specific PD cost, and then a total cost of hospitalization. These were then compared for patients who had good vs. poor peri-operative blood-pressure control. The MA Case Mix dataset included data

  9. Regional Distribution of Metals and C and N Stable Isotopes in the Epiphytic Ball Moss (Tillandsia Recurvata) at the Mezquital Valley, Hidalgo State

    Science.gov (United States)

    Zambrano-Garcia, A.; López-Veneroni, D.; Rojas, A.; Torres, A.; Sosa, G.

    2007-05-01

    the local oil refinery and the oil- fueled power plant. Two distinct Ni:V scatterplot trends suggest that there are two main petrogenic emission sources in the region. Calcium and, to some extent, Mg were higher near the mining areas and a calcium carbonate factory. Lead had a diffuse distribution, probably related to former gasoline vehicle exhaust emissions, rather than to current emissions. Antimony was more abundant at sites far from agriculture and industrial areas, which suggests a natural origin (rocks or soils). The spatial distribution of stable isotopes also showed distinct patterns near the industrial sources with relatively 13C -depleted and 15N -enriched values near the oil refinery and the electrical power plant. Although it is not yet possible to provide quantitative estimates for emission contributions per source type, biomonitoring with T. recurvata provided for the first time a clear picture of the relative deposition patterns for several airborne metals in the Mezquital Valley.

  10. Estimated Probability of a Cervical Spine Injury During an ISS Mission

    Science.gov (United States)

    Brooker, John E.; Weaver, Aaron S.; Myers, Jerry G.

    2013-01-01

    Introduction: The Integrated Medical Model (IMM) utilizes historical data, cohort data, and external simulations as input factors to provide estimates of crew health, resource utilization and mission outcomes. The Cervical Spine Injury Module (CSIM) is an external simulation designed to provide the IMM with parameter estimates for 1) a probability distribution function (PDF) of the incidence rate, 2) the mean incidence rate, and 3) the standard deviation associated with the mean resulting from injury/trauma of the neck. Methods: An injury mechanism based on an idealized low-velocity blunt impact to the superior posterior thorax of an ISS crewmember was used as the simulated mission environment. As a result of this impact, the cervical spine is inertially loaded from the mass of the head producing an extension-flexion motion deforming the soft tissues of the neck. A multibody biomechanical model was developed to estimate the kinematic and dynamic response of the head-neck system from a prescribed acceleration profile. Logistic regression was performed on a dataset containing AIS1 soft tissue neck injuries from rear-end automobile collisions with published Neck Injury Criterion values producing an injury transfer function (ITF). An injury event scenario (IES) was constructed such that crew 1 is moving through a primary or standard translation path transferring large volume equipment impacting stationary crew 2. The incidence rate for this IES was estimated from in-flight data and used to calculate the probability of occurrence. The uncertainty in the model input factors were estimated from representative datasets and expressed in terms of probability distributions. A Monte Carlo Method utilizing simple random sampling was employed to propagate both aleatory and epistemic uncertain factors. Scatterplots and partial correlation coefficients (PCC) were generated to determine input factor sensitivity. CSIM was developed in the SimMechanics/Simulink environment with a

  11. Felyx : A Free Open Software Solution for the Analysis of Large Earth Observation Datasets

    Science.gov (United States)

    Piolle, Jean-Francois; Shutler, Jamie; Poulter, David; Guidetti, Veronica; Donlon, Craig

    2014-05-01

    miniProds using for instance a set of usual statistical operators (mean, median, rms, ...), fully extensible and applicable to any variable of a dataset. These metrics are stored in a fast search engine, queryable by humans and automated applications. * reporting or alerting, based on user-defined inference rules, through various media (emails, twitter feeds,..) and devices (phones, tablets). * analysing miniProds and metrics through a web interface allowing to dig into this base of information and extracting useful knowledge through multidimensional interactive display functions (time series, scatterplots, histograms, maps). The services provided by felyx will be generic, deployable at users own premises and adaptable enough to integrate any kind of parameters. Users will be able to operate their own felyx instance at any location, on datasets and parameters of their own interest, and the various instances will be able to interact with each other, creating a web of felyx systems enabling aggregation and cross comparison of miniProds and metrics from multiple sources. Initially two instances will be operated simultaneously during a 6 months demonstration phase, at IFREMER - on sea surface temperature (for GHRSST community) and ocean waves datasets - and PML - on ocean colour. We will present results from the Felyx project, demonstrate how the GHRSST community can exploit Felyx and demonstrate how the wider community can make use of the GHRSST data within Felyx.

  12. Estudo espacial da mortalidade por acidentes de motocicleta em Pernambuco Estudio espacial de la mortalidad de accidentes de motocicleta en Pernambuco, Noreste de Brasil Spatial study of mortality in motorcycle accidents in the State of Pernambuco, Northeastern Brazil

    Directory of Open Access Journals (Sweden)

    Paul Hindenburg Nobre de Vasconcelos Silva

    2011-04-01

    en el Sistema de Informaciones sobre Mortalidad, y como denominador, la población del centro del período. Se utilizaron técnicas de análisis espacial, suavización del coeficiente por el método bayesiano empírico local y el diagrama de dispersión de Moran, aplicados sobre la base cartográfica digital del estado. RESULTADOS: El coeficiente promedio de mortalidad por accidentes de motocicletas en Pernambuco fue de 3,47/100 mil habitantes. De los 185 municipios, 16 formaban parte de cinco conglomerados identificados con coeficientes de mortalidad que variaron de 5,66 a 11,66/100 mil habitantes, considerados áreas críticas. Tres de dichas áreas se localizan en la región de desarrollo sertón y dos en la agreste. CONCLUSIONES: El riego de morir por accidente de motocicleta es mayor en las áreas de conglomerado en regiones fuera del eje metropolitano, sugiriendo medidas de intervención que consideren el contexto desarrollo económico, social y cultural.OBJECTIVE: To analyze the spatial distribution of mortality due to motorcycle accidents in the state of Pernambuco, Northeastern Brazil. METHODS: A population-based ecological study using data on mortality in motorcycle accidents from 01/01/2000 to 31/12/2005. The analysis units were the municipalities. For the spatial distribution analysis, an average mortality rate was calculated, using deaths from motorcycle accidents recorded in the Mortality Information System as the numerator, and as the denominator the population of the mid-period. Spatial analysis techniques, mortality smoothing coefficient estimate by the local empirical Bayesian method and Moran scatterplot, applied to the digital cartographic base of Pernambuco were used. RESULTS: The average mortality rate for motorcycle accidents in Pernambuco was 3.47 per 100 thousand inhabitants. Of the 185 municipalities, 16 were part of five clusters identified with average mortality rates ranging from 5.66 to 11.66 per 100 thousand inhabitants, and were

  13. Trends in surface-water quality at selected National Stream Quality Accounting Network (NASQAN) stations, in Michigan

    Science.gov (United States)

    Syed, Atiq U.; Fogarty, Lisa R.

    2005-01-01

    -treatment processes, and effective regulations. Phosphorus data for most of the study stations could not be analyzed because of the data limitations for trend tests. The only station with a significant negative trend in total phosphorus concentration is the Clinton River at Mount Clemens. However, scatter-plot analyses of phosphorus data indicate decreasing concentrations with time for most of the study stations. Positive trends in concentration of nitrogen compounds were detected at the Kalamazoo River near Saugatuck and Muskegon River near Bridgeton. Positive trends in both fecal coliform and total fecal coliform were detected at the Tahquamenon River near Paradise. Various different point and nonpoint sources could produce such positive trends, but most commonly the increase in concentrations of nitrogen compounds and fecal coliform bacteria are associated with agricultural practices and sewage-plant discharges. The constituent with the most numerous and geographically widespread significant trend is pH. The pH levels increased at six out of nine stations on all the major rivers in Michigan, with no negative trend at any station. The cause of pH increase is difficult to determine, as it could be related to a combination of anthropogenic activities and natural processes occurring simultaneously in the environment. Trends in concentration of major ions, such as calcium, sodium, magnesium, sulfate, fluoride, chloride, and potassium, were detected at eight out of nine stations. A negative trend was detected only in sulfate and fluoride concentrations; a positive trend was detected only in calcium concentration. The major ions with the most widespread significant trends are sodium and chloride; three positive and two negative trends were detected for sodium, and three negative and two positive trends were detected for chloride. The negative trends in chloride concentrations outnumbered the positive trends. This result indicates a slight improvement in surface-water quality because