WorldWideScience

Sample records for variational objective analysis

  1. Some new mathematical methods for variational objective analysis

    Science.gov (United States)

    Wahba, Grace; Johnson, Donald R.

    1994-01-01

    Numerous results were obtained relevant to remote sensing, variational objective analysis, and data assimilation. A list of publications relevant in whole or in part is attached. The principal investigator gave many invited lectures, disseminating the results to the meteorological community as well as the statistical community. A list of invited lectures at meetings is attached, as well as a list of departmental colloquia at various universities and institutes.

  2. Analysis of Optical Variations of BL Lac Object AO 0235+164 Wang ...

    Indian Academy of Sciences (India)

    obtain statistically meaningful values for the cross-correlation time lags ... deviation, the fifth represents the largest variations, the sixth represents the fractional ..... 6. Conclusions. The multi-band optical data are collected on the object of AO 0235 + 164. The time lags among the B, V, R and I bands have been analysed.

  3. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  4. Numerical Analysis Objects

    Science.gov (United States)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  5. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  6. Finite time exergy analysis and multi-objective ecological optimization of a regenerative Brayton cycle considering the impact of flow rate variations

    International Nuclear Information System (INIS)

    Naserian, Mohammad Mahdi; Farahat, Said; Sarhaddi, Faramarz

    2015-01-01

    Highlights: • Defining a dimensionless parameter includes the finite-time and size concepts. • Inserting the concept of exergy of fluid streams into finite-time thermodynamics. • Defining, drawing and modifying of maximum ecological function curve. • Suggesting the appropriate performance zone, according to maximum ecological curve. - Abstract: In this study, the optimal performance of a regenerative Brayton cycle is sought through power and then ecological function maximization using finite-time thermodynamic concept and finite-size components. Multi-objective optimization is used for maximizing the ecological function. Optimizations are performed using genetic algorithm. In order to take into account the finite-time and finite-size concepts in current problem, a dimensionless mass-flow parameter is introduced deploying time variations. The variations of output power, total exergy destruction of the system, and decision variables for the optimum state (maximum ecological function state) are compared to the maximum power state using the dimensionless parameter. The modified ecological function in optimum state is obtained and plotted relating to the dimensionless mass-flow parameter. One can see that the modified ecological function study results in a better performance than that obtained with the maximum power state. Finally, the appropriate performance zone of the heat engine will be obtained

  7. Objective - oriented financial analysis introduction

    Directory of Open Access Journals (Sweden)

    Dessislava Kostova – Pickett

    2018-02-01

    Full Text Available The practice of financial analysis has been immeasurably strengthened in recent years thanks to the ongoing evolution of computerized approaches in the form of spreadsheets and computer-based financial models of different types. These devices not only relieved the analyst's computing task, but also opened up a wide range of analyzes and research into alternative sensitivity, which so far has not been possible. The main potential for object-oriented financial analysis consists in enormously expanding the analyst's capabilities through an online knowledge and information interface that has not yet been achieved through existing methods and software packages.

  8. Learning-based stochastic object models for characterizing anatomical variations

    Science.gov (United States)

    Dolly, Steven R.; Lou, Yang; Anastasio, Mark A.; Li, Hua

    2018-03-01

    It is widely known that the optimization of imaging systems based on objective, task-based measures of image quality via computer-simulation requires the use of a stochastic object model (SOM). However, the development of computationally tractable SOMs that can accurately model the statistical variations in human anatomy within a specified ensemble of patients remains a challenging task. Previously reported numerical anatomic models lack the ability to accurately model inter-patient and inter-organ variations in human anatomy among a broad patient population, mainly because they are established on image data corresponding to a few of patients and individual anatomic organs. This may introduce phantom-specific bias into computer-simulation studies, where the study result is heavily dependent on which phantom is used. In certain applications, however, databases of high-quality volumetric images and organ contours are available that can facilitate this SOM development. In this work, a novel and tractable methodology for learning a SOM and generating numerical phantoms from a set of volumetric training images is developed. The proposed methodology learns geometric attribute distributions (GAD) of human anatomic organs from a broad patient population, which characterize both centroid relationships between neighboring organs and anatomic shape similarity of individual organs among patients. By randomly sampling the learned centroid and shape GADs with the constraints of the respective principal attribute variations learned from the training data, an ensemble of stochastic objects can be created. The randomness in organ shape and position reflects the learned variability of human anatomy. To demonstrate the methodology, a SOM of an adult male pelvis is computed and examples of corresponding numerical phantoms are created.

  9. Cross-cultural variation of memory colors of familiar objects.

    Science.gov (United States)

    Smet, Kevin A G; Lin, Yandan; Nagy, Balázs V; Németh, Zoltan; Duque-Chica, Gloria L; Quintero, Jesús M; Chen, Hung-Shing; Luo, Ronnier M; Safi, Mahdi; Hanselaer, Peter

    2014-12-29

    The effect of cross-regional or cross-cultural differences on color appearance ratings and memory colors of familiar objects was investigated in seven different countries/regions - Belgium, Hungary, Brazil, Colombia, Taiwan, China and Iran. In each region the familiar objects were presented on a calibrated monitor in over 100 different colors to a test panel of observers that were asked to rate the similarity of the presented object color with respect to what they thought the object looks like in reality (memory color). For each object and region the mean observer ratings were modeled by a bivariate Gaussian function. A statistical analysis showed significant (p culture was found to be small. In fact, the differences between the region average observers and the global average observer were found to of the same magnitude or smaller than the typical within region inter-observer variability. Thus, although statistical differences in color appearance ratings and memory between regions were found, regional impact is not likely to be of practical importance.

  10. Variation in expert source selection according to different objectivity standards

    DEFF Research Database (Denmark)

    Albæk, Erik

    2011-01-01

    Several scholars have tried to clarify how journalists handle and implement the abstract objectivity norm in daily practice. Less research attention has been paid to how common abstract professional norms and values, in casu the objectivity norm, may systematically vary when interpreted and imple......Several scholars have tried to clarify how journalists handle and implement the abstract objectivity norm in daily practice. Less research attention has been paid to how common abstract professional norms and values, in casu the objectivity norm, may systematically vary when interpreted...

  11. Variation in Expert Source Selection According to Different Objectivity Standards

    Science.gov (United States)

    Albaek, Erik

    2011-01-01

    Several scholars have tried to clarify how journalists handle and implement the abstract objectivity norm in daily practice. Less research attention has been paid to how common abstract professional norms and values, "in casu" the objectivity norm, may systematically vary when interpreted and implemented in daily journalistic practice. Allgaier's…

  12. Size variation and flow experience of physical game support objects

    NARCIS (Netherlands)

    Feijs, L.M.G.; Peters, P.J.F.; Eggen, J.H.

    2004-01-01

    This paper is about designing and evaluating an innovative type of computer game. Game support objects are used to enrich the gaming experience [7]. The added objects are active but are simpler than real robots. In the study reported here they are four helper ghosts connected to a traditional Pacman

  13. Object-sensitive Type Analysis of PHP

    NARCIS (Netherlands)

    Van der Hoek, Henk Erik; Hage, J

    2015-01-01

    In this paper we develop an object-sensitive type analysis for PHP, based on an extension of the notion of monotone frameworks to deal with the dynamic aspects of PHP, and following the framework of Smaragdakis et al. for object-sensitive analysis. We consider a number of instantiations of the

  14. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    Directory of Open Access Journals (Sweden)

    Anyela Camargo

    Full Text Available Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  15. Object-oriented analysis and design

    CERN Document Server

    Deacon, John

    2005-01-01

    John Deacon’s in-depth, highly pragmatic approach to object-oriented analysis and design, demonstrates how to lay the foundations for developing the best possible software. Students will learn how to ensure that analysis and design remain focused and productive. By working through the book, they will gain a solid working knowledge of best practices in software development.

  16. Neutron activation analysis of limestone objects

    International Nuclear Information System (INIS)

    Meyers, P.; Van Zelst, L.

    1977-01-01

    The elemental composition of samples from limestone objects were determined by neutron activation analysis to investigate whether this technique can be used to distinguish between objects made of limestone from different sources. Samples weighing between 0.2-2 grams were obtained by drilling from a series of ancient Egyptian and medieval Spanish objects. Analysis was performed on aliquots varying in weight from 40-100 milligrams. The following elements were determined quantitatively: Na, K, Rb, Cs, Ba, Sc, La, Ce, Sm, Eu, Hf, Th, Ta, Cr, Mn, Fe, Co and Zn. The data on Egyptian limestones indicate that, because of the inhomogeneous nature of the stone, 0.2-2 gram samples may not be representative of an entire object. Nevertheless, multivariate statistical methods produced a clear distinction between objects originating from the Luxor area (ancient Thebes) and objects found north of Luxor. The Spanish limestone studied appeared to be more homogeneous. Samples from stylistically related objects have similar elemental compositions while relative large differences were observed between objects having no relationship other than the common provenance of medieval Spain. (orig.) [de

  17. Feedforward Object-Vision Models Only Tolerate Small Image Variations Compared to Human

    Directory of Open Access Journals (Sweden)

    Masoud eGhodrati

    2014-07-01

    Full Text Available Invariant object recognition is a remarkable ability of primates' visual system that its underlying mechanism has constantly been under intense investigations. Computational modelling is a valuable tool toward understanding the processes involved in invariant object recognition. Although recent computational models have shown outstanding performances on challenging image databases, they fail to perform well when images with more complex variations of the same object are applied to them. Studies have shown that making sparse representation of objects by extracting more informative visual features through a feedforward sweep can lead to higher recognition performances. Here, however, we show that when the complexity of image variations is high, even this approach results in poor performance compared to humans. To assess the performance of models and humans in invariant object recognition tasks, we built a parametrically controlled image database consisting of several object categories varied in different dimensions and levels, rendered from 3D planes. Comparing the performance of several object recognition models with human observers shows that only in low-level image variations the models perform similar to humans in categorization tasks. Furthermore, the results of our behavioral experiments demonstrate that, even under difficult experimental conditions (i.e. briefly presented masked stimuli with complex image variations, human observers performed outstandingly well, suggesting that the models are still far from resembling humans in invariant object recognition. Taken together, we suggest that learning sparse informative visual features, although desirable, is not a complete solution for future progresses in object-vision modelling. We show that this approach is not of significant help in solving the computational crux of object recognition (that is invariant object recognition when the identity-preserving image variations become more complex.

  18. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  19. Fixed point theory, variational analysis, and optimization

    CERN Document Server

    Al-Mezel, Saleh Abdullah R; Ansari, Qamrul Hasan

    2015-01-01

    ""There is a real need for this book. It is useful for people who work in areas of nonlinear analysis, optimization theory, variational inequalities, and mathematical economics.""-Nan-Jing Huang, Sichuan University, Chengdu, People's Republic of China

  20. Objective analysis of toolmarks in forensics

    Energy Technology Data Exchange (ETDEWEB)

    Grieve, Taylor N. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Since the 1993 court case of Daubert v. Merrell Dow Pharmaceuticals, Inc. the subjective nature of toolmark comparison has been questioned by attorneys and law enforcement agencies alike. This has led to an increased drive to establish objective comparison techniques with known error rates, much like those that DNA analysis is able to provide. This push has created research in which the 3-D surface profile of two different marks are characterized and the marks’ cross-sections are run through a comparative statistical algorithm to acquire a value that is intended to indicate the likelihood of a match between the marks. The aforementioned algorithm has been developed and extensively tested through comparison of evenly striated marks made by screwdrivers. However, this algorithm has yet to be applied to quasi-striated marks such as those made by the shear edge of slip-joint pliers. The results of this algorithm’s application to the surface of copper wire will be presented. Objective mark comparison also extends to comparison of toolmarks made by firearms. In an effort to create objective comparisons, microstamping of firing pins and breech faces has been introduced. This process involves placing unique alphanumeric identifiers surrounded by a radial code on the surface of firing pins, which transfer to the cartridge’s primer upon firing. Three different guns equipped with microstamped firing pins were used to fire 3000 cartridges. These cartridges are evaluated based on the clarity of their alphanumeric transfers and the clarity of the radial code surrounding the alphanumerics.

  1. Object-Oriented Analysis, Structured Analysis, and Jackson System Development

    NARCIS (Netherlands)

    Van Assche, F.; Wieringa, Roelf J.; Moulin, B.; Rolland, C

    1991-01-01

    Conceptual modeling is the activity of producing a conceptual model of an actual or desired version of a universe of discourse (UoD). In this paper, two methods of conceptual modeling are compared, structured analysis (SA) and object-oriented analysis (OOA). This is done by transforming a model

  2. Humans and Deep Networks Largely Agree on Which Kinds of Variation Make Object Recognition Harder.

    Science.gov (United States)

    Kheradpisheh, Saeed R; Ghodrati, Masoud; Ganjtabesh, Mohammad; Masquelier, Timothée

    2016-01-01

    View-invariant object recognition is a challenging problem that has attracted much attention among the psychology, neuroscience, and computer vision communities. Humans are notoriously good at it, even if some variations are presumably more difficult to handle than others (e.g., 3D rotations). Humans are thought to solve the problem through hierarchical processing along the ventral stream, which progressively extracts more and more invariant visual features. This feed-forward architecture has inspired a new generation of bio-inspired computer vision systems called deep convolutional neural networks (DCNN), which are currently the best models for object recognition in natural images. Here, for the first time, we systematically compared human feed-forward vision and DCNNs at view-invariant object recognition task using the same set of images and controlling the kinds of transformation (position, scale, rotation in plane, and rotation in depth) as well as their magnitude, which we call "variation level." We used four object categories: car, ship, motorcycle, and animal. In total, 89 human subjects participated in 10 experiments in which they had to discriminate between two or four categories after rapid presentation with backward masking. We also tested two recent DCNNs (proposed respectively by Hinton's group and Zisserman's group) on the same tasks. We found that humans and DCNNs largely agreed on the relative difficulties of each kind of variation: rotation in depth is by far the hardest transformation to handle, followed by scale, then rotation in plane, and finally position (much easier). This suggests that DCNNs would be reasonable models of human feed-forward vision. In addition, our results show that the variation levels in rotation in depth and scale strongly modulate both humans' and DCNNs' recognition performances. We thus argue that these variations should be controlled in the image datasets used in vision research.

  3. Humans and deep networks largely agree on which kinds of variation make object recognition harder

    Directory of Open Access Journals (Sweden)

    Saeed Reza Kheradpisheh

    2016-08-01

    Full Text Available View-invariant object recognition is a challenging problem that has attracted much attention among the psychology, neuroscience, and computer vision communities. Humans are notoriously good at it, even if some variations are presumably more difficult to handle than others (e.g. 3D rotations. Humans are thought to solve the problem through hierarchical processing along the ventral stream, which progressively extracts more and more invariant visual features. This feed-forward architecture has inspired a new generation of bio-inspired computer vision systems called deep convolutional neural networks (DCNN, which are currently the best models for object recognition in natural images. Here, for the first time, we systematically compared human feed-forward vision and DCNNs at view-invariant object recognition task using the same set of images and controlling the kinds of transformation (position, scale, rotation in plane, and rotation in depth as well as their magnitude, which we call variation level. We used four object categories: car, ship, motorcycle, and animal. In total, 89 human subjects participated in 10 experiments in which they had to discriminate between two or four categories after rapid presentation with backward masking. We also tested two recent DCNNs (proposed respectively by Hinton's group and Zisserman's group on the same tasks. We found that humans and DCNNs largely agreed on the relative difficulties of each kind of variation: rotation in depth is by far the hardest transformation to handle, followed by scale, then rotation in plane, and finally position (much easier. This suggests that DCNNs would be reasonable models of human feed-forward vision. In addition, our results show that the variation levels in rotation in depth and scale strongly modulate both humans' and DCNNs' recognition performances. We thus argue that these variations should be controlled in the image datasets used in vision research.

  4. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  5. Salient Point Detection in Protrusion Parts of 3D Object Robust to Isometric Variations

    Science.gov (United States)

    Mirloo, Mahsa; Ebrahimnezhad, Hosein

    2018-03-01

    In this paper, a novel method is proposed to detect 3D object salient points robust to isometric variations and stable against scaling and noise. Salient points can be used as the representative points from object protrusion parts in order to improve the object matching and retrieval algorithms. The proposed algorithm is started by determining the first salient point of the model based on the average geodesic distance of several random points. Then, according to the previous salient point, a new point is added to this set of points in each iteration. By adding every salient point, decision function is updated. Hence, a condition is created for selecting the next point in which the iterative point is not extracted from the same protrusion part so that drawing out of a representative point from every protrusion part is guaranteed. This method is stable against model variations with isometric transformations, scaling, and noise with different levels of strength due to using a feature robust to isometric variations and considering the relation between the salient points. In addition, the number of points used in averaging process is decreased in this method, which leads to lower computational complexity in comparison with the other salient point detection algorithms.

  6. Variational method for objective analysis of scalar variable and its ...

    Indian Academy of Sciences (India)

    e-mail: sinha@tropmet.res.in. In this study real time data have been used to compare the standard and triangle method by ... The work presented in this paper is about a vari- ... But when the balance is needed ..... tred at 17:30h IST of 11 June within half a degree of ..... Ogura Y and Chen Y L 1977 A life history of an intense.

  7. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-25

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human genome variations: 1) HapMap Data (1,417 individuals) (http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/2010-08_phaseII+III/forward/), 2) HGDP (Human Genome Diversity Project) Data (940 individuals) (http://www.hagsc.org/hgdp/files.html), 3) 1000 genomes Data (2,504 individuals) http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/release/20130502/ If we can integrate all three data into a single volume of data, we should be able to conduct a more detailed analysis of human genome variations for a total number of 4,861 individuals (= 1,417+940+2,504 individuals). In fact, we successfully integrated these three data sets by use of information on the reference human genome sequence, and we conducted the big data analysis. In particular, we constructed a phylogenetic tree of about 5,000 human individuals at the genome level. As a result, we were able to identify clusters of ethnic groups, with detectable admixture, that were not possible by an analysis of each of the three data sets. Here, we report the outcome of this kind of big data analyses and discuss evolutionary significance of human genomic variations. Note that the present study was conducted in collaboration with Katsuhiko Mineta and Kosuke Goto at KAUST.

  8. THE HUBBLE WIDE FIELD CAMERA 3 TEST OF SURFACES IN THE OUTER SOLAR SYSTEM: SPECTRAL VARIATION ON KUIPER BELT OBJECTS

    International Nuclear Information System (INIS)

    Fraser, Wesley C.; Brown, Michael E.; Glass, Florian

    2015-01-01

    Here, we present additional photometry of targets observed as part of the Hubble Wide Field Camera 3 (WFC3) Test of Surfaces in the Outer Solar System. Twelve targets were re-observed with the WFC3 in the optical and NIR wavebands designed to complement those used during the first visit. Additionally, all of the observations originally presented by Fraser and Brown were reanalyzed through the same updated photometry pipeline. A re-analysis of the optical and NIR color distribution reveals a bifurcated optical color distribution and only two identifiable spectral classes, each of which occupies a broad range of colors and has correlated optical and NIR colors, in agreement with our previous findings. We report the detection of significant spectral variations on five targets which cannot be attributed to photometry errors, cosmic rays, point-spread function or sensitivity variations, or other image artifacts capable of explaining the magnitude of the variation. The spectrally variable objects are found to have a broad range of dynamical classes and absolute magnitudes, exhibit a broad range of apparent magnitude variations, and are found in both compositional classes. The spectrally variable objects with sufficiently accurate colors for spectral classification maintain their membership, belonging to the same class at both epochs. 2005 TV189 exhibits a sufficiently broad difference in color at the two epochs that span the full range of colors of the neutral class. This strongly argues that the neutral class is one single class with a broad range of colors, rather than the combination of multiple overlapping classes

  9. Objectivity

    CERN Document Server

    Daston, Lorraine

    2010-01-01

    Objectivity has a history, and it is full of surprises. In Objectivity, Lorraine Daston and Peter Galison chart the emergence of objectivity in the mid-nineteenth-century sciences--and show how the concept differs from its alternatives, truth-to-nature and trained judgment. This is a story of lofty epistemic ideals fused with workaday practices in the making of scientific images. From the eighteenth through the early twenty-first centuries, the images that reveal the deepest commitments of the empirical sciences--from anatomy to crystallography--are those featured in scientific atlases, the compendia that teach practitioners what is worth looking at and how to look at it. Galison and Daston use atlas images to uncover a hidden history of scientific objectivity and its rivals. Whether an atlas maker idealizes an image to capture the essentials in the name of truth-to-nature or refuses to erase even the most incidental detail in the name of objectivity or highlights patterns in the name of trained judgment is a...

  10. SVAMP: Sequence variation analysis, maps and phylogeny

    KAUST Repository

    Naeem, Raeece

    2014-04-03

    Summary: SVAMP is a stand-alone desktop application to visualize genomic variants (in variant call format) in the context of geographical metadata. Users of SVAMP are able to generate phylogenetic trees and perform principal coordinate analysis in real time from variant call format (VCF) and associated metadata files. Allele frequency map, geographical map of isolates, Tajima\\'s D metric, single nucleotide polymorphism density, GC and variation density are also available for visualization in real time. We demonstrate the utility of SVAMP in tracking a methicillin-resistant Staphylococcus aureus outbreak from published next-generation sequencing data across 15 countries. We also demonstrate the scalability and accuracy of our software on 245 Plasmodium falciparum malaria isolates from three continents. Availability and implementation: The Qt/C++ software code, binaries, user manual and example datasets are available at http://cbrc.kaust.edu.sa/svamp. © The Author 2014.

  11. A strategic analysis of Business Objects' portal application

    OpenAIRE

    Kristinsson, Olafur Oskar

    2007-01-01

    Business Objects is the leading software firm producing business intelligence software. Business intelligence is a growing market. Small to medium businesses are increasingly looking at business intelligence. Business Objects' flagship product in the enterprise market is Business Objects XI and for medium-size companies it has Crystal Decisions. Portals are the front end for the two products. InfoView, Business Objects portal application, lacks a long-term strategy. This analysis evaluates...

  12. Electrical Resistance Tomography for Visualization of Moving Objects Using a Spatiotemporal Total Variation Regularization Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Chen

    2018-05-01

    Full Text Available Electrical resistance tomography (ERT has been considered as a data collection and image reconstruction method in many multi-phase flow application areas due to its advantages of high speed, low cost and being non-invasive. In order to improve the quality of the reconstructed images, the Total Variation algorithm attracts abundant attention due to its ability to solve large piecewise and discontinuous conductivity distributions. In industrial processing tomography (IPT, techniques such as ERT have been used to extract important flow measurement information. For a moving object inside a pipe, a velocity profile can be calculated from the cross correlation between signals generated from ERT sensors. Many previous studies have used two sets of 2D ERT measurements based on pixel-pixel cross correlation, which requires two ERT systems. In this paper, a method for carrying out flow velocity measurement using a single ERT system is proposed. A novel spatiotemporal total variation regularization approach is utilised to exploit sparsity both in space and time in 4D, and a voxel-voxel cross correlation method is adopted for measurement of flow profile. Result shows that the velocity profile can be calculated with a single ERT system and that the volume fraction and movement can be monitored using the proposed method. Both semi-dynamic experimental and static simulation studies verify the suitability of the proposed method. For in plane velocity profile, a 3D image based on temporal 2D images produces velocity profile with accuracy of less than 1% error and a 4D image for 3D velocity profiling shows an error of 4%.

  13. Objective high Resolution Analysis over Complex Terrain with VERA

    Science.gov (United States)

    Mayer, D.; Steinacker, R.; Steiner, A.

    2012-04-01

    VERA (Vienna Enhanced Resolution Analysis) is a model independent, high resolution objective analysis of meteorological fields over complex terrain. This system consists of a special developed quality control procedure and a combination of an interpolation and a downscaling technique. Whereas the so called VERA-QC is presented at this conference in the contribution titled "VERA-QC, an approved Data Quality Control based on Self-Consistency" by Andrea Steiner, this presentation will focus on the method and the characteristics of the VERA interpolation scheme which enables one to compute grid point values of a meteorological field based on irregularly distributed observations and topography related aprior knowledge. Over a complex topography meteorological fields are not smooth in general. The roughness which is induced by the topography can be explained physically. The knowledge about this behavior is used to define the so called Fingerprints (e.g. a thermal Fingerprint reproducing heating or cooling over mountainous terrain or a dynamical Fingerprint reproducing positive pressure perturbation on the windward side of a ridge) under idealized conditions. If the VERA algorithm recognizes patterns of one or more Fingerprints at a few observation points, the corresponding patterns are used to downscale the meteorological information in a greater surrounding. This technique allows to achieve an analysis with a resolution much higher than the one of the observational network. The interpolation of irregularly distributed stations to a regular grid (in space and time) is based on a variational principle applied to first and second order spatial and temporal derivatives. Mathematically, this can be formulated as a cost function that is equivalent to the penalty function of a thin plate smoothing spline. After the analysis field has been divided into the Fingerprint components and the unexplained part respectively, the requirement of a smooth distribution is applied to the

  14. Use of objective analysis to estimate winter temperature and ...

    Indian Academy of Sciences (India)

    In the complex terrain of Himalaya, nonavailability of snow and meteorological data of the remote locations ... Precipitation intensity; spatial interpolation; objective analysis. J. Earth Syst. ... This technique needs historical database and unable ...

  15. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    OpenAIRE

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  16. An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method

    Science.gov (United States)

    Tang, J.

    2012-01-01

    Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.

  17. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.|info:eu-repo/dai/nl/224281216; Queiroz Feitosa, R.; van der Meer, F.D.|info:eu-repo/dai/nl/138940908; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  18. From Pixels to Geographic Objects in Remote Sensing Image Analysis

    NARCIS (Netherlands)

    Addink, E.A.; Van Coillie, Frieke M.B.; Jong, Steven M. de

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received

  19. Analysis on Precipitation Variation in Anyang and Nanyang in Recent 57 Years

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim was to analyze precipitation variation in Anyang and Nanyang in recent 57 years. [Method] Based on the data of annual precipitation in Anyang and Nanyang from 1953 to 2009, the changes of precipitation in Anyang and Nanyang were compared by means of mathematical statistics, regression analysis and wavelet analysis. [Result] In recent 57 years, annual precipitation in Anyang and Nanyang showed decrease trend, especially Anyang with obvious decrease trend; from seasonal variation, average ...

  20. The Effect of Geographic Units of Analysis on Measuring Geographic Variation in Medical Services Utilization

    Directory of Open Access Journals (Sweden)

    Agnus M. Kim

    2016-07-01

    Full Text Available Objectives: We aimed to evaluate the effect of geographic units of analysis on measuring geographic variation in medical services utilization. For this purpose, we compared geographic variations in the rates of eight major procedures in administrative units (districts and new areal units organized based on the actual health care use of the population in Korea. Methods: To compare geographic variation in geographic units of analysis, we calculated the age–sex standardized rates of eight major procedures (coronary artery bypass graft surgery, percutaneous transluminal coronary angioplasty, surgery after hip fracture, knee-replacement surgery, caesarean section, hysterectomy, computed tomography scan, and magnetic resonance imaging scan from the National Health Insurance database in Korea for the 2013 period. Using the coefficient of variation, the extremal quotient, and the systematic component of variation, we measured geographic variation for these eight procedures in districts and new areal units. Results: Compared with districts, new areal units showed a reduction in geographic variation. Extremal quotients and inter-decile ratios for the eight procedures were lower in new areal units. While the coefficient of variation was lower for most procedures in new areal units, the pattern of change of the systematic component of variation between districts and new areal units differed among procedures. Conclusions: Geographic variation in medical service utilization could vary according to the geographic unit of analysis. To determine how geographic characteristics such as population size and number of geographic units affect geographic variation, further studies are needed.

  1. Data analysis in an Object Request Broker environment

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  2. Ten years of Object-Oriented analysis on H1

    International Nuclear Information System (INIS)

    Laycock, Paul

    2012-01-01

    Over a decade ago, the H1 Collaboration decided to embrace the object-oriented paradigm and completely redesign its data analysis model and data storage format. The event data model, based on the ROOT framework, consists of three layers - tracks and calorimeter clusters, identified particles and finally event summary data - with a singleton class providing unified access. This original solution was then augmented with a fourth layer containing user-defined objects. This contribution will summarise the history of the solutions used, from modifications to the original design, to the evolution of the high-level end-user analysis object framework which is used by H1 today. Several important issues are addressed - the portability of expert knowledge to increase the efficiency of data analysis, the flexibility of the framework to incorporate new analyses, the performance and ease of use, and lessons learned for future projects.

  3. Convergence analysis of variational and non-variational multigrid algorithms for the Laplace-Beltrami operator

    KAUST Repository

    Bonito, Andrea; Pasciak, Joseph E.

    2012-01-01

    is captured well enough by the coarsest grid. The main argument hinges on a perturbation analysis from an auxiliary variational algorithm defined directly on the smooth surface. In addition, the vanishing mean value constraint is imposed on each level, thereby

  4. Comparative analysis of imaging configurations and objectives for Fourier microscopy.

    Science.gov (United States)

    Kurvits, Jonathan A; Jiang, Mingming; Zia, Rashid

    2015-11-01

    Fourier microscopy is becoming an increasingly important tool for the analysis of optical nanostructures and quantum emitters. However, achieving quantitative Fourier space measurements requires a thorough understanding of the impact of aberrations introduced by optical microscopes that have been optimized for conventional real-space imaging. Here we present a detailed framework for analyzing the performance of microscope objectives for several common Fourier imaging configurations. To this end, we model objectives from Nikon, Olympus, and Zeiss using parameters that were inferred from patent literature and confirmed, where possible, by physical disassembly. We then examine the aberrations most relevant to Fourier microscopy, including the alignment tolerances of apodization factors for different objective classes, the effect of magnification on the modulation transfer function, and vignetting-induced reductions of the effective numerical aperture for wide-field measurements. Based on this analysis, we identify an optimal objective class and imaging configuration for Fourier microscopy. In addition, the Zemax files for the objectives and setups used in this analysis have been made publicly available as a resource for future studies.

  5. Bi-variate statistical attribute filtering : A tool for robust detection of faint objects

    NARCIS (Netherlands)

    Teeninga, Paul; Moschini, Ugo; Trager, Scott C.; Wilkinson, M.H.F.

    2013-01-01

    We present a new method for morphological connected attribute filtering for object detection in astronomical images. In this approach, a threshold is set on one attribute (power), based on its distribution due to noise, as a function of object area. The results show an order of magnitude higher

  6. Exergoeconomic multi objective optimization and sensitivity analysis of a regenerative Brayton cycle

    International Nuclear Information System (INIS)

    Naserian, Mohammad Mahdi; Farahat, Said; Sarhaddi, Faramarz

    2016-01-01

    Highlights: • Finite time exergoeconomic multi objective optimization of a Brayton cycle. • Comparing the exergoeconomic and the ecological function optimization results. • Inserting the cost of fluid streams concept into finite-time thermodynamics. • Exergoeconomic sensitivity analysis of a regenerative Brayton cycle. • Suggesting the cycle performance curve drawing and utilization. - Abstract: In this study, the optimal performance of a regenerative Brayton cycle is sought through power maximization and then exergoeconomic optimization using finite-time thermodynamic concept and finite-size components. Optimizations are performed using genetic algorithm. In order to take into account the finite-time and finite-size concepts in current problem, a dimensionless mass-flow parameter is used deploying time variations. The decision variables for the optimum state (of multi objective exergoeconomic optimization) are compared to the maximum power state. One can see that the multi objective exergoeconomic optimization results in a better performance than that obtained with the maximum power state. The results demonstrate that system performance at optimum point of multi objective optimization yields 71% of the maximum power, but only with exergy destruction as 24% of the amount that is produced at the maximum power state and 67% lower total cost rate than that of the maximum power state. In order to assess the impact of the variation of the decision variables on the objective functions, sensitivity analysis is conducted. Finally, the cycle performance curve drawing according to exergoeconomic multi objective optimization results and its utilization, are suggested.

  7. Frame sequences analysis technique of linear objects movement

    Science.gov (United States)

    Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.

    2017-12-01

    Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.

  8. Data analysis in an object request broker environment

    International Nuclear Information System (INIS)

    Malon, David M.; May, Edward N.; Grossman, Robert L.; Day, Christopher T.; Quarrie, David R.

    1996-01-01

    Computing for the Next Millennium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanism for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function is such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study. (author)

  9. Forest Rent as an Object of Economic Analysis

    Directory of Open Access Journals (Sweden)

    Lisichko Andriyana M.

    2018-01-01

    Full Text Available The article is aimed at researching the concept of forest rent as an object of economic analysis. The essence of the concept of «forest rent» has been researched. It has been defined that the forest rent is the object of management of the forest complex of Ukraine as a whole and forest enterprises in particular. Rent for special use of forest resources is the object of interest om the part of both the State and the corporate sector, because its value depends on the cost of timber for industry and households. Works of scholars on classification of rents were studied. It has been determined that the rent for specialized use of forest resources is a special kind of natural rent. The structure of constituents in the system of rent relations in the forest sector has been defined in accordance with provisions of the tax code of Ukraine.

  10. SVAMP: Sequence variation analysis, maps and phylogeny

    KAUST Repository

    Naeem, Raeece; Hidayah, Lailatul; Preston, Mark D.; Clark, Taane G.; Pain, Arnab

    2014-01-01

    Summary: SVAMP is a stand-alone desktop application to visualize genomic variants (in variant call format) in the context of geographical metadata. Users of SVAMP are able to generate phylogenetic trees and perform principal coordinate analysis

  11. Fast grasping of unknown objects using principal component analysis

    Science.gov (United States)

    Lei, Qujiang; Chen, Guangming; Wisse, Martijn

    2017-09-01

    Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.

  12. Individual variation in human spatial ability: differences between men and women in object location memory.

    NARCIS (Netherlands)

    Goede, M. de; Kessels, R.P.C.; Postma, A.

    2006-01-01

    One of the most consistent findings in the area of cognitive sex differences is that males outperform females on many spatial tasks. One exception seems to be object location memory. On this task, females tend to perform better than males. However, the existing studies have provided quite mixed

  13. Analysis of Price Variation and Market Integration of Prosopis ...

    African Journals Online (AJOL)

    Analysis of Price Variation and Market Integration of Prosopis Africana (guill. ... select five markets based on the presence of traders selling the commodity in the markets ... T- test result showed that Prosopis africana seed trade is profitable and ...

  14. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  15. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  16. Head First Object-Oriented Analysis and Design

    CERN Document Server

    McLaughlin, Brett D; West, David

    2006-01-01

    "Head First Object Oriented Analysis and Design is a refreshing look at subject of OOAD. What sets this book apart is its focus on learning. The authors have made the content of OOAD accessible, usable for the practitioner." Ivar Jacobson, Ivar Jacobson Consulting "I just finished reading HF OOA&D and I loved it! The thing I liked most about this book was its focus on why we do OOA&D-to write great software!" Kyle Brown, Distinguished Engineer, IBM "Hidden behind the funny pictures and crazy fonts is a serious, intelligent, extremely well-crafted presentation of OO Analysis and Design

  17. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  18. Background Noises Versus Intraseasonal Variation Signals: Small vs. Large Convective Cloud Objects From CERES Aqua Observations

    Science.gov (United States)

    Xu, Kuan-Man

    2015-01-01

    During inactive phases of Madden-Julian Oscillation (MJO), there are plenty of deep but small convective systems and far fewer deep and large ones. During active phases of MJO, a manifestation of an increase in the occurrence of large and deep cloud clusters results from an amplification of large-scale motions by stronger convective heating. This study is designed to quantitatively examine the roles of small and large cloud clusters during the MJO life cycle. We analyze the cloud object data from Aqua CERES (Clouds and the Earth's Radiant Energy System) observations between July 2006 and June 2010 for tropical deep convective (DC) and cirrostratus (CS) cloud object types according to the real-time multivariate MJO index, which assigns the tropics to one of the eight MJO phases each day. The cloud object is a contiguous region of the earth with a single dominant cloud-system type. The criteria for defining these cloud types are overcast footprints and cloud top pressures less than 400 hPa, but DC has higher cloud optical depths (=10) than those of CS (background noises resulting from various types of the tropical waves with different wavenumbers and propagation speeds/directions.

  19. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    Directory of Open Access Journals (Sweden)

    Adam W Green

    Full Text Available Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA and available information to inform a formal decision process to determine optimal and timely management policies.

  20. Mediman: Object oriented programming approach for medical image analysis

    International Nuclear Information System (INIS)

    Coppens, A.; Sibomana, M.; Bol, A.; Michel, C.

    1993-01-01

    Mediman is a new image analysis package which has been developed to analyze quantitatively Positron Emission Tomography (PET) data. It is object-oriented, written in C++ and its user interface is based on InterViews on top of which new classes have been added. Mediman accesses data using external data representation or import/export mechanism which avoids data duplication. Multimodality studies are organized in a simple database which includes images, headers, color tables, lists and objects of interest (OOI's) and history files. Stored color table parameters allow to focus directly on the interesting portion of the dynamic range. Lists allow to organize the study according to modality, acquisition protocol, time and spatial properties. OOI's (points, lines and regions) are stored in absolute 3-D coordinates allowing correlation with other co-registered imaging modalities such as MRI or SPECT. OOI's have visualization properties and are organized into groups. Quantitative ROI analysis of anatomic images consists of position, distance, volume calculation on selected OOI's. An image calculator is connected to mediman. Quantitation of metabolic images is performed via profiles, sectorization, time activity curves and kinetic modeling. Mediman is menu and mouse driven, macro-commands can be registered and replayed. Its interface is customizable through a configuration file. The benefit of the object-oriented approach are discussed from a development point of view

  1. Convergence analysis of variational and non-variational multigrid algorithms for the Laplace-Beltrami operator

    KAUST Repository

    Bonito, Andrea

    2012-09-01

    We design and analyze variational and non-variational multigrid algorithms for the Laplace-Beltrami operator on a smooth and closed surface. In both cases, a uniform convergence for the V -cycle algorithm is obtained provided the surface geometry is captured well enough by the coarsest grid. The main argument hinges on a perturbation analysis from an auxiliary variational algorithm defined directly on the smooth surface. In addition, the vanishing mean value constraint is imposed on each level, thereby avoiding singular quadratic forms without adding additional computational cost. Numerical results supporting our analysis are reported. In particular, the algorithms perform well even when applied to surfaces with a large aspect ratio. © 2011 American Mathematical Society.

  2. Economic emission dispatching with variations of wind power and loads using multi-objective optimization by learning automata

    International Nuclear Information System (INIS)

    Liao, H.L.; Wu, Q.H.; Li, Y.Z.; Jiang, L.

    2014-01-01

    Highlights: • Apply multi-objective optimization by learning automata to power system. • Sequentially dimensional search and state memory are incorporated. • Track dispatch under significant variations of wind power and load demand. • Good performance in terms of accuracy, distribution and computation time. - Abstract: This paper is concerned with using multi-objective optimization by learning automata (MOLA) for economic emission dispatching in the environment where wind power and loads vary. With its capabilities of sequentially dimensional search and state memory, MOLA is able to find accurate solutions while satisfying two objectives: fuel cost coupled with environmental emission and voltage stability. Its searching quality and efficiency are measured using the hypervolume indicator for investigating the quality of Pareto front, and demonstrated by tracking the dispatch solutions under significant variations of wind power and load demand. The simulation studies are carried out on the modified midwestern American electric power system and the IEEE 118-bus test system, in which wind power penetration and load variations present. Evaluated on these two power systems, MOLA is fully compared with multi-objective evolutionary algorithm based on decomposition (MOEA/D) and non-dominated sorting genetic algorithm II (NSGA-II). The simulation results have shown the superiority of MOLA over NAGA-II and MOEA/D, as it is able to obtain more accurate and widely distributed Pareto fronts. In the dynamic environment where the operation condition of both wind speed and load demand varies, MOLA outperforms the other two algorithms, with respect to the tracking ability and accuracy of the solutions

  3. EGYPTIAN MUTUAL FUNDS ANALYSIS: HISTORY, PERFORMANCE, OBJECTIVES, RISK AND RETURN

    Directory of Open Access Journals (Sweden)

    Petru STEFEA

    2013-10-01

    Full Text Available The present research aims to overview the mutual fund in Egypt. The establishment of the first mutual funds was achieved in 1994. Nowadays, the total mutual funds reached 90 funds , approximately. The income funds represent the largest share of the Egyptian mutual funds (40%, growth funds (25% and the private equity funds is at least (1%. The total population of the Egyptian mutual funds reached 22. Finally, the study proved that the Egyptian mutual funds have an impact on fund return , total risk and systemic; when analysis relationship between risk and return. The study found influencing for mutual fund's objectives on Sharpe and Terynor ratios.

  4. Object-Based Image Analysis in Wetland Research: A Review

    Directory of Open Access Journals (Sweden)

    Iryna Dronova

    2015-05-01

    Full Text Available The applications of object-based image analysis (OBIA in remote sensing studies of wetlands have been growing over recent decades, addressing tasks from detection and delineation of wetland bodies to comprehensive analyses of within-wetland cover types and their change. Compared to pixel-based approaches, OBIA offers several important benefits to wetland analyses related to smoothing of the local noise, incorporating meaningful non-spectral features for class separation and accounting for landscape hierarchy of wetland ecosystem organization and structure. However, there has been little discussion on whether unique challenges of wetland environments can be uniformly addressed by OBIA across different types of data, spatial scales and research objectives, and to what extent technical and conceptual aspects of this framework may themselves present challenges in a complex wetland setting. This review presents a synthesis of 73 studies that applied OBIA to different types of remote sensing data, spatial scale and research objectives. It summarizes the progress and scope of OBIA uses in wetlands, key benefits of this approach, factors related to accuracy and uncertainty in its applications and the main research needs and directions to expand the OBIA capacity in the future wetland studies. Growing demands for higher-accuracy wetland characterization at both regional and local scales together with advances in very high resolution remote sensing and novel tasks in wetland restoration monitoring will likely continue active exploration of the OBIA potential in these diverse and complex environments.

  5. Objective image analysis of the meibomian gland area.

    Science.gov (United States)

    Arita, Reiko; Suehiro, Jun; Haraguchi, Tsuyoshi; Shirakawa, Rika; Tokoro, Hideaki; Amano, Shiro

    2014-06-01

    To evaluate objectively the meibomian gland area using newly developed software for non-invasive meibography. Eighty eyelids of 42 patients without meibomian gland loss (meiboscore=0), 105 eyelids of 57 patients with loss of less than one-third total meibomian gland area (meiboscore=1), 13 eyelids of 11 patients with between one-third and two-thirds loss of meibomian gland area (meiboscore=2) and 20 eyelids of 14 patients with two-thirds loss of meibomian gland area (meiboscore=3) were studied. Lid borders were automatically determined. The software evaluated the distribution of the luminance and, by enhancing the contrast and reducing image noise, the meibomian gland area was automatically discriminated. The software calculated the ratio of the total meibomian gland area relative to the total analysis area in all subjects. Repeatability of the software was also evaluated. The mean ratio of the meibomian gland area to the total analysis area in the upper/lower eyelids was 51.9±5.7%/54.7±5.4% in subjects with a meiboscore of 0, 47.7±6.0%/51.5±5.4% in those with a meiboscore of 1, 32.0±4.4%/37.2±3.5% in those with a meiboscore of 2 and 16.7±6.4%/19.5±5.8% in subjects with a meiboscore of 3. The meibomian gland area was objectively evaluated using the developed software. This system could be useful for objectively evaluating the effect of treatment on meibomian gland dysfunction. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  6. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  7. On the analysis of line profile variations: A statistical approach

    International Nuclear Information System (INIS)

    McCandliss, S.R.

    1988-01-01

    This study is concerned with the empirical characterization of the line profile variations (LPV), which occur in many of and Wolf-Rayet stars. The goal of the analysis is to gain insight into the physical mechanisms producing the variations. The analytic approach uses a statistical method to quantify the significance of the LPV and to identify those regions in the line profile which are undergoing statistically significant variations. Line positions and flux variations are then measured and subject to temporal and correlative analysis. Previous studies of LPV have for the most part been restricted to observations of a single line. Important information concerning the range and amplitude of the physical mechanisms involved can be obtained by simultaneously observing spectral features formed over a range of depths in the extended mass losing atmospheres of massive, luminous stars. Time series of a Wolf-Rayet and two of stars with nearly complete spectral coverage from 3940 angstrom to 6610 angstrom and with spectral resolution of R = 10,000 are analyzed here. These three stars exhibit a wide range of both spectral and temporal line profile variations. The HeII Pickering lines of HD 191765 show a monotonic increase in the peak rms variation amplitude with lines formed at progressively larger radii in the Wolf-Rayet star wind. Two times scales of variation have been identified in this star: a less than one day variation associated with small scale flickering in the peaks of the line profiles and a greater than one day variation associated with large scale asymmetric changes in the overall line profile shapes. However, no convincing period phenomena are evident at those periods which are well sampled in this time series

  8. Error analysis of motion correction method for laser scanning of moving objects

    Science.gov (United States)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  9. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  10. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  11. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  12. Joint Tensor Feature Analysis For Visual Object Recognition.

    Science.gov (United States)

    Wong, Wai Keung; Lai, Zhihui; Xu, Yong; Wen, Jiajun; Ho, Chu Po

    2015-11-01

    Tensor-based object recognition has been widely studied in the past several years. This paper focuses on the issue of joint feature selection from the tensor data and proposes a novel method called joint tensor feature analysis (JTFA) for tensor feature extraction and recognition. In order to obtain a set of jointly sparse projections for tensor feature extraction, we define the modified within-class tensor scatter value and the modified between-class tensor scatter value for regression. The k-mode optimization technique and the L(2,1)-norm jointly sparse regression are combined together to compute the optimal solutions. The convergent analysis, computational complexity analysis and the essence of the proposed method/model are also presented. It is interesting to show that the proposed method is very similar to singular value decomposition on the scatter matrix but with sparsity constraint on the right singular value matrix or eigen-decomposition on the scatter matrix with sparse manner. Experimental results on some tensor datasets indicate that JTFA outperforms some well-known tensor feature extraction and selection algorithms.

  13. Analysis and Comparison of Objective Methods for Image Quality Assessment

    Directory of Open Access Journals (Sweden)

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  14. Analysis of Camera Parameters Value in Various Object Distances Calibration

    International Nuclear Information System (INIS)

    Yusoff, Ahmad Razali; Ariff, Mohd Farid Mohd; Idris, Khairulnizam M; Majid, Zulkepli; Setan, Halim; Chong, Albert K

    2014-01-01

    In photogrammetric applications, good camera parameters are needed for mapping purpose such as an Unmanned Aerial Vehicle (UAV) that encompassed with non-metric camera devices. Simple camera calibration was being a common application in many laboratory works in order to get the camera parameter's value. In aerial mapping, interior camera parameters' value from close-range camera calibration is used to correct the image error. However, the causes and effects of the calibration steps used to get accurate mapping need to be analyze. Therefore, this research aims to contribute an analysis of camera parameters from portable calibration frame of 1.5 × 1 meter dimension size. Object distances of two, three, four, five, and six meters are the research focus. Results are analyzed to find out the changes in image and camera parameters' value. Hence, camera calibration parameter's of a camera is consider different depend on type of calibration parameters and object distances

  15. Variational analysis of regular mappings theory and applications

    CERN Document Server

    Ioffe, Alexander D

    2017-01-01

    This monograph offers the first systematic account of (metric) regularity theory in variational analysis. It presents new developments alongside classical results and demonstrates the power of the theory through applications to various problems in analysis and optimization theory. The origins of metric regularity theory can be traced back to a series of fundamental ideas and results of nonlinear functional analysis and global analysis centered around problems of existence and stability of solutions of nonlinear equations. In variational analysis, regularity theory goes far beyond the classical setting and is also concerned with non-differentiable and multi-valued operators. The present volume explores all basic aspects of the theory, from the most general problems for mappings between metric spaces to those connected with fairly concrete and important classes of operators acting in Banach and finite dimensional spaces. Written by a leading expert in the field, the book covers new and powerful techniques, whic...

  16. Hospital Variation in Cesarean Delivery: A Multilevel Analysis.

    Science.gov (United States)

    Vecino-Ortiz, Andres I; Bardey, David; Castano-Yepes, Ramon

    2015-12-01

    To assess the issue of hospital variations in Colombia and to contribute to the methodology on health care variations by using a model that clusters the variance between hospitals while accounting for individual-level reimbursement rates and objective health-status variables. We used data on all births (N = 11,954) taking place in a contributory-regimen insurer network in Colombia during 2007. A multilevel logistic regression model was used to account for the share of unexplained variance between hospitals. In addition, an alternative variance decomposition specification was further carried out to measure the proportion of such unexplained variance due to the region effect. Hospitals account for 20% of the variation in performing cesarean sections, whereas region explains only one-third of such variance. Variables accounting for preferences on the demand side as well as reimbursement rates are found to predict the probability of performing cesarean sections. Hospital variations explain large variances within a single-payer's network. Because this insurer company is highly regarded in terms of performance and finance, these results might provide a lower bound for the scale of hospital variation in the Colombian health care market. Such lower bound provides guidance on the relevance of this issue for Colombia. Some factors such as demand-side preferences and physician reimbursement rates increase variations in health care even within a single-payer network. This is a source of inefficiencies, threatening the quality of health care and financial sustainability. The proposed methodology should be considered in further research on health care variations. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. Introduction and application of the multiscale coefficient of variation analysis.

    Science.gov (United States)

    Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh

    2017-10-01

    Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.

  18. Variational analysis and generalized differentiation I basic theory

    CERN Document Server

    Mordukhovich, Boris S

    2006-01-01

    Contains a study of the basic concepts and principles of variational analysis and generalized differentiation in both finite-dimensional and infinite-dimensional spaces. This title presents many applications to problems in optimization, equilibria, stability and sensitivity, control theory, economics, mechanics, and more.

  19. Poka Yoke system based on image analysis and object recognition

    Science.gov (United States)

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  20. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  1. Theory of planned behaviour variables and objective walking behaviour do not show seasonal variation in a randomised controlled trial.

    Science.gov (United States)

    Williams, Stefanie L; French, David P

    2014-02-05

    Longitudinal studies have shown that objectively measured walking behaviour is subject to seasonal variation, with people walking more in summer compared to winter. Seasonality therefore may have the potential to bias the results of randomised controlled trials if there are not adequate statistical or design controls. Despite this there are no studies that assess the impact of seasonality on walking behaviour in a randomised controlled trial, to quantify the extent of such bias. Further there have been no studies assessing how season impacts on the psychological predictors of walking behaviour to date. The aim of the present study was to assess seasonal differences in a) objective walking behaviour and b) Theory of Planned Behaviour (TPB) variables during a randomised controlled trial of an intervention to promote walking. 315 patients were recruited to a two-arm cluster randomised controlled trial of an intervention to promote walking in primary care. A series of repeated measures ANCOVAs were conducted to examine the effect of season on pedometer measures of walking behaviour and TPB measures, assessed immediately post-intervention and six months later. Hierarchical regression analyses were conducted to assess whether season moderated the prediction of intention and behaviour by TPB measures. There were no significant differences in time spent walking in spring/summer compared to autumn/winter. There was no significant seasonal variation in most TPB variables, although the belief that there will be good weather was significantly higher in spring/summer (F = 19.46, p behaviour, or moderate the effects of TPB variables on intention or behaviour. Seasonality does not influence objectively measured walking behaviour or psychological variables during a randomised controlled trial. Consequently physical activity behaviour outcomes in trials will not be biased by the season in which they are measured. Previous studies may have overestimated the extent of

  2. Static analysis of unbounded structures in object-oriented programs

    NARCIS (Netherlands)

    Grabe, Immo

    2012-01-01

    In this thesis we investigate different techniques and formalisms to address complexity introduced by unbounded structures in object-oriented programs. We give a representation of a weakest precondition calculus for abstract object creation in dynamic logic. Based on this calculus we define symbolic

  3. A Comparative Analysis of Structured and Object-Oriented ...

    African Journals Online (AJOL)

    The concepts of structured and object-oriented programming methods are not relatively new but these approaches are still very much useful and relevant in today's programming paradigm. In this paper, we distinguish the features of structured programs from that of object oriented programs. Structured programming is a ...

  4. Optimized variational analysis scheme of single Doppler radar wind data

    Science.gov (United States)

    Sasaki, Yoshi K.; Allen, Steve; Mizuno, Koki; Whitehead, Victor; Wilk, Kenneth E.

    1989-01-01

    A computer scheme for extracting singularities has been developed and applied to single Doppler radar wind data. The scheme is planned for use in real-time wind and singularity analysis and forecasting. The method, known as Doppler Operational Variational Extraction of Singularities is outlined, focusing on the principle of local symmetry. Results are presented from the application of the scheme to a storm-generated gust front in Oklahoma on May 28, 1987.

  5. Towards a syntactic analysis of European Portuguese cognate objects

    Directory of Open Access Journals (Sweden)

    Celda Morgado Choupina

    2013-01-01

    Full Text Available The present paper aims at discussing selected syntactic aspects of cognate objects in European Portuguese, along the lines of Distributed Morphology (Haugen, 2009. Cognate objects may be readily discovered in numerous human languages, including European Portuguese (Chovia uma chuva miudinha. It is assumed in papers devoted to their English counterparts that they belong to various subclasses. Indeed, some of them are genuine cognates (to sleep a sleep... or hyponyms (to dance a jig; Hale & Keyser, 2002. It turns out that in European Portuguese, they can be split into four different categories: (i genuine cognate objects (chorar um choro..., (ii similar cognate objects (dançar uma dança (iii objects hyponyms (dançar um tango and (iv prepositional cognate objects (morrer de uma morte .... There are, then, significant differences between various classes of cognate objects: whereas the genuine ones call imperatively for a restrictive modifier and a definite article, the remaining ones admit it only optionally. It might be concluded, then, that a lexicalist theory set up along the lines of Hale and Keyser is unable to deal successfully with distributional facts proper to various classes of cognate constructions in European Portuguese. That is why the present study is conducted more in accordance with syntactic principles of Distributed Morphology, with a strong impact of hypotheses put forward by Haugen (2009.

  6. FEM analysis of impact of external objects to pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Gracie, Robert; Konuk, Ibrahim [Geological Survey of Canada, Ottawa, ON (Canada)]. E-mail: ikonuk@NRCan.gc.ca; Fredj, Abdelfettah [BMT Fleet Technology Limited, Ottawa, ON (Canada)

    2003-07-01

    One of the most common hazards to pipelines is impact of external objects. Earth moving machinery, farm equipment or bullets can dent or fail land pipelines. External objects such as anchors, fishing gear, ice can damage offshore pipelines. This paper develops an FEM model to simulate the impact process and presents investigations using the FEM model to determine the influence of the geometry and velocity of the impacting object and also will study the influence of the pipe diameter, wall thickness, and concrete thickness along with internal pressure. The FEM model is developed by using LS-DYNA explicit FEM software utilizing shell and solid elements. The model allows damage and removal of the concrete and corrosion coating elements during impact. Parametric studies will be presented relating the dent size to pipe diameter, wall thickness and concrete thickness, internal pipe pressure, and impacting object geometry. The primary objective of this paper is to develop and present the FEM model. The model can be applied to both offshore and land pipeline problems. Some examples are used to illustrate how the model can be applied to real life problems. A future paper will present more detailed parametric studies. (author)

  7. Voice analysis as an objective state marker in bipolar disorder

    DEFF Research Database (Denmark)

    Faurholt-Jepsen, M.; Busk, Jonas; Frost, M.

    2016-01-01

    Changes in speech have been suggested as sensitive and valid measures of depression and mania in bipolar disorder. The present study aimed at investigating (1) voice features collected during phone calls as objective markers of affective states in bipolar disorder and (2) if combining voice...... features, automatically generated objective smartphone data on behavioral activities and electronic self-monitored data were collected from 28 outpatients with bipolar disorder in naturalistic settings on a daily basis during a period of 12 weeks. Depressive and manic symptoms were assessed using...... and electronic self-monitored data increased the accuracy, sensitivity and specificity of classification of affective states slightly. Voice features collected in naturalistic settings using smartphones may be used as objective state markers in patients with bipolar disorder....

  8. Software Analysis of Mining Images for Objects Detection

    Directory of Open Access Journals (Sweden)

    Jan Tomecek

    2013-11-01

    Full Text Available The contribution is dealing with the development of the new module of robust FOTOMNG system for editing images from a video or miningimage from measurements for subsequent improvement of detection of required objects in the 2D image. The generated module allows create a finalhigh-quality picture by combination of multiple images with the search objects. We can combine input data according to the parameters or basedon reference frames. Correction of detected 2D objects is also part of this module. The solution is implemented intoFOTOMNG system and finishedwork has been tested in appropriate frames, which were validated core functionality and usability. Tests confirmed the function of each part of themodule, its accuracy and implications of integration.

  9. Voice analysis as an objective state marker in bipolar disorder

    DEFF Research Database (Denmark)

    Faurholt-Jepsen, M.; Busk, Jonas; Frost, M.

    2016-01-01

    features with automatically generated objective smartphone data on behavioral activities (for example, number of text messages and phone calls per day) and electronic self-monitored data (mood) on illness activity would increase the accuracy as a marker of affective states. Using smartphones, voice...... features, automatically generated objective smartphone data on behavioral activities and electronic self-monitored data were collected from 28 outpatients with bipolar disorder in naturalistic settings on a daily basis during a period of 12 weeks. Depressive and manic symptoms were assessed using...... to be more accurate, sensitive and specific in the classification of manic or mixed states with an area under the curve (AUC)=0.89 compared with an AUC=0.78 for the classification of depressive states. Combining voice features with automatically generated objective smartphone data on behavioral activities...

  10. Analysis for the high-level waste disposal cost object

    International Nuclear Information System (INIS)

    Kim, S. K.; Lee, J. R.; Choi, J. W.; Han, P. S.

    2003-01-01

    The purpose of this study is to analyse the ratio of cost object in terms of the disposal cost estimation. According to the result, the ratio of operating cost is the most significant object in total cost. There are a lot of differences between the disposal costs and product costs in view of their constituents. While the product costs may be classified by the direct materials cost, direct manufacturing labor cost, and factory overhead the disposal cost factors should be constituted by the technical factors and the non-technical factors

  11. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  12. Three dimensional analysis of cosmic ray intensity variation

    International Nuclear Information System (INIS)

    Yasue, Shin-ichi; Mori, Satoru; Nagashima, Kazuo.

    1974-01-01

    Three dimensional analysis of cosmic ray anisotropy and its time variation was performed. This paper describes the analysis of the Forbush decrease in Jan. 1968 to investigate by comparing the direction of the magnetic field in interplanetary space and the direction of the reference axis for cosmic ray anisotropy. New anisotropy becomes dominant at the time of Forbush decrease because the anisotropy of cosmic ray in calm state is wiped out. Such anisotropy produces intensity variation in neutron monitors on the ground. The characteristic parameters of three dimensional anisotropy can be determined from theoretical value and observed intensity. Analyzed data were taken for 6 days from Jan. 25 to Jan. 30, 1968, at Deep River. The decrease of intensity at Deep River was seen for several hours from 11 o'clock (UT), Jan. 26, just before The Forbush decrease. This may be due to the loss cone. The Forbush decrease began at 19 o'clock, Jan. 26, and the main phase continued to 5 o'clock in the next morning. The spectrum of variation was Psup(-0.5). The time variations of the magnetic field in interplanetary space and the reference axis of cosmic ray anisotropy are shown for 15 hours. The average directions of both are almost in coincidence. The spatial distribution of cosmic ray near the earth may be expressed by the superposition of axial symmetrical distribution along a reference axis and its push-out to the direction of 12 o'clock. It is considered that the direction of magnetic force line and the velocity of solar wind correspond to the direction of the reference axis and the magnitude of anisotropy in the direction of 12 o'clock, respectively. (Kato, T.)

  13. Insurer’s activity as object of economic analysis

    Directory of Open Access Journals (Sweden)

    O.O. Poplavskiy

    2015-12-01

    Full Text Available The article is devoted to the substantiation of theoretical fundamentals of insurer’s analysis and peculiarities of its implementation. The attention has been focused on the important role of economic analysis in economic science which is confirmed by its active use in research and practical orientation. The author summarizes the classification and principles of insurer’s activity analysis, supplements it with specific principles for insurer’s environment, publicity and risk-orientation which enable increasingly to take into account the peculiarities of insurance relations. The paper pays attention to the specification of elements of analysis and its key directions including the analysis of insurer’s financing, the analysis of insurance operations and the analysis of investment activity which will allow the effective functioning of risk management system.

  14. GuidosToolbox: universal digital image object analysis

    Science.gov (United States)

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  15. Contextual object understanding through geospatial analysis and reasoning (COUGAR)

    Science.gov (United States)

    Douglas, Joel; Antone, Matthew; Coggins, James; Rhodes, Bradley J.; Sobel, Erik; Stolle, Frank; Vinciguerra, Lori; Zandipour, Majid; Zhong, Yu

    2009-05-01

    Military operations in urban areas often require detailed knowledge of the location and identity of commonly occurring objects and spatial features. The ability to rapidly acquire and reason over urban scenes is critically important to such tasks as mission and route planning, visibility prediction, communications simulation, target recognition, and inference of higher-level form and function. Under DARPA's Urban Reasoning and Geospatial ExploitatioN Technology (URGENT) Program, the BAE Systems team has developed a system that combines a suite of complementary feature extraction and matching algorithms with higher-level inference and contextual reasoning to detect, segment, and classify urban entities of interest in a fully automated fashion. Our system operates solely on colored 3D point clouds, and considers object categories with a wide range of specificity (fire hydrants, windows, parking lots), scale (street lights, roads, buildings, forests), and shape (compact shapes, extended regions, terrain). As no single method can recognize the diverse set of categories under consideration, we have integrated multiple state-of-the-art technologies that couple hierarchical associative reasoning with robust computer vision and machine learning techniques. Our solution leverages contextual cues and evidence propagation from features to objects to scenes in order to exploit the combined descriptive power of 3D shape, appearance, and learned inter-object spatial relationships. The result is a set of tools designed to significantly enhance the productivity of analysts in exploiting emerging 3D data sources.

  16. Multi-element analysis of unidentified fallen objects from Tatale in ...

    African Journals Online (AJOL)

    A multi-element analysis has been carried out on two fallen objects, # 01 and # 02, using instrumental neutron activation analysis technique. A total of 17 elements were identified in object # 01 while 21 elements were found in object # 02. The two major elements in object # 01 were Fe and Mg, which together constitute ...

  17. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    Science.gov (United States)

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  18. X-ray analysis of objects of art and archaeology

    International Nuclear Information System (INIS)

    Mantler, M.; Schreiner, M.

    2001-01-01

    Some theoretical aspects and limitations of XRF are discussed, including information depths in layered materials, characterization of inhomogeneous specimens, light element analysis, and radiation damage. Worked examples of applications of XRF and XRD are pigment analysis in delicate Chinese Paper, corrosion of glass, and leaching effects in soil-buried medieval coins. (author)

  19. The MUSIC algorithm for sparse objects: a compressed sensing analysis

    International Nuclear Information System (INIS)

    Fannjiang, Albert C

    2011-01-01

    The multiple signal classification (MUSIC) algorithm, and its extension for imaging sparse extended objects, with noisy data is analyzed by compressed sensing (CS) techniques. A thresholding rule is developed to augment the standard MUSIC algorithm. The notion of restricted isometry property (RIP) and an upper bound on the restricted isometry constant (RIC) are employed to establish sufficient conditions for the exact localization by MUSIC with or without noise. In the noiseless case, the sufficient condition gives an upper bound on the numbers of random sampling and incident directions necessary for exact localization. In the noisy case, the sufficient condition assumes additionally an upper bound for the noise-to-object ratio in terms of the RIC and the dynamic range of objects. This bound points to the super-resolution capability of the MUSIC algorithm. Rigorous comparison of performance between MUSIC and the CS minimization principle, basis pursuit denoising (BPDN), is given. In general, the MUSIC algorithm guarantees to recover, with high probability, s scatterers with n=O(s 2 ) random sampling and incident directions and sufficiently high frequency. For the favorable imaging geometry where the scatterers are distributed on a transverse plane MUSIC guarantees to recover, with high probability, s scatterers with a median frequency and n=O(s) random sampling/incident directions. Moreover, for the problems of spectral estimation and source localizations both BPDN and MUSIC guarantee, with high probability, to identify exactly the frequencies of random signals with the number n=O(s) of sampling times. However, in the absence of abundant realizations of signals, BPDN is the preferred method for spectral estimation. Indeed, BPDN can identify the frequencies approximately with just one realization of signals with the recovery error at worst linearly proportional to the noise level. Numerical results confirm that BPDN outperforms MUSIC in the well-resolved case while

  20. Object-oriented data analysis framework for neutron scattering experiments

    International Nuclear Information System (INIS)

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  1. Interaction between High-Level and Low-Level Image Analysis for Semantic Video Object Extraction

    Directory of Open Access Journals (Sweden)

    Andrea Cavallaro

    2004-06-01

    Full Text Available The task of extracting a semantic video object is split into two subproblems, namely, object segmentation and region segmentation. Object segmentation relies on a priori assumptions, whereas region segmentation is data-driven and can be solved in an automatic manner. These two subproblems are not mutually independent, and they can benefit from interactions with each other. In this paper, a framework for such interaction is formulated. This representation scheme based on region segmentation and semantic segmentation is compatible with the view that image analysis and scene understanding problems can be decomposed into low-level and high-level tasks. Low-level tasks pertain to region-oriented processing, whereas the high-level tasks are closely related to object-level processing. This approach emulates the human visual system: what one “sees” in a scene depends on the scene itself (region segmentation as well as on the cognitive task (semantic segmentation at hand. The higher-level segmentation results in a partition corresponding to semantic video objects. Semantic video objects do not usually have invariant physical properties and the definition depends on the application. Hence, the definition incorporates complex domain-specific knowledge and is not easy to generalize. For the specific implementation used in this paper, motion is used as a clue to semantic information. In this framework, an automatic algorithm is presented for computing the semantic partition based on color change detection. The change detection strategy is designed to be immune to the sensor noise and local illumination variations. The lower-level segmentation identifies the partition corresponding to perceptually uniform regions. These regions are derived by clustering in an N-dimensional feature space, composed of static as well as dynamic image attributes. We propose an interaction mechanism between the semantic and the region partitions which allows to

  2. Multispectral image analysis for object recognition and classification

    Science.gov (United States)

    Viau, C. R.; Payeur, P.; Cretu, A.-M.

    2016-05-01

    Computer and machine vision applications are used in numerous fields to analyze static and dynamic imagery in order to assist or automate decision-making processes. Advancements in sensor technologies now make it possible to capture and visualize imagery at various wavelengths (or bands) of the electromagnetic spectrum. Multispectral imaging has countless applications in various fields including (but not limited to) security, defense, space, medical, manufacturing and archeology. The development of advanced algorithms to process and extract salient information from the imagery is a critical component of the overall system performance. The fundamental objective of this research project was to investigate the benefits of combining imagery from the visual and thermal bands of the electromagnetic spectrum to improve the recognition rates and accuracy of commonly found objects in an office setting. A multispectral dataset (visual and thermal) was captured and features from the visual and thermal images were extracted and used to train support vector machine (SVM) classifiers. The SVM's class prediction ability was evaluated separately on the visual, thermal and multispectral testing datasets.

  3. Analysis of process parameters in surface grinding using single objective Taguchi and multi-objective grey relational grade

    Directory of Open Access Journals (Sweden)

    Prashant J. Patil

    2016-09-01

    Full Text Available Close tolerance and good surface finish are achieved by means of grinding process. This study was carried out for multi-objective optimization of MQL grinding process parameters. Water based Al2O3 and CuO nanofluids of various concentrations are used as lubricant for MQL system. Grinding experiments were carried out on instrumented surface grinding machine. For experimentation purpose Taguchi's method was used. Important process parameters that affect the G ratio and surface finish in MQL grinding are depth of cut, type of lubricant, feed rate, grinding wheel speed, coolant flow rate, and nanoparticle size. Grinding performance was calculated by the measurement G ratio and surface finish. For improvement of grinding process a multi-objective process parameter optimization is performed by use of Taguchi based grey relational analysis. To identify most significant factor of process analysis of variance (ANOVA has been used.

  4. Object permanence in cats: Analysis in locomotor space.

    Science.gov (United States)

    Thinus-Blanc, C; Poucet, B; Chapuis, N

    1982-04-01

    Stages IV and V object permanence were studied with 38-40-week-old cats. A constraining apparatus preventing animals from pursuing the bowl containing meat before it was concealed was used. Either the bowl was seen moving and disappeared from view behind a screen (stage IV trials), or after this sequence, it reappeared from behind the first screen and disappeared behind a second screen (stage V trials). In both situations cats performed significantly above chance but the paths taken to reach the food were different according to the stage. In stage V trials, cats expressed a preference for the path leading to the end of the second screen where the food was last seen disappearing. Copyright © 1982. Published by Elsevier B.V.

  5. Which diabetic patients should receive podiatry care? An objective analysis.

    Science.gov (United States)

    McGill, M; Molyneaux, L; Yue, D K

    2005-08-01

    Diabetes is the leading cause of lower limb amputation in Australia. However, due to limited resources, it is not feasible for everyone with diabetes to access podiatry care, and some objective guidelines of who should receive podiatry is required. A total of 250 patients with neuropathy (Biothesiometer; Biomedical Instruments, Newbury, Ohio, USA) ( > 30, age podiatry care (mean of estimates from 10 reports), the NNT to prevent one foot ulcer per year was: no neuropathy (vibration perception threshold (VPT) 30) alone, NNT = 45; +cannot feel monofilament, NNT = 18; +previous ulcer/amputation, NNT = 7. Provision of podiatry care to diabetic patients should not be only economically based, but should also be directed to those with reduced sensation, especially where there is a previous history of ulceration or amputation.

  6. Heating Development Analysis in Long HTS Objects - Updated Results

    Energy Technology Data Exchange (ETDEWEB)

    Vysotsky, V S; Repnikov, V V; Lobanov, E A; Karapetyan, G H; Sytnikov, V E [All-Russian Scientific R and D Cable Institute, 5, Shosse Entuziastov, 111024, Moscow (Russian Federation)

    2006-06-01

    During fault in a grid large overload current, up to 30-times fold, forcibly will go to an HTS superconducting cable installed in a grid causing its quench and heating. The upgraded model has been used to analyse the heating development in long HTS objects during overloads. The model better presents real properties of materials used. New calculations coincide well with experiments and permit to determine the cooling coefficients. The stability limit (thermal runaway current) was determined for different cooling and index n. The overload currents, at which the superconductor will be heated up to 100 K during 250 ms can be determined also. The model may be used for practical evaluations of operational parameters.

  7. Introductory Psychology Textbooks: An Objective Analysis and Update.

    Science.gov (United States)

    Griggs, Richard A.; Jackson, Sherri L.; Christopher, Andrew N.; Marek, Pam

    1999-01-01

    Explores changes in the introductory psychology textbook market through an analysis of edition, author, length, and content coverage of the volumes that comprise the current market. Finds a higher edition average, a decrease in the number of authors, an increase in text pages, and a focus on developmental psychology and sensation/perception. (CMK)

  8. Analysis of Δ14C variations in atmosphere

    International Nuclear Information System (INIS)

    Simon, J.; Sivo, A.; Richtarikova, M.; Holy, K.; Polaskova, A.; Bulko, M.; Hola, O.

    2005-01-01

    The Δ 14 C in the atmosphere have been measured and studied in two localities of Slovakia. The accomplished analysis proved the existence of the annual variations of the Δ 14 C with the attenuating amplitude and decreasing mean value. It seems to be logical and physically correct to describe the Δ 14 C time-dependence by the equation: y = Ae -at + Be -bt cos(ω 1 t + (φ)). The coefficients A, a, B, b, (φ) are listed in the table for both the localities. The observed variations of the Δ 14 C have a maximum in summer and minimum in winter .Probably it is caused by the higher requirement of the heat supply in winter season which is connected directly with the fossil CO 2 emissions and more intensive Suess effect. Summer maximum could be explained by the combination of the lower CO 2 emission rate and higher turbulent transport of the stratospheric 14 C to the troposphere. Using the Fourier harmonic analysis the amplitude spectra of the average annual variations were plotted. The obtained result shows that the variations have the high degree of symmetry. Furthermore, the obtained basic frequency ω 1 = 2π/12 [month -1 ] proves that the cyclic processes with the period of T = 12 [month] have a major influence on the 14 C amount in the troposphere. The presence of some higher-order harmonics is significant, but a physical interpretation has not yet been clear. In addition to the main frequency there are presented also 2ω 1 and 3ω 1 in Bratislava and 4ω 1 in Zlkovce data-set. The long-time average of the Δ 14 C in Zlkovce during years 1995-2004 is higher of about 6.6 o / oo than in Bratislava. It represents an unique evidence that the local CO 2 pollution affects the 14 C activity . The correlation on the level R 2 = 0,43 was found between Bratislava and Zlkovce atmospheric Δ 14 C data. (authors)

  9. The Army Communications Objectives Measurement System (ACOMS): Survey Analysis Plan

    Science.gov (United States)

    1988-05-01

    Analysis Plan 12. PERSONAL AUTHOR(S) Gregory H. Gaertner (Westat) and Timothy W. Elig (ARI), editors 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF...such as those of Lavidge and Steiner (1961), McGuire (1969), and Fishbein and Azjen (1975). Fishbein and Azjen (1975) and Aaker (1975) present...for college, challenge and personal development, or patriotic service). Corresponding to these beliefs are evaluations of the importance of these

  10. Robustness of Multiple Objective Decision Analysis Preference Functions

    Science.gov (United States)

    2002-06-01

    Bayesian Decision Theory and Utilitarian Ethics ,” American Economic Review Papers and Proceedings, 68: 223-228 (May 1978). Hartsough, Bruce R. “A...1983). Morrell, Darryl and Eric Driver. “ Bayesian Network Implementation of Levi’s Epistemic Utility Decision Theory ,” International Journal Of...elicitation efficiency for the decision maker. Subject Terms Decision Analysis, Utility Theory , Elicitation Error, Operations Research, Decision

  11. Objective Bayesian analysis of neutrino masses and hierarchy

    Science.gov (United States)

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  12. Local Analysis Approach for Short Wavelength Geopotential Variations

    Science.gov (United States)

    Bender, P. L.

    2009-12-01

    The value of global spherical harmonic analyses for determining 15 day to 30 day changes in the Earth's gravity field has been demonstrated extensively using data from the GRACE mission and previous missions. However, additional useful information appears to be obtainable from local analyses of the data. A number of such analyses have been carried out by various groups. In the energy approximation, the changes in the height of the satellite altitude geopotential can be determined from the post-fit changes in the satellite separation during individual one-revolution arcs of data from a GRACE-type pair of satellites in a given orbit. For a particular region, it is assumed that short wavelength spatial variations for the arcs crossing that region during a time T of interest would be used to determine corrections to the spherical harmonic results. The main issue in considering higher measurement accuracy in future missions is how much improvement in spatial resolution can be achieved. For this, the shortest wavelengths that can be determined are the most important. And, while the longer wavelength variations are affected by mass distribution changes over much of the globe, the shorter wavelength ones hopefully will be determined mainly by more local changes in the mass distribution. Future missions are expected to have much higher accuracy for measuring changes in the satellite separation than GRACE. However, how large an improvement in the derived results in hydrology will be achieved is still very much a matter of study, particularly because of the effects of uncertainty in the time variations in the atmospheric and oceanic mass distributions. To be specific, it will be assumed that improving the spatial resolution in continental regions away from the coastlines is the objective, and that the satellite altitude is in the range of roughly 290 to 360 km made possible for long missions by drag-free operation. The advantages of putting together the short wavelength

  13. Hadronic Triggers and trigger-object level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program, and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors to more deeply probing for new physics, such as storage and computing requirements f...

  14. Hadronic triggers and trigger object-level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program at the Large Hadron Collider (LHC), and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous event rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors, such as storage and computing requirements...

  15. Novel variational approach for analysis of photonic crystal slabs

    International Nuclear Information System (INIS)

    Aram, Mohammad Hasan; Khorasani, Sina

    2015-01-01

    We propose a new method, based on variational principle, for the analysis of photonic crystal (PC) slabs. Most of the methods used today treat PC slabs as three-dimensional (3D) crystal, and this makes these methods very time and/or memory consuming. In our proposed method, we use the Bloch theorem to expand the field on infinite plane waves, whose amplitudes depend on the component perpendicular to the slab surface. By approximating these amplitudes with appropriate functions, we can find modes of PC slabs almost as fast as we can find modes of two-dimensional crystals. In addition to this advantage, we can also calculate radiation modes with this method, which is not feasible with the 3D plane wave expansion method. (paper)

  16. Batch variation between branchial cell cultures: An analysis of variance

    DEFF Research Database (Denmark)

    Hansen, Heinz Johs. Max; Grosell, M.; Kristensen, L.

    2003-01-01

    We present in detail how a statistical analysis of variance (ANOVA) is used to sort out the effect of an unexpected batch-to-batch variation between cell cultures. Two separate cultures of rainbow trout branchial cells were grown on permeable filtersupports ("inserts"). They were supposed...... and introducing the observed difference between batches as one of the factors in an expanded three-dimensional ANOVA, we were able to overcome an otherwisecrucial lack of sufficiently reproducible duplicate values. We could thereby show that the effect of changing the apical medium was much more marked when...... the radioactive lipid precursors were added on the apical, rather than on the basolateral, side. Theinsert cell cultures were obviously polarized. We argue that it is not reasonable to reject troublesome experimental results, when we do not know a priori that something went wrong. The ANOVA is a very useful...

  17. Sequence length variation, indel costs, and congruence in sensitivity analysis

    DEFF Research Database (Denmark)

    Aagesen, Lone; Petersen, Gitte; Seberg, Ole

    2005-01-01

    The behavior of two topological and four character-based congruence measures was explored using different indel treatments in three empirical data sets, each with different alignment difficulties. The analyses were done using direct optimization within a sensitivity analysis framework in which...... the cost of indels was varied. Indels were treated either as a fifth character state, or strings of contiguous gaps were considered single events by using linear affine gap cost. Congruence consistently improved when indels were treated as single events, but no congruence measure appeared as the obviously...... preferable one. However, when combining enough data, all congruence measures clearly tended to select the same alignment cost set as the optimal one. Disagreement among congruence measures was mostly caused by a dominant fragment or a data partition that included all or most of the length variation...

  18. Interpretation of engine cycle-to-cycle variation by chaotic time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Daw, C.S.; Kahl, W.K.

    1990-01-01

    In this paper we summarize preliminary results from applying a new mathematical technique -- chaotic time series analysis (CTSA) -- to cylinder pressure data from a spark-ignition (SI) four-stroke engine fueled with both methanol and iso-octane. Our objective is to look for the presence of deterministic chaos'' dynamics in peak pressure variations and to investigate the potential usefulness of CTSA as a diagnostic tool. Our results suggest that sequential peak cylinder pressures exhibit some characteristic features of deterministic chaos and that CTSA can extract previously unrecognized information from such data. 18 refs., 11 figs., 2 tabs.

  19. Categorical data processing for real estate objects valuation using statistical analysis

    Science.gov (United States)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  20. Objective analysis of image quality of video image capture systems

    Science.gov (United States)

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give

  1. SOCIAL EXCLUSION AS AN OBJECT OF ECONOMIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    Z. Halushka

    2014-06-01

    Full Text Available In the article essence and forms of display of social exception of separate citizens and certain layers of population are certain as the socioeconomic phenomenon. Theoretical principles and methodology of estimation of the phenomenon of social exception are analyzed. Certain characteristic lines of social exception: subzero even consumptions and profit of individuals or groups; a limit access is to the public mechanisms of increase of welfare; a mainly passive type of cooperating is with society. Attention is accented on a defect for the individuals of row of rights, limit nature of access to the institutes that distribute resources, to the labor-market. Poverty is certain the main category of social exception. A concept "circles of poverty" and mechanisms of its existence are reasonable. Other displays of social exception-direct violation of base human rights are examined on quality education, on medical services and kind health, on the acceptable standard of living, on access to cultural acquisition, on defense of the interests and on the whole on participating in economic, social, in a civilized manner, political life of country. Cited data about part of torn away housekeeping of Ukraine on separate signs. The analysis of distribution of housekeeping after the amount of the accumulated signs of the social tearing away gave an opportunity to set a limit after that the social tearing away begins brightly to show up, at the level of 5 signs. It is certain the limit of the sharp tearing away. The second degree of tearing away – critical – answers a presence 7thsigns. At this level in Ukraine there are 37,7. That's far more than those, who are considered poor on a relative national criterion (24,0. It is set that conception of social exception shows the "horizontal cut" of the system of social relations and place of individual, layer, group and others like that in this system, certain on certain signs. The necessity of the use of

  2. Diachronic and Synchronic Analysis - the Case of the Indirect Object in Spanish

    DEFF Research Database (Denmark)

    Dam, Lotte; Dam-Jensen, Helle

    2007-01-01

    The article deals with a monograph on the indirect object in Spanish. The book offers a many-faceted analysis of the indrect object, as it, on the one hand, gives a detailed diachronic analysis of what is known as clitic-doubled constructions and, on the other, a synchronic analysis of both...

  3. Does a variation in self-reported physical activity reflect variation in objectively measured physical activity, resting heart rate, and physical fitness? Results from the Tromso study

    DEFF Research Database (Denmark)

    Emaus, Aina; Degerstrøm, Jorid; Wilsgaard, Tom

    2010-01-01

    AIMS: To study the association between self-reported physical activity (PA) and objectively measured PA, resting heart rate, and physical fitness. METHODS: During 2007-08, 5017 men and 5607 women aged 30-69 years attended the sixth survey of the Tromsø study. Self-reported PA during leisure......-time and work were assessed and resting heart rate was measured. In a sub-study, the activity study, PA (Actigraph LLC) and physical fitness (VO₂(max)) were objectively measured among 313 healthy men and women aged 40-44 years. RESULTS: Self-reported leisure PA was significantly correlated with VO₂(max) (ml...... women than men met the international recommendations of 10,000 step counts/day (27% vs. 22%) and the recommendation of at least 30 minutes/day of moderate-to-vigorous intensities (30% vs. 22 %). CONCLUSIONS: The Tromsø physical activity questionnaire has acceptable validity and provides valid estimates...

  4. The Influence of Weather Variation, Urban Design and Built Environment on Objectively Measured Sedentary Behaviour in Children.

    Science.gov (United States)

    Katapally, Tarun Reddy; Rainham, Daniel; Muhajarine, Nazeem

    2016-01-01

    With emerging evidence indicating that independent of physical activity, sedentary behaviour (SB) can be detrimental to health, researchers are increasingly aiming to understand the influence of multiple contexts such as urban design and built environment on SB. However, weather variation, a factor that continuously interacts with all other environmental variables, has been consistently underexplored. This study investigated the influence of diverse environmental exposures (including weather variation, urban design and built environment) on SB in children. This cross-sectional observational study is part of an active living research initiative set in the Canadian prairie city of Saskatoon. Saskatoon's neighbourhoods were classified based on urban street design into grid-pattern, fractured grid-pattern and curvilinear types of neighbourhoods. Diverse environmental exposures were measured including, neighbourhood built environment, and neighbourhood and household socioeconomic environment. Actical accelerometers were deployed between April and June 2010 (spring-summer) to derive SB of 331 10-14 year old children in 25 one week cycles. Each cycle of accelerometry was conducted on a different cohort of children within the total sample. Accelerometer data were matched with localized weather patterns derived from Environment Canada weather data. Multilevel modeling using Hierarchical Linear and Non-linear Modeling software was conducted by factoring in weather variation to depict the influence of diverse environmental exposures on SB. Both weather variation and urban design played a significant role in SB. After factoring in weather variation, it was observed that children living in grid-pattern neighbourhoods closer to the city centre (with higher diversity of destinations) were less likely to be sedentary. This study demonstrates a methodology that could be replicated to integrate geography-specific weather patterns with existing cross-sectional accelerometry data to

  5. Intermediary object for participative design processes based on the ergonomic work analysis

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Duarte, F.; Broberg, Ole

    2012-01-01

    The objective of this paper is to present and discuss the use of an intermediary object, built from the ergonomic work analysis, in a participative design process. The object was a zoning pattern, developed as a visual representation ‘mapping’ of the interrelations among the functional units of t...

  6. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  7. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  8. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    Science.gov (United States)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  9. Analysis of Idiom Variation in the Framework of Linguistic Subjectivity

    Science.gov (United States)

    Liu, Zhengyuan

    2012-01-01

    Idiom variation is a ubiquitous linguistic phenomenon which has raised a lot of research questions. The past approach was either formal or functional. Both of them did not pay much attention to cognitive factors of language users. By putting idiom variation in the framework of linguistic subjectivity, we have offered a new perspective in the…

  10. Multidimensional analysis of Drosophila wing variation in Evolution ...

    Indian Academy of Sciences (India)

    2008-12-23

    Dec 23, 2008 ... the different components of phenotypic variation of a complex trait: the wing. ... of Drosophila wing variation in. Evolution Canyon. J. Genet. 87, 407–419]. Introduction ..... identify the effect of slope on wing shape (figure 2,c). All.

  11. Variational formulation based analysis on growth of yield front in ...

    African Journals Online (AJOL)

    The present study investigates the growth of elastic-plastic front in rotating solid disks of non-uniform thickness having exponential and parabolic geometry variation. The problem is solved through an extension of a variational method in elastoplastic regime. The formulation is based on von-Mises yield criterion and linear ...

  12. Variational Bayesian Learning for Wavelet Independent Component Analysis

    Science.gov (United States)

    Roussos, E.; Roberts, S.; Daubechies, I.

    2005-11-01

    In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.

  13. A cluster analysis of patterns of objectively measured physical activity in Hong Kong.

    Science.gov (United States)

    Lee, Paul H; Yu, Ying-Ying; McDowell, Ian; Leung, Gabriel M; Lam, T H

    2013-08-01

    The health benefits of exercise are clear. In targeting interventions it would be valuable to know whether characteristic patterns of physical activity (PA) are associated with particular population subgroups. The present study used cluster analysis to identify characteristic hourly PA patterns measured by accelerometer. Cross-sectional design. Objectively measured PA in Hong Kong adults. Four-day accelerometer data were collected during 2009 to 2011 for 1714 participants in Hong Kong (mean age 44?2 years, 45?9% male). Two clusters were identified, one more active than the other. The ‘active cluster’ (n 480) was characterized by a routine PA pattern on weekdays and a more active and varied pattern on weekends; the other, the ‘less active cluster’ (n 1234), by a consistently low PA pattern on both weekdays and weekends with little variation from day to day. Demographic, lifestyle, PA level and health characteristics of the two clusters were compared. They differed in age, sex, smoking, income and level of PA required at work. The odds of having any chronic health conditions was lower for the active group (adjusted OR50?62, 95% CI 0?46, 0?84) but the two groups did not differ in terms of specific chronic health conditions or obesity. Implications are drawn for targeting exercise promotion programmes at the population level.

  14. Calculating potential error in sodium MRI with respect to the analysis of small objects.

    Science.gov (United States)

    Stobbe, Robert W; Beaulieu, Christian

    2018-06-01

    To facilitate correct interpretation of sodium MRI measurements, calculation of error with respect to rapid signal decay is introduced and combined with that of spatially correlated noise to assess volume-of-interest (VOI) 23 Na signal measurement inaccuracies, particularly for small objects. Noise and signal decay-related error calculations were verified using twisted projection imaging and a specially designed phantom with different sized spheres of constant elevated sodium concentration. As a demonstration, lesion signal measurement variation (5 multiple sclerosis participants) was compared with that predicted from calculation. Both theory and phantom experiment showed that VOI signal measurement in a large 10-mL, 314-voxel sphere was 20% less than expected on account of point-spread-function smearing when the VOI was drawn to include the full sphere. Volume-of-interest contraction reduced this error but increased noise-related error. Errors were even greater for smaller spheres (40-60% less than expected for a 0.35-mL, 11-voxel sphere). Image-intensity VOI measurements varied and increased with multiple sclerosis lesion size in a manner similar to that predicted from theory. Correlation suggests large underestimation of 23 Na signal in small lesions. Acquisition-specific measurement error calculation aids 23 Na MRI data analysis and highlights the limitations of current low-resolution methodologies. Magn Reson Med 79:2968-2977, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  15. Foreign object detection and removal to improve automated analysis of chest radiographs

    International Nuclear Information System (INIS)

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  16. Design analysis of vertical wind turbine with airfoil variation

    Science.gov (United States)

    Maulana, Muhammad Ilham; Qaedy, T. Masykur Al; Nawawi, Muhammad

    2016-03-01

    With an ever increasing electrical energy crisis occurring in the Banda Aceh City, it will be important to investigate alternative methods of generating power in ways different than fossil fuels. In fact, one of the biggest sources of energy in Aceh is wind energy. It can be harnessed not only by big corporations but also by individuals using Vertical Axis Wind Turbines (VAWT). This paper presents a three-dimensional CFD analysis of the influence of airfoil design on performance of a Darrieus-type vertical-axis wind turbine (VAWT). The main objective of this paper is to develop an airfoil design for NACA 63-series vertical axis wind turbine, for average wind velocity 2,5 m/s. To utilize both lift and drag force, some of designs of airfoil are analyzed using a commercial computational fluid dynamics solver such us Fluent. Simulation is performed for this airfoil at different angles of attach rearranging from -12°, -8°, -4°, 0°, 4°, 8°, and 12°. The analysis showed that the significant enhancement in value of lift coefficient for airfoil NACA 63-series is occurred for NACA 63-412.

  17. EFFECTS OF PARAMETRIC VARIATIONS ON SEISMIC ANALYSIS METHODS FOR NON-CLASSICALLY DAMPED COUPLED SYSTEMS

    International Nuclear Information System (INIS)

    XU, J.; DEGRASSI, G.

    2000-01-01

    A comprehensive benchmark program was developed by Brookhaven National Laboratory (BNL) to perform an evaluation of state-of-the-art methods and computer programs for performing seismic analyses of coupled systems with non-classical damping. The program, which was sponsored by the US Nuclear Regulatory Commission (NRC), was designed to address various aspects of application and limitations of these state-of-the-art analysis methods to typical coupled nuclear power plant (NPP) structures with non-classical damping, and was carried out through analyses of a set of representative benchmark problems. One objective was to examine the applicability of various analysis methods to problems with different dynamic characteristics unique to coupled systems. The examination was performed using parametric variations for three simple benchmark models. This paper presents the comparisons and evaluation of the program participants' results to the BNL exact solutions for the applicable ranges of modeling dynamic characteristic parameters

  18. RAPD analysis of colchicine induced variation of the Dendrobium ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-04-20

    Apr 20, 2009 ... species of the Dendrobium genera, and 13 orchids across generas. ... to detect variations at species level and among somaclonal variants in this study. ..... alternative for colchicines in in vitro choromosome doubling of Lilium.

  19. Empirical analysis of skin friction under variations of temperature

    International Nuclear Information System (INIS)

    Parra Alvarez, A. R. de la; Groot Viana, M. de

    2014-01-01

    In soil geotechnical characterization, strength parameters, cohesion (c) and internal friction angle (Φ) has been traditional measured without taking into account temperature, been a very important issue in energy geostructures. The present document analyzes the variation of these parameters in soil-concrete interface at different temperatures. A traditional shear strength case with a forced plane of failure was used. Several tests were carried out to determine the variation of skin friction in granular and cohesive oils with temperature. (Author)

  20. Vector optimization set-valued and variational analysis

    CERN Document Server

    Chen, Guang-ya; Yang, Xiaogi

    2005-01-01

    This book is devoted to vector or multiple criteria approaches in optimization. Topics covered include: vector optimization, vector variational inequalities, vector variational principles, vector minmax inequalities and vector equilibrium problems. In particular, problems with variable ordering relations and set-valued mappings are treated. The nonlinear scalarization method is extensively used throughout the book to deal with various vector-related problems. The results presented are original and should be interesting to researchers and graduates in applied mathematics and operations research

  1. Comparative analysis of proteome and transcriptome variation in mouse.

    Directory of Open Access Journals (Sweden)

    Anatole Ghazalpour

    2011-06-01

    Full Text Available The relationships between the levels of transcripts and the levels of the proteins they encode have not been examined comprehensively in mammals, although previous work in plants and yeast suggest a surprisingly modest correlation. We have examined this issue using a genetic approach in which natural variations were used to perturb both transcript levels and protein levels among inbred strains of mice. We quantified over 5,000 peptides and over 22,000 transcripts in livers of 97 inbred and recombinant inbred strains and focused on the 7,185 most heritable transcripts and 486 most reliable proteins. The transcript levels were quantified by microarray analysis in three replicates and the proteins were quantified by Liquid Chromatography-Mass Spectrometry using O(18-reference-based isotope labeling approach. We show that the levels of transcripts and proteins correlate significantly for only about half of the genes tested, with an average correlation of 0.27, and the correlations of transcripts and proteins varied depending on the cellular location and biological function of the gene. We examined technical and biological factors that could contribute to the modest correlation. For example, differential splicing clearly affects the analyses for certain genes; but, based on deep sequencing, this does not substantially contribute to the overall estimate of the correlation. We also employed genome-wide association analyses to map loci controlling both transcript and protein levels. Surprisingly, little overlap was observed between the protein- and transcript-mapped loci. We have typed numerous clinically relevant traits among the strains, including adiposity, lipoprotein levels, and tissue parameters. Using correlation analysis, we found that a low number of clinical trait relationships are preserved between the protein and mRNA gene products and that the majority of such relationships are specific to either the protein levels or transcript levels

  2. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    Science.gov (United States)

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  3. Assessing temporal variations in connectivity through suspended sediment hysteresis analysis

    Science.gov (United States)

    Sherriff, Sophie; Rowan, John; Fenton, Owen; Jordan, Phil; Melland, Alice; Mellander, Per-Erik; hUallacháin, Daire Ó.

    2016-04-01

    Connectivity provides a valuable concept for understanding catchment-scale sediment dynamics. In intensive agricultural catchments, land management through tillage, high livestock densities and extensive land drainage practices significantly change hydromorphological behaviour and alter sediment supply and downstream delivery. Analysis of suspended sediment-discharge hysteresis has offered insights into sediment dynamics but typically on a limited selection of events. Greater availability of continuous high-resolution discharge and turbidity data and qualitative hysteresis metrics enables assessment of sediment dynamics during more events and over time. This paper assesses the utility of this approach to explore seasonal variations in connectivity. Data were collected from three small (c. 10 km2) intensive agricultural catchments in Ireland with contrasting morphologies, soil types, land use patterns and management practices, and are broadly defined as low-permeability supporting grassland, moderate-permeability supporting arable and high-permeability supporting arable. Suspended sediment concentration (using calibrated turbidity measurements) and discharge data were collected at 10-min resolution from each catchment outlet and precipitation data were collected from a weather station within each catchment. Event databases (67-90 events per catchment) collated information on sediment export metrics, hysteresis category (e.g., clockwise, anti-clockwise, no hysteresis), numeric hysteresis index, and potential hydro-meteorological controls on sediment transport including precipitation amount, duration, intensity, stream flow and antecedent soil moisture and rainfall. Statistical analysis of potential controls on sediment export was undertaken using Pearson's correlation coefficient on separate hysteresis categories in each catchment. Sediment hysteresis fluctuations through time were subsequently assessed using the hysteresis index. Results showed the numeric

  4. X-ray fluorescence analysis of archaeological finds and art objects: Recognizing gold and gilding

    International Nuclear Information System (INIS)

    Trojek, Tomáš; Hložek, Martin

    2012-01-01

    Many cultural heritage objects were gilded in the past, and nowadays they can be found in archeological excavations or in historical buildings dating back to the Middle Ages, or from the modern period. Old gilded artifacts have been studied using X-ray fluorescence analysis and 2D microanalysis. Several techniques that enable the user to distinguish gold and gilded objects are described and then applied to investigate artifacts. These techniques differ in instrumentation, data analysis and numbers of measurements. The application of Monte Carlo calculation to a quantitative analysis of gilded objects is also introduced. - Highlights: ► Three techniques of gilding identification with XRF analysis are proposed. ► These techniques are applied to gold and gilded art and archeological objects. ► Composition of a substrate material is determined by a Monte Carlo simulation.

  5. Multi-band morpho-Spectral Component Analysis Deblending Tool (MuSCADeT): Deblending colourful objects

    Science.gov (United States)

    Joseph, R.; Courbin, F.; Starck, J.-L.

    2016-05-01

    We introduce a new algorithm for colour separation and deblending of multi-band astronomical images called MuSCADeT which is based on Morpho-spectral Component Analysis of multi-band images. The MuSCADeT algorithm takes advantage of the sparsity of astronomical objects in morphological dictionaries such as wavelets and their differences in spectral energy distribution (SED) across multi-band observations. This allows us to devise a model independent and automated approach to separate objects with different colours. We show with simulations that we are able to separate highly blended objects and that our algorithm is robust against SED variations of objects across the field of view. To confront our algorithm with real data, we use HST images of the strong lensing galaxy cluster MACS J1149+2223 and we show that MuSCADeT performs better than traditional profile-fitting techniques in deblending the foreground lensing galaxies from background lensed galaxies. Although the main driver for our work is the deblending of strong gravitational lenses, our method is fit to be used for any purpose related to deblending of objects in astronomical images. An example of such an application is the separation of the red and blue stellar populations of a spiral galaxy in the galaxy cluster Abell 2744. We provide a python package along with all simulations and routines used in this paper to contribute to reproducible research efforts. Codes can be found at http://lastro.epfl.ch/page-126973.html

  6. Principal component analysis to evaluate the spatial variation of ...

    African Journals Online (AJOL)

    Discretisation of the particle sizes is highlighted as both a challenge and an opportunity and it is recommended that it be used as a tuning parameter in gauging kaolin variations across samples and in validating new predictive modeling applications. Successful applications will depend on how clay and data scientists keep ...

  7. Multidimensional analysis of Drosophila wing variation in Evolution ...

    Indian Academy of Sciences (India)

    In this study, using Drosophila melanogaster isofemale lines derived from wild flies collected on both slopes of the canyon, we investigated the effect of developmental temperature upon the different components of phenotypic variation of a complex trait: the wing. Combining geometric and traditional morphometrics, we find ...

  8. Analysis of spin and gauge models with variational methods

    International Nuclear Information System (INIS)

    Dagotto, E.; Masperi, L.; Moreo, A.; Della Selva, A.; Fiore, R.

    1985-01-01

    Since independent-site (link) or independent-link (plaquette) variational states enhance the order or the disorder, respectively, in the treatment of spin (gauge) models, we prove that mixed states are able to improve the critical coupling while giving the qualitatively correct behavior of the relevant parameters

  9. Validity of covariance models for the analysis of geographical variation

    DEFF Research Database (Denmark)

    Guillot, Gilles; Schilling, Rene L.; Porcu, Emilio

    2014-01-01

    1. Due to the availability of large molecular data-sets, covariance models are increasingly used to describe the structure of genetic variation as an alternative to more heavily parametrised biological models. 2. We focus here on a class of parametric covariance models that received sustained att...

  10. A variational analysis for large deflection of skew plates under ...

    African Journals Online (AJOL)

    In the present paper, the static behaviour of thin isotropic skew plates under uniformly distributed load is analyzed with the geometric nonlinearity of the model properly handled. A variational method based on total potential energy has been implemented through assumed displacement field. The computational work has ...

  11. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  12. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    Science.gov (United States)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  13. Sensitivity analysis and parameter estimation for distributed hydrological modeling: potential of variational methods

    Directory of Open Access Journals (Sweden)

    W. Castaings

    2009-04-01

    Full Text Available Variational methods are widely used for the analysis and control of computationally intensive spatially distributed systems. In particular, the adjoint state method enables a very efficient calculation of the derivatives of an objective function (response function to be analysed or cost function to be optimised with respect to model inputs.

    In this contribution, it is shown that the potential of variational methods for distributed catchment scale hydrology should be considered. A distributed flash flood model, coupling kinematic wave overland flow and Green Ampt infiltration, is applied to a small catchment of the Thoré basin and used as a relatively simple (synthetic observations but didactic application case.

    It is shown that forward and adjoint sensitivity analysis provide a local but extensive insight on the relation between the assigned model parameters and the simulated hydrological response. Spatially distributed parameter sensitivities can be obtained for a very modest calculation effort (~6 times the computing time of a single model run and the singular value decomposition (SVD of the Jacobian matrix provides an interesting perspective for the analysis of the rainfall-runoff relation.

    For the estimation of model parameters, adjoint-based derivatives were found exceedingly efficient in driving a bound-constrained quasi-Newton algorithm. The reference parameter set is retrieved independently from the optimization initial condition when the very common dimension reduction strategy (i.e. scalar multipliers is adopted.

    Furthermore, the sensitivity analysis results suggest that most of the variability in this high-dimensional parameter space can be captured with a few orthogonal directions. A parametrization based on the SVD leading singular vectors was found very promising but should be combined with another regularization strategy in order to prevent overfitting.

  14. Genetic analysis of variation in human meiotic recombination.

    Directory of Open Access Journals (Sweden)

    Reshmi Chowdhury

    2009-09-01

    Full Text Available The number of recombination events per meiosis varies extensively among individuals. This recombination phenotype differs between female and male, and also among individuals of each gender. In this study, we used high-density SNP genotypes of over 2,300 individuals and their offspring in two datasets to characterize recombination landscape and to map the genetic variants that contribute to variation in recombination phenotypes. We found six genetic loci that are associated with recombination phenotypes. Two of these (RNF212 and an inversion on chromosome 17q21.31 were previously reported in the Icelandic population, and this is the first replication in any other population. Of the four newly identified loci (KIAA1462, PDZK1, UGCG, NUB1, results from expression studies provide support for their roles in meiosis. Each of the variants that we identified explains only a small fraction of the individual variation in recombination. Notably, we found different sequence variants associated with female and male recombination phenotypes, suggesting that they are regulated by different genes. Characterization of genetic variants that influence natural variation in meiotic recombination will lead to a better understanding of normal meiotic events as well as of non-disjunction, the primary cause of pregnancy loss.

  15. Extending Track Analysis from Animals in the Lab to Moving Objects Anywhere

    NARCIS (Netherlands)

    Dommelen, W. van; Laar, P.J.L.J. van de; Noldus, L.P.J.J.

    2013-01-01

    In this chapter we compare two application domains in which the tracking of objects and the analysis of their movements are core activities, viz. animal tracking and vessel tracking. More specifically, we investigate whether EthoVision XT, a research tool for video tracking and analysis of the

  16. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  17. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  18. SEGMENT OF FINANCIAL CORPORATIONS AS AN OBJECT OF FINANCIAL AND STATISTICAL ANALYSIS

    OpenAIRE

    Marat F. Mazitov

    2013-01-01

    The article is devoted to the study specific features of the formation and change of economic assets of financial corporations as an object of management and financial analysis. He author identifies the features and gives the classification of institutional units belonging to the sector of financial corporations from the viewpoint of assessment and financial analysis of the flows, reflecting change of their assets.

  19. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1987-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results are compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are also presented. (author) 3 refs.; 4 tabs

  20. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1986-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are presented. (author)

  1. Determining characteristics of artificial near-Earth objects using observability analysis

    Science.gov (United States)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  2. A functional analysis of photo-object matching skills of severely retarded adolescents.

    Science.gov (United States)

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photos and the objects. Only one student demonstrated photo-object matching. The results of the four students who failed to demonstrate photo-object matching suggested that physical properties of photos (flat, rectangular) and depth dimensions of objects may exert more control over matching than the similarities of the objects and images within the photos. An analysis of figure-ground variables was conducted to provide an empirical basis for program development in the use of pictures. In one series of tests, rectangular shape and background were removed by cutting out the figures in the photos. The edge shape of the photo and the edge shape of the image were then identical. The results suggest that photo-object matching may be facilitated by using cut-out figures rather than the complete rectangular photo.

  3. Geographical variation in dementia: systematic review with meta-analysis

    Science.gov (United States)

    Russ, Tom C; Batty, G David; Hearnshaw, Gena F; Fenton, Candida; Starr, John M

    2012-01-01

    Background Geographical variation in dementia prevalence and incidence may indicate important socio-environmental contributions to dementia aetiology. However, previous comparisons have been hampered by combining studies with different methodologies. This review systematically collates and synthesizes studies examining geographical variation in the prevalence and incidence of dementia based on comparisons of studies using identical methodologies. Methods Papers were identified by a comprehensive electronic search of relevant databases, scrutinising the reference sections of identified publications, contacting experts in the field and re-examining papers already known to us. Identified articles were independently reviewed against inclusion/exclusion criteria and considered according to geographical scale. Rural/urban comparisons were meta-analysed. Results Twelve thousand five hundred and eighty records were reviewed and 51 articles were included. Dementia prevalence and incidence varies at a number of scales from the national down to small areas, including some evidence of an effect of rural living [prevalence odds ratio (OR) = 1.11, 90% confidence interval (CI) 0.79–1.57; incidence OR = 1.20, 90% CI 0.84–1.71]. However, this association of rurality was stronger for Alzheimer disease, particularly when early life rural living was captured (prevalence OR = 2.22, 90% CI 1.19–4.16; incidence OR = 1.64, 90% CI 1.08–2.50). Conclusions There is evidence of geographical variation in rates of dementia in affluent countries at a variety of geographical scales. Rural living is associated with an increased risk of Alzheimer disease, and there is a suggestion that early life rural living further increases this risk. However, the fact that few studies have been conducted in resource-poor countries limits conclusions. PMID:22798662

  4. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    Science.gov (United States)

    2017-02-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) is a novel information framework developed...prototyping. It supports dynamic plugin of analysis modules, for either research or analysis tasks. The framework integrates multiple image processing...Requirements 2 3. Installing the Software for IOAIDE 2 3.1 Load ARL Software 2 3.2 Load ARL Applications 4 3.3 Load the DSPro Software 7 3.4 Update Java

  5. The Making of Paranormal Belief: History, Discourse Analysis and the Object of Belief

    OpenAIRE

    White, Lewis

    2013-01-01

    The present study comprises a discursive analysis of a cognitive phenomenon, paranormal beliefs. A discursive psychological approach to belief highlights that an important component of the cognitivist work has been how the object of paranormal belief has been defined in formal study. Using discourse analysis, as developed as a method in the history of psychology, this problem is explored through analysis of published scales. The findings highlight three rhetorical themes that are deployed in ...

  6. Systems genetics analysis of pharmacogenomics variation during antidepressant treatment

    DEFF Research Database (Denmark)

    Madsen, Majbritt Busk; Kogelman, L J A; Kadarmideen, H N

    2016-01-01

    Selective serotonin reuptake inhibitors (SSRIs) are the most widely used antidepressants, but the efficacy of the treatment varies significantly among individuals. It is believed that complex genetic mechanisms play a part in this variation. We have used a network based approach to unravel the in...... genes involved in calcium homeostasis. In conclusion, we suggest a difference in genetic interaction networks between initial and subsequent SSRI response.The Pharmacogenomics Journal advance online publication, 18 October 2016; doi:10.1038/tpj.2016.68....

  7. Variational methods for crystalline microstructure analysis and computation

    CERN Document Server

    Dolzmann, Georg

    2003-01-01

    Phase transformations in solids typically lead to surprising mechanical behaviour with far reaching technological applications. The mathematical modeling of these transformations in the late 80s initiated a new field of research in applied mathematics, often referred to as mathematical materials science, with deep connections to the calculus of variations and the theory of partial differential equations. This volume gives a brief introduction to the essential physical background, in particular for shape memory alloys and a special class of polymers (nematic elastomers). Then the underlying mathematical concepts are presented with a strong emphasis on the importance of quasiconvex hulls of sets for experiments, analytical approaches, and numerical simulations.

  8. Using Epistemic Network Analysis to understand core topics as planned learning objectives

    DEFF Research Database (Denmark)

    Allsopp, Benjamin Brink; Dreyøe, Jonas; Misfeldt, Morten

    Epistemic Network Analysis is a tool developed by the epistemic games group at the University of Wisconsin Madison for tracking the relations between concepts in students discourse (Shaffer 2017). In our current work we are applying this tool to learning objectives in teachers digital preparation....... The danish mathematics curriculum is organised in six competencies and three topics. In the recently implemented learning platforms teacher choose which of the mathematical competencies that serves as objective for a specific lesson or teaching sequence. Hence learning objectives for lessons and teaching...... sequences are defining a network of competencies, where two competencies are closely related of they often are part of the same learning objective or teaching sequence. We are currently using Epistemic Network Analysis to study these networks. In the poster we will include examples of different networks...

  9. An Analysis on Usage Preferences of Learning Objects and Learning Object Repositories among Pre-Service Teachers

    Science.gov (United States)

    Yeni, Sabiha; Ozdener, Nesrin

    2014-01-01

    The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…

  10. Variation in plasma calcium analysis in primary care in Sweden - a multilevel analysis

    Directory of Open Access Journals (Sweden)

    Eggertsen Robert

    2010-05-01

    Full Text Available Abstract Background Primary hyperparathyroidism (pHPT is a common disease that often remains undetected and causes severe disturbance especially in postmenopausal women. Therefore, national recommendations promoting early pHPT detection by plasma calcium (P-Ca have been issued in Sweden. In this study we aimed to investigate variation of P-Ca analysis between physicians and health care centres (HCCs in primary care in county of Skaraborg, Sweden. Methods In this cross sectional study of patients' records during 2005 we analysed records from 154 629 patients attending 457 physicians at 24 HCCs. We used multilevel logistic regression analysis (MLRA and adjusted for patient, physician and HCC characteristics. Differences were expressed as median odds ratio (MOR. Results There was a substantial variation in number of P-Ca analyses between both HCCs (MORHCC 1.65 [1.44-2.07] and physicians (MORphysician 1.95 [1.85-2.08]. The odds for a P-Ca analysis were lower for male patients (OR 0.80 [0.77-0.83] and increased with the number of diagnoses (OR 25.8 [23.5-28.5]. Sex of the physician had no influence on P-Ca test ordering (OR 0.93 [0.78-1.09]. Physicians under education ordered most P-Ca analyses (OR 1.69 [1.35-2.24] and locum least (OR 0.73 [0.57-0.94]. More of the variance was attributed to the physician level than the HCC level. Different mix of patients did not explain this variance between physicians. Theoretically, if a patient were able to change both GP and HCC, the odds of a P-Ca analysis would in median increase by 2.45. Including characteristics of the patients, physicians and HCCs in the MLRA model did not explain the variance. Conclusions The physician level was more important than the HCC level for the variation in P-Ca analysis, but further exploration of unidentified contextual factors is crucial for future monitoring of practice variation.

  11. GPR Detection of Buried Symmetrically Shaped Mine-like Objects using Selective Independent Component Analysis

    DEFF Research Database (Denmark)

    Karlsen, Brian; Sørensen, Helge Bjarup Dissing; Larsen, Jan

    2003-01-01

    from small-scale anti-personal (AP) mines to large-scale anti-tank (AT) mines were designed. Large-scale SF-GPR measurements on this series of mine-like objects buried in soil were performed. The SF-GPR data was acquired using a wideband monostatic bow-tie antenna operating in the frequency range 750......This paper addresses the detection of mine-like objects in stepped-frequency ground penetrating radar (SF-GPR) data as a function of object size, object content, and burial depth. The detection approach is based on a Selective Independent Component Analysis (SICA). SICA provides an automatic...... ranking of components, which enables the suppression of clutter, hence extraction of components carrying mine information. The goal of the investigation is to evaluate various time and frequency domain ICA approaches based on SICA. Performance comparison is based on a series of mine-like objects ranging...

  12. 3D object-oriented image analysis in 3D geophysical modelling

    DEFF Research Database (Denmark)

    Fadel, I.; van der Meijde, M.; Kerle, N.

    2015-01-01

    Non-uniqueness of satellite gravity interpretation has traditionally been reduced by using a priori information from seismic tomography models. This reduction in the non-uniqueness has been based on velocity-density conversion formulas or user interpretation of the 3D subsurface structures (objects......) based on the seismic tomography models and then forward modelling these objects. However, this form of object-based approach has been done without a standardized methodology on how to extract the subsurface structures from the 3D models. In this research, a 3D object-oriented image analysis (3D OOA......) approach was implemented to extract the 3D subsurface structures from geophysical data. The approach was applied on a 3D shear wave seismic tomography model of the central part of the East African Rift System. Subsequently, the extracted 3D objects from the tomography model were reconstructed in the 3D...

  13. ANALYSIS THE DIURNAL VARIATIONS ON SELECTED PHYSICAL AND PHYSIOLOGICAL PARAMETERS

    Directory of Open Access Journals (Sweden)

    A. MAHABOOBJAN

    2010-12-01

    Full Text Available The purpose of the study was to analyze the diurnal variations on selected physical and physiological parameters such as speed, explosive power, resting heart rate and breath holding time among college students. To achieve the purpose of this study, a total of twenty players (n=20 from Government Arts College, Salem were selected as subjects To study the diurnal variation of the players on selected physiological and performance variables, the data were collected 4 times a day with every four hours in between the times it from 6.00 to 18.00 hours were selected as another categorical variable. One way repeated measures (ANOVA was used to analyze the data. If the obtained F-ratio was significant, Seheffe’s post-hoc test was used to find out the significant difference if anyamong the paired means. The level of significance was fixed at.05 level. It has concluded that both physical and physiological parameters were significantly deferred with reference to change of temperature in a day

  14. Statistical analysis of geomagnetic field variations during solar eclipses

    Science.gov (United States)

    Kim, Jung-Hee; Chang, Heon-Young

    2018-04-01

    We investigate the geomagnetic field variations recorded by INTERMAGNET geomagnetic observatories, which are observed while the Moon's umbra or penumbra passed over them during a solar eclipse event. Though it is generally considered that the geomagnetic field can be modulated during solar eclipses, the effect of the solar eclipse on the observed geomagnetic field has proved subtle to be detected. Instead of exploring the geomagnetic field as a case study, we analyze 207 geomagnetic manifestations acquired by 100 geomagnetic observatories during 39 solar eclipses occurring from 1991 to 2016. As a result of examining a pattern of the geomagnetic field variation on average, we confirm that the effect can be seen over an interval of 180 min centered at the time of maximum eclipse on a site of a geomagnetic observatory. That is, demonstrate an increase in the Y component of the geomagnetic field and decreases in the X component and the total strength of the geomagnetic field. We also find that the effect can be overwhelmed, depending more sensitively on the level of daily geomagnetic events than on the level of solar activity and/or the phase of solar cycle. We have demonstrated it by dividing the whole data set into subsets based on parameters of the geomagnetic field, solar activity, and solar eclipses. It is suggested, therefore, that an evidence of the solar eclipse effect can be revealed even at the solar maximum, as long as the day of the solar eclipse is magnetically quiet.

  15. Feasibility analysis of CNP 1000 computerized I and C system design objectives

    International Nuclear Information System (INIS)

    Zhang Mingguang; Xu Jijun; Zhang Qinshen

    2000-01-01

    The author states the design objectives of the computerized I and C (CIC) system and advanced main control room (AMCR), which could and should be achieved in CNP 1000, based on the national 1E computer production technology including software and hardware, and current instrumentation and control design technique of nuclear power plant. The feasibility analysis on the design objectives and the reasons or necessity to do the design research projects have been described. The objectives of design research on CIC and AMCR as well as the self-design proficiency after the design research have been given

  16. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Directory of Open Access Journals (Sweden)

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  17. Object position and image magnification in dental panoramic radiography: a theoretical analysis.

    Science.gov (United States)

    Devlin, H; Yuan, J

    2013-01-01

    The purpose of our study was to investigate how image magnification and distortion in dental panoramic radiography are influenced by object size and position for a small round object such as a ball bearing used for calibration. Two ball bearings (2.5 mm and 6 mm in diameter) were placed at approximately the same position between the teeth of a plastic skull and radiographed 21 times. The skull was replaced each time. Their images were measured by software using edge detection and ellipse-fitting algorithms. Using a standard definition of magnification, equations were derived to enable an object's magnification to be determined from its position and vice versa knowing the diameter and machine parameters. The average magnification of the 2.5 mm ball bearing was 1.292 (0.0445) horizontally and 1.257 (0.0067) vertically with a mean ratio of 1.028 (0.0322); standard deviations are in parentheses. The figures for the 6 mm ball bearing were 1.286 (0.0068), 1.255 (0.0018) and 1.025 (0.0061), respectively. Derived positions of each ball bearing from magnification were more consistent horizontally than vertically. There was less variation in either direction for the 6 mm ball bearing than the 2.5 mm one. Automatic measurement of image size resulted in less variation in vertical magnification values than horizontal. There are only certain positions in the focal trough that achieve zero distortion. Object location can be determined from its diameter, measured magnification and machine parameters. The 6 mm diameter ball bearing is preferable to the 2.5 mm one for more reliable magnification measurement and position determination.

  18. Integrating population variation and protein structural analysis to improve clinical interpretation of missense variation: application to the WD40 domain.

    Science.gov (United States)

    Laskowski, Roman A; Tyagi, Nidhi; Johnson, Diana; Joss, Shelagh; Kinning, Esther; McWilliam, Catherine; Splitt, Miranda; Thornton, Janet M; Firth, Helen V; Wright, Caroline F

    2016-03-01

    We present a generic, multidisciplinary approach for improving our understanding of novel missense variants in recently discovered disease genes exhibiting genetic heterogeneity, by combining clinical and population genetics with protein structural analysis. Using six new de novo missense diagnoses in TBL1XR1 from the Deciphering Developmental Disorders study, together with population variation data, we show that the β-propeller structure of the ubiquitous WD40 domain provides a convincing way to discriminate between pathogenic and benign variation. Children with likely pathogenic mutations in this gene have severely delayed language development, often accompanied by intellectual disability, autism, dysmorphology and gastrointestinal problems. Amino acids affected by likely pathogenic missense mutations are either crucial for the stability of the fold, forming part of a highly conserved symmetrically repeating hydrogen-bonded tetrad, or located at the top face of the β-propeller, where 'hotspot' residues affect the binding of β-catenin to the TBLR1 protein. In contrast, those altered by population variation are significantly less likely to be spatially clustered towards the top face or to be at buried or highly conserved residues. This result is useful not only for interpreting benign and pathogenic missense variants in this gene, but also in other WD40 domains, many of which are associated with disease. © The Author 2016. Published by Oxford University Press.

  19. Variational Bayesian Causal Connectivity Analysis for fMRI

    Directory of Open Access Journals (Sweden)

    Martin eLuessi

    2014-05-01

    Full Text Available The ability to accurately estimate effective connectivity among brain regions from neuroimaging data could help answering many open questions in neuroscience. We propose a method which uses causality to obtain a measure of effective connectivity from fMRI data. The method uses a vector autoregressive model for the latent variables describing neuronal activity in combination with a linear observation model based on a convolution with a hemodynamic response function. Due to the employed modeling, it is possible to efficiently estimate all latent variables of the model using a variational Bayesian inference algorithm. The computational efficiency of the method enables us to apply it to large scale problems with high sampling rates and several hundred regions of interest. We use a comprehensive empirical evaluation with synthetic and real fMRI data to evaluate the performance of our method under various conditions.

  20. High resolution analysis of temporal variation of airborne radionuclides

    International Nuclear Information System (INIS)

    Komura, K.; Yamaguchi, Y.; Manikandan, M.; Murata, Y.; Iida, T.; Moriizumi, J.

    2004-01-01

    One of the application of ultra low-background gamma spectrometry, we tried to measure temporal variation of airborne radionuclides at intervals of 1 to few hours in extreme case. Airborne radionuclides were collected on a filter paper made of quartz fiber at the Low Level Radioactivity Laboratory (LLRL), Kanazawa Univ. in Tatsunokuchi (since Nov. 2002), Hegra Island located 50 km from Noto peninsula (since Apr. 2003) to investigate influence of Asian continent and Shishiku plateau at 640 m above sea to know vertical difference (since Sep., 2003). Pb-210, Pb-212 and Be-7 were measured nondestructively by ultra low background Ge detectors in Ogoya Underground Laboratory (270 meter water Concentration of Rn-222 was monitored 1 hour intervals and wind direction and speed were recorded 10 min or 2 min intervals (Hegra Is.) as support data in data analyses. In the regular monitoring, sampling was made at 1-2 day (LLRL and Shishiku) or 1 week intervals (Hegra) to know daily and seasonal variations and similarity or difference between sampling locations. When drastic meteorological change, such as passage of front or typhoon, occurrence of inversion layer and snow fall etc., short sampling at 1-2 hours of intervals was conducted to find the corrlation with meteorological factors at single point or 2 points simultaneously. As a results, it was found that concentrations of Pb-210, Po-210, Pb-212 and Be-7 were found to vary very quickly in a short time (see Figure below) due mainly to horizontal or vertical mixing of air-masses. (authors)

  1. Analysis of DNA methylation variation in sibling tobacco ( Nicotiana ...

    African Journals Online (AJOL)

    Amplified fragment length polymorphism (AFLP) and methylation-sensitive amplification polymorphism (MSAP) analysis were used to investigate the genome of two sibling tobacco cultivars, Yunyan85 and Yunyan87, their parent K326 and the other tobacco cultivar NC89. AFLP analysis indicated that, the genome primary ...

  2. Topological situational analysis and synthesis of strategies of object management in the conditions of conflict, uncertainty of behaviour and varible amount of the observed objects

    Directory of Open Access Journals (Sweden)

    Віктор Володимирович Семко

    2016-09-01

    Full Text Available The conflict of cooperation of objects is considered in observation space as integral phenomenon with the certain variety of types of connections between its elements, objects, systems and environment that erected in a single theoretical conception and comprehensively and deeply determine the real features of object of researches. Methodology of system-structural analysis of conflict is used as research of the phenomenon in the whole and system-functional analysis as research with the aim of determination of all basic intercommunications with an environment

  3. Visual Field Preferences of Object Analysis for Grasping with One Hand

    Directory of Open Access Journals (Sweden)

    Ada eLe

    2014-10-01

    Full Text Available When we grasp an object using one hand, the opposite hemisphere predominantly guides the motor control of grasp movements (Davare et al. 2007; Rice et al. 2007. However, it is unclear whether visual object analysis for grasp control relies more on inputs (a from the contralateral than the ipsilateral visual field, (b from one dominant visual field regardless of the grasping hand, or (c from both visual fields equally. For bimanual grasping of a single object we have recently demonstrated a visual field preference for the left visual field (Le and Niemeier 2013a, 2013b, consistent with a general right-hemisphere dominance for sensorimotor control of bimanual grasps (Le et al., 2013. But visual field differences have never been tested for unimanual grasping. Therefore, here we asked right-handed participants to fixate to the left or right of an object and then grasp the object either with their right or left hand using a precision grip. We found that participants grasping with their right hand performed better with objects in the right visual field: maximum grip apertures (MGAs were more closely matched to the object width and were smaller than for objects in the left visual field. In contrast, when people grasped with their left hand, preferences switched to the left visual field. What is more, MGA scaling showed greater visual field differences compared to right-hand grasping. Our data suggest that, visual object analysis for unimanual grasping shows a preference for visual information from the ipsilateral visual field, and that the left hemisphere is better equipped to control grasps in both visual fields.

  4. Variational formulation based analysis on growth of yield front in ...

    African Journals Online (AJOL)

    user

    The analysis of rotating disk behavior has been of great interest to many ... strain hardening using Tresca's yield condition and its associated flow rule ...... Determination of Stresses in Gas-Turbine Disks Subjected to Plastic Flow and Creep.

  5. Analysis of conformational variations of the cricoid cartilages in Thoroughbred horses using computed tomography.

    Science.gov (United States)

    Dahlberg, J A; Valdes-Martinez, A; Boston, R C; Parente, E J

    2011-03-01

    Loss of arytenoid abduction is a common post operative complication of laryngoplasty without a definitive cause. It has been a clinical impression during laryngoplasty surgery that there is great conformational variability along the caudal edge of the Thoroughbred cricoid cartilage that could impact post operative retention of suture position. A change in suture position would probably lead to some loss of abduction. Defining any structural variability of the cricoid would be an initial step in determining whether this variability could impact on the retention of suture position. Anatomical variations in the larynx of Thoroughbred horses may be detected and measured using objective analysis and computed tomography. Larynges were harvested from 15 mature Thoroughbred horses. Helical CT scans were performed on each specimen. Three independent observers performed a series of measurements on 2D and 3D reconstruction images using digital software. Measurements included the lateral cricoid angle, the caudal cricoid prominences, the distance to the cricoid slope, the angle of the cricoarytenoid joints (CAJ), the cricoid thickness and the suture angle. Mean, standard deviation, coefficient of variation and linear regression analysis were performed among all observers and all measurements. Notable conformational differences were evident on the 3D reconstructions. The highest degree of variability was found in 3 measurements: the distance to the lateral cricoid slope, the lateral cricoid angle and the cricoid thickness. A larger left CAJ angle directly and significantly correlated with a larger suture angle. There are notable conformational differences among cricoid specimens in the Thoroughbred larynx. The morphometric differences identified may impact on optimal prosthesis placement and long-term retention. Since a larger lateral cricoid angle may facilitate abduction loss secondary to a displaced and loosened suture, alternative techniques for suture placement may be of

  6. Worst-case execution time analysis-driven object cache design

    DEFF Research Database (Denmark)

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  7. Analysis of Daily Setup Variation With Tomotherapy Megavoltage Computed Tomography

    International Nuclear Information System (INIS)

    Zhou Jining; Uhl, Barry; Dewit, Kelly; Young, Mark; Taylor, Brian; Fei Dingyu; Lo, Y-C

    2010-01-01

    The purpose of this study was to evaluate different setup uncertainties for various anatomic sites with TomoTherapy (registered) pretreatment megavoltage computed tomography (MVCT) and to provide optimal margin guidelines for these anatomic sites. Ninety-two patients with tumors in head and neck (HN), brain, lung, abdominal, or prostate regions were included in the study. MVCT was used to verify patient position and tumor target localization before each treatment. With the anatomy registration tool, MVCT provided real-time tumor shift coordinates relative to the positions where the simulation CT was performed. Thermoplastic facemasks were used for HN and brain treatments. Vac-Lok TM cushions were used to immobilize the lower extremities up to the thighs for prostate patients. No respiration suppression was administered for lung and abdomen patients. The interfractional setup variations were recorded and corrected before treatment. The mean interfractional setup error was the smallest for HN among the 5 sites analyzed. The average 3D displacement in lateral, longitudinal, and vertical directions for the 5 sites ranged from 2.2-7.7 mm for HN and lung, respectively. The largest movement in the lung was 2.0 cm in the longitudinal direction, with a mean error of 6.0 mm and standard deviation of 4.8 mm. The mean interfractional rotation variation was small and ranged from 0.2-0.5 deg., with the standard deviation ranging from 0.7-0.9 deg. Internal organ displacement was also investigated with a posttreatment MVCT scan for HN, lung, abdomen, and prostate patients. The maximum 3D intrafractional displacement across all sites was less than 4.5 mm. The interfractional systematic errors and random errors were analyzed and the suggested margins for HN, brain, prostate, abdomen, and lung in the lateral, longitudinal, and vertical directions were between 4.2 and 8.2 mm, 5.0 mm and 12.0 mm, and 1.5 mm and 6.8 mm, respectively. We suggest that TomoTherapy (registered) pretreatment

  8. Analysis of daily setup variation with tomotherapy megavoltage computed tomography.

    Science.gov (United States)

    Zhou, Jining; Uhl, Barry; Dewit, Kelly; Young, Mark; Taylor, Brian; Fei, Ding-Yu; Lo, Yeh-Chi

    2010-01-01

    The purpose of this study was to evaluate different setup uncertainties for various anatomic sites with TomoTherapy pretreatment megavoltage computed tomography (MVCT) and to provide optimal margin guidelines for these anatomic sites. Ninety-two patients with tumors in head and neck (HN), brain, lung, abdominal, or prostate regions were included in the study. MVCT was used to verify patient position and tumor target localization before each treatment. With the anatomy registration tool, MVCT provided real-time tumor shift coordinates relative to the positions where the simulation CT was performed. Thermoplastic facemasks were used for HN and brain treatments. Vac-Lok cushions were used to immobilize the lower extremities up to the thighs for prostate patients. No respiration suppression was administered for lung and abdomen patients. The interfractional setup variations were recorded and corrected before treatment. The mean interfractional setup error was the smallest for HN among the 5 sites analyzed. The average 3D displacement in lateral, longitudinal, and vertical directions for the 5 sites ranged from 2.2-7.7 mm for HN and lung, respectively. The largest movement in the lung was 2.0 cm in the longitudinal direction, with a mean error of 6.0 mm and standard deviation of 4.8 mm. The mean interfractional rotation variation was small and ranged from 0.2-0.5 degrees, with the standard deviation ranging from 0.7-0.9 degrees. Internal organ displacement was also investigated with a posttreatment MVCT scan for HN, lung, abdomen, and prostate patients. The maximum 3D intrafractional displacement across all sites was less than 4.5 mm. The interfractional systematic errors and random errors were analyzed and the suggested margins for HN, brain, prostate, abdomen, and lung in the lateral, longitudinal, and vertical directions were between 4.2 and 8.2 mm, 5.0 mm and 12.0 mm, and 1.5 mm and 6.8 mm, respectively. We suggest that TomoTherapy pretreatment MVCT can be used to

  9. ART OF METALLOGRAPHY: POSSIBILITIES OF DARK-FIELD MICROSCOPY APPLICATION FOR COLORED OBJECTS STRUCTURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    A. G. Anisovich

    2015-01-01

    Full Text Available The application of the method of dark field microscopy for the study of colored objects of material technology was researched. The capability of corrosive damage analysis and determination of the thickness of the metal coating were demonstrated. The performance capability of analysis of «reflection» in the dark field during the study of non-metallic materials – orthopedic implants and fireclay refractory were tested. An example of defect detection of carbon coating was displayed.

  10. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  11. Context-based object-of-interest detection for a generic traffic surveillance analysis system

    NARCIS (Netherlands)

    Bao, X.; Javanbakhti, S.; Zinger, S.; Wijnhoven, R.G.J.; With, de P.H.N.

    2014-01-01

    We present a new traffic surveillance video analysis system, focusing on building a framework with robust and generic techniques, based on both scene understanding and moving object-of-interest detection. Since traffic surveillance is widely applied, we want to design a single system that can be

  12. Analysis of Various Multi-Objective Optimization Evolutionary Algorithms for Monte Carlo Treatment Planning System

    CERN Document Server

    Tydrichova, Magdalena

    2017-01-01

    In this project, various available multi-objective optimization evolutionary algorithms were compared considering their performance and distribution of solutions. The main goal was to select the most suitable algorithms for applications in cancer hadron therapy planning. For our purposes, a complex testing and analysis software was developed. Also, many conclusions and hypothesis have been done for the further research.

  13. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E. J. L.; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W. T. Cohen; Overdiek, Hans W. P. M.; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M. P. M.; Wolfs, Tom F. W.; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes: clinical outcomes,

  14. Current evidence on hospital antimicrobial stewardship objectives : A systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E J L; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W T Cohen; Overdiek, Hans W P M; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M P M; Wolfs, Tom F W; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  15. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, E.C.; Hulscher, M.E.J.L.; Mouton, J.W.; Verduin, C.M.; Stuart, J.W.; Overdiek, H.W.; Linden, P.D. van der; Natsch, S.S.; Hertogh, C.M.; Wolfs, T.F.; Schouten, J.A.; Kullberg, B.J.; Prins, J.M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  16. OBJECTIVE EVALUATION OF HYPERACTIVATED MOTILITY IN RAT SPERMATOZA USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Objective evaluation of hyperactivated motility in rat spermatozoa using computer-assisted sperm analysis.Cancel AM, Lobdell D, Mendola P, Perreault SD.Toxicology Program, University of North Carolina, Chapel Hill, NC 27599, USA.The aim of this study was t...

  17. Analysis of micro computed tomography images; a look inside historic enamelled metal objects

    Science.gov (United States)

    van der Linden, Veerle; van de Casteele, Elke; Thomas, Mienke Simon; de Vos, Annemie; Janssen, Elsje; Janssens, Koen

    2010-02-01

    In this study the usefulness of micro-Computed Tomography (µ-CT) for the in-depth analysis of enamelled metal objects was tested. Usually investigations of enamelled metal artefacts are restricted to non-destructive surface analysis or analysis of cross sections after destructive sampling. Radiography, a commonly used technique in the field of cultural heritage studies, is limited to providing two-dimensional information about a three-dimensional object (Lang and Middleton, Radiography of Cultural Material, pp. 60-61, Elsevier-Butterworth-Heinemann, Amsterdam-Stoneham-London, 2005). Obtaining virtual slices and information about the internal structure of these objects was made possible by CT analysis. With this technique the underlying metal work was studied without removing the decorative enamel layer. Moreover visible defects such as cracks were measured in both width and depth and as of yet invisible defects and weaker areas are visualised. All these features are of great interest to restorers and conservators as they allow a view inside these objects without so much as touching them.

  18. Analysis of longitudinal variations in North Pacific alkalinity

    Science.gov (United States)

    Fry, C.; Tyrrell, T.; Achterberg, E. P.

    2016-02-01

    Carbon measurements in the ocean lack the coverage of physical measurements, so approximate alkalinity is predicted where data is unavailable. Surface alkalinity in the North Pacific is poorly characterised by predictive algorithms. Understanding the processes affecting alkalinity in this area can improve the equations. We investigated the causes of regional variations in alkalinity using GLODAPv2. We tested different hypotheses for the causes of three longitudinal phenomena in surface ocean values of Alk*, a tracer of calcium carbonate cycling. These phenomena are: (a) an increase in Alk* from east to west at 50°N, (b) an increase in Alk* from west to east at 30°N, and (c) a lack of a strong increase in Alk* from west to east in the equatorial upwelling area. We found that the most likely cause of higher Alk* on the western side of the subpolar North Pacific (at 50°N) is that denser isopycnals with higher Alk* lie at shallower depths on the western side than the eastern side. At 30°N, the main cause of higher Alk* on the eastern side of the basin is upwelling along the continental shelf of southwestern North America. Along the equator, our analyses suggest that the absence of a strong east-west trend is because the more intense upwelling on the eastern side of the basin does not, under normal conditions, lead to strong elevation of Alk*. However, surface Alk* is more strongly elevated in the eastern Equatorial Pacific during negative phases of the El-Nino-Southern Oscillation, probably because the upwelled water comes from greater depth at these times.

  19. Analysis of the intersexual variation in Thalassophryne maculosa fish venoms.

    Science.gov (United States)

    Lopes-Ferreira, Mônica; Sosa-Rosales, Ines; Bruni, Fernanda M; Ramos, Anderson D; Vieira Portaro, Fernanda Calheta; Conceição, Katia; Lima, Carla

    2016-06-01

    Gender related variation in the molecular composition of venoms and secretions have been described for some animal species, and there are some evidences that the difference in the toxin (s) profile among males and females may be related to different physiopathological effects caused by the envenomation by either gender. In order to investigate whether this same phenomenon occurs to the toadfish Thalassophryne maculosa, we have compared some biological and biochemical properties of female and male venoms. Twenty females and males were collected in deep waters of the La Restinga lagoon (Venezuela) and, after protein concentration assessed, the induction of toxic activities in mice and the biochemical properties were analyzed. Protein content is higher in males than in females, which may be associated to a higher size and weight of the male body. In vivo studies showed that mice injected with male venoms presented higher nociception when compared to those injected with female venoms, and both venoms induced migration of macrophages into the paw of mice. On the other hand, mice injected with female venoms had more paw edema and extravasation of Evans blue in peritoneal cavity than mice injected with male venoms. We observed that the female venoms had more capacity for necrosis induction when compared with male venoms. The female samples present a higher proteolytic activity then the male venom when gelatin, casein and FRETs were used as substrates. Evaluation of the venoms of females and males by SDS-PAGE and chromatographic profile showed that, at least three components (present in two peaks) are only present in males. Although the severity of the lesion, characterized by necrosis development, is related with the poisoning by female specimens, the presence of exclusive toxins in the male venoms could be associated with the largest capacity of nociception induction by this sample. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Copy number variations in affective disorders and meta-analysis

    DEFF Research Database (Denmark)

    Olsen, Line; Hansen, Thomas; Djurovic, Srdjan

    2011-01-01

    in a combined analysis of three case-control samples from Denmark, Norway and Iceland. A total of 1897 cases (n=1223 unipolar and n=463 bipolar) and 11 231 controls were analyzed for CNVs at the 10 genomic loci, but we found no combined association between these CNVs and affective disorders....

  1. analysis of pressure variation of fluid in bounded circular reservoirs

    African Journals Online (AJOL)

    user

    analysis of the analysed finite element, imposing the boundary conditions and finally, getting the results that ... in reservoir engineering applications [2–7]. ... THEORY. The law of conservation of mass, Darcy's law and the equation of state has been combined to obtain the ..... fields in laser-two-layer solids weak interactions.

  2. Genetic variation and DNA markers in forensic analysis

    African Journals Online (AJOL)

    SAM

    2014-07-30

    Jul 30, 2014 ... Author(s) agree that this article remain permanently open access under the terms of the Creative Commons Attribution License. 4.0 International ... (mtDNA) is today a routine method of analysis of biological ... A promising approach in this context seems to be .... 1985; Armour et al., 1996). ...... management.

  3. Analysis of genetic variation in Erianthus arundinaceum by random ...

    African Journals Online (AJOL)

    STORAGESEVER

    2008-10-06

    Oct 6, 2008 ... MATERIALS AND METHODS. Fifty-one E. arundinaceum accessions were used in the RAPD analysis. Figure 1. Plant materials planted in the sugarcane germ- plasm garden of Yunnan Agricultural University (YAU). Name and origin of the accessions are shown in Table 1. DNA was extracted from leaves ...

  4. Pedestrian-Vehicle Accidents Reconstruction with PC-Crash®: Sensibility Analysis of Factors Variation

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Gala, F.

    2016-07-01

    This paper describes the main findings of a study performed by INSIA-UPM about the improvement of the reconstruction process of real world vehicle-pedestrian accidents using PC-Crash® software, aimed to develop a software tool for the estimation of the variability of the collision speed due to the lack of real values of some parameters required during the reconstruction task. The methodology has been based on a sensibility analysis of the factors variation. A total of 9 factors have been analyzed with the objective of identifying which ones were significant. Four of them (pedestrian height, collision angle, hood height and pedestrian-road friction coefficient) were significant and were included in a full factorial experiment with the collision speed as an additional factor in order to obtain a regression model with up to third level interactions. Two different factorial experiments with the same structure have been performed because of pedestrian gender differences. The tool has been created as a collision speed predictor based on the regression models obtained, using the 4 significant factors and the projection distance measured or estimated in the accident site. The tool has been used on the analysis of real-world reconstructed accidents occurred in the city of Madrid (Spain). The results have been adequate in most cases with less than 10% of deviation between the predicted speed and the one estimated in the reconstructions. (Author)

  5. Feasibility study for objective oriented design of system thermal hydraulic analysis program

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. In this work, objective oriented program for system safety analysis code has been tried utilizing modernized C language. The analysis, design, implementation and verification steps for OOP system code development are described with some implementation examples. The system code SYSTF based on three-fluid thermal hydraulic solver has been developed by OOP design. The verifications of feasibility are performed with simple fundamental problems and plant models. (author)

  6. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  7. Elliptic Fourier Analysis of body shape variation of Hippocampus spp. (seahorse in Danajon Bank, Philippines

    Directory of Open Access Journals (Sweden)

    S. R. M. Tabugo-Rico

    2017-12-01

    Full Text Available Seahorses inhabit various ecosystems hence, had become a flagship species of the marine environment. The Philippines as a hot spot of biodiversity in Asia holds a number of species of seahorses. This serve as an exploratory study to describe body shape variation of selected common seahorse species: Hippocampus comes, Hippocampus histrix, Hippocampus spinosissimus and Hippocampus kuda from Danajon bank using Elliptic Fourier Analysis. The method was done to test whether significant yet subtle differences in body shape variation can be species-specific, habitat-influenced and provide evidence of sexual dimorphism. It is hypothesized that phenotypic divergence may provide evidence for genetic differentiation or mere adaptations to habitat variation. Results show significant considerable differences in the body shapes of the five populations based on the canonical variate analysis (CVA and multivariate analysis of variance (MANOVA with significant p values. Populations were found to be distinct from each other suggesting that body shape variation is species-specific, habitat-influenced and provided evidence for sexual dimorphism. Results of discriminant analysis show further support for species specific traits and sexual dimorphism. This study shows the application of the method of geometric morphometrics specifically elliptic fourier analysis in describing subtle body shape variation of selected Hippocampus species.

  8. Art, historical and cultural heritage objects studied with different non-destructive analysis

    International Nuclear Information System (INIS)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M.

    2012-01-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  9. Art, historical and cultural heritage objects studied with different non-destructive analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M. [Universidade de Sao Paulo (IF/USP), SP (Brazil). Inst. de Fisica

    2012-07-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  10. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    Science.gov (United States)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  11. Context based Coding of Binary Shapes by Object Boundary Straightness Analysis

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2004-01-01

    A new lossless compression scheme for bilevel images targeted at binary shapes of image and video objects is presented. The scheme is based on a local analysis of the digital straightness of the causal part of the object boundary, which is used in the context definition for arithmetic encoding....... Tested on individual images of binary shapes and binary layers of digital maps the algorithm outperforms PWC, JBIG and MPEG-4 CAE. On the binary shapes the code lengths are reduced by 21%, 25%, and 42%, respectively. On the maps the reductions are 34%, 32%, and 59%, respectively. The algorithm is also...

  12. Variations analysis of the Society's preference structure regarding environmental issues

    International Nuclear Information System (INIS)

    Angel S, Enrique; Zambrano B, Ana Maria

    2005-01-01

    Society's preference structure regarding environmental issues is understood as the relative importance the society gives to various topics that collectively conform the environmental issues. Based on the hypothesis that this structure behavior and its definition vary with time, proposals are presented related to the concepts and a working plan allowing performing the structure's dynamic analysis. A method is described to gather information based on the systematic reading of a nation wide newspaper during a period time. A comparison is done between the resulting structure and several aspects as the environmental legislation, government plans and summits and environmental milestones

  13. Implicit functions and solution mappings a view from variational analysis

    CERN Document Server

    Dontchev, Asen L

    2014-01-01

    The implicit function theorem is one of the most important theorems in analysis and its many variants are basic tools in partial differential equations and numerical analysis. This second edition of Implicit Functions and Solution Mappings presents an updated and more complete picture of the field by including solutions of problems that have been solved since the first edition was published, and places old and new results in a broader perspective. The purpose of this self-contained work is to provide a reference on the topic and to provide a unified collection of a number of results which are currently scattered throughout the literature. Updates to this edition include new sections in almost all chapters, new exercises and examples, updated commentaries to chapters and an enlarged index and references section. From reviews of the first edition: “The book commences with a helpful context-setting preface followed by six chapters. Each chapter starts with a useful preamble and concludes with a careful and ins...

  14. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  15. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    Science.gov (United States)

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  16. Explicit area-based accuracy assessment for mangrove tree crown delineation using Geographic Object-Based Image Analysis (GEOBIA)

    Science.gov (United States)

    Kamal, Muhammad; Johansen, Kasper

    2017-10-01

    Effective mangrove management requires spatially explicit information of mangrove tree crown map as a basis for ecosystem diversity study and health assessment. Accuracy assessment is an integral part of any mapping activities to measure the effectiveness of the classification approach. In geographic object-based image analysis (GEOBIA) the assessment of the geometric accuracy (shape, symmetry and location) of the created image objects from image segmentation is required. In this study we used an explicit area-based accuracy assessment to measure the degree of similarity between the results of the classification and reference data from different aspects, including overall quality (OQ), user's accuracy (UA), producer's accuracy (PA) and overall accuracy (OA). We developed a rule set to delineate the mangrove tree crown using WorldView-2 pan-sharpened image. The reference map was obtained by visual delineation of the mangrove tree crowns boundaries form a very high-spatial resolution aerial photograph (7.5cm pixel size). Ten random points with a 10 m radius circular buffer were created to calculate the area-based accuracy assessment. The resulting circular polygons were used to clip both the classified image objects and reference map for area comparisons. In this case, the area-based accuracy assessment resulted 64% and 68% for the OQ and OA, respectively. The overall quality of the calculation results shows the class-related area accuracy; which is the area of correctly classified as tree crowns was 64% out of the total area of tree crowns. On the other hand, the overall accuracy of 68% was calculated as the percentage of all correctly classified classes (tree crowns and canopy gaps) in comparison to the total class area (an entire image). Overall, the area-based accuracy assessment was simple to implement and easy to interpret. It also shows explicitly the omission and commission error variations of object boundary delineation with colour coded polygons.

  17. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  18. Nurse-surgeon object transfer: video analysis of communication and situation awareness in the operating theatre.

    Science.gov (United States)

    Korkiakangas, Terhi; Weldon, Sharon-Marie; Bezemer, Jeff; Kneebone, Roger

    2014-09-01

    One of the most central collaborative tasks during surgical operations is the passing of objects, including instruments. Little is known about how nurses and surgeons achieve this. The aim of the present study was to explore what factors affect this routine-like task, resulting in fast or slow transfer of objects. A qualitative video study, informed by an observational ethnographic approach, was conducted in a major teaching hospital in the UK. A total of 20 general surgical operations were observed. In total, approximately 68 h of video data have been reviewed. A subsample of 225 min has been analysed in detail using interactional video-analysis developed within the social sciences. Two factors affecting object transfer were observed: (1) relative instrument trolley position and (2) alignment. The scrub nurse's instrument trolley position (close to vs. further back from the surgeon) and alignment (gaze direction) impacts on the communication with the surgeon, and consequently, on the speed of object transfer. When the scrub nurse was standing close to the surgeon, and "converged" to follow the surgeon's movements, the transfer occurred more seamlessly and faster (1.0 s). The smoothness of object transfer can be improved by adjusting the scrub nurse's instrument trolley position, enabling a better monitoring of surgeon's bodily conduct and affording early orientation (awareness) to an upcoming request (changing situation). Object transfer is facilitated by the surgeon's embodied practices, which can elicit the nurse's attention to the request and, as a response, maximise a faster object transfer. A simple intervention to highlight the significance of these factors could improve communication in the operating theatre. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Analysis of interspecies physicochemical variation of grain legume seeds

    Science.gov (United States)

    Rybiński, Wojciech; Rusinek, Robert; Szot, Bogusław; Bocianowski, Jan; Starzycki, Michał

    2014-10-01

    The paper presents an attempt to assess the reaction of seeds to mechanical loads taking into account their geometry expressed as seed thickness and 1000 seed weight. The initial material comprised 33 genotypes of grain legume plants and included cultivars registered in the country and breeding lines that are subject to pre-registration trials. The analysis of variance revealed significant diversity of the cultivars and lines of the species studied in terms of each of the analysed trait. The highest weight of 1000 seeds were obtained for white lupine seeds and peas, the lowest for andean lupine seeds. The maximum deformation and energy were obtained for white lupine seeds, the lowest for pea seeds, the maximum force and module the lowest values were determined for narrow-leafed lupine and pea. The highest values of protein were obtained for andean and yellow lupine, a fat content for andean and white lupine. The fatty acid profile as much as 70% or more were linoleic and oleic acids. Against the background of all the species are distinguished by white lupine seeds with a high content of oleic acid and the lowest of linoleic acid, for yellow lupine were obtained the inverse ratio of the two acids.

  20. Genetic variation analysis of the Bali street dog using microsatellites

    Directory of Open Access Journals (Sweden)

    Wilton Alan N

    2005-02-01

    Full Text Available Abstract Background Approximately 800,000 primarily feral dogs live on the small island of Bali. To analyze the genetic diversity in this population, forty samples were collected at random from dogs in the Denpasar, Bali region and tested using 31 polymorphic microsatellites. Australian dingoes and 28 American Kennel Club breeds were compared to the Bali Street Dog (BSD for allelic diversity, heterozygosities, F-statistics, GST estimates, Nei's DA distance and phylogenetic relationships. Results The BSD proved to be the most heterogeneous, exhibiting 239 of the 366 total alleles observed across all groups and breeds and had an observed heterozygosity of 0.692. Thirteen private alleles were observed in the BSD with an additional three alleles observed only in the BSD and the Australian dingo. The BSD was related most closely to the Chow Chow with a FST of 0.088 and also with high bootstrap support to the Australian dingo and Akita in the phylogenetic analysis. Conclusions This preliminary study into the diversity and relationship of the BSD to other domestic and feral dog populations shows the BSD to be highly heterogeneous and related to populations of East Asian origin. These results indicate that a viable and diverse population of dogs existed on the island of Bali prior to its geographic isolation approximately 12,000 years ago and has been little influenced by domesticated European dogs since that time.

  1. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    OpenAIRE

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  2. Comprehensive analysis of NuMA variation in breast cancer

    Directory of Open Access Journals (Sweden)

    Aittomäki Kristiina

    2008-03-01

    Full Text Available Abstract Background A recent genome wide case-control association study identified NuMA region on 11q13 as a candidate locus for breast cancer susceptibility. Specifically, the variant Ala794Gly was suggested to be associated with increased risk of breast cancer. Methods In order to evaluate the NuMa gene for breast cancer susceptibility, we have here screened the entire coding region and exon-intron boundaries of NuMa in 92 familial breast cancer patients and constructed haplotypes of the identified variants. Five missense variants were further screened in 341 breast cancer cases with a positive family history and 368 controls. We examined the frequency of Ala794Gly in an extensive series of familial (n = 910 and unselected (n = 884 breast cancer cases and controls (n = 906, with a high power to detect the suggested breast cancer risk. We also tested if the variant is associated with histopathologic features of breast tumors. Results Screening of NuMA resulted in identification of 11 exonic variants and 12 variants in introns or untranslated regions. Five missense variants that were further screened in breast cancer cases with a positive family history and controls, were each carried on a unique haplotype. None of the variants, or the haplotypes represented by them, was associated with breast cancer risk although due to low power in this analysis, very low risk alleles may go unrecognized. The NuMA Ala794Gly showed no difference in frequency in the unselected breast cancer case series or familial case series compared to control cases. Furthermore, Ala794Gly did not show any significant association with histopathologic characteristics of the tumors, though Ala794Gly was slightly more frequent among unselected cases with lymph node involvement. Conclusion Our results do not support the role of NuMA variants as breast cancer susceptibility alleles.

  3. Comprehensive analysis of NuMA variation in breast cancer

    International Nuclear Information System (INIS)

    Kilpivaara, Outi; Rantanen, Matias; Tamminen, Anitta; Aittomäki, Kristiina; Blomqvist, Carl; Nevanlinna, Heli

    2008-01-01

    A recent genome wide case-control association study identified NuMA region on 11q13 as a candidate locus for breast cancer susceptibility. Specifically, the variant Ala794Gly was suggested to be associated with increased risk of breast cancer. In order to evaluate the NuMa gene for breast cancer susceptibility, we have here screened the entire coding region and exon-intron boundaries of NuMa in 92 familial breast cancer patients and constructed haplotypes of the identified variants. Five missense variants were further screened in 341 breast cancer cases with a positive family history and 368 controls. We examined the frequency of Ala794Gly in an extensive series of familial (n = 910) and unselected (n = 884) breast cancer cases and controls (n = 906), with a high power to detect the suggested breast cancer risk. We also tested if the variant is associated with histopathologic features of breast tumors. Screening of NuMA resulted in identification of 11 exonic variants and 12 variants in introns or untranslated regions. Five missense variants that were further screened in breast cancer cases with a positive family history and controls, were each carried on a unique haplotype. None of the variants, or the haplotypes represented by them, was associated with breast cancer risk although due to low power in this analysis, very low risk alleles may go unrecognized. The NuMA Ala794Gly showed no difference in frequency in the unselected breast cancer case series or familial case series compared to control cases. Furthermore, Ala794Gly did not show any significant association with histopathologic characteristics of the tumors, though Ala794Gly was slightly more frequent among unselected cases with lymph node involvement. Our results do not support the role of NuMA variants as breast cancer susceptibility alleles

  4. Parametric analysis of energy quality management for district in China using multi-objective optimization approach

    International Nuclear Information System (INIS)

    Lu, Hai; Yu, Zitao; Alanne, Kari; Xu, Xu; Fan, Liwu; Yu, Han; Zhang, Liang; Martinac, Ivo

    2014-01-01

    Highlights: • A time-effective multi-objective design optimization scheme is proposed. • The scheme aims at exploring suitable 3E energy system for the specific case. • A realistic case located in China is used for the analysis. • Parametric study is investigated to test the effects of different parameters. - Abstract: Due to the increasing energy demands and global warming, energy quality management (EQM) for districts has been getting importance over the last few decades. The evaluation of the optimum energy systems for specific districts is an essential part of EQM. This paper presents a deep analysis of the optimum energy systems for a district sited in China. A multi-objective optimization approach based on Genetic Algorithm (GA) is proposed for the analysis. The optimization process aims to search for the suitable 3E (minimum economic cost and environmental burden as well as maximum efficiency) energy systems. Here, life cycle CO 2 equivalent (LCCO 2 ), life cycle cost (LCC) and exergy efficiency (EE) are set as optimization objectives. Then, the optimum energy systems for the Chinese case are presented. The final work is to investigate the effects of different energy parameters. The results show the optimum energy systems might vary significantly depending on some parameters

  5. Geographic Object-Based Image Analysis – Towards a new paradigm

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  6. The objective assessment of experts' and novices' suturing skills using an image analysis program.

    Science.gov (United States)

    Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M

    2013-02-01

    To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.

  7. Partial differential equations with variable exponents variational methods and qualitative analysis

    CERN Document Server

    Radulescu, Vicentiu D

    2015-01-01

    Partial Differential Equations with Variable Exponents: Variational Methods and Qualitative Analysis provides researchers and graduate students with a thorough introduction to the theory of nonlinear partial differential equations (PDEs) with a variable exponent, particularly those of elliptic type. The book presents the most important variational methods for elliptic PDEs described by nonhomogeneous differential operators and containing one or more power-type nonlinearities with a variable exponent. The authors give a systematic treatment of the basic mathematical theory and constructive meth

  8. Analysis of indel variations in the human disease-associated genes ...

    Indian Academy of Sciences (India)

    Keywords. insertion–deletion variations; haematological disease; tumours; human genetics. Journal of Genetics ... domly selected healthy Korean individuals using a blood genomic DNA ... Bioinformatics annotation and 3-D protein structure analysis. In this study ..... 2009 A genome-wide meta-analysis identifies. Journal of ...

  9. Fourier analysis of intracranial aneurysms: towards an objective and quantitative evaluation of the shape of aneurysms

    International Nuclear Information System (INIS)

    Rohde, Stefan; Lahmann, Katharina; Nafe, Reinhold; Yan, Bernard; Berkefeld, Joachim; Beck, Juergen; Raabe, Andreas

    2005-01-01

    Shape irregularities of intracranial aneurysms may indicate an increased risk of rupture. To quantify morphological differences, Fourier analysis of the shape of intracranial aneurysms was introduced. We compared the morphology of 45 unruptured (UIA) and 46 ruptured intracranial aneurysms (RIA) in 70 consecutive patients on the basis of 3D-rotational angiography. Fourier analysis, coefficient of roundness and qualitative shape assessment were determined for each aneurysm. Morphometric analysis revealed significantly smaller coefficient of roundness (P<0.02) and higher values for Fourier amplitudes numbers 2, 3 and 7 (P<0.01) in the RIA group, indicating more complex and irregular morphology in RIA. Qualitative assessment from 3D-reconstructions showed surface irregularities in 78% of RIA and 42% of UIA (P<0.05). Our data have shown significant differences in shape between RIA and UIA, and further developments of Fourier analysis may provide an objective factor for the assessment of the risk of rupture. (orig.)

  10. A decision analysis approach for risk management of near-earth objects

    Science.gov (United States)

    Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.

    2014-10-01

    Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in

  11. Neural regions supporting lexical processing of objects and actions: A case series analysis

    Directory of Open Access Journals (Sweden)

    Bonnie L Breining

    2014-04-01

    Full Text Available Introduction. Linking semantic representations to lexical items is an important cognitive process for both producing and comprehending language. Past research has suggested that the bilateral anterior temporal lobes are critical for this process (e.g. Patterson, Nestor, & Rogers, 2007. However, the majority of studies focused on object concepts alone, ignoring actions. The few that considered actions suggest that the temporal poles are not critical for their processing (e.g. Kemmerer et al., 2010. In this case series, we investigated the neural substrates of linking object and action concepts to lexical labels by correlating the volume of defined regions of interest with behavioral performance on picture-word verification and picture naming tasks of individuals with primary progressive aphasia (PPA. PPA is a neurodegenerative condition with heterogeneous neuropathological causes, characterized by increasing language deficits for at least two years in the face of relatively intact cognitive function in other domains (Gorno-Tempini et al., 2011. This population displays appropriate heterogeneity of performance and focal atrophy for investigating the neural substrates involved in lexical semantic processing of objects and actions. Method. Twenty-one individuals with PPA participated in behavioral assessment within six months of high resolution anatomical MRI scans. Behavioral assessments consisted of four tasks: picture-word verification and picture naming of objects and actions. Performance on these assessments was correlated with brain volume measured using atlas-based analysis in twenty regions of interest that are commonly atrophied in PPA and implicated in language processing. Results. Impaired performance for all four tasks significantly correlated with atrophy in the right superior temporal pole, left anterior middle temporal gyrus, and left fusiform gyrus. No regions were identified in which volume correlated with performance for both

  12. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    Science.gov (United States)

    Bainbridge, Matthew B.; Webb, John K.

    2017-06-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one `artificial intelligence' process: a genetic algorithm (Genetic Voigt Profile FIT, gvpfit); non-linear least-squares with parameter constraints (vpfit); and Bayesian model averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. gvpfit is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. and King et al.

  13. Intellectual capital: approaches to analysis as an object of the internal environment of an economic entity

    Directory of Open Access Journals (Sweden)

    O. E. Ustinova

    2017-01-01

    Full Text Available Intellectual capital is of strategic importance for a modern company. At the same time, its effective management, including a stimulating and creative approach to solving problems, will help to increase the competitiveness and development of economic entities. The article considers intellectual capital as an object of analysis of the internal environment. In the context of the proposed approaches to its study, its impact on the development of the company is also considered. The intellectual capital has a special significance and influence on internal processes, since on each of them the intellectual component allows to achieve a positive synergetic effect from the interaction of different objects. In more detail, it is proposed to consider it in terms of the position of the company it occupies on the market, the principles of its activities, the formation of marketing policies, the use of resources, methods and means of making managerial decisions, and the organizational culture formed. For the analysis of the state of the internal environment, the main approaches are proposed, in which the intellectual capital is considered, among them: methods for analyzing cash flows, economic efficiency and financial feasibility of the project, analysis of the consolidated financial flow by group of objects, assessment of the potential of the business entity, technology of choice of investment policy, technology Selection of incentive mechanisms. In this regard, it is advisable to analyze the company's internal environment from the position of influencing its state of intellectual capital. The scheme of interaction of intellectual capital and objects of an estimation of an internal environment of the managing subject is offered. The results of this study should be considered as initial data for the further development of the economic evaluation of the influence of intellectual capital on the competitiveness of companies.

  14. Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry

    Science.gov (United States)

    Lukomski, Michal; Krzemien, Leszek

    2013-05-01

    Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.

  15. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    Science.gov (United States)

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  16. Fast and objective detection and analysis of structures in downhole images

    Science.gov (United States)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  17. Multi-objective optimization of a cascade refrigeration system: Exergetic, economic, environmental, and inherent safety analysis

    International Nuclear Information System (INIS)

    Eini, Saeed; Shahhosseini, Hamidreza; Delgarm, Navid; Lee, Moonyong; Bahadori, Alireza

    2016-01-01

    Highlights: • A multi-objective optimization is performed for a cascade refrigeration cycle. • The optimization problem considers inherently safe design as well as 3E analysis. • As a measure of inherent safety level a quantitative risk analysis is utilized. • A CO 2 /NH 3 cascade refrigeration system is compared with a CO 2 /C 3 H 8 system. - Abstract: Inherently safer design is the new approach to maximize the overall safety of a process plant. This approach suggests some risk reduction strategies to be implemented in the early stages of design. In this paper a multi-objective optimization was performed considering economic, exergetic, and environmental aspects besides evaluation of the inherent safety level of a cascade refrigeration system. The capital costs, the processing costs, and the social cost due to CO 2 emission were considered to be included in the economic objective function. Exergetic efficiency of the plant was considered as the second objective function. As a measure of inherent safety level, Quantitative Risk Assessment (QRA) was performed to calculate total risk level of the cascade as the third objective function. Two cases (ammonia and propane) were considered to be compared as the refrigerant of the high temperature circuit. The achieved optimum solutions from the multi–objective optimization process were given as Pareto frontier. The ultimate optimal solution from available solutions on the Pareto optimal curve was selected using Decision-Makings approaches. NSGA-II algorithm was used to obtain Pareto optimal frontiers. Also, three decision-making approaches (TOPSIS, LINMAP, and Shannon’s entropy methods) were utilized to select the final optimum point. Considering continuous material release from the major equipment in the plant, flash and jet fire scenarios were considered for the CO 2 /C 3 H 8 cycle and toxic hazards were considered for the CO 2 /NH 3 cycle. The results showed no significant differences between CO 2 /NH 3 and

  18. Objective voice and speech analysis of persons with chronic hoarseness by prosodic analysis of speech samples.

    Science.gov (United States)

    Haderlein, Tino; Döllinger, Michael; Matoušek, Václav; Nöth, Elmar

    2016-10-01

    Automatic voice assessment is often performed using sustained vowels. In contrast, speech analysis of read-out texts can be applied to voice and speech assessment. Automatic speech recognition and prosodic analysis were used to find regression formulae between automatic and perceptual assessment of four voice and four speech criteria. The regression was trained with 21 men and 62 women (average age 49.2 years) and tested with another set of 24 men and 49 women (48.3 years), all suffering from chronic hoarseness. They read the text 'Der Nordwind und die Sonne' ('The North Wind and the Sun'). Five voice and speech therapists evaluated the data on 5-point Likert scales. Ten prosodic and recognition accuracy measures (features) were identified which describe all the examined criteria. Inter-rater correlation within the expert group was between r = 0.63 for the criterion 'match of breath and sense units' and r = 0.87 for the overall voice quality. Human-machine correlation was between r = 0.40 for the match of breath and sense units and r = 0.82 for intelligibility. The perceptual ratings of different criteria were highly correlated with each other. Likewise, the feature sets modeling the criteria were very similar. The automatic method is suitable for assessing chronic hoarseness in general and for subgroups of functional and organic dysphonia. In its current version, it is almost as reliable as a randomly picked rater from a group of voice and speech therapists.

  19. Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy.

    Science.gov (United States)

    Vandenabeele, Peter; Tate, Jim; Moens, Luc

    2007-02-01

    Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland.

  20. Benchmarking the Applicability of Ontology in Geographic Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Sachit Rajbhandari

    2017-11-01

    Full Text Available In Geographic Object-based Image Analysis (GEOBIA, identification of image objects is normally achieved using rule-based classification techniques supported by appropriate domain knowledge. However, GEOBIA currently lacks a systematic method to formalise the domain knowledge required for image object identification. Ontology provides a representation vocabulary for characterising domain-specific classes. This study proposes an ontological framework that conceptualises domain knowledge in order to support the application of rule-based classifications. The proposed ontological framework is tested with a landslide case study. The Web Ontology Language (OWL is used to construct an ontology in the landslide domain. The segmented image objects with extracted features are incorporated into the ontology as instances. The classification rules are written in Semantic Web Rule Language (SWRL and executed using a semantic reasoner to assign instances to appropriate landslide classes. Machine learning techniques are used to predict new threshold values for feature attributes in the rules. Our framework is compared with published work on landslide detection where ontology was not used for the image classification. Our results demonstrate that a classification derived from the ontological framework accords with non-ontological methods. This study benchmarks the ontological method providing an alternative approach for image classification in the case study of landslides.

  1. Multi-objective Analysis for a Sequencing Planning of Mixed-model Assembly Line

    Science.gov (United States)

    Shimizu, Yoshiaki; Waki, Toshiya; Yoo, Jae Kyu

    Diversified customer demands are raising importance of just-in-time and agile manufacturing much more than before. Accordingly, introduction of mixed-model assembly lines becomes popular to realize the small-lot-multi-kinds production. Since it produces various kinds on the same assembly line, a rational management is of special importance. With this point of view, this study focuses on a sequencing problem of mixed-model assembly line including a paint line as its preceding process. By taking into account the paint line together, reducing work-in-process (WIP) inventory between these heterogeneous lines becomes a major concern of the sequencing problem besides improving production efficiency. Finally, we have formulated the sequencing problem as a bi-objective optimization problem to prevent various line stoppages, and to reduce the volume of WIP inventory simultaneously. Then we have proposed a practical method for the multi-objective analysis. For this purpose, we applied the weighting method to derive the Pareto front. Actually, the resulting problem is solved by a meta-heuristic method like SA (Simulated Annealing). Through numerical experiments, we verified the validity of the proposed approach, and discussed the significance of trade-off analysis between the conflicting objectives.

  2. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    Science.gov (United States)

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  3. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  4. Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations

    Directory of Open Access Journals (Sweden)

    Müller Meinard

    2007-01-01

    Full Text Available One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.

  5. Quantitative analysis of structural variations in corpus callosum in adults with multiple system atrophy (MSA)

    Science.gov (United States)

    Bhattacharya, Debanjali; Sinha, Neelam; Saini, Jitender

    2017-03-01

    Multiple system atrophy (MSA) is a rare, non-curable, progressive neurodegenerative disorder that affects nervous system and movement, poses a considerable diagnostic challenge to medical researchers. Corpus callosum (CC) being the largest white matter structure in brain, enabling inter-hemispheric communication, quantification of callosal atrophy may provide vital information at the earliest possible stages. The main objective is to identify the differences in CC structure for this disease, based on quantitative analysis on the pattern of callosal atrophy. We report results of quantification of structural changes in regional anatomical thickness, area and length of CC between patient-groups with MSA with respect to healthy controls. The method utilizes isolating and parcellating the mid-sagittal CC into 100 segments along the length - measuring the width of each segment. It also measures areas within geometrically defined five callosal compartments of the well-known Witelson, and Hofer-Frahma schemes. For quantification, statistical tests are performed on these different callosal measurements. From the statistical analysis, it is concluded that compared to healthy controls, width is reduced drastically throughout CC for MSA group and as well as changes in area and length are also significant for MSA. The study is further extended to check if any significant difference in thickness is found between the two variations of MSA, Parkinsonian MSA and Cerebellar MSA group, using the same methodology. However area and length of this two sub-MSA group, no substantial difference is obtained. The study is performed on twenty subjects for each control and MSA group, who had T1-weighted MRI.

  6. Molecular Karyotyping and Exome Analysis of Salt-Tolerant Rice Mutant from Somaclonal Variation

    Directory of Open Access Journals (Sweden)

    Thanikarn Udomchalothorn

    2014-11-01

    Full Text Available LPT123-TC171 is a salt-tolerant (ST and drought-tolerant (DT rice line that was selected from somaclonal variation of the original Leuang Pratew 123 (LPT123 rice cultivar. The objective of this study was to identify the changes in the rice genome that possibly lead to ST and/or DT characteristics. The genomes of LPT123 and LPT123-TC171 were comparatively studied at the four levels of whole chromosomes (chromosome structure including telomeres, transposable elements, and DNA sequence changes by using next-generation sequencing analysis. Compared with LPT123, the LPT123-TC171 line displayed no changes in the ploidy level, but had a significant deficiency of chromosome ends (telomeres. The functional genome analysis revealed new aspects of the genome response to the in vitro cultivation condition, where exome sequencing revealed the molecular spectrum and pattern of changes in the somaclonal variant compared with the parental LPT123 cultivar. Mutation detection was performed, and the degree of mutations was evaluated to estimate the impact of mutagenesis on the protein functions. Mutations within the known genes responding to both drought and salt stress were detected in 493 positions, while mutations within the genes responding to only salt stress were found in 100 positions. The possible functions of the mutated genes contributing to salt or drought tolerance were discussed. It was concluded that the ST and DT characteristics in the somaclonal variegated line resulted from the base changes in the salt- and drought-responsive genes rather than the changes in chromosome structure or the large duplication or deletion in the specific region of the genome.

  7. Objective and quantitative analysis of daytime sleepiness in physicians after night duties.

    Science.gov (United States)

    Wilhelm, Barbara J; Widmann, Anja; Durst, Wilhelm; Heine, Christian; Otto, Gerhard

    2009-06-01

    Work place studies often have the disadvantage of lacking objective data less prone to subject bias. The aim of this study was to contribute objective data to the discussion about safety aspects of night shifts in physicians. For this purpose we applied the Pupillographic Sleepiness Test (PST). The PST allows recording and analyses of pupillary sleepiness-related oscillations in darkness for 11 min in the sitting subject. The parameter of evaluation is the Pupillary Unrest Index (PUI; mm/min). For statistical analysis the natural logarithm of this parameter is used (lnPUI). Thirty-four physicians were examined by the PST and subjective scales during the first half of the day. Data taken during a day work period (D) were compared to those taken directly after night duty (N) by a Wilcoxon signed rank test. Night duty caused a mean sleep reduction of 3 h (Difference N-D: median 3 h, minimum 0 h, maximum 7 h, p home.

  8. Approaches to defining «financial potential» concept as of economic analysis object

    Directory of Open Access Journals (Sweden)

    O.M. Dzyubenkо

    2017-12-01

    Full Text Available The research analyzes the works of scientists who studied the issues of financial potential as an economic category. Due to analyzing the approaches of the scientists to the concept of "financial potential" the author identifies six approaches to the interpretation of its essence, they are: the totality of the enterprise financial resources, the sources of the enterprise economic activity financing, the enterprise economic activity development, the enterprise financial indicators, the system of enterprise financial management, the enterprise efficiency characteristics. It is established that the financial potential is the multifaceted category that characterizes the financial and economic activity of enterprises. The author's definition of the financial potential in the context of its place in the objects of economic analysis is proposed. It is established that the financial potential is the object of the enterprise economic activity management and is the subject to analytical assessments for establishing its state and directions of development.

  9. Quantitative measurement of phase variation amplitude of ultrasonic diffraction grating based on diffraction spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Meiyan, E-mail: yphantomohive@gmail.com; Zeng, Yingzhi; Huang, Zuohua, E-mail: zuohuah@163.com [Laboratory of Quantum Engineering and Quantum Materials, School of Physics and Telecommunication Engineering, South China Normal University, Guangzhou, Guangdong 510006 (China)

    2014-09-15

    A new method based on diffraction spectral analysis is proposed for the quantitative measurement of the phase variation amplitude of an ultrasonic diffraction grating. For a traveling wave, the phase variation amplitude of the grating depends on the intensity of the zeroth- and first-order diffraction waves. By contrast, for a standing wave, this amplitude depends on the intensity of the zeroth-, first-, and second-order diffraction waves. The proposed method is verified experimentally. The measured phase variation amplitude ranges from 0 to 2π, with a relative error of approximately 5%. A nearly linear relation exists between the phase variation amplitude and driving voltage. Our proposed method can also be applied to ordinary sinusoidal phase grating.

  10. Application of LC-MS to the analysis of dyes in objects of historical interest

    Science.gov (United States)

    Zhang, Xian; Laursen, Richard

    2009-07-01

    High-performance liquid chromatography (HPLC) with photodiode array and mass spectrometric detection permits dyes extracted from objects of historical interest or from natural plant or animal dyestuffs to be characterized on the basis of three orthogonal properties: HPLC retention time, UV-visible spectrum and molecular mass. In the present study, we have focused primarily on yellow dyes, the bulk of which are flavonoid glycosides that would be almost impossible to characterize without mass spectrometric detection. Also critical for this analysis is a method for mild extraction of the dyes from objects (e.g., textiles) without hydrolyzing the glycosidic linkages. This was accomplished using 5% formic acid in methanol, rather than the more traditional 6 M HCl. Mass spectroscopy, besides providing the molecular mass of the dye molecule, sometimes yields additional structural data based on fragmentation patterns. In addition, coeluting compounds can often be detected using extracted ion chromatography. The utility of mass spectrometry is illustrated by the analysis of historical specimens of silk that had been dyed yellow with flavonoid glycosides from Sophora japonica (pagoda tree) and curcumins from Curcuma longa (turmeric). In addition, we have used these techniques to identify the dye type, and sometimes the specific dyestuff, in a variety of objects, including a yellow varnish from a 19th century Tibetan altar and a 3000-year-old wool mortuary textiles, from Xinjiang, China. We are using HPLC with diode array and mass spectrometric detection to create a library of analyzed dyestuffs (>200 so far; mostly plants) to serve as references for identification of dyes in objects of historical interest.

  11. Reduced object related negativity response indicates impaired auditory scene analysis in adults with autistic spectrum disorder

    Directory of Open Access Journals (Sweden)

    Veema Lodhia

    2014-02-01

    Full Text Available Auditory Scene Analysis provides a useful framework for understanding atypical auditory perception in autism. Specifically, a failure to segregate the incoming acoustic energy into distinct auditory objects might explain the aversive reaction autistic individuals have to certain auditory stimuli or environments. Previous research with non-autistic participants has demonstrated the presence of an Object Related Negativity (ORN in the auditory event related potential that indexes pre-attentive processes associated with auditory scene analysis. Also evident is a later P400 component that is attention dependent and thought to be related to decision-making about auditory objects. We sought to determine whether there are differences between individuals with and without autism in the levels of processing indexed by these components. Electroencephalography (EEG was used to measure brain responses from a group of 16 autistic adults, and 16 age- and verbal-IQ-matched typically-developing adults. Auditory responses were elicited using lateralized dichotic pitch stimuli in which inter-aural timing differences create the illusory perception of a pitch that is spatially separated from a carrier noise stimulus. As in previous studies, control participants produced an ORN in response to the pitch stimuli. However, this component was significantly reduced in the participants with autism. In contrast, processing differences were not observed between the groups at the attention-dependent level (P400. These findings suggest that autistic individuals have difficulty segregating auditory stimuli into distinct auditory objects, and that this difficulty arises at an early pre-attentive level of processing.

  12. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Science.gov (United States)

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  13. Using Item Analysis to Assess Objectively the Quality of the Calgary-Cambridge OSCE Checklist

    Directory of Open Access Journals (Sweden)

    Tyrone Donnon

    2011-06-01

    Full Text Available Background:  The purpose of this study was to investigate the use of item analysis to assess objectively the quality of items on the Calgary-Cambridge Communications OSCE checklist. Methods:  A total of 150 first year medical students were provided with extensive teaching on the use of the Calgary-Cambridge Guidelines for interviewing patients and participated in a final year end 20 minute communication OSCE station.  Grouped into either the upper half (50% or lower half (50% communication skills performance groups, discrimination, difficulty and point biserial values were calculated for each checklist item. Results:  The mean score on the 33 item communication checklist was 24.09 (SD = 4.46 and the internal reliability coefficient was ? = 0.77. Although most of the items were found to have moderate (k = 12, 36% or excellent (k = 10, 30% discrimination values, there were 6 (18% identified as ‘fair’ and 3 (9% as ‘poor’. A post-examination review focused on item analysis findings resulted in an increase in checklist reliability (? = 0.80. Conclusions:  Item analysis has been used with MCQ exams extensively. In this study, it was also found to be an objective and practical approach to use in evaluating the quality of a standardized OSCE checklist.

  14. Crude oil price analysis and forecasting based on variational mode decomposition and independent component analysis

    Science.gov (United States)

    E, Jianwei; Bao, Yanling; Ye, Jimin

    2017-10-01

    As one of the most vital energy resources in the world, crude oil plays a significant role in international economic market. The fluctuation of crude oil price has attracted academic and commercial attention. There exist many methods in forecasting the trend of crude oil price. However, traditional models failed in predicting accurately. Based on this, a hybrid method will be proposed in this paper, which combines variational mode decomposition (VMD), independent component analysis (ICA) and autoregressive integrated moving average (ARIMA), called VMD-ICA-ARIMA. The purpose of this study is to analyze the influence factors of crude oil price and predict the future crude oil price. Major steps can be concluded as follows: Firstly, applying the VMD model on the original signal (crude oil price), the modes function can be decomposed adaptively. Secondly, independent components are separated by the ICA, and how the independent components affect the crude oil price is analyzed. Finally, forecasting the price of crude oil price by the ARIMA model, the forecasting trend demonstrates that crude oil price declines periodically. Comparing with benchmark ARIMA and EEMD-ICA-ARIMA, VMD-ICA-ARIMA can forecast the crude oil price more accurately.

  15. Stability Analysis and Variational Integrator for Real-Time Formation Based on Potential Field

    Directory of Open Access Journals (Sweden)

    Shengqing Yang

    2014-01-01

    Full Text Available This paper investigates a framework of real-time formation of autonomous vehicles by using potential field and variational integrator. Real-time formation requires vehicles to have coordinated motion and efficient computation. Interactions described by potential field can meet the former requirement which results in a nonlinear system. Stability analysis of such nonlinear system is difficult. Our methodology of stability analysis is discussed in error dynamic system. Transformation of coordinates from inertial frame to body frame can help the stability analysis focus on the structure instead of particular coordinates. Then, the Jacobian of reduced system can be calculated. It can be proved that the formation is stable at the equilibrium point of error dynamic system with the effect of damping force. For consideration of calculation, variational integrator is introduced. It is equivalent to solving algebraic equations. Forced Euler-Lagrange equation in discrete expression is used to construct a forced variational integrator for vehicles in potential field and obstacle environment. By applying forced variational integrator on computation of vehicles' motion, real-time formation of vehicles in obstacle environment can be implemented. Algorithm based on forced variational integrator is designed for a leader-follower formation.

  16. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  17. Mapping of crop calendar events by object-based analysis of MODIS and ASTER images

    Directory of Open Access Journals (Sweden)

    A.I. De Castro

    2014-06-01

    Full Text Available A method to generate crop calendar and phenology-related maps at a parcel level of four major irrigated crops (rice, maize, sunflower and tomato is shown. The method combines images from the ASTER and MODIS sensors in an object-based image analysis framework, as well as testing of three different fitting curves by using the TIMESAT software. Averaged estimation of calendar dates were 85%, from 92% in the estimation of emergence and harvest dates in rice to 69% in the case of harvest date in tomato.

  18. Analysis of Scattering by Inhomogeneous Dielectric Objects Using Higher-Order Hierarchical MoM

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.; Jørgensen, Erik; Meincke, Peter

    2003-01-01

    An efficient technique for the analysis of electromagnetic scattering by arbitrary shaped inhomogeneous dielectric objects is presented. The technique is based on a higher-order method of moments (MoM) solution of the volume integral equation. This higher-order MoM solution comprises recently...... that the condition number of the resulting MoM matrix is reduced by several orders of magnitude in comparison to existing higher-order hierarchical basis functions and, consequently, an iterative solver can be applied even for high expansion orders. Numerical results demonstrate excellent agreement...

  19. Object-Oriented Programming in the Development of Containment Analysis Code

    International Nuclear Information System (INIS)

    Han, Tae Young; Hong, Soon Joon; Hwang, Su Hyun; Lee, Byung Chul; Byun, Choong Sup

    2009-01-01

    After the mid 1980s, the new programming concept, Object-Oriented Programming (OOP), was introduced and designed, which has the features such as the information hiding, encapsulation, modularity and inheritance. These offered much more convenient programming paradigm to code developers. The OOP concept was readily developed into the programming language as like C++ in the 1990s and is being widely used in the modern software industry. In this paper, we show that the OOP concept is successfully applicable to the development of safety analysis code for containment and propose the more explicit and easy OOP design for developers

  20. Introduction to global variational geometry

    CERN Document Server

    Krupka, Demeter

    2015-01-01

    The book is devoted to recent research in the global variational theory on smooth manifolds. Its main objective is an extension of the classical variational calculus on Euclidean spaces to (topologically nontrivial) finite-dimensional smooth manifolds; to this purpose the methods of global analysis of differential forms are used. Emphasis is placed on the foundations of the theory of variational functionals on fibered manifolds - relevant geometric structures for variational principles in geometry, physical field theory and higher-order fibered mechanics. The book chapters include: - foundations of jet bundles and analysis of differential forms and vector fields on jet bundles, - the theory of higher-order integral variational functionals for sections of a fibred space, the (global) first variational formula in infinitesimal and integral forms- extremal conditions and the discussion of Noether symmetries and generalizations,- the inverse problems of the calculus of variations of Helmholtz type- variational se...

  1. Integrative analysis of RNA, translation, and protein levels reveals distinct regulatory variation across humans.

    Science.gov (United States)

    Cenik, Can; Cenik, Elif Sarinay; Byeon, Gun W; Grubert, Fabian; Candille, Sophie I; Spacek, Damek; Alsallakh, Bilal; Tilgner, Hagen; Araya, Carlos L; Tang, Hua; Ricci, Emiliano; Snyder, Michael P

    2015-11-01

    Elucidating the consequences of genetic differences between humans is essential for understanding phenotypic diversity and personalized medicine. Although variation in RNA levels, transcription factor binding, and chromatin have been explored, little is known about global variation in translation and its genetic determinants. We used ribosome profiling, RNA sequencing, and mass spectrometry to perform an integrated analysis in lymphoblastoid cell lines from a diverse group of individuals. We find significant differences in RNA, translation, and protein levels suggesting diverse mechanisms of personalized gene expression control. Combined analysis of RNA expression and ribosome occupancy improves the identification of individual protein level differences. Finally, we identify genetic differences that specifically modulate ribosome occupancy--many of these differences lie close to start codons and upstream ORFs. Our results reveal a new level of gene expression variation among humans and indicate that genetic variants can cause changes in protein levels through effects on translation. © 2015 Cenik et al.; Published by Cold Spring Harbor Laboratory Press.

  2. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    Science.gov (United States)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination

  3. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Science.gov (United States)

    Dean, Stephen O.

    1988-09-01

    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications.

  4. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Science.gov (United States)

    Dean, Stephen O.

    1988-03-01

    Fusion is an essentially inexhaustible source of energy that has the potential for economically attractive commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion-energy development program is the generation of centralstation electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high-energy neutrons suggests potentially unique applications. These include breeding of fissile fuels, production of hydrogen and other chemical products, transmutation or “burning” of various nuclear or chemical wastes, radiation processing of materials, production of radioisotopes, food preservation, medical diagnosis and medical treatment, and space power and space propulsion. In addition, fusion R&D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other hand, are the two primary criteria for setting long-range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R&D program toward practical applications. The transfer of fusion technology and skills from the national laboratories and universities to industry is the key to achieving the long-range objective of commercial fusion applications.

  5. Statistical motion vector analysis for object tracking in compressed video streams

    Science.gov (United States)

    Leny, Marc; Prêteux, Françoise; Nicholson, Didier

    2008-02-01

    Compressed video is the digital raw material provided by video-surveillance systems and used for archiving and indexing purposes. Multimedia standards have therefore a direct impact on such systems. If MPEG-2 used to be the coding standard, MPEG-4 (part 2) has now replaced it in most installations, and MPEG-4 AVC/H.264 solutions are now being released. Finely analysing the complex and rich MPEG-4 streams is a challenging issue addressed in that paper. The system we designed is based on five modules: low-resolution decoder, motion estimation generator, object motion filtering, low-resolution object segmentation, and cooperative decision. Our contributions refer to as the statistical analysis of the spatial distribution of the motion vectors, the computation of DCT-based confidence maps, the automatic motion activity detection in the compressed file and a rough indexation by dedicated descriptors. The robustness and accuracy of the system are evaluated on a large corpus (hundreds of hours of in-and outdoor videos with pedestrians and vehicles). The objective benchmarking of the performances is achieved with respect to five metrics allowing to estimate the error part due to each module and for different implementations. This evaluation establishes that our system analyses up to 200 frames (720x288) per second (2.66 GHz CPU).

  6. Approach to proliferation risk assessment based on multiple objective analysis framework

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, A.; Kuptsov, I. [Obninsk Institute for Nuclear Power Engineering of NNRU MEPhI (Russian Federation); Studgorodok 1, Obninsk, Kaluga region, 249030 (Russian Federation)

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  7. A descriptive analysis of quantitative indices for multi-objective block layout

    Directory of Open Access Journals (Sweden)

    Amalia Medina Palomera

    2013-01-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  8. Approach to proliferation risk assessment based on multiple objective analysis framework

    International Nuclear Information System (INIS)

    Andrianov, A.; Kuptsov, I.

    2013-01-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk

  9. Prioritization of buffer areas with multi objective analysis: application in the Basin Creek St. Helena

    International Nuclear Information System (INIS)

    Zuluaga, Julian; Carvajal, Luis Fernando

    2006-01-01

    This paper shows a Multi objective Analysis (AMO-ELECTRE 111) with Geographical Information System (GIS) to establish priorities of buffer zones on the drainage network of the Santa Elena Creek, Medellin middle-east zone. 38 alternatives (small catchment) are evaluated with seven criteria, from field work, and maps. The criteria are: susceptibility to mass sliding, surface and lineal erosion, conflict by land use, and state of the waterways network in respect to hydrology, geology and human impact. The ELECTERE III method allows establishing priorities of buffer zones for each catchment; the indifference, acceptance, veto, and credibility threshold values, as well as those for criteria weighting factors are very important. The results show that the north zone of the catchment, commune 8, in particular La Castro creek, is most affected. The sensibility analysis shows that the obtained solution is robust, and that the anthropic and geologic criteria are paramount

  10. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    Science.gov (United States)

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective

  11. Robust object tracking techniques for vision-based 3D motion analysis applications

    Science.gov (United States)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  12. Object selection costs in visual working memory: A diffusion model analysis of the focus of attention.

    Science.gov (United States)

    Sewell, David K; Lilburn, Simon D; Smith, Philip L

    2016-11-01

    A central question in working memory research concerns the degree to which information in working memory is accessible to other cognitive processes (e.g., decision-making). Theories assuming that the focus of attention can only store a single object at a time require the focus to orient to a target representation before further processing can occur. The need to orient the focus of attention implies that single-object accounts typically predict response time costs associated with object selection even when working memory is not full (i.e., memory load is less than 4 items). For other theories that assume storage of multiple items in the focus of attention, predictions depend on specific assumptions about the way resources are allocated among items held in the focus, and how this affects the time course of retrieval of items from the focus. These broad theoretical accounts have been difficult to distinguish because conventional analyses fail to separate components of empirical response times related to decision-making from components related to selection and retrieval processes associated with accessing information in working memory. To better distinguish these response time components from one another, we analyze data from a probed visual working memory task using extensions of the diffusion decision model. Analysis of model parameters revealed that increases in memory load resulted in (a) reductions in the quality of the underlying stimulus representations in a manner consistent with a sample size model of visual working memory capacity and (b) systematic increases in the time needed to selectively access a probed representation in memory. The results are consistent with single-object theories of the focus of attention. The results are also consistent with a subset of theories that assume a multiobject focus of attention in which resource allocation diminishes both the quality and accessibility of the underlying representations. (PsycINFO Database Record (c) 2016

  13. Simulation of multicomponent light source for optical-electronic system of color analysis objects

    Science.gov (United States)

    Peretiagin, Vladimir S.; Alekhin, Artem A.; Korotaev, Valery V.

    2016-04-01

    Development of lighting technology has led to possibility of using LEDs in the specialized devices for outdoor, industrial (decorative and accent) and domestic lighting. In addition, LEDs and devices based on them are widely used for solving particular problems. For example, the LED devices are widely used for lighting of vegetables and fruit (for their sorting or growing), textile products (for the control of its quality), minerals (for their sorting), etc. Causes of active introduction LED technology in different systems, including optical-electronic devices and systems, are a large choice of emission color and LED structure, that defines the spatial, power, thermal and other parameters. Furthermore, multi-element and color devices of lighting with adjustable illumination properties can be designed and implemented by using LEDs. However, devices based on LEDs require more attention if you want to provide a certain nature of the energy or color distribution at all the work area (area of analysis or observation) or surface of the object. This paper is proposed a method of theoretical modeling of the lighting devices. The authors present the models of RGB multicomponent light source applied to optical-electronic system for the color analysis of mineral objects. The possibility of formation the uniform and homogeneous on energy and color illumination of the work area for this system is presented. Also authors showed how parameters and characteristics of optical radiation receiver (by optical-electronic system) affect on the energy, spatial, spectral and colorimetric properties of a multicomponent light source.

  14. GRAIN-SIZE MEASUREMENTS OF FLUVIAL GRAVEL BARS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pedro Castro

    2018-01-01

    Full Text Available Traditional techniques for classifying the average grain size in gravel bars require manual measurements of each grain diameter. Aiming productivity, more efficient methods have been developed by applying remote sensing techniques and digital image processing. This research proposes an Object-Based Image Analysis methodology to classify gravel bars in fluvial channels. First, the study evaluates the performance of multiresolution segmentation algorithm (available at the software eCognition Developer in performing shape recognition. The linear regression model was applied to assess the correlation between the gravels’ reference delineation and the gravels recognized by the segmentation algorithm. Furthermore, the supervised classification was validated by comparing the results with field data using the t-statistic test and the kappa index. Afterwards, the grain size distribution in gravel bars along the upper Bananeiras River, Brazil was mapped. The multiresolution segmentation results did not prove to be consistent with all the samples. Nonetheless, the P01 sample showed an R2 =0.82 for the diameter estimation and R2=0.45 the recognition of the eliptical ft. The t-statistic showed no significant difference in the efficiencies of the grain size classifications by the field survey data and the Object-based supervised classification (t = 2.133 for a significance level of 0.05. However, the kappa index was 0.54. The analysis of the both segmentation and classification results did not prove to be replicable.

  15. Transferability of Object-Oriented Image Analysis Methods for Slum Identification

    Directory of Open Access Journals (Sweden)

    Alfred Stein

    2013-08-01

    Full Text Available Updated spatial information on the dynamics of slums can be helpful to measure and evaluate progress of policies. Earlier studies have shown that semi-automatic detection of slums using remote sensing can be challenging considering the large variability in definition and appearance. In this study, we explored the potential of an object-oriented image analysis (OOA method to detect slums, using very high resolution (VHR imagery. This method integrated expert knowledge in the form of a local slum ontology. A set of image-based parameters was identified that was used for differentiating slums from non-slum areas in an OOA environment. The method was implemented on three subsets of the city of Ahmedabad, India. Results show that textural features such as entropy and contrast derived from a grey level co-occurrence matrix (GLCM and the size of image segments are stable parameters for classification of built-up areas and the identification of slums. Relation with classified slum objects, in terms of enclosed by slums and relative border with slums was used to refine classification. The analysis on three different subsets showed final accuracies ranging from 47% to 68%. We conclude that our method produces useful results as it allows including location specific adaptation, whereas generically applicable rulesets for slums are still to be developed.

  16. Multi-objective optimization of GPU3 Stirling engine using third order analysis

    International Nuclear Information System (INIS)

    Toghyani, Somayeh; Kasaeian, Alibakhsh; Hashemabadi, Seyyed Hasan; Salimi, Morteza

    2014-01-01

    Highlights: • A third-order analysis is carried out for optimization of Stirling engine. • The triple-optimization is done on a GPU3 Stirling engine. • A multi-objective optimization is carried out for a Stirling engine. • The results are compared with an experimental previous work for checking the model improvement. • The methods of TOPSIS, Fuzzy, and LINMAP are compared with each other in aspect of optimization. - Abstract: Stirling engine is an external combustion engine that uses any external heat source to generate mechanical power which operates at closed cycles. These engines are good choices for using in power generation systems; because these engines present a reasonable theoretical efficiency which can be closer to the Carnot efficiency, comparing with other reciprocating thermal engines. Hence, many studies have been conducted on Stirling engines and the third order thermodynamic analysis is one of them. In this study, multi-objective optimization with four decision variables including the temperature of heat source, stroke, mean effective pressure, and the engine frequency were applied in order to increase the efficiency and output power and reduce the pressure drop. Three decision-making procedures were applied to optimize the answers from the results. At last, the applied methods were compared with the results obtained of one experimental work and a good agreement was observed

  17. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  18. Cultural Variations across Academic Genres: A Generic Analysis of Intertextuality in Master's Theses Introductions

    Science.gov (United States)

    Ketabi, Saeed; Rahavard, Shaahin

    2013-01-01

    Genre analysis of texts has always been significant. The current study aimed at investigating intertextuality considering cultural variations and differences in students' discourse communities. Social studies, philosophy, and biology were chosen as the representatives of social sciences, humanities and sciences. Tehran University, one of the most…

  19. ANALYSIS AND PARTICULARITIES OF EXTERNAL FACTORS IMPACT ON ECONOMICAL RESULTS OF STRATEGIC OBJECTS PLANNING DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    V. V. Gromov

    2015-01-01

    Full Text Available Summary. The relevance of the scientific problem described in the article are: to determine changes in economic performance, the effectiveness of the sectoral components of the service sector from the effects of environmental factors, which allows them to reach the planned long-term economic performance; management decision-making about structural and organizational changes, implementation of investment projects in the renovation and modernization of fixed capital, the creation of technology, process and product innovations directly connected with the impact analysis of such external factors as economic, socio-cultural, legal, political, innovative. The structure of the article is formed on the basis of presentation of the impact of specific groups of environmental factors on the competitiveness and economic performance of industry components of services based on the technology of strategic planning; complience of logical sequence of presentation of materials, establishing a causal relationship, the interaction of factors and elements of studied problems and objects. Features of external factors impact on the effectiveness of macro-economic entities, sectoral components of services are to the adequacy of the measures and strategies to counter the negative impact on the economic development of the objects of strategic development. Features of status changes and influence of internal factors on local and sectoral socio-economic systems dictate the need for a part of the available resources, the level of efficiency of the use of labor resources, fixed and current assets. The contribution of the author in a scientific perspective of this topic is to carry out a comprehensive analysis of the impact of the main groups of external factors on economic activities of the service sector development; identifying features of internal factors impact on the economic and innovative development of strategic planning objects.

  20. Advances in Neutron Activation Analysis of Large Objects with Emphasis on Archaeological Examples. Results of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2018-03-01

    This publication is a compilation of the main results and findings of an IAEA coordinated research project (CRP). In particular, it discusses an innovative variation of neutron activation analysis (NAA) known as large sample NAA (LSNAA). There is no other way to measure the bulk mass fractions of the elements present in a large sample (up to kilograms in mass) non-destructively. Examples amenable to LSNAA include irregularly shaped archaeological artefacts, excavated rock samples, large samples of assorted ore, and finished products, such as nuclear reactor components. The CRP focused primarily on the application of LSNAA in the areas of archaeology and geology; however it was also open for further exploration in other areas such as industry and life sciences as well as in basic research. The CRP contributed to establish the validation of the methodology, and, in particular, it provided an opportunity for developing trained manpower. The specific objectives of this CRP were to: i) Validate and optimize the experimental procedures for LSNAA applications in archaeology and geology; ii) Identify the needs for development or upgrade of the neutron irradiation facility for irradiation of large samples; iii) Develop and standardize data acquisition and data analysis systems; iv) Harmonize and standardize data collection from facilities with similar kind of instrumentation for further analysis and benchmarking. Advantages of LSNAA applications, limitations and scientific and technological requirements are described in this publication, which serves as a reference of interest not only to the NAA experts, research reactor personnel, and those considering this technique, but also to various stakeholders and users such as researchers, industrialists, environmental and legal experts, and administrators.

  1. Anatomical variations of the celiac trunk and hepatic arterial system: an analysis using multidetector computed tomography angiography

    Energy Technology Data Exchange (ETDEWEB)

    Araujo Neto, Severino Aires; Franca, Henrique Almeida; Mello Junior, Carlos Fernando de; Silva Neto, Eulampio Jose; Negromonte, Gustavo Ramalho Pessoa; Duarte, Claudia Martina Araujo; Cavalcanti Neto, Bartolomeu Fragoso; Farias, Rebeca Danielly da Fonseca, E-mail: severinoaires@hotmail.com [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil)

    2015-11-15

    Objective: To analyze the prevalence of anatomical variations of celiac arterial trunk (CAT) branches and hepatic arterial system (HAS), as well as the CAT diameter, length and distance to the superior mesenteric artery. Materials And Methods: Retrospective, cross-sectional and predominantly descriptive study based on the analysis of multidetector computed tomography images of 60 patients. Results: The celiac trunk anatomy was normal in 90% of cases. Hepatosplenic trunk was found in 8.3% of patients, and hepatogastric trunk in 1.7%. Variation of the HAS was observed in 21.7% of cases, including anomalous location of the right hepatic artery in 8.3% of cases, and of the left hepatic artery, in 5%. Also, cases of joint relocation of right and left hepatic arteries, and trifurcation of the proper hepatic artery were observed, respectively, in 3 (5%) and 2 (3.3%) patients. Mean length and caliber of the CAT were 2.3 cm and 0.8 cm, respectively. Mean distance between CAT and superior mesenteric artery was 1.2 cm (standard deviation = 4.08). A significant correlation was observed between CAT diameter and length, and CAT diameter and distance to superior mesenteric artery. Conclusion: The pattern of CAT variations and diameter corroborate the majority of the literature data. However, this does not happen in relation to the HAS. (author)

  2. Anatomical variations of the celiac trunk and hepatic arterial system: an analysis using multidetector computed tomography angiography

    International Nuclear Information System (INIS)

    Araujo Neto, Severino Aires; Franca, Henrique Almeida; Mello Junior, Carlos Fernando de; Silva Neto, Eulampio Jose; Negromonte, Gustavo Ramalho Pessoa; Duarte, Claudia Martina Araujo; Cavalcanti Neto, Bartolomeu Fragoso; Farias, Rebeca Danielly da Fonseca

    2015-01-01

    Objective: To analyze the prevalence of anatomical variations of celiac arterial trunk (CAT) branches and hepatic arterial system (HAS), as well as the CAT diameter, length and distance to the superior mesenteric artery. Materials And Methods: Retrospective, cross-sectional and predominantly descriptive study based on the analysis of multidetector computed tomography images of 60 patients. Results: The celiac trunk anatomy was normal in 90% of cases. Hepatosplenic trunk was found in 8.3% of patients, and hepatogastric trunk in 1.7%. Variation of the HAS was observed in 21.7% of cases, including anomalous location of the right hepatic artery in 8.3% of cases, and of the left hepatic artery, in 5%. Also, cases of joint relocation of right and left hepatic arteries, and trifurcation of the proper hepatic artery were observed, respectively, in 3 (5%) and 2 (3.3%) patients. Mean length and caliber of the CAT were 2.3 cm and 0.8 cm, respectively. Mean distance between CAT and superior mesenteric artery was 1.2 cm (standard deviation = 4.08). A significant correlation was observed between CAT diameter and length, and CAT diameter and distance to superior mesenteric artery. Conclusion: The pattern of CAT variations and diameter corroborate the majority of the literature data. However, this does not happen in relation to the HAS. (author)

  3. Variation in lumbar punctures for early onset neonatal sepsis: a nationally representative serial cross-sectional analysis, 2003-2009

    Directory of Open Access Journals (Sweden)

    Patrick Stephen W

    2012-08-01

    Full Text Available Abstract Background Whether lumbar punctures (LPs should be performed routinely for term newborns suspected of having early onset neonatal sepsis (EONS is subject to debate. It is unclear whether variations in performance of LPs for EONS may be associated with patient, hospital, insurance or regional factors. Our objective was to identify characteristics associated with the practice of performing LPs for suspected EONS in a nationally representative sample. Methods Utilizing data from the 2003, 2006 and 2009 Kids’ Inpatient Database (KID compiled by the Agency for Healthcare Research and Quality, we examined the frequency and characteristics of term, normal-birth weight newborns receiving an LP for EONS. Survey-weighting was applied for national estimates and used in chi squared and multivariable regression analysis. Results In 2009, there were 13,694 discharges for term newborns that underwent LPs for apparent EONS. Newborns having LPs performed were more likely to be covered by Medicaid vs. private insurance (51.9 vs. 45.1 percent; p Conclusions We found pronounced variation in LPs performed for EONS, even when adjusting for clinical conditions that would prompt LPs. These findings indicate practice variations in newborn care that merit further examination and explanation.

  4. Variational analysis and aerospace engineering mathematical challenges for the aerospace of the future

    CERN Document Server

    Mohammadi, Bijan; Pironneau, Olivier; Cipolla, Vittorio

    2016-01-01

    This book presents papers surrounding the extensive discussions that took place from the ‘Variational Analysis and Aerospace Engineering’ workshop held at the Ettore Majorana Foundation and Centre for Scientific Culture in 2015. Contributions to this volume focus on advanced mathematical methods in aerospace engineering and industrial engineering such as computational fluid dynamics methods, optimization methods in aerodynamics, optimum controls, dynamic systems, the theory of structures, space missions, flight mechanics, control theory, algebraic geometry for CAD applications, and variational methods and applications. Advanced graduate students, researchers, and professionals in mathematics and engineering will find this volume useful as it illustrates current collaborative research projects in applied mathematics and aerospace engineering.

  5. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2015-09-01

    well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  6. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    Science.gov (United States)

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  7. Multi-objective optimization and exergoeconomic analysis of a combined cooling, heating and power based compressed air energy storage system

    International Nuclear Information System (INIS)

    Yao, Erren; Wang, Huanran; Wang, Ligang; Xi, Guang; Maréchal, François

    2017-01-01

    Highlights: • A novel tri-generation based compressed air energy storage system. • Trade-off between efficiency and cost to highlight the best compromise solution. • Components with largest irreversibility and potential improvements highlighted. - Abstract: Compressed air energy storage technologies can improve the supply capacity and stability of the electricity grid, particularly when fluctuating renewable energies are massively connected. While incorporating the combined cooling, heating and power systems into compressed air energy storage could achieve stable operation as well as efficient energy utilization. In this paper, a novel combined cooling, heating and power based compressed air energy storage system is proposed. The system combines a gas engine, supplemental heat exchangers and an ammonia-water absorption refrigeration system. The design trade-off between the thermodynamic and economic objectives, i.e., the overall exergy efficiency and the total specific cost of product, is investigated by an evolutionary multi-objective algorithm for the proposed combined system. It is found that, with an increase in the exergy efficiency, the total product unit cost is less affected in the beginning, while rises substantially afterwards. The best trade-off solution is selected with an overall exergy efficiency of 53.04% and a total product unit cost of 20.54 cent/kWh, respectively. The variation of decision variables with the exergy efficiency indicates that the compressor, turbine and heat exchanger preheating the inlet air of turbine are the key equipment to cost-effectively pursuit a higher exergy efficiency. It is also revealed by an exergoeconomic analysis that, for the best trade-off solution, the investment costs of the compressor and the two heat exchangers recovering compression heat and heating up compressed air for expansion should be reduced (particularly the latter), while the thermodynamic performance of the gas engine need to be improved

  8. Investigation, sensitivity analysis, and multi-objective optimization of effective parameters on temperature and force in robotic drilling cortical bone.

    Science.gov (United States)

    Tahmasbi, Vahid; Ghoreishi, Majid; Zolfaghari, Mojtaba

    2017-11-01

    The bone drilling process is very prominent in orthopedic surgeries and in the repair of bone fractures. It is also very common in dentistry and bone sampling operations. Due to the complexity of bone and the sensitivity of the process, bone drilling is one of the most important and sensitive processes in biomedical engineering. Orthopedic surgeries can be improved using robotic systems and mechatronic tools. The most crucial problem during drilling is an unwanted increase in process temperature (higher than 47 °C), which causes thermal osteonecrosis or cell death and local burning of the bone tissue. Moreover, imposing higher forces to the bone may lead to breaking or cracking and consequently cause serious damage. In this study, a mathematical second-order linear regression model as a function of tool drilling speed, feed rate, tool diameter, and their effective interactions is introduced to predict temperature and force during the bone drilling process. This model can determine the maximum speed of surgery that remains within an acceptable temperature range. Moreover, for the first time, using designed experiments, the bone drilling process was modeled, and the drilling speed, feed rate, and tool diameter were optimized. Then, using response surface methodology and applying a multi-objective optimization, drilling force was minimized to sustain an acceptable temperature range without damaging the bone or the surrounding tissue. In addition, for the first time, Sobol statistical sensitivity analysis is used to ascertain the effect of process input parameters on process temperature and force. The results show that among all effective input parameters, tool rotational speed, feed rate, and tool diameter have the highest influence on process temperature and force, respectively. The behavior of each output parameters with variation in each input parameter is further investigated. Finally, a multi-objective optimization has been performed considering all the

  9. Towards Uniform Accelerometry Analysis: A Standardization Methodology to Minimize Measurement Bias Due to Systematic Accelerometer Wear-Time Variation

    Directory of Open Access Journals (Sweden)

    Tarun R. Katapally, Nazeem Muhajarine

    2014-06-01

    Full Text Available Accelerometers are predominantly used to objectively measure the entire range of activity intensities – sedentary behaviour (SED, light physical activity (LPA and moderate to vigorous physical activity (MVPA. However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants, jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within ‘valid’ data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA. Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time’s influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and

  10. Hyper-Fractal Analysis: A visual tool for estimating the fractal dimension of 4D objects

    Science.gov (United States)

    Grossu, I. V.; Grossu, I.; Felea, D.; Besliu, C.; Jipa, Al.; Esanu, T.; Bordeianu, C. C.; Stan, E.

    2013-04-01

    This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images and 3D objects (Grossu et al. (2010) [1]). The program was extended for working with four-dimensional objects stored in comma separated values files. This might be of interest in biomedicine, for analyzing the evolution in time of three-dimensional images. New version program summaryProgram title: Hyper-Fractal Analysis (Fractal Analysis v03) Catalogue identifier: AEEG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 745761 No. of bytes in distributed program, including test data, etc.: 12544491 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 100M Classification: 14 Catalogue identifier of previous version: AEEG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 831-832 Does the new version supersede the previous version? Yes Nature of problem: Estimating the fractal dimension of 4D images. Solution method: Optimized implementation of the 4D box-counting algorithm. Reasons for new version: Inspired by existing applications of 3D fractals in biomedicine [3], we extended the optimized version of the box-counting algorithm [1, 2] to the four-dimensional case. This might be of interest in analyzing the evolution in time of 3D images. The box-counting algorithm was extended in order to support 4D objects, stored in comma separated values files. A new form was added for generating 2D, 3D, and 4D test data. The application was tested on 4D objects with known dimension, e.g. the Sierpinski hypertetrahedron gasket, Df=ln(5)/ln(2) (Fig. 1). The algorithm could be extended, with minimum effort, to

  11. Restructuring of burnup sensitivity analysis code system by using an object-oriented design approach

    International Nuclear Information System (INIS)

    Kenji, Yokoyama; Makoto, Ishikawa; Masahiro, Tatsumi; Hideaki, Hyoudou

    2005-01-01

    A new burnup sensitivity analysis code system was developed with help from the object-oriented technique and written in Python language. It was confirmed that they are powerful to support complex numerical calculation procedure such as reactor burnup sensitivity analysis. The new burnup sensitivity analysis code system PSAGEP was restructured from a complicated old code system and reborn as a user-friendly code system which can calculate the sensitivity coefficients of the nuclear characteristics considering multicycle burnup effect based on the generalized perturbation theory (GPT). A new encapsulation framework for conventional codes written in Fortran was developed. This framework supported to restructure the software architecture of the old code system by hiding implementation details and allowed users of the new code system to easily calculate the burnup sensitivity coefficients. The framework can be applied to the other development projects since it is carefully designed to be independent from PSAGEP. Numerical results of the burnup sensitivity coefficient of a typical fast breeder reactor were given with components based on GPT and the multicycle burnup effects on the sensitivity coefficient were discussed. (authors)

  12. Infrared spectroscopy with multivariate analysis to interrogate endometrial tissue: a novel and objective diagnostic approach.

    Science.gov (United States)

    Taylor, S E; Cheung, K T; Patel, I I; Trevisan, J; Stringfellow, H F; Ashton, K M; Wood, N J; Keating, P J; Martin-Hirsch, P L; Martin, F L

    2011-03-01

    Endometrial cancer is the most common gynaecological malignancy in the United Kingdom. Diagnosis currently involves subjective expert interpretation of highly processed tissue, primarily using microscopy. Previous work has shown that infrared (IR) spectroscopy can be used to distinguish between benign and malignant cells in a variety of tissue types. Tissue was obtained from 76 patients undergoing hysterectomy, 36 had endometrial cancer. Slivers of endometrial tissue (tumour and tumour-adjacent tissue if present) were dissected and placed in fixative solution. Before analysis, tissues were thinly sliced, washed, mounted on low-E slides and desiccated; 10 IR spectra were obtained per slice by attenuated total reflection Fourier-transform IR (ATR-FTIR) spectroscopy. Derived data was subjected to principal component analysis followed by linear discriminant analysis. Post-spectroscopy analyses, tissue sections were haematoxylin and eosin-stained to provide histological verification. Using this approach, it is possible to distinguish benign from malignant endometrial tissue, and various subtypes of both. Cluster vector plots of benign (verified post-spectroscopy to be free of identifiable pathology) vs malignant tissue indicate the importance of the lipid and secondary protein structure (Amide I and Amide II) regions of the spectrum. These findings point towards the possibility of a simple objective test for endometrial cancer using ATR-FTIR spectroscopy. This would facilitate earlier diagnosis and so reduce the morbidity and mortality associated with this disease.

  13. Qualitative content analysis experiences with objective structured clinical examination among Korean nursing students.

    Science.gov (United States)

    Jo, Kae-Hwa; An, Gyeong-Ju

    2014-04-01

    The aim of this study was to explore the experiences of Korean nursing students with an objective structured clinical examination (OSCE) assessment regarding the 12 cranial nerves using qualitative content analysis. Qualitative content analysis was used to explore the subjective experiences of nursing baccalaureate students after taking the OSCE. Convenience sampling was used to select 64 4th year nursing students who were interested in taking the OSCE. The participants learned content about the 12 cranial nerve assessment by lectures, demonstrations, and videos before the OSCE. The OSCE consisted of examinations in each of three stations for 2 days. The participants wrote information about their experiences on sheets of paper immediately after the OSCE anonymously in an adjacent room. The submitted materials were analyzed via qualitative content analysis. The collected materials were classified into two themes and seven categories. One theme was "awareness of inner capabilities", which included three categories: "inner motivation", "inner confidence", and "creativity". The other theme was "barriers to nursing performance", which included four categories: "deficiency of knowledge", "deficiency of communication skill", "deficiency of attitude toward comfort", and "deficiency of repetitive practice". This study revealed that the participants simultaneously experienced the potential and deficiency of their nursing competency after an OSCE session on cranial nerves. OSCE also provided the opportunity for nursing students to realize nursing care in a holistic manner unlike concern that OSCE undermines holism. © 2013 The Authors. Japan Journal of Nursing Science © 2013 Japan Academy of Nursing Science.

  14. Multi-objective optimization and grey relational analysis on configurations of organic Rankine cycle

    International Nuclear Information System (INIS)

    Wang, Y.Z.; Zhao, J.; Wang, Y.; An, Q.S.

    2017-01-01

    Highlights: • Pareto frontier is an effective way to make comprehensive comparison of ORC. • Comprehensive performance from energy and economics of basic ORC is the best. • R141b shows the best comprehensive performance from energy and economics. - Abstract: Concerning the comprehensive performance of organic Rankine cycle (ORC), comparisons and optimizations on 3 different configurations of ORC (basic, regenerative and extractive ORCs) are investigated in this paper. Medium-temperature geothermal water is used for comparing the influence of configurations, working fluids and operating parameters on different evaluation criteria. Different evaluation and optimization methods are adopted in evaluation of ORCs to obtain the one with the best comprehensive performance, such as exergoeconomic analysis, bi-objective optimization and grey relational analysis. The results reveal that the basic ORC performs the best among these 3 ORCs in terms of comprehensive thermodynamic and economic performances when using R245fa and driven by geothermal water at 150 °C. Furthermore, R141b shows the best comprehensive performance among 14 working fluids based on the Pareto frontier solutions without considering safe factors. Meanwhile, R141b is the best among all 14 working fluids with the optimal comprehensive performance when regarding all the evaluation criteria as equal by using grey relational analysis.

  15. Gaming Change: A Many-objective Analysis of Water Supply Portfolios under Uncertainty

    Science.gov (United States)

    Reed, P. M.; Kasprzyk, J.; Characklis, G.; Kirsch, B.

    2008-12-01

    This study explores the uncertainty and tradeoffs associated with up to six conflicting water supply portfolio planning objectives. A ten-year Monte Carlo simulation model is used to evaluate water supply portfolios blending permanent rights, adaptive options contracts, and spot leases for a single city in the Lower Rio Grande Valley. Historical records of reservoir mass balance, lease pricing, and demand serve as the source data for the Monte Carlo simulation. Portfolio planning decisions include the initial volume and annual increases of permanent rights, thresholds for an adaptive options contract, and anticipatory decision rules for purchasing leases and exercising options. Our work distinguishes three cases: (1) permanent rights as the sole source of supply, (2) permanent rights and adaptive options, and (3) a combination of permanent rights, adaptive options, and leases. The problems have been formulated such that cases 1 and 2 are sub-spaces of the six objective formulation used for case 3. Our solution sets provide the tradeoff surfaces between portfolios' expected values for cost, cost variability, reliability, frequency of purchasing permanent rights increases, frequency of using leases, and dropped (or unused) transfers of water. The tradeoff surfaces for the three cases show that options and leases have a dramatic impact on the marginal costs associated with improving the efficiency and reliability of urban water supplies. Moreover, our many-objective analysis permits the discovery of a broad range of high quality portfolio strategies. We differentiate the value of adaptive options versus leases by testing a representative subset of optimal portfolios' abilities to effectively address regional increases in demand during drought periods. These results provide insights into the tradeoffs inherent to a more flexible, portfolio-style approach to urban water resources management, an approach that should become increasingly attractive in an environment of

  16. A REGION-BASED MULTI-SCALE APPROACH FOR OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    T. Kavzoglu

    2016-06-01

    Full Text Available Within the last two decades, object-based image analysis (OBIA considering objects (i.e. groups of pixels instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient. Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  17. Analysis of genetic variation and potential applications in genome-scale metabolic modeling

    DEFF Research Database (Denmark)

    Cardoso, Joao; Andersen, Mikael Rørdam; Herrgard, Markus

    2015-01-01

    scale and resolution by re-sequencing thousands of strains systematically. In this article, we review challenges in the integration and analysis of large-scale re-sequencing data, present an extensive overview of bioinformatics methods for predicting the effects of genetic variants on protein function......Genetic variation is the motor of evolution and allows organisms to overcome the environmental challenges they encounter. It can be both beneficial and harmful in the process of engineering cell factories for the production of proteins and chemicals. Throughout the history of biotechnology......, there have been efforts to exploit genetic variation in our favor to create strains with favorable phenotypes. Genetic variation can either be present in natural populations or it can be artificially created by mutagenesis and selection or adaptive laboratory evolution. On the other hand, unintended genetic...

  18. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  19. Systematic analysis of the heat exchanger arrangement problem using multi-objective genetic optimization

    International Nuclear Information System (INIS)

    Daróczy, László; Janiga, Gábor; Thévenin, Dominique

    2014-01-01

    A two-dimensional cross-flow tube bank heat exchanger arrangement problem with internal laminar flow is considered in this work. The objective is to optimize the arrangement of tubes and find the most favorable geometries, in order to simultaneously maximize the rate of heat exchange while obtaining a minimum pressure loss. A systematic study was performed involving a large number of simulations. The global optimization method NSGA-II was retained. A fully automatized in-house optimization environment was used to solve the problem, including mesh generation and CFD (computational fluid dynamics) simulations. The optimization was performed in parallel on a Linux cluster with a very good speed-up. The main purpose of this article is to illustrate and analyze a heat exchanger arrangement problem in its most general form and to provide a fundamental understanding of the structure of the Pareto front and optimal geometries. The considered conditions are particularly suited for low-power applications, as found in a growing number of practical systems in an effort toward increasing energy efficiency. For such a detailed analysis with more than 140 000 CFD-based evaluations, a design-of-experiment study involving a response surface would not be sufficient. Instead, all evaluations rely on a direct solution using a CFD solver. - Highlights: • Cross-flow tube bank heat exchanger arrangement problem. • A fully automatized multi-objective optimization based on genetic algorithm. • A systematic study involving a large number of CFD (computational fluid dynamics) simulations

  20. Energy regulation in China: Objective selection, potential assessment and responsibility sharing by partial frontier analysis

    International Nuclear Information System (INIS)

    Xia, X.H.; Chen, Y.B.; Li, J.S.; Tasawar, H.; Alsaedi, A.; Chen, G.Q.

    2014-01-01

    To cope with the excessive growth of energy consumption, the Chinese government has been trying to strengthen the energy regulation system by introducing new initiatives that aim at controlling the total amount of energy consumption. A partial frontier analysis is performed in this paper to make a comparative assessment of the combinations of possible energy conservation objectives, new constraints and regulation strategies. According to the characteristics of the coordination of existing regulation structure and the optimality of regulation strategy, four scenarios are constructed and regional responsibilities are reasonably divided by fully considering the production technology in the economy. The relative importance of output objectives and the total amount controlling is compared and the impacts on the regional economy caused by the changes of regulation strategy are also evaluated for updating regulation policy. - Highlights: • New initiatives to control the total amount of energy consumption are evaluated. • Twenty-four regulation strategies and four scenarios are designed and compared. • Crucial regions for each sector and regional potential are identified. • The national goals of energy abatement are decomposed into regional responsibilities. • The changes of regulation strategy are evaluated for updating regulation policy

  1. Testing and injury potential analysis of rollovers with narrow object impacts.

    Science.gov (United States)

    Meyer, Steven E; Forrest, Stephen; Herbst, Brian; Hayden, Joshua; Orton, Tia; Sances, Anthony; Kumaresan, Srirangam

    2004-01-01

    Recent statistics highlight the significant risk of serious and fatal injuries to occupants involved in rollover collisions due to excessive roof crush. The government has reported that in 2002. Sports Utility Vehicle rollover related fatalities increased by 14% to more than 2400 annually. 61% of all SUV fatalities included rollovers [1]. Rollover crashes rely primarily upon the roof structures to maintain occupant survival space. Frequently these crashes occur off the travel lanes of the roadway and, therefore, can include impacts with various types of narrow objects such as light poles, utility poles and/or trees. A test device and methodology is presented which facilitates dynamic, repeatable rollover impact evaluation of complete vehicle roof structures with such narrow objects. These tests allow for the incorporation of Anthropomorphic Test Dummies (ATDs) which can be instrumented to measure accelerations, forces and moments to evaluate injury potential. High-speed video permits for detailed analysis of occupant kinematics and evaluation of injury causation. Criteria such as restraint performance, injury potential, survival space and the effect of roof crush associated with various types of design alternatives, countermeasures and impact circumstances can also be evaluated. In addition to presentation of the methodology, two representative vehicle crash tests are also reported. Results indicated that the reinforced roof structure significantly reduced the roof deformation compared to the production roof structure.

  2. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Science.gov (United States)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  3. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    Directory of Open Access Journals (Sweden)

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  4. Accelerometry-based gait analysis, an additional objective approach to screen subjects at risk for falling.

    Science.gov (United States)

    Senden, R; Savelberg, H H C M; Grimm, B; Heyligers, I C; Meijer, K

    2012-06-01

    This study investigated whether the Tinetti scale, as a subjective measure for fall risk, is associated with objectively measured gait characteristics. It is studied whether gait parameters are different for groups that are stratified for fall risk using the Tinetti scale. Moreover, the discriminative power of gait parameters to classify elderly according to the Tinetti scale is investigated. Gait of 50 elderly with a Tinneti>24 and 50 elderly with a Tinetti≤24 was analyzed using acceleration-based gait analysis. Validated algorithms were used to derive spatio-temporal gait parameters, harmonic ratio, inter-stride amplitude variability and root mean square (RMS) from the accelerometer data. Clear differences in gait were found between the groups. All gait parameters correlated with the Tinetti scale (r-range: 0.20-0.73). Only walking speed, step length and RMS showed moderate to strong correlations and high discriminative power to classify elderly according to the Tinetti scale. It is concluded that subtle gait changes that have previously been related to fall risk are not captured by the subjective assessment. It is therefore worthwhile to include objective gait assessment in fall risk screening. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Scanpath-based analysis of objects conspicuity in context of human vision physiology.

    Science.gov (United States)

    Augustyniak, Piotr

    2007-01-01

    This paper discusses principal aspects of objects conspicuity investigated with use of an eye tracker and interpreted on the background of human vision physiology. Proper management of objects conspicuity is fundamental in several leading edge applications in the information society like advertisement, web design, man-machine interfacing and ergonomics. Although some common rules of human perception are applied since centuries in the art, the interest of human perception process is motivated today by the need of gather and maintain the recipient attention by putting selected messages in front of the others. Our research uses the visual tasks methodology and series of progressively modified natural images. The modifying details were attributed by their size, color and position while the scanpath-derived gaze points confirmed or not the act of perception. The statistical analysis yielded the probability of detail perception and correlations with the attributes. This probability conforms to the knowledge about the retina anatomy and perception physiology, although we use noninvasive methods only.

  6. Objective and subjective analysis of women's voice with idiopathic Parkinson's disease

    Directory of Open Access Journals (Sweden)

    Riviana Rodrigues das Graças

    2012-07-01

    Full Text Available OBJECTIVE: To compare the voice quality of women with idiopathic Parkinson's disease and those without it. METHODS: An evaluation was performed including 19 female patients diagnosed with idiopathic Parkinson's disease, with an average age of 66 years, and 27 women with an average of 67 years-old in the Control Group. The assessment was performed by computed acoustic analysis and perceptual evaluation. RESULTS: Parkinson's disease patients presented moderate rough and unstable voice quality. The parameters of grade, roughness, and instability had higher scores in Parkinson's disease patients with statistically significant differences. Acoustic measures of Jitter and period perturbation quotient (PPQ significantly differ between groups. CONCLUSIONS: Parkinson's disease female individuals showed more vocal alterations compared to the Control Group, when both perceptual and acoustic evaluations were analyzed.

  7. Comprehensive benefit analysis of regional water resources based on multi-objective evaluation

    Science.gov (United States)

    Chi, Yixia; Xue, Lianqing; Zhang, Hui

    2018-01-01

    The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.

  8. Analysis of art objects by means of ion beam induced luminescence

    International Nuclear Information System (INIS)

    Quaranta, A; Dran, J C; Salomon, J; Pivin, J C; Vomiero, A; Tonezzer, M; Maggioni, G; Carturan, S; Mea, G Della

    2006-01-01

    The impact of energetic ions on solid samples gives rise to the emission of visible light owing to the electronic excitation of intrinsic defects or extrinsic impurities. The intensity and position of the emission features provide information on the nature of the luminescence centers and on their chemical environments. This makes ion beam induced luminescence (IBIL) a useful complement to other ion beam analyses, like PIXE, in the cultural heritage field in characterizing the composition and the provenience of art objects. In the present paper, IBIL measurements have been performed on inorganic pigments for underlying the complementary role played by IBIL in the analysis of artistic works. Some blue and red pigment has been presented as case study

  9. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  10. Analysis of interfraction and intrafraction variation during tangential breast irradiation with an electronic portal imaging device

    International Nuclear Information System (INIS)

    Smith, Ryan P.; Bloch, Peter; Harris, Eleanor E.; McDonough, James; Sarkar, Abhirup; Kassaee, Alireza; Avery, Steven; Solin, Lawrence J.

    2005-01-01

    Purpose: To evaluate the daily setup variation and the anatomic movement of the heart and lungs during breast irradiation with tangential photon beams, as measured with an electronic portal imaging device. Methods and materials: Analysis of 1,709 portal images determined changes in the radiation field during a treatment course in 8 patients. Values obtained for every image included central lung distance (CLD) and area of lung and heart within the irradiated field. The data from these measurements were used to evaluate variation from setup between treatment days and motion due to respiration and/or patient movement during treatment delivery. Results: The effect of respiratory motion and movement during treatment was minimal: the maximum range in CLD for any patient on any day was 0.25 cm. The variation caused by day-to-day setup variation was greater, with CLD values for patients ranging from 0.59 cm to 2.94 cm. Similar findings were found for heart and lung areas. Conclusions: There is very little change in CLD and corresponding lung and heart area during individual radiation treatment fractions in breast tangential fields, compared with a relatively greater amount of variation that occurs between days

  11. AFLP analysis of Cynodon dactylon (L.) Pers. var. dactylon genetic variation.

    Science.gov (United States)

    Wu, Y Q; Taliaferro, C M; Bai, G H; Anderson, M P

    2004-08-01

    Cynodon dactylon (L.) Pers. var. dactylon (common bermudagrass) is geographically widely distributed between about lat 45 degrees N and lat 45 degrees S, penetrating to about lat 53 degrees N in Europe. The extensive variation of morphological and adaptive characteristics of the taxon is substantially documented, but information is lacking on DNA molecular variation in geographically disparate forms. Accordingly, this study was conducted to assess molecular genetic variation and genetic relatedness among 28 C. dactylon var. dactylon accessions originating from 11 countries on 4 continents (Africa, Asia, Australia, and Europe). A fluorescence-labeled amplified fragment length polymorphism (AFLP) DNA profiling method was used to detect the genetic diversity and relatedness. On the basis of 443 polymorphic AFLP fragments from 8 primer combinations, the accessions were grouped into clusters and subclusters associating with their geographic origins. Genetic similarity coefficients (SC) for the 28 accessions ranged from 0.53 to 0.98. Accessions originating from Africa, Australia, Asia, and Europe formed major groupings as indicated by cluster and principal coordinate analysis. Accessions from Australia and Asia, though separately clustered, were relatively closely related and most distantly related to accessions of European origin. African accessions formed two distant clusters and had the greatest variation in genetic relatedness relative to accessions from other geographic regions. Sampling the full extent of genetic variation in C. dactylon var. dactylon would require extensive germplasm collection in the major geographic regions of its distributional range.

  12. Analysis of Geomagnetic Field Variations during Total Solar Eclipses Using INTERMAGNET Data

    Science.gov (United States)

    KIM, J. H.; Chang, H. Y.

    2017-12-01

    We investigate variations of the geomagnetic field observed by INTERMAGNET geomagnetic observatories over which the totality path passed during a solar eclipse. We compare results acquired by 6 geomagnetic observatories during the 4 total solar eclipses (11 August 1999, 1 August 2008, 11 July 2010, and 20 March 2015) in terms of geomagnetic and solar ecliptic parameters. These total solar eclipses are the only total solar eclipse during which the umbra of the moon swept an INTERMAGNET geomagnetic observatory and simultaneously variations of the geomagnetic field are recorded. We have confirmed previous studies that increase BY and decreases of BX, BZ and F are conspicuous. Interestingly, we have noted that variations of geomagnetic field components observed during the total solar eclipse at Isla de Pascua Mataveri (Easter Island) in Chile (IPM) in the southern hemisphere show distinct decrease of BY and increases of BX and BZ on the contrary. We have found, however, that variations of BX, BY, BZ and F observed at Hornsund in Norway (HRN) seem to be dominated by other geomagnetic occurrence. In addition, we have attempted to obtain any signatures of influence on the temporal behavior of the variation in the geomagnetic field signal during the solar eclipse by employing the wavelet analysis technique. Finally, we conclude by pointing out that despite apparent success a more sophisticate and reliable algorithm is required before implementing to make quantitative comparisons.

  13. Infant search and object permanence: a meta-analysis of the A-not-B error.

    Science.gov (United States)

    Wellman, H M; Cross, D; Bartsch, K

    1987-01-01

    Research on Piaget's stage 4 object concept has failed to reveal a clear or consistent pattern of results. Piaget found that 8-12-month-old infants would make perserverative errors; his explanation for this phenomenon was that the infant's concept of the object was contextually dependent on his or her actions. Some studies designed to test Piaget's explanation have replicated Piaget's basic finding, yet many have found no preference for the A location or the B location or an actual preference for the B location. More recently, researchers have attempted to uncover the causes for these results concerning the A-not-B error. Again, however, different studies have yielded different results, and qualitative reviews have failed to yield a consistent explanation for the results of the individual studies. This state of affairs suggests that the phenomenon may simply be too complex to be captured by individual studies varying 1 factor at a time and by reviews based on similar qualitative considerations. Therefore, the current investigation undertook a meta-analysis, a synthesis capturing the quantitative information across the now sizable number of studies. We entered several important factors into the meta-analysis, including the effects of age, the number of A trials, the length of delay between hiding and search, the number of locations, the distances between locations, and the distinctive visual properties of the hiding arrays. Of these, the analysis consistently indicated that age, delay, and number of hiding locations strongly influence infants' search. The pattern of specific findings also yielded new information about infant search. A general characterization of the results is that, at every age, both above-chance and below-chance performance was observed. That is, at each age at least 1 combination of delay and number of locations yielded above-chance A-not-B errors or significant perseverative search. At the same time, at each age at least 1 alternative

  14. Variation compensation and analysis on diaphragm curvature analysis for emphysema quantification on whole lung CT scans

    Science.gov (United States)

    Keller, Brad M.; Reeves, Anthony P.; Barr, R. Graham; Yankelevitz, David F.; Henschke, Claudia I.

    2010-03-01

    CT scans allow for the quantitative evaluation of the anatomical bases of emphysema. Recently, a non-density based geometric measurement of lung diagphragm curvature has been proposed as a method for the quantification of emphysema from CT. This work analyzes variability of diaphragm curvature and evaluates the effectiveness of a compensation methodology for the reduction of this variability as compared to emphysema index. Using a dataset of 43 scan-pairs with less than a 100 day time-interval between scans, we find that the diaphragm curvature had a trend towards lower overall variability over emphysema index (95% CI:-9.7 to + 14.7 vs. -15.8 to +12.0), and that the variation of both measures was reduced after compensation. We conclude that the variation of the new measure can be considered comparable to the established measure and the compensation can reduce the apparent variation of quantitative measures successfully.

  15. Quantification and Analysis of Icebergs in a Tidewater Glacier Fjord Using an Object-Based Approach.

    Directory of Open Access Journals (Sweden)

    Robert W McNabb

    Full Text Available Tidewater glaciers are glaciers that terminate in, and calve icebergs into, the ocean. In addition to the influence that tidewater glaciers have on physical and chemical oceanography, floating icebergs serve as habitat for marine animals such as harbor seals (Phoca vitulina richardii. The availability and spatial distribution of glacier ice in the fjords is likely a key environmental variable that influences the abundance and distribution of selected marine mammals; however, the amount of ice and the fine-scale characteristics of ice in fjords have not been systematically quantified. Given the predicted changes in glacier habitat, there is a need for the development of methods that could be broadly applied to quantify changes in available ice habitat in tidewater glacier fjords. We present a case study to describe a novel method that uses object-based image analysis (OBIA to classify floating glacier ice in a tidewater glacier fjord from high-resolution aerial digital imagery. Our objectives were to (i develop workflows and rule sets to classify high spatial resolution airborne imagery of floating glacier ice; (ii quantify the amount and fine-scale characteristics of floating glacier ice; (iii and develop processes for automating the object-based analysis of floating glacier ice for large number of images from a representative survey day during June 2007 in Johns Hopkins Inlet (JHI, a tidewater glacier fjord in Glacier Bay National Park, southeastern Alaska. On 18 June 2007, JHI was comprised of brash ice ([Formula: see text] = 45.2%, SD = 41.5%, water ([Formula: see text] = 52.7%, SD = 42.3%, and icebergs ([Formula: see text] = 2.1%, SD = 1.4%. Average iceberg size per scene was 5.7 m2 (SD = 2.6 m2. We estimate the total area (± uncertainty of iceberg habitat in the fjord to be 455,400 ± 123,000 m2. The method works well for classifying icebergs across scenes (classification accuracy of 75.6%; the largest classification errors occur in areas

  16. Exploring the impact of learning objects in middle school mathematics and science classrooms: A formative analysis

    Directory of Open Access Journals (Sweden)

    Robin H. Kay

    2008-12-01

    Full Text Available The current study offers a formative analysis of the impact of learning objects in middle school mathematics and science classrooms. Five reliable and valid measure of effectiveness were used to examine the impact of learning objects from the perspective of 262 students and 8 teachers (14 classrooms in science or mathematics. The results indicate that teachers typically spend 1-2 hours finding and preparing for learning-object based lesson plans that focus on the review of previous concepts. Both teachers and students are positive about the learning benefits, quality, and engagement value of learning objects, although teachers are more positive than students. Student performance increased significantly, over 40%, when learning objects were used in conjunction with a variety of teaching strategies. It is reasonable to conclude that learning objects have potential as a teaching tool in a middle school environment. L’impacte des objets d’apprentissage dans les classes de mathématique et de sciences à l’école intermédiaire : une analyse formative Résumé : Cette étude présente une analyse formative de l’impacte des objets d’apprentissage dans les classes de mathématique et de sciences à l’école intermédiaire. Cinq mesures de rendement fiables et valides ont été exploitées pour examiner l’effet des objets d’apprentissage selon 262 élèves et 8 enseignants (414 classes en science ou mathématiques. Les résultats indiquent que les enseignants passent typiquement 1-2 heures pour trouver des objets d’apprentissage et préparer les leçons associées qui seraient centrées sur la revue de concepts déjà vus en classe. Quoique les enseignants aient répondu de façon plus positive que les élèves, les deux groupes ont répondu positivement quant aux avantages au niveau de l’apprentissage, à la qualité ainsi qu’à la valeur motivationnelle des objets d’apprentissage. Le rendement des élèves aurait aussi augment

  17. A systematic review and meta-analysis of variations in branching patterns of the adult aortic arch.

    Science.gov (United States)

    Popieluszko, Patrick; Henry, Brandon Michael; Sanna, Beatrice; Hsieh, Wan Chin; Saganiak, Karolina; Pękala, Przemysław A; Walocha, Jerzy A; Tomaszewski, Krzysztof A

    2018-07-01

    The aortic arch (AA) is the main conduit of the left side of the heart, providing a blood supply to the head, neck, and upper limbs. As it travels through the thorax, the pattern in which it gives off the branches to supply these structures can vary. Variations of these branching patterns have been studied; however, a study providing a comprehensive incidence of these variations has not yet been conducted. The objective of this study was to perform a meta-analysis of all the studies that report prevalence data on AA variants and to provide incidence data on the most common variants. A systematic search of online databases including PubMed, Embase, Scopus, ScienceDirect, Web of Science, SciELO, BIOSIS, and CNKI was performed for literature describing incidence of AA variations in adults. Studies including prevalence data on adult patients or cadavers were collected and their data analyzed. A total of 51 articles were included (N = 23,882 arches). Seven of the most common variants were analyzed. The most common variants found included the classic branching pattern, defined as a brachiocephalic trunk, a left common carotid, and a left subclavian artery (80.9%); the bovine arch variant (13.6%); and the left vertebral artery variant (2.8%). Compared by geographic data, bovine arch variants were noted to have a prevalence as high as 26.8% in African populations. Although patients who have an AA variant are often asymptomatic, they compose a significant portion of the population of patients and pose a greater risk of hemorrhage and ischemia during surgery in the thorax. Because of the possibility of encountering such variants, it is prudent for surgeons to consider potential variations in planning procedures, especially of an endovascular nature, in the thorax. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  18. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong

    2008-03-15

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation.

  19. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    International Nuclear Information System (INIS)

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  20. Ethical objections against including life-extension costs in cost-effectiveness analysis: a consistent approach.

    Science.gov (United States)

    Gandjour, Afschin; Müller, Dirk

    2014-10-01

    One of the major ethical concerns regarding cost-effectiveness analysis in health care has been the inclusion of life-extension costs ("it is cheaper to let people die"). For this reason, many analysts have opted to rule out life-extension costs from the analysis. However, surprisingly little has been written in the health economics literature regarding this ethical concern and the resulting practice. The purpose of this work was to present a framework and potential solution for ethical objections against life-extension costs. This work found three levels of ethical concern: (i) with respect to all life-extension costs (disease-related and -unrelated); (ii) with respect to disease-unrelated costs only; and (iii) regarding disease-unrelated costs plus disease-related costs not influenced by the intervention. Excluding all life-extension costs for ethical reasons would require-for reasons of consistency-a simultaneous exclusion of savings from reducing morbidity. At the other extreme, excluding only disease-unrelated life-extension costs for ethical reasons would require-again for reasons of consistency-the exclusion of health gains due to treatment of unrelated diseases. Therefore, addressing ethical concerns regarding the inclusion of life-extension costs necessitates fundamental changes in the calculation of cost effectiveness.

  1. Objective Analysis of Performance of Activities of Daily Living in People With Central Field Loss.

    Science.gov (United States)

    Pardhan, Shahina; Latham, Keziah; Tabrett, Daryl; Timmis, Matthew A

    2015-11-01

    People with central visual field loss (CFL) adopt various strategies to complete activities of daily living (ADL). Using objective movement analysis, we compared how three ADLs were completed by people with CFL compared with age-matched, visually healthy individuals. Fourteen participants with CFL (age 81 ± 10 years) and 10 age-matched, visually healthy (age 75 ± 5 years) participated. Three ADLs were assessed: pick up food from a plate, pour liquid from a bottle, and insert a key in a lock. Participants with CFL completed each ADL habitually (as they would in their home). Data were compared with visually healthy participants who were asked to complete the tasks as they would normally, but under specified experimental conditions. Movement kinematics were compared using three-dimension motion analysis (Vicon). Visual functions (distance and near acuities, contrast sensitivity, visual fields) were recorded. All CFL participants were able to complete each ADL. However, participants with CFL demonstrated significantly (P approach. Various kinematic indices correlated significantly to visual function parameters including visual acuity and midperipheral visual field loss.

  2. Aligning experimental design with bioinformatics analysis to meet discovery research objectives.

    Science.gov (United States)

    Kane, Michael D

    2002-01-01

    The utility of genomic technology and bioinformatic analytical support to provide new and needed insight into the molecular basis of disease, development, and diversity continues to grow as more research model systems and populations are investigated. Yet deriving results that meet a specific set of research objectives requires aligning or coordinating the design of the experiment, the laboratory techniques, and the data analysis. The following paragraphs describe several important interdependent factors that need to be considered to generate high quality data from the microarray platform. These factors include aligning oligonucleotide probe design with the sample labeling strategy if oligonucleotide probes are employed, recognizing that compromises are inherent in different sample procurement methods, normalizing 2-color microarray raw data, and distinguishing the difference between gene clustering and sample clustering. These factors do not represent an exhaustive list of technical variables in microarray-based research, but this list highlights those variables that span both experimental execution and data analysis. Copyright 2001 Wiley-Liss, Inc.

  3. Variation and diversity in Homo erectus: a 3D geometric morphometric analysis of the temporal bone.

    Science.gov (United States)

    Terhune, Claire E; Kimbel, William H; Lockwood, Charles A

    2007-07-01

    Although the level of taxonomic diversity within the fossil hominin species Homo erectus (sensu lato) is continually debated, there have been relatively few studies aiming to quantify the morphology of this species. Instead, most researchers have relied on qualitative descriptions or the evaluation of nonmetric characters, which in many cases display continuous variation. Also, only a few studies have used quantitative data to formally test hypotheses regarding the taxonomic composition of the "erectus" hypodigm. Despite these previous analyses, however, and perhaps in part due to these varied approaches for assessing variation within specimens typically referred to H. erectus (sensu lato) and the general lack of rigorous statistical testing of how variation within this taxon is partitioned, there is currently little consensus regarding whether this group is a single species, or whether it should instead be split into separate temporal or geographically delimited taxa. In order to evaluate possible explanations for variation within H. erectus, we tested the general hypothesis that variation within the temporal bone morphology of H. erectus is consistent with that of a single species, using great apes and humans as comparative taxa. Eighteen three-dimensional (3D) landmarks of the temporal bone were digitized on a total of 520 extant and fossil hominid crania. Landmarks were registered by Generalized Procrustes Analysis, and Procrustes distances were calculated for comparisons of individuals within and between the extant taxa. Distances between fossil specimens and between a priori groupings of fossils were then compared to the distances calculated within the extant taxa to assess the variation within the H. erectus sample relative to that of known species, subspecies, and populations. Results of these analyses indicate that shape variation within the entire H. erectus sample is generally higher than extant hominid intraspecific variation, and putative H. ergaster

  4. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    Science.gov (United States)

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most

  5. Advances in variational and hemivariational inequalities theory, numerical analysis, and applications

    CERN Document Server

    Migórski, Stanisław; Sofonea, Mircea

    2015-01-01

    Highlighting recent advances in variational and hemivariational inequalities with an emphasis on theory, numerical analysis and applications, this volume serves as an indispensable resource to graduate students and researchers interested in the latest results from recognized scholars in this relatively young and rapidly-growing field. Particularly, readers will find that the volume’s results and analysis present valuable insights into the fields of pure and applied mathematics, as well as civil, aeronautical, and mechanical engineering. Researchers and students will find new results on well posedness to stationary and evolutionary inequalities and their rigorous proofs. In addition to results on modeling and abstract problems, the book contains new results on the numerical methods for variational and hemivariational inequalities. Finally, the applications presented illustrate the use of these results in the study of miscellaneous mathematical models which describe the contact between deformable bodies and a...

  6. Functional analysis and applied optimization in Banach spaces applications to non-convex variational models

    CERN Document Server

    Botelho, Fabio

    2014-01-01

    This book introduces the basic concepts of real and functional analysis. It presents the fundamentals of the calculus of variations, convex analysis, duality, and optimization that are necessary to develop applications to physics and engineering problems. The book includes introductory and advanced concepts in measure and integration, as well as an introduction to Sobolev spaces. The problems presented are nonlinear, with non-convex variational formulation. Notably, the primal global minima may not be attained in some situations, in which cases the solution of the dual problem corresponds to an appropriate weak cluster point of minimizing sequences for the primal one. Indeed, the dual approach more readily facilitates numerical computations for some of the selected models. While intended primarily for applied mathematicians, the text will also be of interest to engineers, physicists, and other researchers in related fields.

  7. Characterization of analysis activity in the development of object-oriented software. Application to a examination system in nuclear medicine

    International Nuclear Information System (INIS)

    Bayas, Marcos Raul Cordova.

    1995-01-01

    The object-oriented approach, formerly proposed as an alternative to conventional software coding techniques, has expanded its scope to other phases in software development, including the analysis phase. This work discusses basic concepts and major object oriented analysis methods, drawing comparisons with structured analysis, which has been the dominant paradigm in systems analysis. The comparison is based on three interdependent system aspects, that must be specified during the analysis phase: data, control and functionality. The specification of a radioisotope examination archive system is presented as a case study. (author). 45 refs., 87 figs., 1 tab

  8. Meningococcal genetic variation mechanisms viewed through comparative analysis of serogroup C strain FAM18.

    Directory of Open Access Journals (Sweden)

    Stephen D Bentley

    2007-02-01

    Full Text Available The bacterium Neisseria meningitidis is commonly found harmlessly colonising the mucosal surfaces of the human nasopharynx. Occasionally strains can invade host tissues causing septicaemia and meningitis, making the bacterium a major cause of morbidity and mortality in both the developed and developing world. The species is known to be diverse in many ways, as a product of its natural transformability and of a range of recombination and mutation-based systems. Previous work on pathogenic Neisseria has identified several mechanisms for the generation of diversity of surface structures, including phase variation based on slippage-like mechanisms and sequence conversion of expressed genes using information from silent loci. Comparison of the genome sequences of two N. meningitidis strains, serogroup B MC58 and serogroup A Z2491, suggested further mechanisms of variation, including C-terminal exchange in specific genes and enhanced localised recombination and variation related to repeat arrays. We have sequenced the genome of N. meningitidis strain FAM18, a representative of the ST-11/ET-37 complex, providing the first genome sequence for the disease-causing serogroup C meningococci; it has 1,976 predicted genes, of which 60 do not have orthologues in the previously sequenced serogroup A or B strains. Through genome comparison with Z2491 and MC58 we have further characterised specific mechanisms of genetic variation in N. meningitidis, describing specialised loci for generation of cell surface protein variants and measuring the association between noncoding repeat arrays and sequence variation in flanking genes. Here we provide a detailed view of novel genetic diversification mechanisms in N. meningitidis. Our analysis provides evidence for the hypothesis that the noncoding repeat arrays in neisserial genomes (neisserial intergenic mosaic elements provide a crucial mechanism for the generation of surface antigen variants. Such variation will have an

  9. MAPPING ERODED AREAS ON MOUNTAIN GRASSLAND WITH TERRESTRIAL PHOTOGRAMMETRY AND OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    A. Mayr

    2016-06-01

    Full Text Available In the Alps as well as in other mountain regions steep grassland is frequently affected by shallow erosion. Often small landslides or snow movements displace the vegetation together with soil and/or unconsolidated material. This results in bare earth surface patches within the grass covered slope. Close-range and remote sensing techniques are promising for both mapping and monitoring these eroded areas. This is essential for a better geomorphological process understanding, to assess past and recent developments, and to plan mitigation measures. Recent developments in image matching techniques make it feasible to produce high resolution orthophotos and digital elevation models from terrestrial oblique images. In this paper we propose to delineate the boundary of eroded areas for selected scenes of a study area, using close-range photogrammetric data. Striving for an efficient, objective and reproducible workflow for this task, we developed an approach for automated classification of the scenes into the classes grass and eroded. We propose an object-based image analysis (OBIA workflow which consists of image segmentation and automated threshold selection for classification using the Excess Green Vegetation Index (ExG. The automated workflow is tested with ten different scenes. Compared to a manual classification, grass and eroded areas are classified with an overall accuracy between 90.7% and 95.5%, depending on the scene. The methods proved to be insensitive to differences in illumination of the scenes and greenness of the grass. The proposed workflow reduces user interaction and is transferable to other study areas. We conclude that close-range photogrammetry is a valuable low-cost tool for mapping this type of eroded areas in the field with a high level of detail and quality. In future, the output will be used as ground truth for an area-wide mapping of eroded areas in coarser resolution aerial orthophotos acquired at the same time.

  10. Object-based image analysis and data mining for building ontology of informal urban settlements

    Science.gov (United States)

    Khelifa, Dejrriri; Mimoun, Malki

    2012-11-01

    During recent decades, unplanned settlements have been appeared around the big cities in most developing countries and as consequence, numerous problems have emerged. Thus the identification of different kinds of settlements is a major concern and challenge for authorities of many countries. Very High Resolution (VHR) Remotely Sensed imagery has proved to be a very promising way to detect different kinds of settlements, especially through the using of new objectbased image analysis (OBIA). The most important key is in understanding what characteristics make unplanned settlements differ from planned ones, where most experts characterize unplanned urban areas by small building sizes at high densities, no orderly road arrangement and Lack of green spaces. Knowledge about different kinds of settlements can be captured as a domain ontology that has the potential to organize knowledge in a formal, understandable and sharable way. In this work we focus on extracting knowledge from VHR images and expert's knowledge. We used an object based strategy by segmenting a VHR image taken over urban area into regions of homogenous pixels at adequate scale level and then computing spectral, spatial and textural attributes for each region to create objects. A genetic-based data mining was applied to generate high predictive and comprehensible classification rules based on selected samples from the OBIA result. Optimized intervals of relevant attributes are found, linked with land use types for forming classification rules. The unplanned areas were separated from the planned ones, through analyzing of the line segments detected from the input image. Finally a simple ontology was built based on the previous processing steps. The approach has been tested to VHR images of one of the biggest Algerian cities, that has grown considerably in recent decades.

  11. AFLP and MS-AFLP analysis of the variation within saffron crocus (Crocus sativus L. germplasm.

    Directory of Open Access Journals (Sweden)

    Matteo Busconi

    Full Text Available The presence and extent of genetic variation in saffron crocus are still debated, as testified by several contradictory articles providing contrasting results about the monomorphism or less of the species. Remarkably, phenotypic variations have been frequently observed in the field, such variations are usually unstable and can change from one growing season to another. Considering that gene expression can be influenced both by genetic and epigenetic changes, epigenetics could be a plausible cause of the alternative phenotypes. In order to obtain new insights into this issue, we carried out a molecular marker analysis of 112 accessions from the World Saffron and Crocus Collection. The accessions were grown for at least three years in the same open field conditions. The same samples were analysed using Amplified Fragment Length Polymorphism (AFLP and Methyl Sensitive AFLP in order to search for variation at the genetic (DNA sequence and epigenetic (cytosine methylation level. While the genetic variability was low (4.23% polymorphic peaks and twelve (12 effective different genotypes, the methyl sensitive analysis showed the presence of high epigenetic variability (33.57% polymorphic peaks and twenty eight (28 different effective epigenotypes. The pattern obtained by Factorial Correspondence Analysis of AFLP and, in particular, of MS-AFLP data was consistent with the geographical provenance of the accessions. Very interestingly, by focusing on Spanish accessions, it was observed that the distribution of the accessions in the Factorial Correspondence Analysis is not random but tends to reflect the geographical origin. Two clearly defined clusters grouping accessions from the West (Toledo and Ciudad Real and accessions from the East (Cuenca and Teruel were clearly recognised.

  12. Statistical analysis of activation and reaction energies with quasi-variational coupled-cluster theory

    Science.gov (United States)

    Black, Joshua A.; Knowles, Peter J.

    2018-06-01

    The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.

  13. Sensitivity analysis in oxidation ditch modelling: the effect of variations in stoichiometric, kinetic and operating parameters on the performance indices

    NARCIS (Netherlands)

    Abusam, A.A.A.; Keesman, K.J.; Straten, van G.; Spanjers, H.; Meinema, K.

    2001-01-01

    This paper demonstrates the application of the factorial sensitivity analysis methodology in studying the influence of variations in stoichiometric, kinetic and operating parameters on the performance indices of an oxidation ditch simulation model (benchmark). Factorial sensitivity analysis

  14. An experimental analysis of design choices of multi-objective ant colony optimization algorithms

    OpenAIRE

    Lopez-Ibanez, Manuel; Stutzle, Thomas

    2012-01-01

    There have been several proposals on how to apply the ant colony optimization (ACO) metaheuristic to multi-objective combinatorial optimization problems (MOCOPs). This paper proposes a new formulation of these multi-objective ant colony optimization (MOACO) algorithms. This formulation is based on adding specific algorithm components for tackling multiple objectives to the basic ACO metaheuristic. Examples of these components are how to represent multiple objectives using pheromone and heuris...

  15. Factors Associated with Variations in Population HIV Prevalence across West Africa: Findings from an Ecological Analysis

    Science.gov (United States)

    Prudden, Holly J.; Beattie, Tara S.; Bobrova, Natalia; Panovska-Griffiths, Jasmina; Mukandavire, Zindoga; Gorgens, Marelize; Wilson, David; Watts, Charlotte H.

    2015-01-01

    Background Population HIV prevalence across West Africa varies substantially. We assess the national epidemiological and behavioural factors associated with this. Methods National, urban and rural data on HIV prevalence, the percentage of younger (15–24) and older (25–49) women and men reporting multiple (2+) partners in the past year, HIV prevalence among female sex workers (FSWs), men who have bought sex in the past year (clients), and ART coverage, were compiled for 13 countries. An Ecological analysis using linear regression assessed which factors are associated with national variations in population female and male HIV prevalence, and with each other. Findings National population HIV prevalence varies between 0 4–2 9% for men and 0 4–5.6% for women. ART coverage ranges from 6–23%. National variations in HIV prevalence are not shown to be associated with variations in HIV prevalence among FSWs or clients. Instead they are associated with variations in the percentage of younger and older males and females reporting multiple partners. HIV prevalence is weakly negatively associated with ART coverage, implying it is not increased survival that is the cause of variations in HIV prevalence. FSWs and younger female HIV prevalence are associated with client population sizes, especially older men. Younger female HIV prevalence is strongly associated with older male and female HIV prevalence. Interpretation In West Africa, population HIV prevalence is not significantly higher in countries with high FSW HIV prevalence. Our analysis suggests, higher prevalence occurs where more men buy sex, and where a higher percentage of younger women, and older men and women have multiple partnerships. If a sexual network between clients and young females exists, clients may potentially bridge infection to younger females. HIV prevention should focus both on commercial sex and transmission between clients and younger females with multiple partners. PMID:26698854

  16. Factors Associated with Variations in Population HIV Prevalence across West Africa: Findings from an Ecological Analysis.

    Directory of Open Access Journals (Sweden)

    Holly J Prudden

    Full Text Available Population HIV prevalence across West Africa varies substantially. We assess the national epidemiological and behavioural factors associated with this.National, urban and rural data on HIV prevalence, the percentage of younger (15-24 and older (25-49 women and men reporting multiple (2+ partners in the past year, HIV prevalence among female sex workers (FSWs, men who have bought sex in the past year (clients, and ART coverage, were compiled for 13 countries. An Ecological analysis using linear regression assessed which factors are associated with national variations in population female and male HIV prevalence, and with each other.National population HIV prevalence varies between 0 4-2 9% for men and 0 4-5.6% for women. ART coverage ranges from 6-23%. National variations in HIV prevalence are not shown to be associated with variations in HIV prevalence among FSWs or clients. Instead they are associated with variations in the percentage of younger and older males and females reporting multiple partners. HIV prevalence is weakly negatively associated with ART coverage, implying it is not increased survival that is the cause of variations in HIV prevalence. FSWs and younger female HIV prevalence are associated with client population sizes, especially older men. Younger female HIV prevalence is strongly associated with older male and female HIV prevalence.In West Africa, population HIV prevalence is not significantly higher in countries with high FSW HIV prevalence. Our analysis suggests, higher prevalence occurs where more men buy sex, and where a higher percentage of younger women, and older men and women have multiple partnerships. If a sexual network between clients and young females exists, clients may potentially bridge infection to younger females. HIV prevention should focus both on commercial sex and transmission between clients and younger females with multiple partners.

  17. Genome size variation among and within Camellia species by using flow cytometric analysis.

    Directory of Open Access Journals (Sweden)

    Hui Huang

    Full Text Available BACKGROUND: The genus Camellia, belonging to the family Theaceae, is economically important group in flowering plants. Frequent interspecific hybridization together with polyploidization has made them become taxonomically "difficult taxa". The DNA content is often used to measure genome size variation and has largely advanced our understanding of plant evolution and genome variation. The goals of this study were to investigate patterns of interspecific and intraspecific variation of DNA contents and further explore genome size evolution in a phylogenetic context of the genus. METHODOLOGY/PRINCIPAL FINDINGS: The DNA amount in the genus was determined by using propidium iodide flow cytometry analysis for a total of 139 individual plants representing almost all sections of the two subgenera, Camellia and Thea. An improved WPB buffer was proven to be suitable for the Camellia species, which was able to counteract the negative effects of secondary metabolite and generated high-quality results with low coefficient of variation values (CV <5%. Our results showed trivial effects on different tissues of flowers, leaves and buds as well as cytosolic compounds on the estimation of DNA amount. The DNA content of C. sinensis var. assamica was estimated to be 1C = 3.01 pg by flow cytometric analysis, which is equal to a genome size of about 2940 Mb. CONCLUSION: Intraspecific and interspecific variations were observed in the genus Camellia, and as expected, the latter was larger than the former. Our study suggests a directional trend of increasing genome size in the genus Camellia probably owing to the frequent polyploidization events.

  18. SU-E-T-139: Automated Daily EPID Exit Dose Analysis Uncovers Treatment Variations

    Energy Technology Data Exchange (ETDEWEB)

    Olch, A [University of Southern California, Los Angeles, CA (United States)

    2015-06-15

    Purpose: To evaluate a fully automated EPID exit dose system for its ability to detect daily treatment deviations including patient setup, delivery, and anatomy changes. Methods: PerFRACTION (Sun Nuclear Corporation) software is a system that uses integrated EPID images taken during patient treatment and automatically pulled from the Aria database and analyzed based on user-defined comparisons. This was used to monitor 20 plans consisting of a total of 859 fields for 18 patients, for a total of 251 fractions. Nine VMAT, 5 IMRT, and 6 3D plans were monitored. The Gamma analysis was performed for each field within a plan, comparing the first fraction against each of the other fractions in each treatment course. A 2% dose difference, 1 mm distance-to-agreement, and 10% dose threshold was used. These tight tolerances were chosen to achieve a high sensitivity to treatment variations. The field passed if 93% of the pixels had a Gamma of 1 or less. Results: Twenty-nine percent of the fields failed. The average plan passing rate was 92.5%.The average 3D plan passing rate was less than for VMAT or IMRT, 84%, vs. an average of 96.2%. When fields failed, an investigation revealed changes in patient anatomy or setup variations, often also leading to variations of transmission through immobilization devices. Conclusion: PerFRACTION is a fully automated system for determining daily changes in dose transmission through the patient that requires no effort other than for the imager panel to be deployed during treatment. A surprising number of fields failed the analysis and can be attributed to important treatment variations that would otherwise not be appreciated. Further study of inter-fraction treatment variations is possible and warranted. Sun Nuclear Corporation provided a license to the software described.

  19. Analysis and optimization with ecological objective function of irreversible single resonance energy selective electron heat engines

    International Nuclear Information System (INIS)

    Zhou, Junle; Chen, Lingen; Ding, Zemin; Sun, Fengrui

    2016-01-01

    Ecological performance of a single resonance ESE heat engine with heat leakage is conducted by applying finite time thermodynamics. By introducing Nielsen function and numerical calculations, expressions about power output, efficiency, entropy generation rate and ecological objective function are derived; relationships between ecological objective function and power output, between ecological objective function and efficiency as well as between power output and efficiency are demonstrated; influences of system parameters of heat leakage, boundary energy and resonance width on the optimal performances are investigated in detail; a specific range of boundary energy is given as a compromise to make ESE heat engine system work at optimal operation regions. Comparing performance characteristics with different optimization objective functions, the significance of selecting ecological objective function as the design objective is clarified specifically: when changing the design objective from maximum power output into maximum ecological objective function, the improvement of efficiency is 4.56%, while the power output drop is only 2.68%; when changing the design objective from maximum efficiency to maximum ecological objective function, the improvement of power output is 229.13%, and the efficiency drop is only 13.53%. - Highlights: • An irreversible single resonance energy selective electron heat engine is studied. • Heat leakage between two reservoirs is considered. • Power output, efficiency and ecological objective function are derived. • Optimal performance comparison for three objective functions is carried out.

  20. Object methods of analysis and design: presentation of U R L ...

    African Journals Online (AJOL)

    Objects invaded the world of data processing, and there is no field which did not feel their effects. The object approach originates in the programming object, whose languages Smalltalk and C++ are the most known representatives. Thereafter, its application spread with many fields such as the software genius, the left again ...

  1. Object-oriented analysis and design of a GEANT based detector simulator

    International Nuclear Information System (INIS)

    Amako, K.; Kanzaki, J.; Sasaki, T.; Takaiwa, Y.; Nakagawa, Y.; Yamagata, T.

    1994-01-01

    The authors give a status report of the project to design a detector simulation program by reengineering GEANT with the object-oriented methodology. They followed the Object Modeling Technique. They explain the object model they constructed. Also problems of the technique found during their study are discussed

  2. report on the french objectives of electricity consumption, produced from renewable energies sources and on the analysis of their realization

    International Nuclear Information System (INIS)

    2007-01-01

    This report presents the french objectives of electricity, from renewable energies sources, internal consumption for the next ten years, as the analysis of their realization taking into account the climatic factors likely to change the realization of these objectives. It also discusses the adequacy of the actions to the national engagement in matter of climatic change. (A.L.B.)

  3. Pigments analysis and gold layer thickness evaluation of polychromy on wood objects by PXRF

    International Nuclear Information System (INIS)

    Blonski, M.S.; Appoloni, C.R.

    2014-01-01

    The X-ray fluorescence technique by energy dispersion (EDXRF), being a multi elemental and non-destructive technique, has been widely used in the analysis of artworks and archeometry. An X-ray fluorescence portable equipment from the Laboratory of Applied Nuclear Physics of the State University of Londrina (LFNA/UEL) was used for the measurement of pigments in golden parts of a Gilding Preparation Standard Plaque and also pigments measurement on the Wood Adornment of the High Altar Column of the Side Pulpit of the Immaculate Conception Church Parish Sao Paulo-SP. The portable X-ray fluorescence PXRF-LFNA-02 consists of an X-ray tube with Ag anode, a Si-PIN detector (FWHM=221 eV for Mn line at 5.9 keV), a chain of electronics nuclear standard of X-ray spectrometer, a multichannel 8 K, a notebook and a mechanical system designed for the positioning of detector and X-ray tube, which allows movements with two degrees of freedom from the system of excitation–detection. The excitation–detection time of each measurement was 100 and 500 s, respectively. The presence of elements Ti, Cr, Fe, Cu, Zn and Au was found in the golden area of the Altar Column ornament. On the other hand, analysis of the ratios for the intensities of K α /K β lines measured in the areas made it possible to explore the possibility of measuring the stratigraphies of the layers of pigments and to estimate the thickness of the same. - Highlights: • The X-ray fluorescence technique by energy dispersion (EDXRF) and an X-ray fluorescence portable equipment are used for measurement of pigments. • Analysis of the ratios for the intensities of K α /K β lines measured in the areas made it possible to explore the possibility of measuring the stratigraphies of the layers of pigments and to estimate the thickness of the same. • The result of pigment analysis performed on these objects indicates that they are of the twentieth century

  4. Transcriptome analysis of the sea cucumber (Apostichopus japonicus) with variation in individual growth.

    Science.gov (United States)

    Gao, Lei; He, Chongbo; Bao, Xiangbo; Tian, Meilin; Ma, Zhen

    2017-01-01

    The sea cucumber (Apostichopus japonicus) is an economically important aquaculture species in China. However, the serious individual growth variation often caused financial losses to farmers and the genetic mechanisms are poorly understood. In the present study, the extensively analysis at the transcriptome level for individual growth variation in sea cucumber was carried out. A total of 118946 unigenes were assembled from 255861 transcripts, with N50 of 1700. Of all unigenes, about 23% were identified with at least one significant match to known databases. In all four pair of comparison, 1840 genes were found to be expressed differently. Global hypometabolism was found to be occurred in the slow growing population, based on which the hypothesis was raised that growth retardation in individual growth variation of sea cucumber is one type of dormancy which is used to be against to adverse circumstances. Besides, the pathways such as ECM-receptor interaction and focal adhesion were enriched in the maintenance of cell and tissue structure and communication. Further, 76645 SSRs, 765242 SNPs and 146886 ins-dels were detected in the current study providing an extensive set of data for future studies of genetic mapping and selective breeding. In summary, these results will provides deep insight into the molecular basis of individual growth variation in marine invertebrates, and be valuable for understanding the physiological differences of growth process.

  5. Transcriptome analysis of the sea cucumber (Apostichopus japonicus with variation in individual growth.

    Directory of Open Access Journals (Sweden)

    Lei Gao

    Full Text Available The sea cucumber (Apostichopus japonicus is an economically important aquaculture species in China. However, the serious individual growth variation often caused financial losses to farmers and the genetic mechanisms are poorly understood. In the present study, the extensively analysis at the transcriptome level for individual growth variation in sea cucumber was carried out. A total of 118946 unigenes were assembled from 255861 transcripts, with N50 of 1700. Of all unigenes, about 23% were identified with at least one significant match to known databases. In all four pair of comparison, 1840 genes were found to be expressed differently. Global hypometabolism was found to be occurred in the slow growing population, based on which the hypothesis was raised that growth retardation in individual growth variation of sea cucumber is one type of dormancy which is used to be against to adverse circumstances. Besides, the pathways such as ECM-receptor interaction and focal adhesion were enriched in the maintenance of cell and tissue structure and communication. Further, 76645 SSRs, 765242 SNPs and 146886 ins-dels were detected in the current study providing an extensive set of data for future studies of genetic mapping and selective breeding. In summary, these results will provides deep insight into the molecular basis of individual growth variation in marine invertebrates, and be valuable for understanding the physiological differences of growth process.

  6. Bridge Crack Detection Using Multi-Rotary Uav and Object-Base Image Analysis

    Science.gov (United States)

    Rau, J. Y.; Hsiao, K. W.; Jhan, J. P.; Wang, S. H.; Fang, W. C.; Wang, J. L.

    2017-08-01

    Bridge is an important infrastructure for human life. Thus, the bridge safety monitoring and maintaining is an important issue to the government. Conventionally, bridge inspection were conducted by human in-situ visual examination. This procedure sometimes require under bridge inspection vehicle or climbing under the bridge personally. Thus, its cost and risk is high as well as labor intensive and time consuming. Particularly, its documentation procedure is subjective without 3D spatial information. In order cope with these challenges, this paper propose the use of a multi-rotary UAV that equipped with a SONY A7r2 high resolution digital camera, 50 mm fixed focus length lens, 135 degrees up-down rotating gimbal. The target bridge contains three spans with a total of 60 meters long, 20 meters width and 8 meters height above the water level. In the end, we took about 10,000 images, but some of them were acquired by hand held method taken on the ground using a pole with 2-8 meters long. Those images were processed by Agisoft PhotoscanPro to obtain exterior and interior orientation parameters. A local coordinate system was defined by using 12 ground control points measured by a total station. After triangulation and camera self-calibration, the RMS of control points is less than 3 cm. A 3D CAD model that describe the bridge surface geometry was manually measured by PhotoscanPro. They were composed of planar polygons and will be used for searching related UAV images. Additionally, a photorealistic 3D model can be produced for 3D visualization. In order to detect cracks on the bridge surface, we utilize object-based image analysis (OBIA) technique to segment the image into objects. Later, we derive several object features, such as density, area/bounding box ratio, length/width ratio, length, etc. Then, we can setup a classification rule set to distinguish cracks. Further, we apply semi-global-matching (SGM) to obtain 3D crack information and based on image scale we

  7. BRIDGE CRACK DETECTION USING MULTI-ROTARY UAV AND OBJECT-BASE IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    J. Y. Rau

    2017-08-01

    Full Text Available Bridge is an important infrastructure for human life. Thus, the bridge safety monitoring and maintaining is an important issue to the government. Conventionally, bridge inspection were conducted by human in-situ visual examination. This procedure sometimes require under bridge inspection vehicle or climbing under the bridge personally. Thus, its cost and risk is high as well as labor intensive and time consuming. Particularly, its documentation procedure is subjective without 3D spatial information. In order cope with these challenges, this paper propose the use of a multi-rotary UAV that equipped with a SONY A7r2 high resolution digital camera, 50 mm fixed focus length lens, 135 degrees up-down rotating gimbal. The target bridge contains three spans with a total of 60 meters long, 20 meters width and 8 meters height above the water level. In the end, we took about 10,000 images, but some of them were acquired by hand held method taken on the ground using a pole with 2–8 meters long. Those images were processed by Agisoft PhotoscanPro to obtain exterior and interior orientation parameters. A local coordinate system was defined by using 12 ground control points measured by a total station. After triangulation and camera self-calibration, the RMS of control points is less than 3 cm. A 3D CAD model that describe the bridge surface geometry was manually measured by PhotoscanPro. They were composed of planar polygons and will be used for searching related UAV images. Additionally, a photorealistic 3D model can be produced for 3D visualization. In order to detect cracks on the bridge surface, we utilize object-based image analysis (OBIA technique to segment the image into objects. Later, we derive several object features, such as density, area/bounding box ratio, length/width ratio, length, etc. Then, we can setup a classification rule set to distinguish cracks. Further, we apply semi-global-matching (SGM to obtain 3D crack information and based

  8. Automated retroillumination photography analysis for objective assessment of Fuchs Corneal Dystrophy severity

    Science.gov (United States)

    Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.

    2016-01-01

    Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565

  9. Sliding thin slab, minimum intensity projection imaging for objective analysis of emphysema

    International Nuclear Information System (INIS)

    Satoh, Shiro; Ohdama, Shinichi; Shibuya, Hitoshi

    2006-01-01

    The aim of this study was to determine whether sliding thin slab, minimum intensity projection (STS-MinIP) imaging is more advantageous than thin-section computed tomography (CT) for detecting and assessing emphysema. Objective quantification of emphysema by STS-MinIP and thin-section CT was defined as the percentage of area lower than the threshold in the lung section at the level of the aortic arch, tracheal carina, and 5 cm below the carina. Quantitative analysis in 100 subjects was performed and compared with pulmonary function test results. The ratio of the low attenuation area in the lung measured by STS-MinIP was significantly higher than that found by thin-section CT (P<0.01). The difference between STS-MinIP and thin-section CT was statistically evident even for mild emphysema and increased depending on whether the low attenuation in the lung increased. Moreover, STS-MinIP showed a stronger regression relation with pulmonary function results than did thin-section CT (P<0.01). STS-MinIP can be recommended as a new morphometric method for detecting and assessing the severity of emphysema. (author)

  10. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    Science.gov (United States)

    Fernandez Galarreta, J.; Kerle, N.; Gerke, M.

    2015-06-01

    Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  11. Change Analysis and Decision Tree Based Detection Model for Residential Objects across Multiple Scales

    Directory of Open Access Journals (Sweden)

    CHEN Liyan

    2018-03-01

    Full Text Available Change analysis and detection plays important role in the updating of multi-scale databases.When overlap an updated larger-scale dataset and a to-be-updated smaller-scale dataset,people usually focus on temporal changes caused by the evolution of spatial entities.Little attention is paid to the representation changes influenced by map generalization.Using polygonal building data as an example,this study examines the changes from different perspectives,such as the reasons for their occurrence,their performance format.Based on this knowledge,we employ decision tree in field of machine learning to establish a change detection model.The aim of the proposed model is to distinguish temporal changes that need to be applied as updates to the smaller-scale dataset from representation changes.The proposed method is validated through tests using real-world building data from Guangzhou city.The experimental results show the overall precision of change detection is more than 90%,which indicates our method is effective to identify changed objects.

  12. Cognition and objectively measured sleep duration in children: a systematic review and meta-analysis.

    Science.gov (United States)

    Short, Michelle A; Blunden, Sarah; Rigney, Gabrielle; Matricciani, Lisa; Coussens, Scott; M Reynolds, Chelsea; Galland, Barbara

    2018-06-01

    Sleep recommendations are widely used to guide communities on children's sleep needs. Following recent adjustments to guidelines by the National Sleep Foundation and the subsequent consensus statement by the American Academy of Sleep Medicine, we undertook a systematic literature search to evaluate the current evidence regarding relationships between objectively measured sleep duration and cognitive function in children aged 5 to 13 years. Cognitive function included measures of memory, attention, processing speed, and intelligence in children aged 5 to 13 years. Keyword searches of 7 databases to December 2016 found 23 meeting inclusion criteria from 137 full articles reviewed, 19 of which were suitable for meta-analysis. A significant effect (r = .06) was found between sleep duration and cognition, suggesting that longer sleep durations were associated with better cognitive functioning. Analyses of different cognitive domains revealed that full/verbal IQ was significantly associated with sleep loss, but memory, fluid IQ, processing speed and attention were not. Comparison of study sleep durations with current sleep recommendations showed that most children studied had sleep durations that were not within the range of recommended sleep. As such, the true effect of sleep loss on cognitive function may be obscured in these samples, as most children were sleep restricted. Future research using more rigorous experimental methodologies is needed to properly elucidate the relationship between sleep duration and cognition in this age group. Copyright © 2018 National Sleep Foundation. Published by Elsevier Inc. All rights reserved.

  13. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues

  14. Application of Object Based Classification and High Resolution Satellite Imagery for Savanna Ecosystem Analysis

    Directory of Open Access Journals (Sweden)

    Jane Southworth

    2010-12-01

    Full Text Available Savanna ecosystems are an important component of dryland regions and yet are exceedingly difficult to study using satellite imagery. Savannas are composed are varying amounts of trees, shrubs and grasses and typically traditional classification schemes or vegetation indices cannot differentiate across class type. This research utilizes object based classification (OBC for a region in Namibia, using IKONOS imagery, to help differentiate tree canopies and therefore woodland savanna, from shrub or grasslands. The methodology involved the identification and isolation of tree canopies within the imagery and the creation of tree polygon layers had an overall accuracy of 84%. In addition, the results were scaled up to a corresponding Landsat image of the same region, and the OBC results compared to corresponding pixel values of NDVI. The results were not compelling, indicating once more the problems of these traditional image analysis techniques for savanna ecosystems. Overall, the use of the OBC holds great promise for this ecosystem and could be utilized more frequently in studies of vegetation structure.

  15. Objective classification of ecological status in marine water bodies using ecotoxicological information and multivariate analysis.

    Science.gov (United States)

    Beiras, Ricardo; Durán, Iria

    2014-12-01

    Some relevant shortcomings have been identified in the current approach for the classification of ecological status in marine water bodies, leading to delays in the fulfillment of the Water Framework Directive objectives. Natural variability makes difficult to settle fixed reference values and boundary values for the Ecological Quality Ratios (EQR) for the biological quality elements. Biological responses to environmental degradation are frequently of nonmonotonic nature, hampering the EQR approach. Community structure traits respond only once ecological damage has already been done and do not provide early warning signals. An alternative methodology for the classification of ecological status integrating chemical measurements, ecotoxicological bioassays and community structure traits (species richness and diversity), and using multivariate analyses (multidimensional scaling and cluster analysis), is proposed. This approach does not depend on the arbitrary definition of fixed reference values and EQR boundary values, and it is suitable to integrate nonlinear, sensitive signals of ecological degradation. As a disadvantage, this approach demands the inclusion of sampling sites representing the full range of ecological status in each monitoring campaign. National or international agencies in charge of coastal pollution monitoring have comprehensive data sets available to overcome this limitation.

  16. NDVI-Based analysis on the influence of human activities on vegetation variation on Hainan Island

    Science.gov (United States)

    Luo, Hongxia; Dai, Shengpei; Xie, Zhenghui; Fang, Jihua

    2018-02-01

    Using the Moderate Resolution Imaging Spectroradiometer-normalized difference vegetation index (NDVI) dataset, we analyzed the predicted NDVI values variation and the influence of human activities on vegetation on Hainan Island during 2001-2015. We investigated the roles of human activities in vegetation variation, particularly from 2002 when implemented the Grain-for-Greenprogram on Hainan Island. The trend analysis, linear regression model and residual analysis were used to analyze the data. The results of the study showed that (1) The predicted vegetation on Hainan Island showed an general upward trend with a linear growth rate of 0.0025/10y (phuman activities. (3) In general, human activities had played a positive role in the vegetation increase on Hainan Island, and the residual NDVI trend of this region showed positive outcomes for vegetation variation after implementing ecological engineering projects. However, it indicated a growing risk of vegetation degradation in the coastal region of Hainan Island as a result of rapid urbanization, land reclamation.

  17. Forest anisotropy assessment by means of spatial variations analysis of PolSAR backscattering

    Directory of Open Access Journals (Sweden)

    A. V. Dmitriev

    2017-06-01

    Full Text Available The possibility to synthesize polarization response from earth covers at any desired combination of transmit and receive antenna polarizations is the significant advantage of polarimetric radar. It permits better identification of dominant scattering mechanisms especially when analyzing polarization signatures. These signatures depict more details of physical information from target backscattering in various polarization bases. However, polarization signatures cannot reveal spatial variations of the radar backscattering caused by volume heterogeneity of a target. This paper proposes a new approach for estimating volume target heterogeneity from polarimetric synthetic aperture radar (PolSAR images. The approach is based on the analysis of a novel type of polarization signature, which we call fractal polarization signature (FPS. This signature is a result of polarization synthesis of initial fully polarimetric data and subsequent fractal analysis of synthesized images. It is displayed as a 3D plot and can be produced for each point in an image. It is shown that FPS describes backscattering variations or image roughness at different states of polarization. Fully polarimetric data of SIR-C and ALOS PALSAR at ascending/descending orbits were used for testing the proposed approach. The azimuthal dependence of the radar backscattering variations is discovered when analyzing backscattering from a pine forest. It correlates with the results of a field survey of trees branch distribution.

  18. Measurement of isotope abundance variations in nature by gravimetric spiking isotope dilution analysis (GS-IDA).

    Science.gov (United States)

    Chew, Gina; Walczyk, Thomas

    2013-04-02

    Subtle variations in the isotopic composition of elements carry unique information about physical and chemical processes in nature and are now exploited widely in diverse areas of research. Reliable measurement of natural isotope abundance variations is among the biggest challenges in inorganic mass spectrometry as they are highly sensitive to methodological bias. For decades, double spiking of the sample with a mix of two stable isotopes has been considered the reference technique for measuring such variations both by multicollector-inductively coupled plasma mass spectrometry (MC-ICPMS) and multicollector-thermal ionization mass spectrometry (MC-TIMS). However, this technique can only be applied to elements having at least four stable isotopes. Here we present a novel approach that requires measurement of three isotope signals only and which is more robust than the conventional double spiking technique. This became possible by gravimetric mixing of the sample with an isotopic spike in different proportions and by applying principles of isotope dilution for data analysis (GS-IDA). The potential and principle use of the technique is demonstrated for Mg in human urine using MC-TIMS for isotopic analysis. Mg is an element inaccessible to double spiking methods as it consists of three stable isotopes only and shows great potential for metabolically induced isotope effects waiting to be explored.

  19. [Genetic variation analysis of canine parvovirus VP2 gene in China].

    Science.gov (United States)

    Yi, Li; Cheng, Shi-Peng; Yan, Xi-Jun; Wang, Jian-Ke; Luo, Bin

    2009-11-01

    To recognize the molecular biology character, phylogenetic relationship and the state quo prevalent of Canine parvovirus (CPV), Faecal samnples from pet dogs with acute enteritis in the cities of Beijing, Wuhan, and Nanjing were collected and tested for CPV by PCR and other assay between 2006 and 2008. There was no CPV to FPV (MEV) variation by PCR-RFLP analysis in all samples. The complete ORFs of VP2 genes were obtained by PCR from 15 clinical CPVs and 2 CPV vaccine strains. All amplicons were cloned and sequenced. Analysis of the VP2 sequences showed that clinical CPVs both belong to CPV-2a subtype, and could be classified into a new cluster by amino acids contrasting which contains Tyr-->Ile (324) mutation. Besides the 2 CPV vaccine strains belong to CPV-2 subtype, and both of them have scattered variation in amino acids residues of VP2 protein. Construction of the phylogenetic tree based on CPV VP2 sequence showed these 15 CPV clinical strains were in close relationship with Korea strain K001 than CPV-2a isolates in other countries at early time, It is indicated that the canine parvovirus genetic variation was associated with location and time in some degree. The survey of CPV capsid protein VP2 gene provided the useful information for the identification of CPV types and understanding of their genetic relationship.

  20. Object-oriented analysis and design of a health care management information system.

    Science.gov (United States)

    Krol, M; Reich, D L

    1999-04-01

    We have created a prototype for a universal object-oriented model of a health care system compatible with the object-oriented approach used in version 3.0 of the HL7 standard for communication messages. A set of three models has been developed: (1) the Object Model describes the hierarchical structure of objects in a system--their identity, relationships, attributes, and operations; (2) the Dynamic Model represents the sequence of operations in time as a collection of state diagrams for object classes in the system; and (3) functional Diagram represents the transformation of data within a system by means of data flow diagrams. Within these models, we have defined major object classes of health care participants and their subclasses, associations, attributes and operators, states, and behavioral scenarios. We have also defined the major processes and subprocesses. The top-down design approach allows use, reuse, and cloning of standard components.

  1. "Life history space": a multivariate analysis of life history variation in extant and extinct Malagasy lemurs.

    Science.gov (United States)

    Catlett, Kierstin K; Schwartz, Gary T; Godfrey, Laurie R; Jungers, William L

    2010-07-01

    Studies of primate life history variation are constrained by the fact that all large-bodied extant primates are haplorhines. However, large-bodied strepsirrhines recently existed. If we can extract life history information from their skeletons, these species can contribute to our understanding of primate life history variation. This is particularly important in light of new critiques of the classic "fast-slow continuum" as a descriptor of variation in life history profiles across mammals in general. We use established dental histological methods to estimate gestation length and age at weaning for five extinct lemur species. On the basis of these estimates, we reconstruct minimum interbirth intervals and maximum reproductive rates. We utilize principal components analysis to create a multivariate "life history space" that captures the relationships among reproductive parameters and brain and body size in extinct and extant lemurs. Our data show that, whereas large-bodied extinct lemurs can be described as "slow" in some fashion, they also varied greatly in their life history profiles. Those with relatively large brains also weaned their offspring late and had long interbirth intervals. These were not the largest of extinct lemurs. Thus, we distinguish size-related life history variation from variation that linked more strongly to ecological factors. Because all lemur species larger than 10 kg, regardless of life history profile, succumbed to extinction after humans arrived in Madagascar, we argue that large body size increased the probability of extinction independently of reproductive rate. We also provide some evidence that, among lemurs, brain size predicts reproductive rate better than body size. (c) 2010 Wiley-Liss, Inc.

  2. Systematic documentation and analysis of human genetic variation using the microattribution approach

    Science.gov (United States)

    Giardine, Belinda; Borg, Joseph; Higgs, Douglas R.; Peterson, Kenneth R.; Maglott, Donna; Basak, A. Nazli; Clark, Barnaby; Faustino, Paula; Felice, Alex E.; Francina, Alain; Gallivan, Monica V. E.; Georgitsi, Marianthi; Gibbons, Richard J.; Giordano, Piero C.; Harteveld, Cornelis L.; Joly, Philippe; Kanavakis, Emmanuel; Kollia, Panagoula; Menzel, Stephan; Miller, Webb; Moradkhani, Kamran; Old, John; Papachatzopoulou, Adamantia; Papadakis, Manoussos N.; Papadopoulos, Petros; Pavlovic, Sonja; Philipsen, Sjaak; Radmilovic, Milena; Riemer, Cathy; Schrijver, Iris; Stojiljkovic, Maja; Thein, Swee Lay; Traeger-Synodinos, Jan; Tully, Ray; Wada, Takahito; Waye, John; Wiemann, Claudia; Zukic, Branka; Chui, David H. K.; Wajcman, Henri; Hardison, Ross C.; Patrinos, George P.

    2013-01-01

    We developed a series of interrelated locus-specific databases to store all published and unpublished genetic variation related to these disorders, and then implemented microattribution to encourage submission of unpublished observations of genetic variation to these public repositories 1. A total of 1,941 unique genetic variants in 37 genes, encoding globins (HBA2, HBA1, HBG2, HBG1, HBD, HBB) and other erythroid proteins (ALOX5AP, AQP9, ARG2, ASS1, ATRX, BCL11A, CNTNAP2, CSNK2A1, EPAS1, ERCC2, FLT1, GATA1, GPM6B, HAO2, HBS1L, KDR, KL, KLF1, MAP2K1, MAP3K5, MAP3K7, MYB, NOS1, NOS2, NOS3, NOX3, NUP133, PDE7B, SMAD3, SMAD6, and TOX) are currently documented in these databases with reciprocal attribution of microcitations to data contributors. Our project provides the first example of implementing microattribution to incentivise submission of all known genetic variation in a defined system. It has demonstrably increased the reporting of human variants and now provides a comprehensive online resource for systematically describing human genetic variation in the globin genes and other genes contributing to hemoglobinopathies and thalassemias. The large repository of previously reported data, together with more recent data, acquired by microattribution, demonstrates how the comprehensive documentation of human variation will provide key insights into normal biological processes and how these are perturbed in human genetic disease. Using the microattribution process set out here, datasets which took decades to accumulate for the globin genes could be assembled rapidly for other genes and disease systems. The principles established here for the globin gene system will serve as a model for other systems and the analysis of other common and/or complex human genetic diseases. PMID:21423179

  3. A functional analysis of photo-object matching skills of severely retarded adolescents.

    OpenAIRE

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photo...

  4. A measurement based analysis of the spatial distribution, temporal variation and chemical composition of particulate matter in Munich and Augsburg

    Directory of Open Access Journals (Sweden)

    Klaus Schäfer

    2011-02-01

    Full Text Available The objective of the studies presented in this paper is to present an analysis of spatial distribution and temporal variation of particulate matter in Munich and Augsburg, Germany, and to identify and discuss the factors determining the aerosol pollution in both areas. Surface-based in-situ and remote sensing measurements of particle mass and particle size distribution have been performed in, around, and above the two cities. Two measurement campaigns were conducted in Munich, one in late spring and one in winter 2003. Another campaign has been on-going in Augsburg since 2004. Spatial and temporal variations are analyzed from this data (PM10, PM2.5, and PM1. There are higher particle mass concentrations at the urban site than at the surrounding rural sites, especially in winter. No significant difference in the major ionic composition of the particles between the urban and the rural site was detected. This is considered to be related to the spatial distribution of secondary inorganic aerosol that is more homogeneous than aerosol resulting from other sources like traffic or urban releases in general. During the measurement campaigns mixing layer heights were determined continuously by remote sensing (SODAR, ceilometer, RASS. Significant dependence of particle size distribution and particle mass concentration on mixing layer height was found. This finding paves the way to new applications of satellite remote sensing products.

  5. Evidence of increment of efficiency of the Mexican Stock Market through the analysis of its variations

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Huerta-Quintanilla, R.; Rodríguez-Achach, M.

    2007-07-01

    It is well known that there exist statistical and structural differences between the stock markets of developed and emerging countries. In this work, and in order to find out if the efficiency of the Mexican Stock Market has been changing over time, we have performed and compared several analyses of the variations of the Mexican Stock Market index (IPC) and Dow Jones industrial average index (DJIA) for different periods of their historical daily data. We have analyzed the returns autocorrelation function (ACF) and used detrended fluctuation analysis (DFA) to study returns variations. We also analyze the volatility, mean value and standard deviation of both markets and compare their evolution. We conclude from the overall result of these studies, that they show compelling evidence of the increment of efficiency of the Mexican Stock Market over time. The data samples analyzed here, correspond to daily values of the IPC and DJIA for the period 10/30/1978-02/28/2006.

  6. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-08-01

    Full Text Available Airborne laser scanning (ALS is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2 and additional classification variables from full-waveform (FWF ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original

  7. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Directory of Open Access Journals (Sweden)

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  8. A Nationwide Analysis of Cost Variation for Autologous Free Flap Breast Reconstruction.

    Science.gov (United States)

    Billig, Jessica I; Lu, Yiwen; Momoh, Adeyiza O; Chung, Kevin C

    2017-11-01

    Cost variation among hospitals has been demonstrated for surgical procedures. Uncovering these differences has helped guide measures taken to reduce health care spending. To date, the fiscal consequence of hospital variation for autologous free flap breast reconstruction is unknown. To investigate factors that influence cost variation for autologous free flap breast reconstruction. A secondary cross-sectional analysis was performed using the Healthcare Cost and Utilization Project National Inpatient Sample database from 2008 to 2010. The dates of analysis were September 2016 to February 2017. The setting was a stratified sample of all US community hospitals. Participants were female patients who were diagnosed as having breast cancer or were at high risk for breast cancer and underwent autologous free flap breast reconstruction. Variables of interest included demographic data, hospital characteristics, length of stay, complications (surgical and systemic), and inpatient cost. The study used univariate and generalized linear mixed models to examine associations between patient and hospital characteristics and cost. A total of 3302 patients were included in the study, with a median age of 50 years (interquartile range, 44-57 years). The mean cost for autologous free flap breast reconstruction was $22 677 (interquartile range, $14 907-$33 391). Flap reconstructions performed at high-volume hospitals were significantly more costly than those performed at low-volume hospitals ($24 360 vs $18 918, P Logistic regression demonstrated that hospital volume correlated with increased cost (Exp[β], 1.06; 95% CI, 1.02-1.11; P = .003). Fewer surgical complications (16.4% [169 of 1029] vs 23.7% [278 of 1174], P cost variation among patients undergoing autologous free flap breast reconstruction. Experience, as measured by a hospital's volume, provides quality health care with fewer complications but is more costly. Longer length of stay contributed to regional

  9. Variation in semen parameters derived from computer-aided semen analysis, within donors and between donors

    NARCIS (Netherlands)

    Wijchman, JG; De Wolf, BTHM; Graaff, R; Arts, EGJM

    2001-01-01

    The development of computer-aided semen analysis (CASA) has made it possible to study sperm motility characteristics objectively and longitudinally. In this 2-year study of 8 sperm donors, we used CASA to measure 7 semen parameters (concentration, percentage of motile spermatozoa, curvilinear

  10. Error analysis of marker-based object localization using a single-plane XRII

    International Nuclear Information System (INIS)

    Habets, Damiaan F.; Pollmann, Steven I.; Yuan, Xunhua; Peters, Terry M.; Holdsworth, David W.

    2009-01-01

    The role of imaging and image guidance is increasing in surgery and therapy, including treatment planning and follow-up. Fluoroscopy is used for two-dimensional (2D) guidance or localization; however, many procedures would benefit from three-dimensional (3D) guidance or localization. Three-dimensional computed tomography (CT) using a C-arm mounted x-ray image intensifier (XRII) can provide high-quality 3D images; however, patient dose and the required acquisition time restrict the number of 3D images that can be obtained. C-arm based 3D CT is therefore limited in applications for x-ray based image guidance or dynamic evaluations. 2D-3D model-based registration, using a single-plane 2D digital radiographic system, does allow for rapid 3D localization. It is our goal to investigate - over a clinically practical range - the impact of x-ray exposure on the resulting range of 3D localization precision. In this paper it is assumed that the tracked instrument incorporates a rigidly attached 3D object with a known configuration of markers. A 2D image is obtained by a digital fluoroscopic x-ray system and corrected for XRII distortions (±0.035 mm) and mechanical C-arm shift (±0.080 mm). A least-square projection-Procrustes analysis is then used to calculate the 3D position using the measured 2D marker locations. The effect of x-ray exposure on the precision of 2D marker localization and on 3D object localization was investigated using numerical simulations and x-ray experiments. The results show a nearly linear relationship between 2D marker localization precision and the 3D localization precision. However, a significant amplification of error, nonuniformly distributed among the three major axes, occurs, and that is demonstrated. To obtain a 3D localization error of less than ±1.0 mm for an object with 20 mm marker spacing, the 2D localization precision must be better than ±0.07 mm. This requirement was met for all investigated nominal x-ray exposures at 28 cm FOV, and

  11. Study of Seasonal Variation in Groundwater Quality of Sagar City (India by Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Hemant Pathak

    2011-01-01

    Full Text Available Groundwater is one of the major resources of the drinking water in Sagar city (India.. In this study 15 sampling station were selected for the investigations on 14 chemical parameters. The work was carried out during different months of the pre-monsoon, monsoon and post-monsoon seasons in June 2009 to June 2010. The multivariate statistics such as principal component and cluster analysis were applied to the datasets to investigate seasonal variations in groundwater quality. Principal axis factoring has been used to observe the mode of association of parameters and their interrelationships, for evaluating water quality. Average value of BOD, COD, ammonia and iron was high during entire study period. Elevated values of BOD and ammonia in monsoon, slightly more value of BOD in post-monsoon, BOD, ammonia and iron in pre-monsoon period reflected contribution on temporal effect on groundwater. Results of principal component analysis evinced that all the parameters equally and significantly contribute to groundwater quality variations. Factor 1 and factor 2 analysis revealed the DO value deteriorate due to organic load (BOD/Ammonia in different seasons. Hierarchical cluster analysis grouped 15 stations into four clusters in monsoon, five clusters in post-monsoon and five clusters in pre-monsoon with similar water quality features. Clustered group at monsoon, post-monsoon and pre-monsoon consisted one station exhibiting significant spatial variation in physicochemical composition. The anthropogenic nitrogenous species, as fallout from modernization activities. The study indicated that the groundwater sufficiently well oxygenated and nutrient-rich in study places.

  12. An Exploration and Analysis of the Relationships among Object Oriented Programming, Hypermedia, and Hypertalk.

    Science.gov (United States)

    Milet, Lynn K.; Harvey, Francis A.

    Hypermedia and object oriented programming systems (OOPs) represent examples of "open" computer environments that allow the user access to parts of the code or operating system. Both systems share fundamental intellectual concepts (objects, messages, methods, classes, and inheritance), so that an understanding of hypermedia can help in…

  13. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    Science.gov (United States)

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  14. Forecast skill score assessment of a relocatable ocean prediction system, using a simplified objective analysis method

    Science.gov (United States)

    Onken, Reiner

    2017-11-01

    A relocatable ocean prediction system (ROPS) was employed to an observational data set which was collected in June 2014 in the waters to the west of Sardinia (western Mediterranean) in the framework of the REP14-MED experiment. The observational data, comprising more than 6000 temperature and salinity profiles from a fleet of underwater gliders and shipborne probes, were assimilated in the Regional Ocean Modeling System (ROMS), which is the heart of ROPS, and verified against independent observations from ScanFish tows by means of the forecast skill score as defined by Murphy(1993). A simplified objective analysis (OA) method was utilised for assimilation, taking account of only those profiles which were located within a predetermined time window W. As a result of a sensitivity study, the highest skill score was obtained for a correlation length scale C = 12.5 km, W = 24 h, and r = 1, where r is the ratio between the error of the observations and the background error, both for temperature and salinity. Additional ROPS runs showed that (i) the skill score of assimilation runs was mostly higher than the score of a control run without assimilation, (i) the skill score increased with increasing forecast range, and (iii) the skill score for temperature was higher than the score for salinity in the majority of cases. Further on, it is demonstrated that the vast number of observations can be managed by the applied OA method without data reduction, enabling timely operational forecasts even on a commercially available personal computer or a laptop.

  15. EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. T. L. Estomata

    2012-07-01

    Full Text Available Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU, which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA, which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05.

  16. Analysis Of Tourism Object Demand In The Pekanbaru City With Travel Cost Method

    Directory of Open Access Journals (Sweden)

    Eriyati

    2017-11-01

    Full Text Available The tourism sector gets attention when world oil prices are decreasing. It can not be denied that during this time the largest contribution of Pekanbaru city revenue from profit-sharing funding comes from the oil and gas sector. Currently Pekanbaru revenue is small from the oil and gas sector as oil prices continue to decline. The existence of Pekanbaru City away from the coast and mountains causing focus on the development of artificial attractions such as Alam Mayang artificial lake Bandar Kayangan Lembah Sari Pekanbaru Mosque and the tomb of the founder of Pekanbaru city. Many people bring families visiting artificial tourist attractions on weekends and holidays.This study aims to determine the factors that affect the demand and economic value of tourist attractions in Kota Pekanbaru with Travel Cost Method. Sampling non probability as much as 100 respondents visitor attraction in Pekanbaru City of population 224896 people with sampling technique using slovin formula data analysis method used in this research is descriptive quantitative method. From the results of research states that the factors that influence the demand for tourist attraction in the city of Pekanbaru is income cost and distance. The economic value of tourism object of Pekanbaru city with cost of travel method is Rp42.679.638.400 per year. This means that the price given by a person to something at a certain place and time with the size of the price specified by time goods or money that will be sacrificed by someone to own or use goods and services in want.

  17. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  18. Monitoring Urban Tree Cover Using Object-Based Image Analysis and Public Domain Remotely Sensed Data

    Directory of Open Access Journals (Sweden)

    Meghan Halabisky

    2011-10-01

    Full Text Available Urban forest ecosystems provide a range of social and ecological services, but due to the heterogeneity of these canopies their spatial extent is difficult to quantify and monitor. Traditional per-pixel classification methods have been used to map urban canopies, however, such techniques are not generally appropriate for assessing these highly variable landscapes. Landsat imagery has historically been used for per-pixel driven land use/land cover (LULC classifications, but the spatial resolution limits our ability to map small urban features. In such cases, hyperspatial resolution imagery such as aerial or satellite imagery with a resolution of 1 meter or below is preferred. Object-based image analysis (OBIA allows for use of additional variables such as texture, shape, context, and other cognitive information provided by the image analyst to segment and classify image features, and thus, improve classifications. As part of this research we created LULC classifications for a pilot study area in Seattle, WA, USA, using OBIA techniques and freely available public aerial photography. We analyzed the differences in accuracies which can be achieved with OBIA using multispectral and true-color imagery. We also compared our results to a satellite based OBIA LULC and discussed the implications of per-pixel driven vs. OBIA-driven field sampling campaigns. We demonstrated that the OBIA approach can generate good and repeatable LULC classifications suitable for tree cover assessment in urban areas. Another important finding is that spectral content appeared to be more important than spatial detail of hyperspatial data when it comes to an OBIA-driven LULC.

  19. Pricing index-based catastrophe bonds: Part 2: Object-oriented design issues and sensitivity analysis

    Science.gov (United States)

    Unger, André J. A.

    2010-02-01

    This work is the second installment in a two-part series, and focuses on object-oriented programming methods to implement an augmented-state variable approach to aggregate the PCS index and introduce the Bermudan-style call feature into the proposed CAT bond model. The PCS index is aggregated quarterly using a discrete Asian running-sum formulation. The resulting aggregate PCS index augmented-state variable is used to specify the payoff (principle) on the CAT bond based on reinsurance layers. The purpose of the Bermudan-style call option is to allow the reinsurer to minimize their interest rate risk exposure on making fixed coupon payments under prevailing interest rates. A sensitivity analysis is performed to determine the impact of uncertainty in the frequency and magnitude of hurricanes on the price of the CAT bond. Results indicate that while the CAT bond is highly sensitive to the natural variability in the frequency of landfalling hurricanes between El Ninõ and non-El Ninõ years, it remains relatively insensitive to uncertainty in the magnitude of damages. In addition, results indicate that the maximum price of the CAT bond is insensitive to whether it is engineered to cover low frequency high magnitude events in a 'high' reinsurance layer relative to high frequency low magnitude events in a 'low' reinsurance layer. Also, while it is possible for the reinsurer to minimize their interest rate risk exposure on the fixed coupon payments, the impact of this risk on the price of the CAT bond appears small relative to the natural variability in the CAT bond price, and consequently catastrophic risk, due to uncertainty in the frequency and magnitude of landfalling hurricanes.

  20. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants

  1. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis; FINAL

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants

  2. Feature extraction and selection for objective gait analysis and fall risk assessment by accelerometry

    Directory of Open Access Journals (Sweden)

    Cremer Gerald

    2011-01-01

    Full Text Available Abstract Background Falls in the elderly is nowadays a major concern because of their consequences on elderly general health and moral states. Moreover, the aging of the population and the increasing life expectancy make the prediction of falls more and more important. The analysis presented in this article makes a first step in this direction providing a way to analyze gait and classify hospitalized elderly fallers and non-faller. This tool, based on an accelerometer network and signal processing, gives objective informations about the gait and does not need any special gait laboratory as optical analysis do. The tool is also simple to use by a non expert and can therefore be widely used on a large set of patients. Method A population of 20 hospitalized elderlies was asked to execute several classical clinical tests evaluating their risk of falling. They were also asked if they experienced any fall in the last 12 months. The accelerations of the limbs were recorded during the clinical tests with an accelerometer network distributed on the body. A total of 67 features were extracted from the accelerometric signal recorded during a simple 25 m walking test at comfort speed. A feature selection algorithm was used to select those able to classify subjects at risk and not at risk for several classification algorithms types. Results The results showed that several classification algorithms were able to discriminate people from the two groups of interest: fallers and non-fallers hospitalized elderlies. The classification performances of the used algorithms were compared. Moreover a subset of the 67 features was considered to be significantly different between the two groups using a t-test. Conclusions This study gives a method to classify a population of hospitalized elderlies in two groups: at risk of falling or not at risk based on accelerometric data. This is a first step to design a risk of falling assessment system that could be used to provide

  3. Analysis of disease-associated objects at the Rat Genome Database

    Science.gov (United States)

    Wang, Shur-Jen; Laulederkind, Stanley J. F.; Hayman, G. T.; Smith, Jennifer R.; Petri, Victoria; Lowry, Timothy F.; Nigam, Rajni; Dwinell, Melinda R.; Worthey, Elizabeth A.; Munzenmaier, Diane H.; Shimoyama, Mary; Jacob, Howard J.

    2013-01-01

    The Rat Genome Database (RGD) is the premier resource for genetic, genomic and phenotype data for the laboratory rat, Rattus norvegicus. In addition to organizing biological data from rats, the RGD team focuses on manual curation of gene–disease associations for rat, human and mouse. In this work, we have analyzed disease-associated strains, quantitative trait loci (QTL) and genes from rats. These disease objects form the basis for seven disease portals. Among disease portals, the cardiovascular disease and obesity/metabolic syndrome portals have the highest number of rat strains and QTL. These two portals share 398 rat QTL, and these shared QTL are highly concentrated on rat chromosomes 1 and 2. For disease-associated genes, we performed gene ontology (GO) enrichment analysis across portals using RatMine enrichment widgets. Fifteen GO terms, five from each GO aspect, were selected to profile enrichment patterns of each portal. Of the selected biological process (BP) terms, ‘regulation of programmed cell death’ was the top enriched term across all disease portals except in the obesity/metabolic syndrome portal where ‘lipid metabolic process’ was the most enriched term. ‘Cytosol’ and ‘nucleus’ were common cellular component (CC) annotations for disease genes, but only the cancer portal genes were highly enriched with ‘nucleus’ annotations. Similar enrichment patterns were observed in a parallel analysis using the DAVID functional annotation tool. The relationship between the preselected 15 GO terms and disease terms was examined reciprocally by retrieving rat genes annotated with these preselected terms. The individual GO term–annotated gene list showed enrichment in physiologically related diseases. For example, the ‘regulation of blood pressure’ genes were enriched with cardiovascular disease annotations, and the ‘lipid metabolic process’ genes with obesity annotations. Furthermore, we were able to enhance enrichment of neurological

  4. Concept Maps as Instructional Tools for Improving Learning of Phase Transitions in Object-Oriented Analysis and Design

    Science.gov (United States)

    Shin, Shin-Shing

    2016-01-01

    Students attending object-oriented analysis and design (OOAD) courses typically encounter difficulties transitioning from requirements analysis to logical design and then to physical design. Concept maps have been widely used in studies of user learning. The study reported here, based on the relationship of concept maps to learning theory and…

  5. Medical Assistance in Dying in Canada: An Ethical Analysis of Conscientious and Religious Objections

    Directory of Open Access Journals (Sweden)

    Christie, Timothy

    2016-08-01

    Full Text Available Background: The Supreme Court of Canada (SCC has ruled that the federal government is required to remove the provisions of the Criminal Code of Canada that prohibit medical assistance in dying (MAID. The SCC has stipulated that individual physicians will not be required to provide MAID should they have a religious or conscientious objection. Therefore, the pending legislative response will have to balance the rights of the patients with the rights of physicians, other health care professionals, and objecting institutions. Objective: The objective of this paper is to critically assess, within the Canadian context, the moral probity of individual or institutional objections to MAID that are for either religious or conscientious reasons. Methods: Deontological ethics and the Doctrine of Double Effect. Results: The religious or conscientious objector has conflicting duties, i.e., a duty to respect the “right to life” (section 7 of the Charter and a duty to respect the tenets of his or her religious or conscientious beliefs (protected by section 2 of the Charter. Conclusion: The discussion of religious or conscientious objections to MAID has not explicitly considered the competing duties of the conscientious objector. It has focussed on the fact that a conscientious objection exists and has ignored the normative question of whether the duty to respect one’s conscience or religion supersedes the duty to respect the patient’s right to life.

  6. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Energy Technology Data Exchange (ETDEWEB)

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  7. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    International Nuclear Information System (INIS)

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  8. A Case Study on Coloured Petri Nets in Object-oriented Analysis and Design

    DEFF Research Database (Denmark)

    Barros, Joao Paulo; Jørgensen, Jens Bæk

    2005-01-01

    is structurally and conceptually closer to class diagrams and object-oriented programming languages. The CPN models reduce the gap between user-level requirements and the respective implementation, thus simplifying the implementation or code generation. Finally, we discuss the code generation from object-oriented......In this paper, we first demonstrate how a coloured Petri nets (CPN) model can be used to capture requirements for a considered example system, an elevator controller. Then, we show how this requirements-level CPN model is transformed into a design-level object-oriented CPN model, which...

  9. Relationship between climatic variables and the variation in bulk tank milk composition using canonical correlation analysis.

    Science.gov (United States)

    Stürmer, Morgana; Busanello, Marcos; Velho, João Pedro; Heck, Vanessa Isabel; Haygert-Velho, Ione Maria Pereira

    2018-06-04

    A number of studies have addressed the relations between climatic variables and milk composition, but these works used univariate statistical approaches. In our study, we used a multivariate approach (canonical correlation) to study the impact of climatic variables on milk composition, price, and monthly milk production at a dairy farm using bulk tank milk data. Data on milk composition, price, and monthly milk production were obtained from a dairy company that purchased the milk from the farm, while climatic variable data were obtained from the National Institute of Meteorology (INMET). The data are from January 2014 to December 2016. Univariate correlation analysis and canonical correlation analysis were performed. Few correlations between the climatic variables and milk composition were found using a univariate approach. However, using canonical correlation analysis, we found a strong and significant correlation (r c  = 0.95, p value = 0.0029). Lactose, ambient temperature measures (mean, minimum, and maximum), and temperature-humidity index (THI) were found to be the most important variables for the canonical correlation. Our study indicated that 10.2% of the variation in milk composition, pricing, and monthly milk production can be explained by climatic variables. Ambient temperature variables, together with THI, seem to have the most influence on variation in milk composition.

  10. A genome-wide analysis of putative functional and exonic variation associated with extremely high intelligence.

    Science.gov (United States)

    Spain, S L; Pedroso, I; Kadeva, N; Miller, M B; Iacono, W G; McGue, M; Stergiakouli, E; Davey Smith, G; Putallaz, M; Lubinski, D; Meaburn, E L; Plomin, R; Simpson, M A

    2016-08-01

    Although individual differences in intelligence (general cognitive ability) are highly heritable, molecular genetic analyses to date have had limited success in identifying specific loci responsible for its heritability. This study is the first to investigate exome variation in individuals of extremely high intelligence. Under the quantitative genetic model, sampling from the high extreme of the distribution should provide increased power to detect associations. We therefore performed a case-control association analysis with 1409 individuals drawn from the top 0.0003 (IQ >170) of the population distribution of intelligence and 3253 unselected population-based controls. Our analysis focused on putative functional exonic variants assayed on the Illumina HumanExome BeadChip. We did not observe any individual protein-altering variants that are reproducibly associated with extremely high intelligence and within the entire distribution of intelligence. Moreover, no significant associations were found for multiple rare alleles within individual genes. However, analyses using genome-wide similarity between unrelated individuals (genome-wide complex trait analysis) indicate that the genotyped functional protein-altering variation yields a heritability estimate of 17.4% (s.e. 1.7%) based on a liability model. In addition, investigation of nominally significant associations revealed fewer rare alleles associated with extremely high intelligence than would be expected under the null hypothesis. This observation is consistent with the hypothesis that rare functional alleles are more frequently detrimental than beneficial to intelligence.

  11. Quasi-static Cycle Performance Analysis of Micro Modular Reactor for Heat Sink Temperature Variation

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Seong Kuk; Lee, Jekyoung; Ahn, Yoonhan; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of); Cha, Jae Eun [KAERI, Daejeon (Korea, Republic of)

    2015-10-15

    A Supercritical CO{sub 2} (S-CO{sub 2}) cycle has potential for high thermal efficiency in the moderate turbine inlet temperature (450 - 750 .deg. C) and achieving compact system size because of small specific volume and simple cycle layouts. Owing to small specific volume of S-CO{sub 2} and the development of heat exchanger technology, it can accomplish complete modularization of the system. The previous works focused on the cycle performance analysis for the design point only. However, the heat sink temperature can be changed depending on the ambient atmosphere condition, i.e. weather, seasonal change. This can influence the compressor inlet temperature, which alters the cycle operating condition overall. To reflect the heat sink temperature variation, a quasi-static analysis code for a simple recuperated S-CO{sub 2} Brayton cycle has been developed by the KAIST research team. Thus, cycle performance analysis is carried out with a compressor inlet temperature variation in this research. In the case of dry air-cooling system, the ambient temperature of the local surrounding can affect the compressor inlet temperature. As the compressor inlet temperature increases, thermal efficiency and generated electricity decrease. As further works, the experiment of S-CO{sub 2} integral test loop will be performed to validate in-house codes, such as KAIST{sub T}MD and the quasi-static code.

  12. Analysis of Long-Term Temperature Variations in the Human Body.

    Science.gov (United States)

    Dakappa, Pradeepa Hoskeri; Mahabala, Chakrapani

    2015-01-01

    Body temperature is a continuous physiological variable. In normal healthy adults, oral temperature is estimated to vary between 36.1°C and 37.2°C. Fever is a complex host response to many external and internal agents and is a potential contributor to many clinical conditions. Despite being one of the foremost vital signs, temperature and its analysis and variations during many pathological conditions has yet to be examined in detail using mathematical techniques. Classical fever patterns based on recordings obtained every 8-12 h have been developed. However, such patterns do not provide meaningful information in diagnosing diseases. Because fever is a host response, it is likely that there could be a unique response to specific etiologies. Continuous long-term temperature monitoring and pattern analysis using specific analytical methods developed in engineering and physics could aid in revealing unique fever responses of hosts and in different clinical conditions. Furthermore, such analysis can potentially be used as a novel diagnostic tool and to study the effect of pharmaceutical agents and other therapeutic protocols. Thus, the goal of our article is to present a comprehensive review of the recent relevant literature and analyze the current state of research regarding temperature variations in the human body.

  13. Analysis of temporal variation in human masticatory cycles during gum chewing.

    Science.gov (United States)

    Crane, Elizabeth A; Rothman, Edward D; Childers, David; Gerstner, Geoffrey E

    2013-10-01

    The study investigated modulation of fast and slow opening (FO, SO) and closing (FC, SC) chewing cycle phases using gum-chewing sequences in humans. Twenty-two healthy adult subjects participated by chewing gum for at least 20s on the right side and at least 20s on the left side while jaw movements were tracked with a 3D motion analysis system. Jaw movement data were digitized, and chewing cycle phases were identified and analysed for all chewing cycles in a complete sequence. All four chewing cycle phase durations were more variant than total cycle durations, a result found in other non-human primates. Significant negative correlations existed between the opening phases, SO and FO, and between the closing phases, SC and FC; however, there was less consistency in terms of which phases were negatively correlated both between subjects, and between chewing sides within subjects, compared with results reported in other species. The coordination of intra-cycle phases appears to be flexible and to follow complex rules during gum-chewing in humans. Alternatively, the observed intra-cycle phase relationships could simply reflect: (1) variation in jaw kinematics due to variation in how gum was handled by the tongue on a chew-by-chew basis in our experimental design or (2) by variation due to data sampling noise and/or how phases were defined and identified. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Analysis of substructural variation in families of enzymatic proteins with applications to protein function prediction

    Directory of Open Access Journals (Sweden)

    Fofanov Viacheslav Y

    2010-05-01

    Full Text Available Abstract Background Structural variations caused by a wide range of physico-chemical and biological sources directly influence the function of a protein. For enzymatic proteins, the structure and chemistry of the catalytic binding site residues can be loosely defined as a substructure of the protein. Comparative analysis of drug-receptor substructures across and within species has been used for lead evaluation. Substructure-level similarity between the binding sites of functionally similar proteins has also been used to identify instances of convergent evolution among proteins. In functionally homologous protein families, shared chemistry and geometry at catalytic sites provide a common, local point of comparison among proteins that may differ significantly at the sequence, fold, or domain topology levels. Results This paper describes two key results that can be used separately or in combination for protein function analysis. The Family-wise Analysis of SubStructural Templates (FASST method uses all-against-all substructure comparison to determine Substructural Clusters (SCs. SCs characterize the binding site substructural variation within a protein family. In this paper we focus on examples of automatically determined SCs that can be linked to phylogenetic distance between family members, segregation by conformation, and organization by homology among convergent protein lineages. The Motif Ensemble Statistical Hypothesis (MESH framework constructs a representative motif for each protein cluster among the SCs determined by FASST to build motif ensembles that are shown through a series of function prediction experiments to improve the function prediction power of existing motifs. Conclusions FASST contributes a critical feedback and assessment step to existing binding site substructure identification methods and can be used for the thorough investigation of structure-function relationships. The application of MESH allows for an automated

  15. Analysis of the average daily radon variations in the soil air

    International Nuclear Information System (INIS)

    Holy, K.; Matos, M.; Boehm, R.; Stanys, T.; Polaskova, A.; Hola, O.

    1998-01-01

    In this contribution the search of the relation between the daily variations of the radon concentration and the regular daily oscillations of the atmospheric pressure are presented. The deviation of the radon activity concentration in the soil air from the average daily value reaches only a few percent. For the dry summer months the average daily course of the radon activity concentration can be described by the obtained equation. The analysis of the average daily courses could give the information concerning the depth of the gas permeable soil layer. The soil parameter is determined by others method with difficulty

  16. Detecting peatland drains with Object Based Image Analysis and Geoeye-1 imagery.

    Science.gov (United States)

    Connolly, J; Holden, N M

    2017-12-01

    Peatlands play an important role in the global carbon cycle. They provide important ecosystem services including carbon sequestration and storage. Drainage disturbs peatland ecosystem services. Mapping drains is difficult and expensive and their spatial extent is, in many cases, unknown. An object based image analysis (OBIA) was performed on a very high resolution satellite image (Geoeye-1) to extract information about drain location and extent on a blanket peatland in Ireland. Two accuracy assessment methods: Error matrix and the completeness, correctness and quality (CCQ) were used to assess the extracted data across the peatland and at several sub sites. The cost of the OBIA method was compared with manual digitisation and field survey. The drain maps were also used to assess the costs relating to blocking drains vs. a business-as-usual scenario and estimating the impact of each on carbon fluxes at the study site. The OBIA method performed well at almost all sites. Almost 500 km of drains were detected within the peatland. In the error matrix method, overall accuracy (OA) of detecting the drains was 94% and the kappa statistic was 0.66. The OA for all sub-areas, except one, was 95-97%. The CCQ was 85%, 85% and 71% respectively. The OBIA method was the most cost effective way to map peatland drains and was at least 55% cheaper than either field survey or manual digitisation, respectively. The extracted drain maps were used constrain the study area CO 2 flux which was 19% smaller than the prescribed Peatland Code value for drained peatlands. The OBIA method used in this study showed that it is possible to accurately extract maps of fine scale peatland drains over large areas in a cost effective manner. The development of methods to map the spatial extent of drains is important as they play a critical role in peatland carbon dynamics. The objective of this study was to extract data on the spatial extent of drains on a blanket bog in the west of Ireland. The

  17. Detecting peatland drains with Object Based Image Analysis and Geoeye-1 imagery

    Directory of Open Access Journals (Sweden)

    J. Connolly

    2017-03-01

    Full Text Available Abstract Background Peatlands play an important role in the global carbon cycle. They provide important ecosystem services including carbon sequestration and storage. Drainage disturbs peatland ecosystem services. Mapping drains is difficult and expensive and their spatial extent is, in many cases, unknown. An object based image analysis (OBIA was performed on a very high resolution satellite image (Geoeye-1 to extract information about drain location and extent on a blanket peatland in Ireland. Two accuracy assessment methods: Error matrix and the completeness, correctness and quality (CCQ were used to assess the extracted data across the peatland and at several sub sites. The cost of the OBIA method was compared with manual digitisation and field survey. The drain maps were also used to assess the costs relating to blocking drains vs. a business-as-usual scenario and estimating the impact of each on carbon fluxes at the study site. Results The OBIA method performed well at almost all sites. Almost 500 km of drains were detected within the peatland. In the error matrix method, overall accuracy (OA of detecting the drains was 94% and the kappa statistic was 0.66. The OA for all sub-areas, except one, was 95–97%. The CCQ was 85%, 85% and 71% respectively. The OBIA method was the most cost effective way to map peatland drains and was at least 55% cheaper than either field survey or manual digitisation, respectively. The extracted drain maps were used constrain the study area CO2 flux which was 19% smaller than the prescribed Peatland Code value for drained peatlands. Conclusions The OBIA method used in this study showed that it is possible to accurately extract maps of fine scale peatland drains over large areas in a cost effective manner. The development of methods to map the spatial extent of drains is important as they play a critical role in peatland carbon dynamics. The objective of this study was to extract data on the spatial extent of

  18. Objective Acoustic-Phonetic Speech Analysis in Patients Treated for Oral or Oropharyngeal Cancer

    NARCIS (Netherlands)

    de Bruijn, Marieke J.; ten Bosch, Louis; Kuik, Dirk J.; Quene, Hugo; Langendijk, Johannes A.; Leemans, C. Rene; Verdonck-de Leeuw, Irma M.

    2009-01-01

    Objective: Speech impairment often occurs in patients after treatment for head and neck cancer. New treatment modalities such as surgical reconstruction or (chemo) radiation techniques aim at sparing anatomical structures that are correlated with speech and swallowing. In randomized trials

  19. Learning Objectives and Testing: An Analysis of Six Principles of Economics Textbooks, Using Bloom's Taxonomy.

    Science.gov (United States)

    Karns, James M. L.; And Others

    1983-01-01

    Significant differences were found between the stated objectives of most college level economics textbooks and the instruments included in the instructor's manuals to measure student achievement. (Author/RM)

  20. Analysis of students’ spatial thinking in geometry: 3D object into 2D representation

    Science.gov (United States)

    Fiantika, F. R.; Maknun, C. L.; Budayasa, I. K.; Lukito, A.

    2018-05-01

    The aim of this study is to find out the spatial thinking process of students in transforming 3-dimensional (3D) object to 2-dimensional (2D) representation. Spatial thinking is helpful in using maps, planning routes, designing floor plans, and creating art. The student can engage geometric ideas by using concrete models and drawing. Spatial thinking in this study is identified through geometrical problems of transforming a 3-dimensional object into a 2-dimensional object image. The problem was resolved by the subject and analyzed by reference to predetermined spatial thinking indicators. Two representative subjects of elementary school were chosen based on mathematical ability and visual learning style. Explorative description through qualitative approach was used in this study. The result of this study are: 1) there are different representations of spatial thinking between a boy and a girl object, 2) the subjects has their own way to invent the fastest way to draw cube net.

  1. Analysis of double support phase of biped robot and multi-objective ...

    Indian Academy of Sciences (India)

    ing objectives, namely power consumption and dynamic balance margin have been ... in detail to arrive at a complete knowledge of the biped walking systems on .... measured in the anti-clockwise sense with respect to the vertical axis.

  2. DGTD Analysis of Electromagnetic Scattering from Penetrable Conductive Objects with IBC

    KAUST Repository

    Li, Ping; Shi, Yifei; Jiang, Li; Bagci, Hakan

    2015-01-01

    To avoid straightforward volumetric discretization, a discontinuous Galerkin time-domain (DGTD) method integrated with the impedance boundary condition (IBC) is presented in this paper to analyze the scattering from objects with finite conductivity

  3. ANALYSIS ON THE VARIATION OF MEDIAL ROTATION VALUES ACCORDING TO THE POSITION OF THE HUMERAL DIAPHYSIS.

    Science.gov (United States)

    Miyazaki, Alberto Naoki; Fregoneze, Marcelo; Santos, Pedro Doneux; da Silva, Luciana Andrade; do Val Sella, Guilherme; Cohen, Carina; Busin Giora, Taís Stedile; Checchia, Sergio Luiz; Raia, Fabio; Pekelman, Hélio; Cymrot, Raquel

    2012-01-01

    To analyze the validity of measurements of medial rotation (MR) of the shoulder, using vertebral levels, according to the variation in the position of the humeral diaphysis, and to test the bi-goniometer as a new measuring instrument. 140 shoulders (70 patients) were prospectively evaluated in cases presenting unilateral shoulder MR limitation. The vertebral level was evaluated by means of a visual scale and was correlated with the angle obtained according to the position of the humeral diaphysis, using the bi-goniometer developed with the Department of Mechanical Engineering of Mackenzie University. The maximum vertebral level reached through MR on the unaffected side ranged from T3 to T12, and on the affected side, from T6 to the trochanter. Repositioning of the affected limb in MR according to the angular values on the normal side showed that 57.13% of the patients reached lower levels, between the sacrum, gluteus and trochanter. From analysis on the maximum vertebral level attained and the variation between the affected angle x (frontal plane: abduction and MR of the shoulder) and the unaffected angle x in MR, we observed that the greater the angle of the diaphyseal axis was, the lower the variation in the vertebral level attained was. From evaluating the linear correlation between the variables of difference in maximum vertebral level reached and variation in the affected angle y (extension and abduction of the shoulder) and the unaffected angle y in MR, we observed that there was no well-established linear relationship between these variables. Measurement of MR using vertebral levels does not correspond to the real values, since it varies according to the positioning of the humeral diaphysis.

  4. Meta-Analysis of Mitochondrial DNA Variation in the Iberian Peninsula.

    Directory of Open Access Journals (Sweden)

    Ruth Barral-Arca

    Full Text Available The Iberian Peninsula has been the focus of attention of numerous studies dealing with mitochondrial DNA (mtDNA variation, most of them targeting the control region segment. In the present study we sequenced the control region of 3,024 Spanish individuals from areas where available data were still limited. We also compiled mtDNA haplotypes from the literature involving 4,588 sequences and 28 population groups or small regions. We meta-analyzed all these data in order to shed further light on patterns of geographic variation, taking advantage of the large sample size and geographic coverage, in contrast with the atomized sampling strategy of previous work. The results indicate that the main mtDNA haplogroups show primarily clinal geographic patterns across the Iberian geography, roughly along a North-South axis. Haplogroup HV0 (where haplogroup U is nested is more prevalent in the Franco Cantabrian region, in good agreement with previous findings that identified this area as a climate refuge during the Last Glacial Maximum (LGM, prior to a subsequent demographic re-expansion towards Central Europe and the Mediterranean. Typical sub-Saharan and North African lineages are slightly more prevalent in South Iberia, although at low frequencies; this pattern has been shaped mainly by the transatlantic slave trade and the Arab invasion of the Iberian Peninsula. The results also indicate that summary statistics that aim to measure molecular variation, or AMOVA, have limited sensitivity to detect population substructure, in contrast to patterns revealed by phylogeographic analysis. Overall, the results suggest that mtDNA variation in Iberia is substantially stratified. These patterns might be relevant in biomedical studies given that stratification is a common cause of false positives in case-control mtDNA association studies, and should be also considered when weighting the DNA evidence in forensic casework, which is strongly dependent on haplotype

  5. Meta-Analysis of Mitochondrial DNA Variation in the Iberian Peninsula.

    Science.gov (United States)

    Barral-Arca, Ruth; Pischedda, Sara; Gómez-Carballa, Alberto; Pastoriza, Ana; Mosquera-Miguel, Ana; López-Soto, Manuel; Martinón-Torres, Federico; Álvarez-Iglesias, Vanesa; Salas, Antonio

    2016-01-01

    The Iberian Peninsula has been the focus of attention of numerous studies dealing with mitochondrial DNA (mtDNA) variation, most of them targeting the control region segment. In the present study we sequenced the control region of 3,024 Spanish individuals from areas where available data were still limited. We also compiled mtDNA haplotypes from the literature involving 4,588 sequences and 28 population groups or small regions. We meta-analyzed all these data in order to shed further light on patterns of geographic variation, taking advantage of the large sample size and geographic coverage, in contrast with the atomized sampling strategy of previous work. The results indicate that the main mtDNA haplogroups show primarily clinal geographic patterns across the Iberian geography, roughly along a North-South axis. Haplogroup HV0 (where haplogroup U is nested) is more prevalent in the Franco Cantabrian region, in good agreement with previous findings that identified this area as a climate refuge during the Last Glacial Maximum (LGM), prior to a subsequent demographic re-expansion towards Central Europe and the Mediterranean. Typical sub-Saharan and North African lineages are slightly more prevalent in South Iberia, although at low frequencies; this pattern has been shaped mainly by the transatlantic slave trade and the Arab invasion of the Iberian Peninsula. The results also indicate that summary statistics that aim to measure molecular variation, or AMOVA, have limited sensitivity to detect population substructure, in contrast to patterns revealed by phylogeographic analysis. Overall, the results suggest that mtDNA variation in Iberia is substantially stratified. These patterns might be relevant in biomedical studies given that stratification is a common cause of false positives in case-control mtDNA association studies, and should be also considered when weighting the DNA evidence in forensic casework, which is strongly dependent on haplotype frequencies.

  6. Scientific analysis of a calcified object from a post-medieval burial in Vienna, Austria.

    Science.gov (United States)

    Binder, Michaela; Berner, Margit; Krause, Heike; Kucera, Matthias; Patzak, Beatrix

    2016-09-01

    Calcifications commonly occur in association with soft tissue inflammation. However, they are not often discussed in palaeopathological literature, frequently due to problems of identification and diagnosis. We present a calcified object (40×27×27cm) found with a middle-aged male from a post-medieval cemetery in Vienna. It was not recognized during excavation, thus its anatomical location within the body remains unknown. The object was subject to X-ray, SEM and CT scanning and compared to historic pathological objects held in the collection of the Natural History Museum Vienna. Two of closest resemblance, a thyroid adenoma and goitre were subject to similar analytical techniques for comparison. Despite similarities between all objects, the structure of the object most closely conforms to a thyroid tumor. Nevertheless, due to similar pathophysiological pathways and biochemical composition of calcified soft tissue, a secure identification outside of its anatomical context is not possible. The research further highlights the fact that recognition of such objects during excavation is crucial for a more conclusive diagnosis. Historic medical records indicate that they were common and might therefore be expected to frequently occur in cemeteries. Consequently, an increasing the dataset of calcifications would also aid in extending the knowledge about diseases in past human populations. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience.

    Science.gov (United States)

    Dunstone, Kimberley; Brennan, Emily; Slater, Michael D; Dixon, Helen G; Durkin, Sarah J; Pettigrew, Simone; Wakefield, Melanie A

    2017-04-11

    Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads). Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40%) and the United Kingdom (26%). The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38%) or to behave responsibly and/or not get drunk when drinking (33%). Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of different characteristics of alcohol harm reduction ads. Given

  8. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience

    Directory of Open Access Journals (Sweden)

    Kimberley Dunstone

    2017-04-01

    Full Text Available Abstract Background Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads. Method Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. Results In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40% and the United Kingdom (26%. The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38% or to behave responsibly and/or not get drunk when drinking (33%. Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Conclusions Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of

  9. Land Cover/Land Use Classification and Change Detection Analysis with Astronaut Photography and Geographic Object-Based Image Analysis

    Science.gov (United States)

    Hollier, Andi B.; Jagge, Amy M.; Stefanov, William L.; Vanderbloemen, Lisa A.

    2017-01-01

    For over fifty years, NASA astronauts have taken exceptional photographs of the Earth from the unique vantage point of low Earth orbit (as well as from lunar orbit and surface of the Moon). The Crew Earth Observations (CEO) Facility is the NASA ISS payload supporting astronaut photography of the Earth surface and atmosphere. From aurora to mountain ranges, deltas, and cities, there are over two million images of the Earth's surface dating back to the Mercury missions in the early 1960s. The Gateway to Astronaut Photography of Earth website (eol.jsc.nasa.gov) provides a publically accessible platform to query and download these images at a variety of spatial resolutions and perform scientific research at no cost to the end user. As a demonstration to the science, application, and education user communities we examine astronaut photography of the Washington D.C. metropolitan area for three time steps between 1998 and 2016 using Geographic Object-Based Image Analysis (GEOBIA) to classify and quantify land cover/land use and provide a template for future change detection studies with astronaut photography.

  10. Challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials

    Directory of Open Access Journals (Sweden)

    Yunbo Chu

    2016-10-01

    Full Text Available Economic evaluation in the form of cost-effectiveness analysis has become a popular means to inform decisions in healthcare. With multi-regional clinical trials in a global development program becoming a new venue for drug efficacy testing in recent decades, questions in methods for cost-effectiveness analysis in the multi-regional clinical trials setting also emerge. This paper addresses some challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials. Several discussion points are raised for further attention and a multi-regional clinical trial example is presented to illustrate the implications in industrial application. A general message is delivered to call for a depth discussion by all stakeholders to reach an agreement on a good practice in cost-effectiveness analysis in the multi-regional clinical trials. Meanwhile, we recommend an additional consideration of cost-effectiveness analysis results based on the clinical evidence from a certain homogeneous population as sensitivity or scenario analysis upon data availability.

  11. Anatomical variations of hepatic arterial system, coeliac trunk and renal arteries: an analysis with multidetector CT angiography.

    Science.gov (United States)

    Ugurel, M S; Battal, B; Bozlar, U; Nural, M S; Tasar, M; Ors, F; Saglam, M; Karademir, I

    2010-08-01

    The purpose of our investigation was to determine the anatomical variations in the coeliac trunk-hepatic arterial system and the renal arteries in patients who underwent multidetector CT (MDCT) angiography of the abdominal aorta for various reasons. A total of 100 patients were analysed retrospectively. The coeliac trunk, hepatic arterial system and renal arteries were analysed individually and anatomical variations were recorded. Statistical analysis of the relationship between hepatocoeliac variations and renal artery variations was performed using a chi(2) test. There was a coeliac trunk trifurcation in 89% and bifurcation in 8% of the cases. Coeliac trunk was absent in 1%, a hepatosplenomesenteric trunk was seen in 1% and a splenomesenteric trunk was present in 1%. Hepatic artery variation was present in 48% of patients. Coeliac trunk and/or hepatic arterial variation was present in 23 (39.7%) of the 58 patients with normal renal arteries, and in 27 (64.3%) of the 42 patients with accessory renal arteries. There was a statistically significant correlation between renal artery variations and coeliac trunk-hepatic arterial system variations (p = 0.015). MDCT angiography permits a correct and detailed evaluation of hepatic and renal vascular anatomy. The prevalence of variations in the coeliac trunk and/or hepatic arteries is increased in people with accessory renal arteries. For that reason, when undertaking angiographic examinations directed towards any single organ, the possibility of variations in the vascular structure of other organs should be kept in mind.

  12. Pan-Genome Analysis Links the Hereditary Variation of Leptospirillum ferriphilum With Its Evolutionary Adaptation

    Directory of Open Access Journals (Sweden)

    Xian Zhang

    2018-03-01

    Full Text Available Niche adaptation has long been recognized to drive intra-species differentiation and speciation, yet knowledge about its relatedness with hereditary variation of microbial genomes is relatively limited. Using Leptospirillum ferriphilum species as a case study, we present a detailed analysis of genomic features of five recognized strains. Genome-to-genome distance calculation preliminarily determined the roles of spatial distance and environmental heterogeneity that potentially contribute to intra-species variation within L. ferriphilum species at the genome level. Mathematical models were further constructed to extrapolate the expansion of L. ferriphilum genomes (an ‘open’ pan-genome, indicating the emergence of novel genes with new sequenced genomes. The identification of diverse mobile genetic elements (MGEs (such as transposases, integrases, and phage-associated genes revealed the prevalence of horizontal gene transfer events, which is an important evolutionary mechanism that provides avenues for the recruitment of novel functionalities and further for the genetic divergence of microbial genomes. Comprehensive analysis also demonstrated that the genome reduction by gene loss in a broad sense might contribute to the observed diversification. We thus inferred a plausible explanation to address this observation: the community-dependent adaptation that potentially economizes the limiting resources of the entire community. Now that the introduction of new genes is accompanied by a parallel abandonment of some other ones, our results provide snapshots on the biological fitness cost of environmental adaptation within the L. ferriphilum genomes. In short, our genome-wide analyses bridge the relation between genetic variation of L. ferriphilum with its evolutionary adaptation.

  13. National Variation in Urethroplasty Cost and Predictors of Extreme Cost: A Cost Analysis With Policy Implications.

    Science.gov (United States)

    Harris, Catherine R; Osterberg, E Charles; Sanford, Thomas; Alwaal, Amjad; Gaither, Thomas W; McAninch, Jack W; McCulloch, Charles E; Breyer, Benjamin N

    2016-08-01

    To determine which factors are associated with higher costs of urethroplasty procedure and whether these factors have been increasing over time. Identification of determinants of extreme costs may help reduce cost while maintaining quality. We conducted a retrospective analysis using the 2001-2010 Healthcare Cost and Utilization Project-Nationwide Inpatient Sample (HCUP-NIS). The HCUP-NIS captures hospital charges which we converted to cost using the HCUP cost-to-charge ratio. Log cost linear regression with sensitivity analysis was used to determine variables associated with increased costs. Extreme cost was defined as the top 20th percentile of expenditure, analyzed with logistic regression, and expressed as odds ratios (OR). A total of 2298 urethroplasties were recorded in NIS over the study period. The median (interquartile range) calculated cost was $7321 ($5677-$10,000). Patients with multiple comorbid conditions were associated with extreme costs [OR 1.56, 95% confidence interval (CI) 1.19-2.04, P = .02] compared with patients with no comorbid disease. Inpatient complications raised the odds of extreme costs (OR 3.2, CI 2.14-4.75, P costs (OR 1.78, 95% CI 1.2-2.64, P = .005). Variations in patient age, race, hospital region, bed size, teaching status, payor type, and volume of urethroplasty cases were not associated with extremes of cost. Cost variation for perioperative inpatient urethroplasty procedures is dependent on preoperative patient comorbidities, postoperative complications, and surgical complexity related to graft usage. Procedural cost and cost variation are critical for understanding which aspects of care have the greatest impact on cost. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Analysis of Greedy Decision Making for Geographic Routing for Networks of Randomly Moving Objects

    Directory of Open Access Journals (Sweden)

    Amber Israr

    2016-04-01

    Full Text Available Autonomous and self-organizing wireless ad-hoc communication networks for moving objects consist of nodes, which use no centralized network infrastructure. Examples of moving object networks are networks of flying objects, networks of vehicles, networks of moving people or robots. Moving object networks have to face many critical challenges in terms of routing because of dynamic topological changes and asymmetric networks links. A suitable and effective routing mechanism helps to extend the deployment of moving nodes. In this paper an attempt has been made to analyze the performance of the Greedy Decision method (position aware distance based algorithm for geographic routing for network nodes moving according to the random waypoint mobility model. The widely used GPSR (Greedy Packet Stateless Routing protocol utilizes geographic distance and position based data of nodes to transmit packets towards destination nodes. In this paper different scenarios have been tested to develop a concrete set of recommendations for optimum deployment of distance based Greedy Decision of Geographic Routing in randomly moving objects network

  15. Measuring systems of hard to get objects: problems with analysis of measurement results

    Science.gov (United States)

    Gilewska, Grazyna

    2005-02-01

    The problem accessibility of metrological parameters features of objects appeared in many measurements. Especially if it is biological object which parameters very often determined on the basis of indirect research. Accidental component predominate in forming of measurement results with very limited access to measurement objects. Every measuring process has a lot of conditions limiting its abilities to any way processing (e.g. increase number of measurement repetition to decrease random limiting error). It may be temporal, financial limitations, or in case of biological object, small volume of sample, influence measuring tool and observers on object, or whether fatigue effects e.g. at patient. It's taken listing difficulties into consideration author worked out and checked practical application of methods outlying observation reduction and next innovative methods of elimination measured data with excess variance to decrease of mean standard deviation of measured data, with limited aomunt of data and accepted level of confidence. Elaborated methods wee verified on the basis of measurement results of knee-joint width space got from radiographs. Measurements were carried out by indirectly method on the digital images of radiographs. Results of examination confirmed legitimacy to using of elaborated methodology and measurement procedures. Such methodology has special importance when standard scientific ways didn't bring expectations effects.

  16. Designing personal grief rituals: An analysis of symbolic objects and actions.

    Science.gov (United States)

    Sas, Corina; Coman, Alina

    2016-10-01

    Personal grief rituals are beneficial in dealing with complicated grief, but challenging to design, as they require symbolic objects and actions meeting clients' emotional needs. The authors reported interviews with 10 therapists with expertise in both grief therapy and grief rituals. Findings indicate three types of rituals supporting honoring, letting go, and self transformation, with the latter being particularly complex. Outcomes also point to a taxonomy of ritual objects for framing and remembering ritual experience, and for capturing and processing grief. Besides symbolic possessions, the authors identified other types of ritual objects including transformational and future-oriented ones. Symbolic actions include creative craft of ritual objects, respectful handling, disposal, and symbolic play. They conclude with theoretical implications of these findings, and a reflection on their value for tailored, creative co-design of grief rituals. In particular, several implications for designing grief rituals were identified that include accounting for the client's need, selecting (or creating) the most appropriate objects and actions from the identified types, integrating principles of both grief and art/drama therapy, exploring clients' affinity for the ancient elements as medium of disposal in letting go rituals, and the value of technology for recording and reflecting on ritual experience.

  17. Variation Trend Analysis of Runoff and Sediment Time Series Based on the R/S Analysis of Simulated Loess Tilled Slopes in the Loess Plateau, China

    Directory of Open Access Journals (Sweden)

    Ju Zhang

    2017-12-01

    Full Text Available The objective of this study was to illustrate the temporal variation of runoff and sediment of loess tilled slopes under successive rainfall conditions. Loess tilled slopes with four microtopography types (straight cultivated slope, artificial backhoe, artificial digging, and contour tillage under five slope gradients (5°, 10°, 15°, 20°, 25° were simulated and a rainfall intensity of 60 mm/h was adopted. The temporal trends of runoff and sediment yield were predicted based on the Rescaled Range (R/S analysis method. The results indicate that the Hurst indices of runoff time series and sediment time series are higher than 0.5, and a long-term positive correlation exists between the future and the past. This means that runoff and sediment of loess tilled slopes in the future will have the same trends as in the past. The results obtained by the classical R/S analysis method were the same as those of the modified R/S analysis method. The rationality and reliability of the R/S analysis method were further identified and the method can be used for predicting the trend of runoff and sediment yield. The correlation between the microtopography and the Hurst indices of the runoff and sediment yield time series, as well as between the slopes and the Hurst indices, were tested, and the result was that there was no significant correlation between them. The microtopography and slopes cannot affect the correlation and continuity of runoff and sediment yield time series. This study provides an effective method for predicting variations in the trends of runoff and sediment yield on loess tilled slopes.

  18. [Analysis of genomic copy number variations in two sisters with primary amenorrhea and hyperandrogenism].

    Science.gov (United States)

    Zhang, Yanliang; Xu, Qiuyue; Cai, Xuemei; Li, Yixun; Song, Guibo; Wang, Juan; Zhang, Rongchen; Dai, Yong; Duan, Yong

    2015-12-01

    To analyze genomic copy number variations (CNVs) in two sisters with primary amenorrhea and hyperandrogenism. G-banding was performed for karyotype analysis. The whole genome of the two sisters were scanned and analyzed by array-based comparative genomic hybridization (array-CGH). The results were confirmed with real-time quantitative PCR (RT-qPCR). No abnormality was found by conventional G-banded chromosome analysis. Array-CGH has identified 11 identical CNVs from the sisters which, however, overlapped with CNVs reported by the Database of Genomic Variants (http://projects.tcag.ca/variation/). Therefore, they are likely to be benign. In addition, a -8.44 Mb 9p11.1-p13.1 duplication (38,561,587-47,002,387 bp, hg18) and a -80.9 kb 4q13.2 deletion (70,183,990-70,264,889 bp, hg18) were also detected in the elder and younger sister, respectively. The relationship between such CNVs and primary amenorrhea and hyperandrogenism was however uncertain. RT-qPCR results were in accordance with array-CGH. Two CNVs were detected in two sisters by array-CGH, for which further studies are needed to clarify their correlation with primary amenorrhea and hyperandrogenism.

  19. Some analysis on the diurnal variation of rainfall over the Atlantic Ocean

    Science.gov (United States)

    Gill, T.; Perng, S.; Hughes, A.

    1981-01-01

    Data collected from the GARP Atlantic Tropical Experiment (GATE) was examined. The data were collected from 10,000 grid points arranged as a 100 x 100 array; each grid covered a 4 square km area. The amount of rainfall was measured every 15 minutes during the experiment periods using c-band radars. Two types of analyses were performed on the data: analysis of diurnal variation was done on each of grid points based on the rainfall averages at noon and at midnight, and time series analysis on selected grid points based on the hourly averages of rainfall. Since there are no known distribution model which best describes the rainfall amount, nonparametric methods were used to examine the diurnal variation. Kolmogorov-Smirnov test was used to test if the rainfalls at noon and at midnight have the same statistical distribution. Wilcoxon signed-rank test was used to test if the noon rainfall is heavier than, equal to, or lighter than the midnight rainfall. These tests were done on each of the 10,000 grid points at which the data are available.

  20. Multi-objective game-theory models for conflict analysis in reservoir watershed management.

    Science.gov (United States)

    Lee, Chih-Sheng

    2012-05-01

    This study focuses on the development of a multi-objective game-theory model (MOGM) for balancing economic and environmental concerns in reservoir watershed management and for assistance in decision. Game theory is used as an alternative tool for analyzing strategic interaction between economic development (land use and development) and environmental protection (water-quality protection and eutrophication control). Geographic information system is used to concisely illustrate and calculate the areas of various land use types. The MOGM methodology is illustrated in a case study of multi-objective watershed management in the Tseng-Wen reservoir, Taiwan. The innovation and advantages of MOGM can be seen in the results, which balance economic and environmental concerns in watershed management and which can be interpreted easily by decision makers. For comparison, the decision-making process using conventional multi-objective method to produce many alternatives was found to be more difficult. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. A novel no-reference objective stereoscopic video quality assessment method based on visual saliency analysis

    Science.gov (United States)

    Yang, Xinyan; Zhao, Wei; Ye, Long; Zhang, Qin

    2017-07-01

    This paper proposes a no-reference objective stereoscopic video quality assessment method with the motivation that making the effect of objective experiments close to that of subjective way. We believe that the image regions with different visual salient degree should not have the same weights when designing an assessment metric. Therefore, we firstly use GBVS algorithm to each frame pairs and separate both the left and right viewing images into the regions with strong, general and week saliency. Besides, local feature information like blockiness, zero-crossing and depth are extracted and combined with a mathematical model to calculate a quality assessment score. Regions with different salient degree are assigned with different weights in the mathematical model. Experiment results demonstrate the superiority of our method compared with the existed state-of-the-art no-reference objective Stereoscopic video quality assessment methods.

  2. Indicators analysis and objectives for the development sustainable and sustainability environmental

    Directory of Open Access Journals (Sweden)

    Pedro Noboa-Romero

    2016-09-01

    Full Text Available The present article is product of a research qualitative, descriptive and analytical of the indicators and objectives aimed to the development sustainable. The main objective of this essay is to analyze sustainability indicators: index of human development (IDH, sustainable development goals (SDGS, objectives of the Millennium Goals (MDGS and the index of Multidimensional poverty (IPM; through a review of research and work on these issues, in order to establish progress and results that have been generated during the use of these indicators in the field of health education, technology, and environment. Demonstrate that there is inequality between Nations, the approach is oriented to a development in the short term, benefit exclusively to current generations, exhausting natural resources, regardless of a vision in the long term for the future generations.

  3. Melting temperature and enthalpy variations of phase change materials (PCMs): a differential scanning calorimetry (DSC) analysis

    Science.gov (United States)

    Sun, Xiaoqin; Lee, Kyoung Ok; Medina, Mario A.; Chu, Youhong; Li, Chuanchang

    2018-06-01

    Differential scanning calorimetry (DSC) analysis is a standard thermal analysis technique used to determine the phase transition temperature, enthalpy, heat of fusion, specific heat and activation energy of phase change materials (PCMs). To determine the appropriate heating rate and sample mass, various DSC measurements were carried out using two kinds of PCMs, namely N-octadecane paraffin and calcium chloride hexahydrate. The variations in phase transition temperature, enthalpy, heat of fusion, specific heat and activation energy were observed within applicable heating rates and sample masses. It was found that the phase transition temperature range increased with increasing heating rate and sample mass; while the heat of fusion varied without any established pattern. The specific heat decreased with the increase of heating rate and sample mass. For accuracy purpose, it is recommended that for PCMs with high thermal conductivity (e.g. hydrated salt) the focus will be on heating rate rather than sample mass.

  4. Genetic variation analysis and relationships among environmental strains of Scedosporium apiospermum sensu stricto in Bangkok, Thailand.

    Directory of Open Access Journals (Sweden)

    Thanwa Wongsuk

    Full Text Available The Scedosporium apiospermum species complex is an emerging filamentous fungi that has been isolated from environment. It can cause a wide range of infections in both immunocompetent and immunocompromised individuals. We aimed to study the genetic variation and relationships between 48 strains of S. apiospermum sensu stricto isolated from soil in Bangkok, Thailand. For PCR, sequencing and phylogenetic analysis, we used the following genes: actin; calmodulin exons 3 and 4; the second largest subunit of the RNA polymerase II; ß-tubulin exon 2-4; manganese superoxide dismutase; internal transcribed spacer; transcription elongation factor 1α; and beta-tubulin exons 5 and 6. The present study is the first phylogenetic analysis of relationships among S. apiospermum sensu stricto in Thailand and South-east Asia. This result provides useful information for future epidemiological study and may be correlated to clinical manifestation.

  5. Analysis and design of the SI-simulator software system for the VHTR-SI process by using the object-oriented analysis and object-oriented design methodology

    International Nuclear Information System (INIS)

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2008-01-01

    The SI-simulator is an application software system that simulates the dynamic behavior of the VHTR-SI process by the use of mathematical models. Object-oriented analysis (OOA) and object-oriented design (OOD) methodologies were employed for the SI simulator system development. OOA is concerned with developing software engineering requirements and specifications that are expressed as a system's object model (which is composed of a population of interacting objects), as opposed to the traditional data or functional views of systems. OOD techniques are useful for the development of large complex systems. Also, OOA/OOD methodology is usually employed to maximize the reusability and extensibility of a software system. In this paper, we present a design feature for the SI simulator software system by the using methodologies of OOA and OOD

  6. Java programming fundamentals problem solving through object oriented analysis and design

    CERN Document Server

    Nair, Premchand S

    2008-01-01

    While Java texts are plentiful, it's difficult to find one that takes a real-world approach, and encourages novice programmers to build on their Java skills through practical exercise. Written by an expert with 19 experience teaching computer programming, Java Programming Fundamentals presents object-oriented programming by employing examples taken from everyday life. Provides a foundation in object-oriented design principles and UML notation Describes common pitfalls and good programming practicesFurnishes supplemental links, documents, and programs on its companion website, www.premnair.netU

  7. Multi-objective analysis of the conjunctive use of surface water and groundwater in a multisource water supply system

    Science.gov (United States)

    Vieira, João; da Conceição Cunha, Maria

    2017-04-01

    A multi-objective decision model has been developed to identify the Pareto-optimal set of management alternatives for the conjunctive use of surface water and groundwater of a multisource urban water supply system. A multi-objective evolutionary algorithm, Borg MOEA, is used to solve the multi-objective decision model. The multiple solutions can be shown to stakeholders allowing them to choose their own solutions depending on their preferences. The multisource urban water supply system studied here is dependent on surface water and groundwater and located in the Algarve region, southernmost province of Portugal, with a typical warm Mediterranean climate. The rainfall is low, intermittent and concentrated in a short winter, followed by a long and dry period. A base population of 450 000 inhabitants and visits by more than 13 million tourists per year, mostly in summertime, turns water management critical and challenging. Previous studies on single objective optimization after aggregating multiple objectives together have already concluded that only an integrated and interannual water resources management perspective can be efficient for water resource allocation in this drought prone region. A simulation model of the multisource urban water supply system using mathematical functions to represent the water balance in the surface reservoirs, the groundwater flow in the aquifers, and the water transport in the distribution network with explicit representation of water quality is coupled with Borg MOEA. The multi-objective problem formulation includes five objectives. Two objective evaluate separately the water quantity and the water quality supplied for the urban use in a finite time horizon, one objective calculates the operating costs, and two objectives appraise the state of the two water sources - the storage in the surface reservoir and the piezometric levels in aquifer - at the end of the time horizon. The decision variables are the volume of withdrawals from

  8. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis

    Science.gov (United States)

    Addink, Elisabeth A.; Van Coillie, Frieke M. B.; De Jong, Steven M.

    2012-04-01

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received considerable attention over the past 15 years for analyzing and interpreting remote sensing imagery. In contrast to traditional image analysis, GEOBIA works more like the human eye-brain combination does. The latter uses the object's color (spectral information), size, texture, shape and occurrence to other image objects to interpret and analyze what we see. GEOBIA starts by segmenting the image grouping together pixels into objects and next uses a wide range of object properties to classify the objects or to extract object's properties from the image. Significant advances and improvements in image analysis and interpretation are made thanks to GEOBIA. In June 2010 the third conference on GEOBIA took place at the Ghent University after successful previous meetings in Calgary (2008) and Salzburg (2006). This special issue presents a selection of the 2010 conference papers that are worked out as full research papers for JAG. The papers cover GEOBIA applications as well as innovative methods and techniques. The topics range from vegetation mapping, forest parameter estimation, tree crown identification, urban mapping, land cover change, feature selection methods and the effects of image compression on segmentation. From the original 94 conference papers, 26 full research manuscripts were submitted; nine papers were selected and are presented in this special issue. Selection was done on the basis of quality and topic of the studies. The next GEOBIA conference will take place in Rio de Janeiro from 7 to 9 May 2012 where we hope to welcome even more scientists working in the field of GEOBIA.

  9. Prediction of ppm level electrical failure by using physical variation analysis

    Science.gov (United States)

    Hou, Hsin-Ming; Kung, Ji-Fu; Hsu, Y.-B.; Yamazaki, Y.; Maruyama, Kotaro; Toyoshima, Yuya; Chen, Chu-en

    2016-03-01

    their spatial correlation distance. For local variations (LV) there is no correlation, whereas for global variations (GV) the correlation distance is very large [7]-[9]. This is the first time to certificate the validation of spatial distribution from the affordable bias contour big data fundamental infrastructures. And then apply statistical techniques to dig out the variation sources. The GV come from systematic issue, which could be compensated by adaptive LT condition or OPC correction. But LV comes from random issue, which being considered as intrinsic problem such as structure, material, tool capability… etc. In this paper studying, we can find out the advanced technology node SRAM contact CD local variation (LV) dominates in total variation, about 70%. It often plays significant in-line real time catching WP-DPMO role of the product yield loss, especially for wafer edge is the worst loss within wafer distribution and causes serious reliability concern. The major root cause of variations comes from the PR material induced burr defect (LV), the second one comes from GV enhanced wafer edge short opportunity, which being attributed to three factors, first one factor is wafer edge CD deliberated enlargement for yield improvement as shown in Fig. 10. Second factor is overlaps/AA shifts due to tool capability dealing with incoming wafer's war page issue and optical periphery layout dependent working pitch issue as shown in Fig. 9 (1)., the last factor comes from wafer edge burr enhanced by wafer edge larger Photo Resistance (PR) spin centrifugal force. After implementing KPIs such as GV related AA/CD indexes as shown in Fig. 9 (1) and 10, respectively, and LV related burr index as shown in Fig. 11., we can construct the parts per million (PPM) level short probability model via multi-variables regression, canonical correlation analysis and logistic transformation. The model provides prediction of PPM level electrical failure by using in-line real time physical

  10. A Component Analysis of the Impact of Evaluative and Objective Feedback on Performance

    Science.gov (United States)

    Johnson, Douglas A.

    2013-01-01

    Despite the frequency with which performance feedback interventions are used in organizational behavior management, component analyses of such feedback are rare. It has been suggested that evaluation of performance and objective details about performance are two necessary components for performance feedback. The present study was designed to help…

  11. A Case Study on Coloured Petri Nets in Object-oriented Analysis and Design

    DEFF Research Database (Denmark)

    Barros, Joao Paulo; Jørgensen, Jens Bæk

    2005-01-01

    In this paper, we first demonstrate how a coloured Petri nets (CPN) model can be used to capture requirements for a considered example system, an elevator controller. Then, we show how this requirements-level CPN model is transformed into a design-level object-oriented CPN model, which...

  12. An integrated approach for visual analysis of a multisource moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; van Hage, W.R.; de Vries, G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  13. An Integrated Approach for Visual Analysis of a Multi-Source Moving Objects Knowledge Base

    NARCIS (Netherlands)

    Willems, C.M.E.; van Hage, W.R.; de Vries, G.K.D.; Janssens, J.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  14. An integrated approach for visual analysis of a multi-source moving objects knowledge base

    NARCIS (Netherlands)

    Willems, N.; Hage, van W.R.; Vries, de G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  15. Optimum analysis of pavement maintenance using multi-objective genetic algorithms

    Directory of Open Access Journals (Sweden)

    Amr A. Elhadidy

    2015-04-01

    Full Text Available Road network expansion in Egypt is considered as a vital issue for the development of the country. This is done while upgrading current road networks to increase the safety and efficiency. A pavement management system (PMS is a set of tools or methods that assist decision makers in finding optimum strategies for providing and maintaining pavements in a serviceable condition over a given period of time. A multi-objective optimization problem for pavement maintenance and rehabilitation strategies on network level is discussed in this paper. A two-objective optimization model considers minimum action costs and maximum condition for used road network. In the proposed approach, Markov-chain models are used for predicting the performance of road pavement and to calculate the expected decline at different periods of time. A genetic-algorithm-based procedure is developed for solving the multi-objective optimization problem. The model searched for the optimum maintenance actions at adequate time to be implemented on an appropriate pavement. Based on the computing results, the Pareto optimal solutions of the two-objective optimization functions are obtained. From the optimal solutions represented by cost and condition, a decision maker can easily obtain the information of the maintenance and rehabilitation planning with minimum action costs and maximum condition. The developed model has been implemented on a network of roads and showed its ability to derive the optimal solution.

  16. An Analysis of Learning Objectives and Content Coverage in Introductory Psychology Syllabi

    Science.gov (United States)

    Homa, Natalie; Hackathorn, Jana; Brown, Carrie M.; Garczynski, Amy; Solomon, Erin D.; Tennial, Rachel; Sanborn, Ursula A.; Gurung, Regan A. R.

    2013-01-01

    Introductory psychology is one of the most popular undergraduate courses and often serves as the gateway to choosing psychology as an academic major. However, little research has examined the typical structure of introductory psychology courses. The current study examined student learning objectives (SLOs) and course content in introductory…

  17. Object Selection Costs in Visual Working Memory: A Diffusion Model Analysis of the Focus of Attention

    Science.gov (United States)

    Sewell, David K.; Lilburn, Simon D.; Smith, Philip L.

    2016-01-01

    A central question in working memory research concerns the degree to which information in working memory is accessible to other cognitive processes (e.g., decision-making). Theories assuming that the focus of attention can only store a single object at a time require the focus to orient to a target representation before further processing can…

  18. Simple proteomics data analysis in the object-oriented PowerShell.

    Science.gov (United States)

    Mohammed, Yassene; Palmblad, Magnus

    2013-01-01

    Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."

  19. Analysis of Buried Dielectric Objects Using Higher-Order MoM for Volume Integral Equations

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.; Meincke, Peter; Breinbjerg, Olav

    2004-01-01

    A higher-order method of moments (MoM) is applied to solve a volume integral equation for dielectric objects in layered media. In comparison to low-order methods, the higher-order MoM, which is based on higher-order hierarchical Legendre vector basis functions and curvilinear hexahedral elements,...

  20. Analysis of porous media and objects of cultural heritage by mobile NMR

    International Nuclear Information System (INIS)

    Haber, Agnes

    2012-01-01

    Low-field NMR techniques are used to study porous system, from simple to complex structures, and objects of cultural heritage. It is shown that NMR relaxometry can be used to study the fluid dynamics inside a porous system. A simple theoretical model for multi-site relaxation exchange NMR is used to extract exchange kinetic parameters when applied on a model porous systems. It provides a first step towards the study of more complex systems, where continuous relaxation distributions are present, such as soil systems or building materials. Moisture migration is observed in the soil systems with the help of 1D and 2D NMR relaxometry methods. In case of the concrete samples, the difference in composition makes a significant difference in the ability of water uptake. The single-sided NMR sensor proves to be a useful tool for on-site measurements. This is very important also in the case of the cultural heritage objects, as most of the objects can not be moved out of their environment. Mobile NMR turns out to be a simple but reliable and powerful tool to investigate moisture distributions and pore structures in porous media as well as the conservation state and history of objects of cultural heritage.

  1. Development and Factor Analysis of an Instrument to Measure Preservice Teachers' Perceptions of Learning Objects

    Science.gov (United States)

    Sahin, Sami

    2010-01-01

    The purpose of this study was to develop a questionnaire to measure student teachers' perception of digital learning objects. The participants included 308 voluntary senior students attending courses in a college of education of a public university in Turkey. The items were extracted to their related factors by the principal axis factoring method.…

  2. An analysis of nature and mechanisms of the Lira objects territories' radionuclide contamination

    International Nuclear Information System (INIS)

    Kadyrzhanov, K.K; Tuleushev, A.Zh.; Lukashenko, S.N.; Solodukhin, V.P.; Kazachevskij, I.V.; Reznikov, S.V.

    2001-01-01

    In the paper the results of study of radioactive contamination of 'Lira' objects territories are presented. Obtained data are evidencing, that existing radiation situation does not presents a threat for operating personnel of both the occupied on the deposit and its objects furthermore for inhabitants of the closest localities. Therewith a radionuclides concentration in the soils on the examined areas is slightly exceeds the background values characteristic for this region. Two hypothesises for reveled radionuclide contamination have been considered: yield on the surface and distribution by territory immediately after explosion 137 Xe and 90 Kr inert gases - they are genetical predecessors of 137 Cs and 90 Sr, relatively; existence of a constant effluence of these radionuclides on a surface from a 'ditch cavities' of the 'Lira' objects by the zones of dis-consolidation and crack propagations in the earth crust. With purpose for these hypothesis correctness clarification the distribution of radionuclides by soil layer depth in the vicinities of militant wells (TK-2 and TK-5), as well as in the case and riverbed of the Berezovka river. There are not data confirm the hypothesis about possible constant radionuclides influent from a 'ditch cavities'. So, the hypothesis of the 'Lira' objects territories radionuclide contamination due to inert gases yield on the surface is a more rightful

  3. A multi-level object store and its application to HEP data analysis

    International Nuclear Information System (INIS)

    May, E.; Lifka, D.; Malon, D.; Grossman, R.L.; Qin, X.; Valsamis, D.; Xu, W.

    1994-01-01

    We present a design and demonstration of a scientific data manager consisting of a low overhead, high performance object store interfaced to a hierarchical storage system. This was done with the framework of the Mark1 testbeds of the PASS project

  4. An Achievement Degree Analysis Approach to Identifying Learning Problems in Object-Oriented Programming

    Science.gov (United States)

    Allinjawi, Arwa A.; Al-Nuaim, Hana A.; Krause, Paul

    2014-01-01

    Students often face difficulties while learning object-oriented programming (OOP) concepts. Many papers have presented various assessment methods for diagnosing learning problems to improve the teaching of programming in computer science (CS) higher education. The research presented in this article illustrates that although max-min composition is…

  5. Inverse Transient Analysis for Classification of Wall Thickness Variations in Pipelines

    Directory of Open Access Journals (Sweden)

    Jeffrey Tuck

    2013-12-01

    Full Text Available Analysis of transient fluid pressure signals has been investigated as an alternative method of fault detection in pipeline systems and has shown promise in both laboratory and field trials. The advantage of the method is that it can potentially provide a fast and cost effective means of locating faults such as leaks, blockages and pipeline wall degradation within a pipeline while the system remains fully operational. The only requirement is that high speed pressure sensors are placed in contact with the fluid. Further development of the method requires detailed numerical models and enhanced understanding of transient flow within a pipeline where variations in pipeline condition and geometry occur. One such variation commonly encountered is the degradation or thinning of pipe walls, which can increase the susceptible of a pipeline to leak development. This paper aims to improve transient-based fault detection methods by investigating how changes in pipe wall thickness will affect the transient behaviour of a system; this is done through the analysis of laboratory experiments. The laboratory experiments are carried out on a stainless steel pipeline of constant outside diameter, into which a pipe section of variable wall thickness is inserted. In order to detect the location and severity of these changes in wall conditions within the laboratory system an inverse transient analysis procedure is employed which considers independent variations in wavespeed and diameter. Inverse transient analyses are carried out using a genetic algorithm optimisation routine to match the response from a one-dimensional method of characteristics transient model to the experimental time domain pressure responses. The accuracy of the detection technique is evaluated and benefits associated with various simplifying assumptions and simulation run times are investigated. It is found that for the case investigated, changes in the wavespeed and nominal diameter of the

  6. Inverse Transient Analysis for Classification of Wall Thickness Variations in Pipelines

    Science.gov (United States)

    Tuck, Jeffrey; Lee, Pedro

    2013-01-01

    Analysis of transient fluid pressure signals has been investigated as an alternative method of fault detection in pipeline systems and has shown promise in both laboratory and field trials. The advantage of the method is that it can potentially provide a fast and cost effective means of locating faults such as leaks, blockages and pipeline wall degradation within a pipeline while the system remains fully operational. The only requirement is that high speed pressure sensors are placed in contact with the fluid. Further development of the method requires detailed numerical models and enhanced understanding of transient flow within a pipeline where variations in pipeline condition and geometry occur. One such variation commonly encountered is the degradation or thinning of pipe walls, which can increase the susceptible of a pipeline to leak development. This paper aims to improve transient-based fault detection methods by investigating how changes in pipe wall thickness will affect the transient behaviour of a system; this is done through the analysis of laboratory experiments. The laboratory experiments are carried out on a stainless steel pipeline of constant outside diameter, into which a pipe section of variable wall thickness is inserted. In order to detect the location and severity of these changes in wall conditions within the laboratory system an inverse transient analysis procedure is employed which considers independent variations in wavespeed and diameter. Inverse transient analyses are carried out using a genetic algorithm optimisation routine to match the response from a one-dimensional method of characteristics transient model to the experimental time domain pressure responses. The accuracy of the detection technique is evaluated and benefits associated with various simplifying assumptions and simulation run times are investigated. It is found that for the case investigated, changes in the wavespeed and nominal diameter of the pipeline are both important

  7. Object and Objective Lost?

    DEFF Research Database (Denmark)

    Lopdrup-Hjorth, Thomas

    2015-01-01

    This paper explores the erosion and problematization of ‘the organization’ as a demarcated entity. Utilizing Foucault's reflections on ‘state-phobia’ as a source of inspiration, I show how an organization-phobia has gained a hold within Organization Theory (OT). By attending to the history...... of this organization-phobia, the paper argues that OT has become increasingly incapable of speaking about its core object. I show how organizations went from being conceptualized as entities of major importance to becoming theoretically deconstructed and associated with all kinds of ills. Through this history......, organizations as distinct entities have been rendered so problematic that they have gradually come to be removed from the center of OT. The costs of this have been rather significant. Besides undermining the grounds that gave OT intellectual credibility and legitimacy to begin with, the organization-phobia...

  8. Thermohydromechanical stability analysis upon joint characteristics and depth variations in the region of an underground repository for the study of a disposal concept of high level radioactive wastes

    International Nuclear Information System (INIS)

    Kim, Jhin Wung; Bae, Dae Suk; Kang, Chul Hyung; Choi, Jong Won

    2003-02-01

    The objective of the present study is to understand a long term(500 years) thermohydromechanical interaction behavior in the vicinity of a repository cavern upon joint location and repository depth variations, and then, to contribute to the development of a disposal concept. The model includes a saturated rock mass, PWR spent fuels in a disposal canister surrounded by compacted bentonite inside a deposition hole, and mixed bentonite backfilled in the rest of the space within a cavern. It is assumed that two joint sets exist within a model. A joint set 1 includes 56 .deg. dip joints, 20m spaced, and a joint set 2 is in the direction perpendicular to a joint set 1 and includes 34 .deg. dip joints, 20m spaced. In order to understand the behavior change upon joint location variations, 5 different models of 500m depth are analyzed, and additional 3 different models of 1km depth are analyzed to understand the effect of a depth variation. The two dimensional distinct element code, UDEC is used for the analysis. To understand the joint behavior adjacent to a repository cavern, Barton-Bandis joint model is used. Effect of the decay heat for PWR spent fuels on a repository model is analyzed, and a steady state algorithm is used for a hydraulic analysis. According to the thermohydromechanical interaction behavior of a repository model upon variations of joint locations and a repository depth, during the period of 500 years from waste emplacement, the effect of a depth variation on the stress and displacement behavior of a model is comparatively smaller than the effect of decay heat from radioactive materials. From the study of the joint location variation effect, it is advisable not to locate an underground opening in the region very close to the joint crossings

  9. Complexity analysis of dual-channel game model with different managers' business objectives

    Science.gov (United States)

    Li, Ting; Ma, Junhai

    2015-01-01

    This paper considers dual-channel game model with bounded rationality, using the theory of bifurcations of dynamical system. The business objectives of retailers are assumed to be different, which is closer to reality than previous studies. We study the local stable region of Nash equilibrium point and find that business objectives can expand the stable region and play an important role in price strategy. One interesting finding is that a fiercer competition tends to stabilize the Nash equilibrium. Simulation shows the complex behavior of two dimensional dynamic system, we find period doubling bifurcation and chaos phenomenon. We measure performances of the model in different period by using the index of average profit. The results show that unstable behavior in economic system is often an unfavorable outcome. So this paper discusses the application of adaptive adjustment mechanism when the model exhibits chaotic behavior and then allows the retailers to eliminate the negative effects.

  10. Distributed dendritic processing facilitates object detection: a computational analysis on the visual system of the fly.

    Science.gov (United States)

    Hennig, Patrick; Möller, Ralf; Egelhaaf, Martin

    2008-08-28

    Detecting objects is an important task when moving through a natural environment. Flies, for example, may land on salient objects or may avoid collisions with them. The neuronal ensemble of Figure Detection cells (FD-cells) in the visual system of the fly is likely to be involved in controlling these behaviours, as these cells are more sensitive to objects than to extended background structures. Until now the computations in the presynaptic neuronal network of FD-cells and, in particular, the functional significance of the experimentally established distributed dendritic processing of excitatory and inhibitory inputs is not understood. We use model simulations to analyse the neuronal computations responsible for the preference of FD-cells for small objects. We employed a new modelling approach which allowed us to account for the spatial spread of electrical signals in the dendrites while avoiding detailed compartmental modelling. The models are based on available physiological and anatomical data. Three models were tested each implementing an inhibitory neural circuit, but differing by the spatial arrangement of the inhibitory interaction. Parameter optimisation with an evolutionary algorithm revealed that only distributed dendritic processing satisfies the constraints arising from electrophysiological experiments. In contrast to a direct dendro-dendritic inhibition of the FD-cell (Direct Distributed Inhibition model), an inhibition of its presynaptic retinotopic elements (Indirect Distributed Inhibition model) requires smaller changes in input resistance in the inhibited neurons during visual stimulation. Distributed dendritic inhibition of retinotopic elements as implemented in our Indirect Distributed Inhibition model is the most plausible wiring scheme for the neuronal circuit of FD-cells. This microcircuit is computationally similar to lateral inhibition between the retinotopic elements. Hence, distributed inhibition might be an alternative explanation of

  11. DOCUMENTATION OF HISTORICAL UNDERGROUND OBJECT IN SKORKOV VILLAGE WITH SELECTED MEASURING METHODS, DATA ANALYSIS AND VISUALIZATION

    Directory of Open Access Journals (Sweden)

    A. Dlesk

    2016-06-01

    Full Text Available The author analyzes current methods of 3D documentation of historical tunnels in Skorkov village, which lies at the Jizera river, approximately 30 km away from Prague. The area is known as a former military camp from Thirty Years’ War in 17th Century. There is an extensive underground compound with one entrance corridor and two transverse, situated approximately 2 to 5 m under the local development. The object has been partly documented by geodetic polar method, intersection photogrammetry, image-based modelling and laser scanning. Data have been analyzed and methods have been compared. Then the 3D model of object has been created and compound with cadastral data, orthophoto, historical maps and digital surface model which was made by photogrammetric method using remotely piloted aircraft system. Then the measuring has been realized with ground penetrating radar. Data have been analyzed and the result compared with real status. All the data have been combined and visualized into one 3D model. Finally, the discussion about advantages and disadvantages of used measuring methods has been livened up. The tested methodology has been also used for other documentation of historical objects in this area. This project has been created as a part of research at EuroGV. s.r.o. Company lead by Ing. Karel Vach CSc. in cooperation with prof. Dr. Ing. Karel Pavelka from Czech Technical University in Prague and Miloš Gavenda, the renovator.

  12. Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for the Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements

  13. Documentation of Historical Underground Object in Skorkov Village with Selected Measuring Methods, Data Analysis and Visualization

    Science.gov (United States)

    Dlesk, A.

    2016-06-01

    The author analyzes current methods of 3D documentation of historical tunnels in Skorkov village, which lies at the Jizera river, approximately 30 km away from Prague. The area is known as a former military camp from Thirty Years' War in 17th Century. There is an extensive underground compound with one entrance corridor and two transverse, situated approximately 2 to 5 m under the local development. The object has been partly documented by geodetic polar method, intersection photogrammetry, image-based modelling and laser scanning. Data have been analyzed and methods have been compared. Then the 3D model of object has been created and compound with cadastral data, orthophoto, historical maps and digital surface model which was made by photogrammetric method using remotely piloted aircraft system. Then the measuring has been realized with ground penetrating radar. Data have been analyzed and the result compared with real status. All the data have been combined and visualized into one 3D model. Finally, the discussion about advantages and disadvantages of used measuring methods has been livened up. The tested methodology has been also used for other documentation of historical objects in this area. This project has been created as a part of research at EuroGV. s.r.o. Company lead by Ing. Karel Vach CSc. in cooperation with prof. Dr. Ing. Karel Pavelka from Czech Technical University in Prague and Miloš Gavenda, the renovator.

  14. Data Quality Objectives for Regulatory Requirements for Dangerous Waste Sampling and Analysis; FINAL

    International Nuclear Information System (INIS)

    MULKEY, C.H.

    1999-01-01

    This document describes sampling and analytical requirements needed to meet state and federal regulations for dangerous waste (DW). The River Protection Project (RPP) is assigned to the task of storage and interim treatment of hazardous waste. Any final treatment or disposal operations, as well as requirements under the land disposal restrictions (LDRs), fall in the jurisdiction of another Hanford organization and are not part of this scope. The requirements for this Data Quality Objective (DQO) Process were developed using the RPP Data Quality Objective Procedure (Banning 1996), which is based on the U.S. Environmental Protection Agency's (EPA) Guidance for the Data Quality Objectives Process (EPA 1994). Hereafter, this document is referred to as the DW DQO. Federal and state laws and regulations pertaining to waste contain requirements that are dependent upon the composition of the waste stream. These regulatory drivers require that pertinent information be obtained. For many requirements, documented process knowledge of a waste composition can be used instead of analytical data to characterize or designate a waste. When process knowledge alone is used to characterize a waste, it is a best management practice to validate the information with analytical measurements

  15. Multi-Objective Analysis of a CHP Plant Integrated Microgrid in Pakistan

    Directory of Open Access Journals (Sweden)

    Asad Waqar

    2017-10-01

    Full Text Available In developing countries like Pakistan, the capacity shortage (CS of electricity is a critical problem. The frequent natural gas (NG outages compel consumers to use electricity to fulfill the thermal loads, which ends up as an increase in electrical load. In this scenario, the authors have proposed the concept of a combined heat & power (CHP plant to be a better option for supplying both electrical and thermal loads simultaneously. A CHP plant-based microgrid comprising a PV array, diesel generators and batteries (operating in grid-connected as well as islanded modes has been simulated using the HOMER Pro software. Different configurations of distributed generators (DGs with/without batteries have been evaluated considering multiple objectives. The multiple objectives include the minimization of the total net present cost (TNPC, cost of generated energy (COE and the annual greenhouse gas (GHG emissions, as well as the maximization of annual waste heat recovery (WHR of thermal units and annual grid sales (GS. These objectives are subject to the constraints of power balance, battery operation within state of charge (SOC limits, generator operation within capacity limits and zero capacity shortage. The simulations have been performed on six cities including Islamabad, Lahore, Karachi, Peshawar, Quetta and Gilgit. The simulation results have been analyzed to find the most optimal city for the CHP plant integrated microgrid.

  16. An object-oriented framework for magnetic-fusion modeling and analysis codes

    International Nuclear Information System (INIS)

    Cohen, R H; Yang, T Y Brian.

    1999-01-01

    The magnetic-fusion energy (MFE) program, like many other scientific and engineering activities, has a need to efficiently develop complex modeling codes which combine detailed models of components to make an integrated model of a device, as well as a rich supply of legacy code that could provide the component models. There is also growing recognition in many technical fields of the desirability of steerable software: computer programs whose functionality can be changed by the user as it is run. This project had as its goals the development of two key pieces of infrastructure that are needed to combine existing code modules, written mainly in Fortran, into flexible, steerable, object-oriented integrated modeling codes for magnetic- fusion applications. These two pieces are (1) a set of tools to facilitate the interfacing of Fortran code with a steerable object-oriented framework (which we have chosen to be based on PythonlW3, an object-oriented interpreted language), and (2) a skeleton for the integrated modeling code which defines the relationships between the modules. The first of these activities obviously has immediate applicability to a spectrum of projects; the second is more focussed on the MFE application, but may be of value as an example for other applications

  17. Neutron activation analysis capability of natural objects' estimation for Latvian environment

    International Nuclear Information System (INIS)

    Damburg, N.A.; Mednis, I.V.; Taure, I.Ya.; Virtsavs, M.V.

    1989-01-01

    A review of literature data and the NAA techniques developed by the authors for the analysis of environmental saples (aerosols, fly ash, soil, pine needls, natural and technological waters) are presented. The methods are used for the routine analysis of some samples from the environment of industrial and power plants of Latvia to investigate and control the local pollution with heavy metals, arsenic, halogens

  18. Transit Timing Variation analysis with Kepler light curves of KOI 227 and Kepler 93b

    Science.gov (United States)

    Dulz, Shannon; Reed, Mike

    2017-01-01

    By searching for transit signals in approximately 150,000 stars, NASA’s Kepler Space telescope found thousands of exoplanets over its primary mission from 2009 to 2013 (Tenenbaum et al. 2014, ApJS, 211, 6). Yet, a detailed follow-up examination of Kepler light curves may contribute more evidence on system dynamics and planetary atmospheres of these objects. Kepler’s continuous observing of these systems over the mission duration produced light curves of sufficient duration to allow for the search for transit timing variations. Transit timing variations over the course of many orbits may indicate a precessing orbit or the existence of a non-transiting third body such as another exoplanet. Flux contributions of the planet just prior to secondary eclipse may provide a measurement of bond albedo from the day-side of the transiting planet. Any asymmetries of the transit shape may indicate thermal asymmetries which can measure upper atmosphere motion of the planet. These two factors can constrain atmospheric models of close orbiting exoplanets. We first establish our procedure with the well-documented TTV system, KOI 227 (Nesvorny et al. 2014, ApJ, 790, 31). Using the test case of KOI 227, we analyze Kepler-93b for TTVs and day-side flux contributions. Kepler-93b is likely a rocky planet with R = 1.50 ± 0.03 Earth Radii and M = 2.59 ± 2.0 Earth Masses (Marcy et al. 2014, ApJS, 210, 20). This research is funded by a NASA EPSCoR grant.

  19. Development of three-dimensional shoulder kinematic and electromyographic exposure variation analysis methodology in violin musicians.

    Science.gov (United States)

    Reynolds, Jonathan F; Leduc, Robert E; Kahnert, Emily K; Ludewig, Paula M

    2014-01-01

    A total of 11 male and 19 female violinists performed 30-second random-ordered slow and fast musical repertoire while right shoulder three-dimensional kinematic, and upper trapezius and serratus anterior surface electromyography (EMG) data were summarised using exposure variation analysis (EVA), a bivariate distribution of work time spent at categories of signal amplitude, and duration spent at a fixed category of amplitude. Sixty-two per cent of intraclass correlation coefficients [1,1] for all kinematic and EMG variables exceeded 0.75, and 40% of standard error of the measurement results were below 5%, confirming EVA reliability. When fast repertoire was played, increases in odds ratios in short duration cells were seen in 23 of 24 possible instances, and decreases in longer duration cells were seen in 17 instances in all EVA arrays using multinomial logistic regression with random effects, confirming a shift towards shorter duration. A reliable technique to assess right shoulder kinematic and EMG exposure in violinists was identified. A reliable method of measuring right shoulder motion and muscle activity exposure variation in violinists was developed which can be used to assess ergonomic risk in other occupations. Recently developed statistical methods enabled differentiation between fast and slow musical performance of standardised musical repertoire.

  20. Analysis of Pressure Variations in a Low-Pressure Nickel-Hydrogen Battery - Part 1.

    Science.gov (United States)

    Purushothaman, B K; Wainright, J S

    2012-05-15

    A low pressure nickel-hydrogen battery using either a metal hydride or gaseous hydrogen for H(2) storage has been developed for use in implantable neuroprosthetic devices. In this paper, pressure variations inside the cell for the gaseous hydrogen version are analyzed and correlated with oxygen evolution side reaction at the end of charging, the recombination of oxygen with hydrogen during charging and a subsequent rest period, and the self-discharge of the nickel electrode. About 70% of the recombination occurred simultaneously with oxygen evolution during charging and the remaining oxygen recombined with hydrogen during the 1(st) hour after charging. Self-discharge of the cell varies linearly with hydrogen pressure at a given state of charge and increased with increasing battery charge levels. The coulometric efficiency calculated based on analysis of the pressure-time data agreed well with the efficiency calculated based on the current-time data. Pressure variations in the battery are simulated accurately to predict coulometric efficiency and the state of charge of the cell, factors of extreme importance for a battery intended for implantation within the human body.

  1. Analysis of Pressure Variations in a Low-Pressure Nickel-Hydrogen Battery – Part 1

    Science.gov (United States)

    Purushothaman, B. K.; Wainright, J. S.

    2012-01-01

    A low pressure nickel-hydrogen battery using either a metal hydride or gaseous hydrogen for H2 storage has been developed for use in implantable neuroprosthetic devices. In this paper, pressure variations inside the cell for the gaseous hydrogen version are analyzed and correlated with oxygen evolution side reaction at the end of charging, the recombination of oxygen with hydrogen during charging and a subsequent rest period, and the self-discharge of the nickel electrode. About 70% of the recombination occurred simultaneously with oxygen evolution during charging and the remaining oxygen recombined with hydrogen during the 1st hour after charging. Self-discharge of the cell varies linearly with hydrogen pressure at a given state of charge and increased with increasing battery charge levels. The coulometric efficiency calculated based on analysis of the pressure-time data agreed well with the efficiency calculated based on the current-time data. Pressure variations in the battery are simulated accurately to predict coulometric efficiency and the state of charge of the cell, factors of extreme importance for a battery intended for implantation within the human body. PMID:22423175

  2. Moveout analysis of wide-azimuth data in the presence of lateral velocity variation

    KAUST Repository

    Takanashi, Mamoru

    2012-05-01

    Moveout analysis of wide-azimuth reflection data seldom takes into account lateral velocity variations on the scale of spreadlength. However, velocity lenses (such as channels and reefs) in the overburden can cause significant, laterally varying errors in the moveout parameters and distortions in data interpretation. Here, we present an analytic expression for the normal-moveout (NMO) ellipse in stratified media with lateral velocity variation. The contribution of lateral heterogeneity (LH) is controlled by the second derivatives of the interval vertical traveltime with respect to the horizontal coordinates, along with the depth and thickness of the LH layer. This equation provides a quick estimate of the influence of velocity lenses and can be used to substantially mitigate the lens-induced distortions in the effective and interval NMO ellipses. To account for velocity lenses in nonhyperbolic moveout inversion of wide-azimuth data, we propose a prestack correction algorithm that involves computation of the lens-induced traveltime distortion for each recorded trace. The overburden is assumed to be composed of horizontal layers (one of which contains the lens), but the target interval can be laterally heterogeneous with dipping or curved interfaces. Synthetic tests for horizontally layered models confirm that our algorithm accurately removes lens-related azimuthally varying traveltime shifts and errors in the moveout parameters. The developed methods should increase the robustness of seismic processing of wide-azimuth surveys, especially those acquired for fracture-characterization purposes. © 2012 Society of Exploration Geophysicists.

  3. Social variations in fetal growth in a Russian setting: an analysis of medical records.

    Science.gov (United States)

    Grjibovski, Andrej M; Bygren, Lars O; Svartbo, Boo; Magnus, Per

    2003-10-01

    The study examines variations in fetal growth by maternal social circumstances in a Russian town. All pregnant women registered at the antenatal clinics in 1999 in Severodvinsk (north-west Russia) and their live born infants comprised the study base (n=1399). Multivariate linear regression analysis was applied to quantify the effect of socio-demographic factors on birthweight and the ponderal index (PI). A clear gradient of birthweight in relation to mothers' education was revealed. Babies of the most educated mothers were 207 g (95% CI, 55, 358) heavier than babies of mothers with basic education. The average weight of those born to mothers with secondary and vocational levels of education was 172 g (95% CI, 91, 253) and 83 g (95% CI, 9, 163) lower compared with infants born to mothers with a university level of education after adjustment for age, parity, pre-pregnancy weight, marital status, maternal occupation, length of gestation, and sex of the baby. Maternal education also influenced the PI. Further studies should focus on the mechanisms of the coherence of maternal education and fetal growth. To ensure that all parts of the society benefit equally from economic and social reforms, social variations in pregnancy outcomes should be monitored during the time of transition.

  4. Extrapolating cosmic ray variations and impacts on life: Morlet wavelet analysis

    Science.gov (United States)

    Zarrouk, N.; Bennaceur, R.

    2009-07-01

    Exposure to cosmic rays may have both a direct and indirect effect on Earth's organisms. The radiation may lead to higher rates of genetic mutations in organisms, or interfere with their ability to repair DNA damage, potentially leading to diseases such as cancer. Increased cloud cover, which may cool the planet by blocking out more of the Sun's rays, is also associated with cosmic rays. They also interact with molecules in the atmosphere to create nitrogen oxide, a gas that eats away at our planet's ozone layer, which protects us from the Sun's harmful ultraviolet rays. On the ground, humans are protected from cosmic particles by the planet's atmosphere. In this paper we give estimated results of wavelet analysis from solar modulation and cosmic ray data incorporated in time-dependent cosmic ray variation. Since solar activity can be described as a non-linear chaotic dynamic system, methods such as neural networks and wavelet methods should be very suitable analytical tools. Thus we have computed our results using Morlet wavelets. Many have used wavelet techniques for studying solar activity. Here we have analysed and reconstructed cosmic ray variation, and we have better depicted periods or harmonics other than the 11-year solar modulation cycles.

  5. Moveout analysis of wide-azimuth data in the presence of lateral velocity variation

    KAUST Repository

    Takanashi, Mamoru; Tsvankin, Ilya

    2012-01-01

    Moveout analysis of wide-azimuth reflection data seldom takes into account lateral velocity variations on the scale of spreadlength. However, velocity lenses (such as channels and reefs) in the overburden can cause significant, laterally varying errors in the moveout parameters and distortions in data interpretation. Here, we present an analytic expression for the normal-moveout (NMO) ellipse in stratified media with lateral velocity variation. The contribution of lateral heterogeneity (LH) is controlled by the second derivatives of the interval vertical traveltime with respect to the horizontal coordinates, along with the depth and thickness of the LH layer. This equation provides a quick estimate of the influence of velocity lenses and can be used to substantially mitigate the lens-induced distortions in the effective and interval NMO ellipses. To account for velocity lenses in nonhyperbolic moveout inversion of wide-azimuth data, we propose a prestack correction algorithm that involves computation of the lens-induced traveltime distortion for each recorded trace. The overburden is assumed to be composed of horizontal layers (one of which contains the lens), but the target interval can be laterally heterogeneous with dipping or curved interfaces. Synthetic tests for horizontally layered models confirm that our algorithm accurately removes lens-related azimuthally varying traveltime shifts and errors in the moveout parameters. The developed methods should increase the robustness of seismic processing of wide-azimuth surveys, especially those acquired for fracture-characterization purposes. © 2012 Society of Exploration Geophysicists.

  6. Morphological variation and phylogenetic analysis of the dinoflagellate Gymnodinium aureolum from a tributary of Chesapeake Bay.

    Science.gov (United States)

    Tang, Ying Zhong; Egerton, Todd A; Kong, Lesheng; Marshall, Harold G

    2008-01-01

    Cultures of four strains of the dinoflagellate Gymnodinium aureolum (Hulburt) G. Hansen were established from the Elizabeth River, a tidal tributary of the Chesapeake Bay, USA. Light microscopy, scanning electron microscopy, nuclear-encoded large sub-unit rDNA sequencing, and culturing observations were conducted to further characterize this species. Observations of morphology included: a multiple structured apical groove; a peduncle located between the emerging points of the two flagella; pentagonal and hexagonal vesicles on the amphiesma; production and germination of resting cysts; variation in the location of the nucleus within the center of the cell; a longitudinal ventral concavity; and considerable variation in cell width/length and overall cell size. A fish bioassay using juvenile sheepshead minnows detected no ichthyotoxicity from any of the strains over a 48-h period. Molecular analysis confirmed the dinoflagellate was conspecific with G. aureolum strains from around the world, and formed a cluster along with several other Gymnodinium species. Morphological evidence suggests that further research is necessary to examine the relationship between G. aureolum and a possibly closely related species Gymnodinium maguelonnense.

  7. Multilocus analysis of nucleotide variation and speciation in three closely related Populus (Salicaceae) species.

    Science.gov (United States)

    Du, Shuhui; Wang, Zhaoshan; Ingvarsson, Pär K; Wang, Dongsheng; Wang, Junhui; Wu, Zhiqiang; Tembrock, Luke R; Zhang, Jianguo

    2015-10-01

    Historical tectonism and climate oscillations can isolate and contract the geographical distributions of many plant species, and they are even known to trigger species divergence and ultimately speciation. Here, we estimated the nucleotide variation and speciation in three closely related Populus species, Populus tremuloides, P. tremula and P. davidiana, distributed in North America and Eurasia. We analysed the sequence variation in six single-copy nuclear loci and three chloroplast (cpDNA) fragments in 497 individuals sampled from 33 populations of these three species across their geographic distributions. These three Populus species harboured relatively high levels of nucleotide diversity and showed high levels of nucleotide differentiation. Phylogenetic analysis revealed that P. tremuloides diverged earlier than the other two species. The cpDNA haplotype network result clearly illustrated the dispersal route from North America to eastern Asia and then into Europe. Molecular dating results confirmed that the divergence of these three species coincided with the sundering of the Bering land bridge in the late Miocene and a rapid uplift of the Qinghai-Tibetan Plateau around the Miocene/Pliocene boundary. Vicariance-driven successful allopatric speciation resulting from historical tectonism and climate oscillations most likely played roles in the formation of the disjunct distributions and divergence of these three Populus species. © 2015 John Wiley & Sons Ltd.

  8. Spatio-temporal variation analysis of hydrochemical characteristics in the Luanhe River Basin, China.

    Science.gov (United States)

    Xie, Ying; Li, Xuyong; Wang, Huiliang; Li, Wenzan

    2013-01-01

    The analysis of river pollution and assessment of spatial and temporal variation in hydrochemistry are essential to river water pollution control in the context of rapid economic growth and growing pollution threats in China. In this study, we focused on hydrochemical characteristics of the Luanhe River Basin (China) and evaluation of 12 hydrochemical variables obtained from 32 monitoring stations during 2001-2010. In each study year, the streams were monitored in the three hydrological periods (April, August, and October) to observe differences in the impacts of agricultural activity and rainfall pattern. Multivariate statistical methods were applied to the data set, and the river water hydrochemical characteristics were assessed using the water quality identification index (WQIIM). The results showed that parameters had variable contribution to water quality status in different months except for ammonia nitrogen (NH4-N) and total nitrogen (TN), which were the most important parameters in contributing to water quality variations for all three periods. Results of WQIIM revealed that 18 sites were classified as 'meeting standard' while the other 14 sites were classified as 'not meeting standard', with most of the seriously polluted sites located in urban area, mainly due to discharge of wastewater from domestic and industrial sources. Sites with low pollution level were located primarily in smaller tributaries, whereas sites of medium and high pollution levels were in the main river channel and the larger tributaries. Our findings provide valuable information and guidance for water pollution control and water resource management in the Luanhe River Basin.

  9. SULT1A1 copy number variation: ethnic distribution analysis in an Indian population.

    Science.gov (United States)

    Almal, Suhani; Padh, Harish

    2017-11-01

    Cytosolic sulfotransferases (SULTs) are phase II detoxification enzymes involved in metabolism of numerous xenobiotics, drugs and endogenous compounds. Interindividual variation in sulfonation capacity is important for determining an individual's response to xenobiotics. SNPs in SULTs, mainly SULT1A1 have been associated with cancer risk and also with response to therapeutic agents. Copy number variation (CNVs) in SULT1A1 is found to be correlated with altered enzyme activity. This short report primarily focuses on CNV in SULT1A1 and its distribution among different ethnic populations around the globe. Frequency distribution of SULT1A1 copy number (CN) in 157 healthy Indian individuals was assessed using florescent-based quantitative PCR assay. A range of 1 to >4 copies, with a frequency of SULT1A1 CN =2 (64.9%) the highest, was observed in our (Indian) population. Upon comparative analysis of frequency distribution of SULT1A1 CN among diverse population groups, a statistically significant difference was observed between Indians (our data) and African-American (AA) (p = 0.0001) and South African (Tswana) (p populations. Distribution of CNV in the Indian population was found to be similar to that in European-derived populations of American and Japanese. CNV of SULT1A1 varies significantly among world populations and may be one of the determinants of health and diseases.

  10. Global Warming and Geographically Scalar Climatic Objects Exist: An Ontologically Realist and Object-Oriented Analysis of the Daymet TMAX Climate Summaries for North America

    Science.gov (United States)

    Jackson, C. P.

    2017-12-01

    The scientific materialist worldview, what Peter Unger refers to as the Scientiphical worldview, or Scientiphicalism, has been utterly catastrophic for mesoscale objects in general, but, with its closely associated twentieth-century formal logic, this has been especially true for notoriously vague things like climate change, coastlines, mountains and dust storms. That is, any so-called representations or references ultimately suffer the same ontological demise as their referents, no matter how well-defined their boundaries may in fact be. Against this reductionist metaphysics, climatic objects are discretized within three separate ontologically realist systems, Graham Harman's object-oriented philosophy, or ontology (OOO), Markus Gabriel's ontology of fields of sense (OFS) and Tristan Garcia's two systems and new order of time, so as to make an ontological case for any geographically scalar object, beginning with pixels, as well as any notoriously vague thing they are said to represent. Four-month overlapping TMAX seasonals were first developed from the Oak Ridge National Laboratory (ORNL) Daymet climate temperature maximum (TMAX) monthly summaries (1980-2016) for North America and segmented within Trimble's eCognition Developer using the simple and widely familiar quadtree algorithm with a scale parameter of four, in this example. The regression coefficient was then calculated for the resulting 37-year climatic objects and an equally simple classification was applied. The same segmentation and classification was applied to the Daymet annual summaries, as well, for comparison. As was expected, the mean warming and cooling trends are lowest for the annual summary TMAX climatic objects. However, the Fall (SOND) season has the largest and smallest areas of warming and cooling, respectively, and the highest mean trend for warming objects. Conversely, Spring (MAMJ) has the largest and smallest areas undergoing cooling and warming, respectively. Finally, Summer (JJAS

  11. Paleosecular variation analysis of high-latitude paleomagnetic data from the volcanic island of Jan Mayen

    Science.gov (United States)

    Cromwell, G.; Tauxe, L.; Staudigel, H.; Pedersen, L. R.; Constable, C.; Pedersen, R.; Duncan, R. A.; Staudigel, P.

    2009-12-01

    Recent investigation of high-latitude paleomagnetic data from the Erebus Volcanic Province (EVP), Antarctica shows a departure from magnetic dipole predictions for paleointensity data for the period 0-5 Ma. The average EVP paleointensity (31.5 +/- 2.4 μT) is equivalent to low-latitude measurements (1) or approximately half the strength predicted for a dipole at high-latitude. Also, paleosecular variation models (e.g., 2,3) predict dispersions of directions that are much lower than the high latitude observations. Observed low intensity values may be the result of reduced convective flow inside the tangent cylinder of the Earth’s core or insufficient temporal sampling (1). More high-latitude paleomagnetic data are necessary in order to investigate the cause of the depressed intensity values and to provide better geographic and temporal resolution for future statistical paleosecular variation models. To address this, we carried out two field seasons, one in Spitzbergen (79°N, 14°E) and one on the young volcanic island of Jan Mayen (71°N, 8°W). The latter sampling effort was guided by age analyses of samples obtained by P. Imsland (unpublished and 4). We will present new paleodirectional and paleointensity data from a total of 25 paleomagnetic sites. These data enhance the temporal resolution of global paleomagnetic data and allow for a more complete evaluation of the time-averaged magnetic field from 0-5 Ma. We will present a new analysis of paleosecular variation based on our new data, in combination with other recently published data sets. (1) Lawrence, K.P., L.Tauxe, H. Staudigel, C.G. Constable, A. Koppers, W. MacIntosh, C.L. Johnson, Paleomagnetic field properties at high southern latitude. Geochemistry Geophysics Geosystems 10 (2009). (2) McElhinny, M.W., P.L. McFadden, Paleosecular variation over the past 5 Myr based on a new generalized database. Geophysics Journal International 131 (1997), 240-252. (3) Tauxe, L., Kent, D.V., A simplified statistical

  12. Analysis of rare, exonic variation amongst subjects with autism spectrum disorders and population controls.

    Directory of Open Access Journals (Sweden)

    Li Liu

    2013-04-01

    Full Text Available We report on results from whole-exome sequencing (WES of 1,039 subjects diagnosed with autism spectrum disorders (ASD and 870 controls selected from the NIMH repository to be of similar ancestry to cases. The WES data came from two centers using different methods to produce sequence and to call variants from it. Therefore, an initial goal was to ensure the distribution of rare variation was similar for data from different centers. This proved straightforward by filtering called variants by fraction of missing data, read depth, and balance of alternative to reference reads. Results were evaluated using seven samples sequenced at both centers and by results from the association study. Next we addressed how the data and/or results from the centers should be combined. Gene-based analyses of association was an obvious choice, but should statistics for association be combined across centers (meta-analysis or should data be combined and then analyzed (mega-analysis? Because of the nature of many gene-based tests, we showed by theory and simulations that mega-analysis has better power than meta-analysis. Finally, before analyzing the data for association, we explored the impact of population structure on rare variant analysis in these data. Like other recent studies, we found evidence that population structure can confound case-control studies by the clustering of rare variants in ancestry space; yet, unlike some recent studies, for these data we found that principal component-based analyses were sufficient to control for ancestry and produce test statistics with appropriate distributions. After using a variety of gene-based tests and both meta- and mega-analysis, we found no new risk genes for ASD in this sample. Our results suggest that standard gene-based tests will require much larger samples of cases and controls before being effective for gene discovery, even for a disorder like ASD.

  13. Analysis of Rare, Exonic Variation amongst Subjects with Autism Spectrum Disorders and Population Controls

    Science.gov (United States)

    Liu, Li; Sabo, Aniko; Neale, Benjamin M.; Nagaswamy, Uma; Stevens, Christine; Lim, Elaine; Bodea, Corneliu A.; Muzny, Donna; Reid, Jeffrey G.; Banks, Eric; Coon, Hillary; DePristo, Mark; Dinh, Huyen; Fennel, Tim; Flannick, Jason; Gabriel, Stacey; Garimella, Kiran; Gross, Shannon; Hawes, Alicia; Lewis, Lora; Makarov, Vladimir; Maguire, Jared; Newsham, Irene; Poplin, Ryan; Ripke, Stephan; Shakir, Khalid; Samocha, Kaitlin E.; Wu, Yuanqing; Boerwinkle, Eric; Buxbaum, Joseph D.; Cook, Edwin H.; Devlin, Bernie; Schellenberg, Gerard D.; Sutcliffe, James S.; Daly, Mark J.; Gibbs, Richard A.; Roeder, Kathryn

    2013-01-01

    We report on results from whole-exome sequencing (WES) of 1,039 subjects diagnosed with autism spectrum disorders (ASD) and 870 controls selected from the NIMH repository to be of similar ancestry to cases. The WES data came from two centers using different methods to produce sequence and to call variants from it. Therefore, an initial goal was to ensure the distribution of rare variation was similar for data from different centers. This proved straightforward by filtering called variants by fraction of missing data, read depth, and balance of alternative to reference reads. Results were evaluated using seven samples sequenced at both centers and by results from the association study. Next we addressed how the data and/or results from the centers should be combined. Gene-based analyses of association was an obvious choice, but should statistics for association be combined across centers (meta-analysis) or should data be combined and then analyzed (mega-analysis)? Because of the nature of many gene-based tests, we showed by theory and simulations that mega-analysis has better power than meta-analysis. Finally, before analyzing the data for association, we explored the impact of population structure on rare variant analysis in these data. Like other recent studies, we found evidence that population structure can confound case-control studies by the clustering of rare variants in ancestry space; yet, unlike some recent studies, for these data we found that principal component-based analyses were sufficient to control for ancestry and produce test statistics with appropriate distributions. After using a variety of gene-based tests and both meta- and mega-analysis, we found no new risk genes for ASD in this sample. Our results suggest that standard gene-based tests will require much larger samples of cases and controls before being effective for gene discovery, even for a disorder like ASD. PMID:23593035

  14. SECOND-ORDER VARIATIONAL ANALYSIS IN CONIC PROGRAMMING WITH APPLICATIONS TO OPTIMALITY AND STABILITY

    Czech Academy of Sciences Publication Activity Database

    Mordukhovich, B. S.; Outrata, Jiří; Ramírez, H. C.

    2015-01-01

    Roč. 25, č. 1 (2015), s. 76-101 ISSN 1052-6234 R&D Projects: GA ČR(CZ) GAP201/12/0671 Grant - others:Australian Research Council(AU) DP-110102011; USA National Science Foundation(US) DMS-1007132; Australian Reseach Council(AU) DP-12092508; Portuguese Foundation of Science and Technologies(PT) MAT/11109; FONDECYT Project(CL) 1110888; Universidad de Chile(CL) BASAL Project Centro de Modelamiento Matematico Institutional support: RVO:67985556 Keywords : variational analysis * second-order theory * conic programming * generalized differentiation * optimality conditions * isolated calmness * tilt stability Subject RIV: BA - General Mathematics Impact factor: 2.659, year: 2015 http://library.utia.cas.cz/separaty/2015/MTR/outrata-0439413.pdf

  15. Statistical intensity variation analysis for rapid volumetric imaging of capillary network flux.

    Science.gov (United States)

    Lee, Jonghwan; Jiang, James Y; Wu, Weicheng; Lesage, Frederic; Boas, David A

    2014-04-01

    We present a novel optical coherence tomography (OCT)-based technique for rapid volumetric imaging of red blood cell (RBC) flux in capillary networks. Previously we reported that OCT can capture individual RBC passage within a capillary, where the OCT intensity signal at a voxel fluctuates when an RBC passes the voxel. Based on this finding, we defined a metric of statistical intensity variation (SIV) and validated that the mean SIV is proportional to the RBC flux [RBC/s] through simulations and measurements. From rapidly scanned volume data, we used Hessian matrix analysis to vectorize a segment path of each capillary and estimate its flux from the mean of the SIVs gathered along the path. Repeating this process led to a 3D flux map of the capillary network. The present technique enabled us to trace the RBC flux changes over hundreds of capillaries with a temporal resolution of ~1 s during functional activation.

  16. Numerical Analysis of Through Transmission Pulsed Eddy Current Testing and Effects of Pulse Width Variation

    International Nuclear Information System (INIS)

    Shin, Young Kil; Choi, Dong Myung

    2007-01-01

    By using numerical analysis methods, through transmission type pulsed eddy current (PEC) testing is modeled and PEC signal responses due to varying material conductivity, permeability, thickness, lift-off and pulse width are investigated. Results show that the peak amplitude of PEC signal gets reduced and the time to reach the peak amplitude is increased as the material conductivity, permeability, and specimen thickness increase. Also, they indicate that the pulse width needs to be shorter when evaluating the material conductivity and the plate thickness using the peak amplitude, and when the pulse width is long, the peak time is found to be more useful. Other results related to lift-off variation are reported as well

  17. Variational analysis for simulating free-surface flows in a porous medium

    Directory of Open Access Journals (Sweden)

    Shabbir Ahmed

    2003-01-01

    is used to obtain a discrete form of equations for a two-dimensional domain. The matrix characteristics and the stability criteria have been investigated to develop a stable numerical algorithm for solving the governing equation. A computer programme has been written to solve a symmetric positive definite system obtained from the variational finite element analysis. The system of equations is solved using the conjugate gradient method. The solution generates time-varying hydraulic heads in the subsurface. The interfacing free surface between the unsaturated and saturated zones in the variably saturated domain is located, based on the computed hydraulic heads. Example problems are investigated. The finite element solutions are compared with the exact solutions for the example problems. The numerical characteristics of the finite element solution method are also investigated using the example problems.

  18. Application of Archimedean copulas to the analysis of drought decadal variation in China

    Science.gov (United States)

    Zuo, Dongdong; Feng, Guolin; Zhang, Zengping; Hou, Wei

    2017-12-01

    Based on daily precipitation data collected from 1171 stations in China during 1961-2015, the monthly standardized precipitation index was derived and used to extract two major drought characteristics which are drought duration and severity. Next, a bivariate joint model was established based on the marginal distributions of the two variables and Archimedean copula functions. The joint probability and return period were calculated to analyze the drought characteristics and decadal variation. According to the fit analysis, the Gumbel-Hougaard copula provided the best fit to the observed data. Based on four drought duration classifications and four severity classifications, the drought events were divided into 16 drought types according to the different combinations of duration and severity classifications, and the probability and return period were analyzed for different drought types. The results showed that the occurring probability of six common drought types (0 accounted for 76% of the total probability of all types. Moreover, due to their greater variation, two drought types were particularly notable, i.e., the drought types where D ≥ 6 and S ≥ 2. Analyzing the joint probability in different decades indicated that the location of the drought center had a distinctive stage feature, which cycled from north to northeast to southwest during 1961-2015. However, southwest, north, and northeast China had a higher drought risk. In addition, the drought situation in southwest China should be noted because the joint probability values, return period, and the analysis of trends in the drought duration and severity all indicated a considerable risk in recent years.

  19. Segmental Quantitative MR Imaging analysis of diurnal variation of water content in the lumbar intervertebral discs

    International Nuclear Information System (INIS)

    Zhu, Ting Ting; Ai, Tao; Zhang, Wei; Li, Tao; Li, Xiao Ming

    2015-01-01

    To investigate the changes in water content in the lumbar intervertebral discs by quantitative T2 MR imaging in the morning after bed rest and evening after a diurnal load. Twenty healthy volunteers were separately examined in the morning after bed rest and in the evening after finishing daily work. T2-mapping images were obtained and analyzed. An equally-sized rectangular region of interest (ROI) was manually placed in both, the anterior and the posterior annulus fibrosus (AF), in the outermost 20% of the disc. Three ROIs were placed in the space defined as the nucleus pulposus (NP). Repeated-measures analysis of variance and paired 2-tailed t tests were used for statistical analysis, with p < 0.05 as significantly different. T2 values significantly decreased from morning to evening, in the NP (anterior NP = -13.9 ms; central NP = -17.0 ms; posterior NP = -13.3 ms; all p < 0.001). Meanwhile T2 values significantly increased in the anterior AF (+2.9 ms; p = 0.025) and the posterior AF (+5.9 ms; p < 0.001). T2 values in the posterior AF showed the largest degree of variation among the 5 ROIs, but there was no statistical significance (p = 0.414). Discs with initially low T2 values in the center NP showed a smaller degree of variation in the anterior NP and in the central NP, than in discs with initially high T2 values in the center NP (10.0% vs. 16.1%, p = 0.037; 6.4% vs. 16.1%, p = 0.006, respectively). Segmental quantitative T2 MRI provides valuable insights into physiological aspects of normal discs.

  20. Systematic documentation and analysis of human genetic variation in hemoglobinopathies using the microattribution approach

    NARCIS (Netherlands)

    B. Giardine (Belinda); J. Borg (Joseph); D.R. Higgs (Douglas); K.R. Peterson (Kenneth R.); J.N.J. Philipsen (Sjaak); D. Maglott (Donna); B.K. Singleton (Belinda K.); D.J. Anstee (David J.); A.N. Basak (Nazli); B.H. Clark (Bruce); F.C. Costa (Flavia C.); P. Faustino (Paula); H. Fedosyuk (Halyna); A.E. Felice (Alex); A. Francina (Alain); R. Galanello (Renzo); M.V.E. Gallivan (Monica V. E.); M. Georgitsi (Marianthi); R.J. Gibbons (Richard J.); P.C. Giordano (Piero Carlo); C.L. Harteveld (Cornelis); J.D. Hoyer (James D.); M. Jarvis (Martin); P. Joly (Philippe); E. Kanavakis (Emmanuel); P. Kollia (Panagoula); S. Menzel (Stephan); W.G. Miller (William); K. Moradkhani (Kamran); J. Old (John); A. Papachatzpoulou (Adamantia); M.N. Papadakis (Manoussos); P. Papadopoulos (Petros); S. Pavlovic (Sonja); L. Perseu (Lucia); M. Radmilovic (Milena); C. Riemer (Cathy); S. Satta (Stefania); I.A. Schrijver (Ingrid); M. Stojiljkovic (Maja); S.L. Thein; J. Traeger-Synodinos (Joanne); R. Tully (Ray); T. Wada (Takahito); J.S. Waye (John); C. Wiemann (Claudia); B. Zukic (Branka); D.H.K. Chui (David H. K.); H. Wajcman (Henri); R. Hardison (Ross); G.P. Patrinos (George)

    2011-01-01

    textabstractWe developed a series of interrelated locus-specific databases to store all published and unpublished genetic variation related to hemoglobinopathies and thalassemia and implemented microattribution to encourage submission of unpublished observations of genetic variation to these public

  1. Life table analysis of the United States' Year 2000 mortality objectives.

    Science.gov (United States)

    Rockett, I R; Pollard, J H

    1995-06-01

    The US Year 2000 mortality objectives are model standards cast as targeted changes in age-adjusted cause-specific death rates. This research centred on the projected impact of such changes on life expectancy and the mortality toll for each sex. A computer simulation was conducted using single decrement, multiple decrement and cause-elimination life table techniques, together with a decomposition procedure. Male and female life expectancy at birth was projected to increase by 1.71 and 1.51 years, respectively, between the designated 1987 baseline and 2000. The leading beneficiaries would be those aged 65 and older, followed by those aged 45-64, and infants. Declines in coronary heart disease, stroke and injury death rates would most influence the projected life expectancy changes, irrespective of sex. Approximately 782,000 male deaths and 730,000 female deaths would be averted under Year 2000 assumptions. Life expectancy would be a useful summary measure to incorporate into official evaluations of the Year 2000 mortality objectives. Targeting of excess male mortality in the US and other highly industrialized nations is recommended.

  2. Subjective and objective analysis of three water pump systems carried by forest firefighters.

    Science.gov (United States)

    Moser, Daniel J; Graham, Ryan B; Stevenson, Joan M; Costigan, Patrick A

    2014-01-01

    The Mark 3 (M3) water power pump is an integral piece of wildfire fighting equipment. However, it is provided to fire stations without a carrying harness. The currently-used carrying harness is very uncomfortable, especially when carrying the pumps considerable distance in a forest to reach a water source. The purpose of this study was to advise the Ontario Ministry of Natural Resources on the selection of a new M3 load carriage system. Twenty Fire Rangers wore the three systems (Original, Prototype, and Modified) through a circuit of tasks representative of their working environment. Subjective and objective approaches were combined to assess and rank the M3 carriage systems. Subjective visual analogue scale ratings were obtained for ease of loading/unloading, comfort, system stability, and overall performance. Tri-axial accelerometers were mounted on each pump and at the sternum of each participant to determine relative pump-carrier accelerations. Overall, the Prototype was ranked as the best system; it resulted in the lowest relative pump-carrier accelerations on 10 out of 15 objective measures, and also received a first place ranking on all subjective measures. It was recommended that the Prototype be implemented as the M3 carriage system for fire suppression teams.

  3. Optimal Waste Load Allocation Using Multi-Objective Optimization and Multi-Criteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    L. Saberi

    2016-10-01

    Full Text Available Introduction: Increasing demand for water, depletion of resources of acceptable quality, and excessive water pollution due to agricultural and industrial developments has caused intensive social and environmental problems all over the world. Given the environmental importance of rivers, complexity and extent of pollution factors and physical, chemical and biological processes in these systems, optimal waste-load allocation in river systems has been given considerable attention in the literature in the past decades. The overall objective of planning and quality management of river systems is to develop and implement a coordinated set of strategies and policies to reduce or allocate of pollution entering the rivers so that the water quality matches by proposing environmental standards with an acceptable reliability. In such matters, often there are several different decision makers with different utilities which lead to conflicts. Methods/Materials: In this research, a conflict resolution framework for optimal waste load allocation in river systems is proposed, considering the total treatment cost and the Biological Oxygen Demand (BOD violation characteristics. There are two decision-makers inclusive waste load discharges coalition and environmentalists who have conflicting objectives. This framework consists of an embedded river water quality simulator, which simulates the transport process including reaction kinetics. The trade-off curve between objectives is obtained using the Multi-objective Particle Swarm Optimization Algorithm which these objectives are minimization of the total cost of treatment and penalties that must be paid by discharges and a violation of water quality standards considering BOD parameter which is controlled by environmentalists. Thus, the basic policy of river’s water quality management is formulated in such a way that the decision-makers are ensured their benefits will be provided as far as possible. By using MOPSO

  4. Auditory Scene Analysis and sonified visual images. Does consonance negatively impact on object formation when using complex sonified stimuli?

    Directory of Open Access Journals (Sweden)

    David J Brown

    2015-10-01

    Full Text Available A critical task for the brain is the sensory representation and identification of perceptual objects in the world. When the visual sense is impaired, hearing and touch must take primary roles and in recent times compensatory techniques have been developed that employ the tactile or auditory system as a substitute for the visual system. Visual-to-auditory sonifications provide a complex, feature-based auditory representation that must be decoded and integrated into an object-based representation by the listener. However, we don’t yet know what role the auditory system plays in the object integration stage and whether the principles of auditory scene analysis apply. Here we used coarse sonified images in a two-tone discrimination task to test whether auditory feature-based representations of visual objects would be confounded when their features conflicted with the principles of auditory consonance. We found that listeners (N = 36 performed worse in an object recognition task when the auditory feature-based representation was harmonically consonant. We also found that this conflict was not negated with the provision of congruent audio-visual information. The findings suggest that early auditory processes of harmonic grouping dominate the object formation process and that the complexity of the signal, and additional sensory information have limited effect on this.

  5. Role of regression analysis and variation of rheological data in calculation of pressure drop for sludge pipelines.

    Science.gov (United States)

    Farno, E; Coventry, K; Slatter, P; Eshtiaghi, N

    2018-06-15

    Sludge pumps in wastewater treatment plants are often oversized due to uncertainty in calculation of pressure drop. This issue costs millions of dollars for industry to purchase and operate the oversized pumps. Besides costs, higher electricity consumption is associated with extra CO 2 emission which creates huge environmental impacts. Calculation of pressure drop via current pipe flow theory requires model estimation of flow curve data which depends on regression analysis and also varies with natural variation of rheological data. This study investigates impact of variation of rheological data and regression analysis on variation of pressure drop calculated via current pipe flow theories. Results compare the variation of calculated pressure drop between different models and regression methods and suggest on the suitability of each method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Identifying the factors influencing practice variation in thrombosis medicine: A qualitative content analysis of published practice-pattern surveys.

    Science.gov (United States)

    Skeith, Leslie; Gonsalves, Carol

    2017-11-01

    Practice variation, the differences in clinical management between physicians, is one reason why patient outcomes may differ. Identifying factors that contribute to practice variation in areas of clinical uncertainty or equipoise may have implications for understanding and improving patient care. To discern what factors may influence practice variation, we completed a qualitative content analysis of all practice-pattern surveys in thrombosis medicine in the last 10years. Out of 2117 articles screened using a systematic search strategy, 33 practice-pattern surveys met eligibility criteria. Themes were identified using constant comparative analysis of qualitative data. Practice variation was noted in all 33 practice-pattern surveys. Contributing factors to variation included lack of available evidence, lack of clear and specific guideline recommendations, past experience, patient context, institutional culture and the perceived risk and benefit of a particular treatment. Additional themes highlight the value placed on expertise in challenging clinical scenarios, the complexity of practice variation and the value placed on minimizing practice variation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Analysis of hepatic vein variations in healthy people with 64-slice spiral CT

    International Nuclear Information System (INIS)

    Zhang Rong; Li Yong; Shen Jun; Zeng Weike; Li Jieting; Huang Suiqiao; Liang Biling; Liu Chao

    2007-01-01

    Objective: To analyze variations of hepatic vein in healthy people with 64-slice spiral CT. Methods: Seventy-five healthy subjects underwent multi-slice spiral computed (MSCT) hepatic venography. The anatomy of the junction of the hepatic veins with the inferior vena cava and the intrahepatic drainage territory of the hepatic veins and tributaries were evaluated. The hepatic veins were classified according to three anatomic classification (Nakamura's, Marcos's and Kawasaki's classification) methods respectively. Results: There was a common trunk of the middle and left hepatic veins before joining the IVC in 86.7% (65/75)of the cases. In 13.3% (10/75)of the cases, the three main hepatic veins joined the IVC separately. The ratios of Nakamma's classification type A, B, C of hepatic veins were 49.4% (37/75), 37.3% (28/75), and 13.3% (10/75) respectively. The ratios of Marcos's classification type A, B, C of hepatic veins were 56.0% (42/75), 24.0% (18/75), and 20.0% (15/75) respectively. The ratios of Kawasaki's classification type I, II of hepatic vein were 40.0% (30/75) and 60.0% (45/75). Conclusion: Multi-slice spiral CT hepatic venography can provide visualization of peripheral hepatic venous branches in details. (authors)

  8. Fashion Objects

    DEFF Research Database (Denmark)

    Andersen, Bjørn Schiermer

    2009-01-01

    -- an outline which at the same time indicates the need for transformations of the Durkheimian model on decisive points. Thus, thirdly, it returns to Durkheim and undertakes to develop his concepts in a direction suitable for a sociological theory of fashion. Finally, it discusses the theoretical implications......This article attempts to create a framework for understanding modern fashion phenomena on the basis of Durkheim's sociology of religion. It focuses on Durkheim's conception of the relation between the cult and the sacred object, on his notion of 'exteriorisation', and on his theory of the social...... symbol in an attempt to describe the peculiar attraction of the fashion object and its social constitution. However, Durkheim's notions of cult and ritual must undergo profound changes if they are to be used in an analysis of fashion. The article tries to expand the Durkheimian cult, radically enlarging...

  9. Children's exposure to alcohol marketing within supermarkets: An objective analysis using GPS technology and wearable cameras.

    Science.gov (United States)

    Chambers, T; Pearson, A L; Stanley, J; Smith, M; Barr, M; Ni Mhurchu, C; Signal, L

    2017-07-01

    Exposure to alcohol marketing within alcohol retailers has been associated with higher rates of childhood drinking, brand recognition, and marketing recall. This study aimed to objectively measure children's everyday exposure to alcohol marketing within supermarkets. Children aged 11-13 (n = 167) each wore a wearable camera and GPS device for four consecutive days. Micro-spatial analyses were used to examine exposures within supermarkets. In alcohol retailing supermarkets (n = 30), children encountered alcohol marketing on 85% of their visits (n = 78). Alcohol marketing was frequently near everyday goods (bread and milk) or entrance/exit. Alcohol sales in supermarkets should be banned in order to protect children from alcohol marketing. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Fuzzy multinomial logistic regression analysis: A multi-objective programming approach

    Science.gov (United States)

    Abdalla, Hesham A.; El-Sayed, Amany A.; Hamed, Ramadan

    2017-05-01

    Parameter estimation for multinomial logistic regression is usually based on maximizing the likelihood function. For large well-balanced datasets, Maximum Likelihood (ML) estimation is a satisfactory approach. Unfortunately, ML can fail completely or at least produce poor results in terms of estimated probabilities and confidence intervals of parameters, specially for small datasets. In this study, a new approach based on fuzzy concepts is proposed to estimate parameters of the multinomial logistic regression. The study assumes that the parameters of multinomial logistic regression are fuzzy. Based on the extension principle stated by Zadeh and Bárdossy's proposition, a multi-objective programming approach is suggested to estimate these fuzzy parameters. A simulation study is used to evaluate the performance of the new approach versus Maximum likelihood (ML) approach. Results show that the new proposed model outperforms ML in cases of small datasets.

  11. Instrumental Supporting System for Developing and Analysis of Software-Defined Networks of Mobile Objects

    Directory of Open Access Journals (Sweden)

    V. A. Sokolov

    2015-01-01

    Full Text Available This article describes the organization principles for wireless mesh-networks (software-defined net-works of mobile objects. The emphasis is on the questions of getting effective routing algorithms for such networks. The mathematical model of the system is the standard transportation network. The key parameter of the routing system is the node reachability coefficient — the function depending on several basic and additional parameters (“mesh-factors”, which characterize the route between two network nodes. Each pair (arc, node is juxtaposed to a composite parameter which characterizes the “reacha-bility” of the node by the route which begins with this arc. The best (“shortest” route between two nodes is the route with the maximum reachability coefficient. The rules of building and refreshing the routing tables by the network nodes are described. With the announcement from the neighbor the node gets the information about the connection energy and reliability, the announcement time of receipt, the absence of transitional nodes and also about the connection capability. On the basis of this informationthe node applies the penalization (decreasing the reachability coefficient or the reward (increasing the reachability coefficient to all routes through this neighbor node. The penalization / reward scheme has some separate aspects: 1. Penalization for the actuality of information. 2. Penalization / reward for the reliability of a node. 3. Penalization for the connection energy. 4. Penalization for the present connection capability. The simulator of the wireless mesh-network of mobile objects is written. It is based on the suggested heuristic algorithms. The description and characteristics of the simulator are stated in the article. The peculiarities of its program realization are also examined.

  12. AN ANALYSIS OF THE ENVIRONMENTS OF FU ORIONIS OBJECTS WITH HERSCHEL

    Energy Technology Data Exchange (ETDEWEB)

    Green, Joel D.; Evans, Neal J. II; Merello, Manuel [Department of Astronomy, The University of Texas at Austin, 2515 Speedway, Stop C1400, Austin, TX 78712-1205 (United States); Kospal, Agnes [European Space Agency (ESA/ESTEC), Keplerlaan 1, 2200-AG Noordwijk (Netherlands); Herczeg, Gregory [Kavli Institute for Astronomy and Astrophysics, Peking University, Beijing 100871 (China); Quanz, Sascha P. [Institute for Astronomy, ETH Zurich, Wolfgang-Pauli-Strasse 27, CH-8093 Zurich (Switzerland); Henning, Thomas; Bouwman, Jeroen [Max Planck Institute for Astronomy, Koenigstuhl 17, D-69117 Heidelberg (Germany); Van Kempen, Tim A. [Leiden Observatory, Leiden University, P.O. Box 9513, 2300-RA Leiden (Netherlands); Lee, Jeong-Eun [Department of Astronomy and Space Science, Kyung Hee University, Gyeonggi 446-701 (Korea, Republic of); Dunham, Michael M. [Department of Astronomy, Yale University, New Haven, CT (United States); Meeus, Gwendolyn [Departamento de Fisica Teorica, Universidad Autonoma de Madrid, Campus Cantoblanco (Spain); Chen, Jo-hsin [Jet Propulsion Laboratory, Pasadena, CA (United States); Guedel, Manuel; Liebhart, Armin [Department of Astrophysics, University of Vienna (Austria); Skinner, Stephen L., E-mail: joel@astro.as.utexas.edu [Center for Astrophysics and Space Astronomy (CASA), University of Colorado, Boulder, CO 80309-0389 (United States)

    2013-08-01

    We present Herschel-HIFI, SPIRE, and PACS 50-670 {mu}m imaging and spectroscopy of six FU Orionis-type objects and candidates (FU Orionis, V1735 Cyg, V1515 Cyg, V1057 Cyg, V1331 Cyg, and HBC 722), ranging in outburst date from 1936 to 2010, from the 'FOOSH' (FU Orionis Objects Surveyed with Herschel) program, as well as ancillary results from Spitzer Infrared Spectrograph and the Caltech Submillimeter Observatory. In their system properties (L{sub bol}, T{sub bol}, and line emission), we find that FUors are in a variety of evolutionary states. Additionally, some FUors have features of both Class I and II sources: warm continuum consistent with Class II sources, but rotational line emission typical of Class I, far higher than Class II sources of similar mass/luminosity. Combining several classification techniques, we find an evolutionary sequence consistent with previous mid-IR indicators. We detect [O I] in every source at luminosities consistent with Class 0/I protostars, much greater than in Class II disks. We detect transitions of {sup 13}CO (J{sub up} of 5-8) around two sources (V1735 Cyg and HBC 722) but attribute them to nearby protostars. Of the remaining sources, three (FU Ori, V1515 Cyg, and V1331 Cyg) exhibit only low-lying CO, but one (V1057 Cyg) shows CO up to J = 23 {yields} 22 and evidence for H{sub 2}O and OH emission, at strengths typical of protostars rather than T Tauri stars. Rotational temperatures for 'cool' CO components range from 20 to 81 K, for {approx} 10{sup 50} total CO molecules. We detect [C I] and [N II] primarily as diffuse emission.

  13. A cardiovascular life history: a life course analysis of the original Framingham Heart Objective.

    NARCIS (Netherlands)

    Peeters, A.; Mamun, A.A.; Willekens, F.J.; Bonneux, L.

    2002-01-01

    dietary behaviour and to further examine the associations of different dietary compositions with selected characteristics. Design: Latent class analysis was applied to data from the recent cross-sectional National Family Health Survey that collected information on the intake frequency of selected

  14. Nonradioactive Dangerous Waste Landfill sampling and analysis plan and data quality objectives process summary report

    International Nuclear Information System (INIS)

    Smith, R.C.

    1997-08-01

    This sampling and analysis plan defines the sampling and analytical activities and associated procedures that will be used to support the Nonradioactive Dangerous Waste Landfill soil-gas investigation. This SAP consists of three sections: this introduction, the field sampling plan, and the quality assurance project plan. The field sampling plan defines the sampling and analytical methodologies to be performed

  15. Mapping of landslides under dense vegetation cover using object - oriented analysis and LiDAR derivatives

    NARCIS (Netherlands)

    Van Den Eeckhout, Miet; Kerle, N.; Hervas, Javier; Supper, Robert; Margottini, C.; Canuti, P.; Sassa, K.

    2013-01-01

    Light Detection and Ranging (LiDAR) and its wide range of derivative products have become a powerful tool in landslide research, particularly for landslide identification and landslide inventory mapping. In contrast to the many studies that use expert-based analysis of LiDAR derivatives to identify

  16. High Recharge Areas in the Choushui River Alluvial Fan (Taiwan Assessed from Recharge Potential Analysis and Average Storage Variation Indexes

    Directory of Open Access Journals (Sweden)

    Jui-Pin Tsai

    2015-03-01

    Full Text Available High recharge areas significantly influence the groundwater quality and quantity in regional groundwater systems. Many studies have applied recharge potential analysis (RPA to estimate groundwater recharge potential (GRP and have delineated high recharge areas based on the estimated GRP. However, most of these studies define the RPA parameters with supposition, and this represents a major source of uncertainty for applying RPA. To objectively define the RPA parameter values without supposition, this study proposes a systematic method based on the theory of parameter identification. A surrogate variable, namely the average storage variation (ASV index, is developed to calibrate the RPA parameters, because of the lack of direct GRP observations. The study results show that the correlations between the ASV indexes and computed GRP values improved from 0.67 before calibration to 0.85 after calibration, thus indicating that the calibrated RPA parameters represent the recharge characteristics of the study area well; these data also highlight how defining the RPA parameters with ASV indexes can help to improve the accuracy. The calibrated RPA parameters were used to estimate the GRP distribution of the study area, and the GRP values were graded into five levels. High and excellent level areas are defined as high recharge areas, which composed 7.92% of the study area. Overall, this study demonstrates that the developed approach can objectively define the RPA parameters and high recharge areas of the Choushui River alluvial fan, and the results should serve as valuable references for the Taiwanese government in their efforts to conserve the groundwater quality and quantity of the study area.

  17. A comparative analysis of pixel- and object-based detection of landslides from very high-resolution images

    Science.gov (United States)

    Keyport, Ren N.; Oommen, Thomas; Martha, Tapas R.; Sajinkumar, K. S.; Gierke, John S.

    2018-02-01

    A comparative analysis of landslides detected by pixel-based and object-oriented analysis (OOA) methods was performed using very high-resolution (VHR) remotely sensed aerial images for the San Juan La Laguna, Guatemala, which witnessed widespread devastation during the 2005 Hurricane Stan. A 3-band orthophoto of 0.5 m spatial resolution together with a 115 field-based landslide inventory were used for the analysis. A binary reference was assigned with a zero value for landslide and unity for non-landslide pixels. The pixel-based analysis was performed using unsupervised classification, which resulted in 11 different trial classes. Detection of landslides using OOA includes 2-step K-means clustering to eliminate regions based on brightness; elimination of false positives using object properties such as rectangular fit, compactness, length/width ratio, mean difference of objects, and slope angle. Both overall accuracy and F-score for OOA methods outperformed pixel-based unsupervised classification methods in both landslide and non-landslide classes. The overall accuracy for OOA and pixel-based unsupervised classification was 96.5% and 94.3%, respectively, whereas the best F-score for landslide identification for OOA and pixel-based unsupervised methods: were 84.3% and 77.9%, respectively.Results indicate that the OOA is able to identify the majority of landslides with a few false positive when compared to pixel-based unsupervised classification.

  18. Exploring advantages of 4He-PIXE analysis for layered objects in cultural heritage

    International Nuclear Information System (INIS)

    Roehrs, S.; Calligaro, T.; Mathis, F.; Ortega-Feliu, I.; Salomon, J.; Walter, P.

    2006-01-01

    In the field of cultural heritage 4 He particle beams are often used to perform RBS analysis. In most cases the simultaneously produced X-rays are not considered for PIXE analysis. This paper aims to explore the potentials of 4 He induced X-ray emission (α-PIXE) using 4, 5 and 6 MeV 4 He beams and to compare its performance with that of conventional PIXE with 3 MeV protons. The α-PIXE and α-RBS spectra were collected at the same time in a vacuum chamber. The X-ray yields produced by 6 MeV 4 He beam for K-lines were found to be superior to those of protons for atomic numbers below 25. An additional advantage of α-PIXE is the lower bremsstrahlung background which leads to an improved peak to noise ratio for certain elements

  19. DECIPHERING THE FINEST IMPRINT OF GLACIAL EROSION: OBJECTIVE ANALYSIS OF STRIAE PATTERNS ON BEDROCK

    Directory of Open Access Journals (Sweden)

    Piet Stroeven

    2011-05-01

    Full Text Available The aim of this study is to compare the efficiency of different mathematical and statistical geometrical methods applied to characterise the orientation distribution of striae on bedrock for deciphering the finest imprint of glacial erosion. The involved methods include automatic image analysis techniques of Fast Fourier Transform (FFT, and the experimental investigations by means of Saltikov's directed secants analysis (rose of intersection densities, applied to digital and analogue images of the striae pattern, respectively. In addition, the experimental data were compared with the modelling results made on the basis of Underwood's concept of linear systems in a plane. The experimental and modelling approaches in the framework of stereology yield consistent results. These results reveal that stereo