WorldWideScience

Sample records for variational objective analysis

  1. Some new mathematical methods for variational objective analysis

    Science.gov (United States)

    Wahba, Grace; Johnson, Donald R.

    1994-01-01

    Numerous results were obtained relevant to remote sensing, variational objective analysis, and data assimilation. A list of publications relevant in whole or in part is attached. The principal investigator gave many invited lectures, disseminating the results to the meteorological community as well as the statistical community. A list of invited lectures at meetings is attached, as well as a list of departmental colloquia at various universities and institutes.

  2. Analysis of Optical Variations of BL Lac Object AO 0235+164 Wang ...

    Indian Academy of Sciences (India)

    obtain statistically meaningful values for the cross-correlation time lags ... deviation, the fifth represents the largest variations, the sixth represents the fractional ..... 6. Conclusions. The multi-band optical data are collected on the object of AO 0235 + 164. The time lags among the B, V, R and I bands have been analysed.

  3. Feedforward Object-Vision Models Only Tolerate Small Image Variations Compared to Human

    Directory of Open Access Journals (Sweden)

    Masoud eGhodrati

    2014-07-01

    Full Text Available Invariant object recognition is a remarkable ability of primates' visual system that its underlying mechanism has constantly been under intense investigations. Computational modelling is a valuable tool toward understanding the processes involved in invariant object recognition. Although recent computational models have shown outstanding performances on challenging image databases, they fail to perform well when images with more complex variations of the same object are applied to them. Studies have shown that making sparse representation of objects by extracting more informative visual features through a feedforward sweep can lead to higher recognition performances. Here, however, we show that when the complexity of image variations is high, even this approach results in poor performance compared to humans. To assess the performance of models and humans in invariant object recognition tasks, we built a parametrically controlled image database consisting of several object categories varied in different dimensions and levels, rendered from 3D planes. Comparing the performance of several object recognition models with human observers shows that only in low-level image variations the models perform similar to humans in categorization tasks. Furthermore, the results of our behavioral experiments demonstrate that, even under difficult experimental conditions (i.e. briefly presented masked stimuli with complex image variations, human observers performed outstandingly well, suggesting that the models are still far from resembling humans in invariant object recognition. Taken together, we suggest that learning sparse informative visual features, although desirable, is not a complete solution for future progresses in object-vision modelling. We show that this approach is not of significant help in solving the computational crux of object recognition (that is invariant object recognition when the identity-preserving image variations become more complex.

  4. Learning-based stochastic object models for characterizing anatomical variations

    Science.gov (United States)

    Dolly, Steven R.; Lou, Yang; Anastasio, Mark A.; Li, Hua

    2018-03-01

    It is widely known that the optimization of imaging systems based on objective, task-based measures of image quality via computer-simulation requires the use of a stochastic object model (SOM). However, the development of computationally tractable SOMs that can accurately model the statistical variations in human anatomy within a specified ensemble of patients remains a challenging task. Previously reported numerical anatomic models lack the ability to accurately model inter-patient and inter-organ variations in human anatomy among a broad patient population, mainly because they are established on image data corresponding to a few of patients and individual anatomic organs. This may introduce phantom-specific bias into computer-simulation studies, where the study result is heavily dependent on which phantom is used. In certain applications, however, databases of high-quality volumetric images and organ contours are available that can facilitate this SOM development. In this work, a novel and tractable methodology for learning a SOM and generating numerical phantoms from a set of volumetric training images is developed. The proposed methodology learns geometric attribute distributions (GAD) of human anatomic organs from a broad patient population, which characterize both centroid relationships between neighboring organs and anatomic shape similarity of individual organs among patients. By randomly sampling the learned centroid and shape GADs with the constraints of the respective principal attribute variations learned from the training data, an ensemble of stochastic objects can be created. The randomness in organ shape and position reflects the learned variability of human anatomy. To demonstrate the methodology, a SOM of an adult male pelvis is computed and examples of corresponding numerical phantoms are created.

  5. Objective definition of rosette shape variation using a combined computer vision and data mining approach.

    Directory of Open Access Journals (Sweden)

    Anyela Camargo

    Full Text Available Computer-vision based measurements of phenotypic variation have implications for crop improvement and food security because they are intrinsically objective. It should be possible therefore to use such approaches to select robust genotypes. However, plants are morphologically complex and identification of meaningful traits from automatically acquired image data is not straightforward. Bespoke algorithms can be designed to capture and/or quantitate specific features but this approach is inflexible and is not generally applicable to a wide range of traits. In this paper, we have used industry-standard computer vision techniques to extract a wide range of features from images of genetically diverse Arabidopsis rosettes growing under non-stimulated conditions, and then used statistical analysis to identify those features that provide good discrimination between ecotypes. This analysis indicates that almost all the observed shape variation can be described by 5 principal components. We describe an easily implemented pipeline including image segmentation, feature extraction and statistical analysis. This pipeline provides a cost-effective and inherently scalable method to parameterise and analyse variation in rosette shape. The acquisition of images does not require any specialised equipment and the computer routines for image processing and data analysis have been implemented using open source software. Source code for data analysis is written using the R package. The equations to calculate image descriptors have been also provided.

  6. An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method

    Science.gov (United States)

    Tang, J.

    2012-01-01

    Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.

  7. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  8. Analysis on Precipitation Variation in Anyang and Nanyang in Recent 57 Years

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The aim was to analyze precipitation variation in Anyang and Nanyang in recent 57 years. [Method] Based on the data of annual precipitation in Anyang and Nanyang from 1953 to 2009, the changes of precipitation in Anyang and Nanyang were compared by means of mathematical statistics, regression analysis and wavelet analysis. [Result] In recent 57 years, annual precipitation in Anyang and Nanyang showed decrease trend, especially Anyang with obvious decrease trend; from seasonal variation, average ...

  9. Exergoeconomic multi objective optimization and sensitivity analysis of a regenerative Brayton cycle

    International Nuclear Information System (INIS)

    Naserian, Mohammad Mahdi; Farahat, Said; Sarhaddi, Faramarz

    2016-01-01

    Highlights: • Finite time exergoeconomic multi objective optimization of a Brayton cycle. • Comparing the exergoeconomic and the ecological function optimization results. • Inserting the cost of fluid streams concept into finite-time thermodynamics. • Exergoeconomic sensitivity analysis of a regenerative Brayton cycle. • Suggesting the cycle performance curve drawing and utilization. - Abstract: In this study, the optimal performance of a regenerative Brayton cycle is sought through power maximization and then exergoeconomic optimization using finite-time thermodynamic concept and finite-size components. Optimizations are performed using genetic algorithm. In order to take into account the finite-time and finite-size concepts in current problem, a dimensionless mass-flow parameter is used deploying time variations. The decision variables for the optimum state (of multi objective exergoeconomic optimization) are compared to the maximum power state. One can see that the multi objective exergoeconomic optimization results in a better performance than that obtained with the maximum power state. The results demonstrate that system performance at optimum point of multi objective optimization yields 71% of the maximum power, but only with exergy destruction as 24% of the amount that is produced at the maximum power state and 67% lower total cost rate than that of the maximum power state. In order to assess the impact of the variation of the decision variables on the objective functions, sensitivity analysis is conducted. Finally, the cycle performance curve drawing according to exergoeconomic multi objective optimization results and its utilization, are suggested.

  10. Humans and Deep Networks Largely Agree on Which Kinds of Variation Make Object Recognition Harder.

    Science.gov (United States)

    Kheradpisheh, Saeed R; Ghodrati, Masoud; Ganjtabesh, Mohammad; Masquelier, Timothée

    2016-01-01

    View-invariant object recognition is a challenging problem that has attracted much attention among the psychology, neuroscience, and computer vision communities. Humans are notoriously good at it, even if some variations are presumably more difficult to handle than others (e.g., 3D rotations). Humans are thought to solve the problem through hierarchical processing along the ventral stream, which progressively extracts more and more invariant visual features. This feed-forward architecture has inspired a new generation of bio-inspired computer vision systems called deep convolutional neural networks (DCNN), which are currently the best models for object recognition in natural images. Here, for the first time, we systematically compared human feed-forward vision and DCNNs at view-invariant object recognition task using the same set of images and controlling the kinds of transformation (position, scale, rotation in plane, and rotation in depth) as well as their magnitude, which we call "variation level." We used four object categories: car, ship, motorcycle, and animal. In total, 89 human subjects participated in 10 experiments in which they had to discriminate between two or four categories after rapid presentation with backward masking. We also tested two recent DCNNs (proposed respectively by Hinton's group and Zisserman's group) on the same tasks. We found that humans and DCNNs largely agreed on the relative difficulties of each kind of variation: rotation in depth is by far the hardest transformation to handle, followed by scale, then rotation in plane, and finally position (much easier). This suggests that DCNNs would be reasonable models of human feed-forward vision. In addition, our results show that the variation levels in rotation in depth and scale strongly modulate both humans' and DCNNs' recognition performances. We thus argue that these variations should be controlled in the image datasets used in vision research.

  11. Humans and deep networks largely agree on which kinds of variation make object recognition harder

    Directory of Open Access Journals (Sweden)

    Saeed Reza Kheradpisheh

    2016-08-01

    Full Text Available View-invariant object recognition is a challenging problem that has attracted much attention among the psychology, neuroscience, and computer vision communities. Humans are notoriously good at it, even if some variations are presumably more difficult to handle than others (e.g. 3D rotations. Humans are thought to solve the problem through hierarchical processing along the ventral stream, which progressively extracts more and more invariant visual features. This feed-forward architecture has inspired a new generation of bio-inspired computer vision systems called deep convolutional neural networks (DCNN, which are currently the best models for object recognition in natural images. Here, for the first time, we systematically compared human feed-forward vision and DCNNs at view-invariant object recognition task using the same set of images and controlling the kinds of transformation (position, scale, rotation in plane, and rotation in depth as well as their magnitude, which we call variation level. We used four object categories: car, ship, motorcycle, and animal. In total, 89 human subjects participated in 10 experiments in which they had to discriminate between two or four categories after rapid presentation with backward masking. We also tested two recent DCNNs (proposed respectively by Hinton's group and Zisserman's group on the same tasks. We found that humans and DCNNs largely agreed on the relative difficulties of each kind of variation: rotation in depth is by far the hardest transformation to handle, followed by scale, then rotation in plane, and finally position (much easier. This suggests that DCNNs would be reasonable models of human feed-forward vision. In addition, our results show that the variation levels in rotation in depth and scale strongly modulate both humans' and DCNNs' recognition performances. We thus argue that these variations should be controlled in the image datasets used in vision research.

  12. Salient Point Detection in Protrusion Parts of 3D Object Robust to Isometric Variations

    Science.gov (United States)

    Mirloo, Mahsa; Ebrahimnezhad, Hosein

    2018-03-01

    In this paper, a novel method is proposed to detect 3D object salient points robust to isometric variations and stable against scaling and noise. Salient points can be used as the representative points from object protrusion parts in order to improve the object matching and retrieval algorithms. The proposed algorithm is started by determining the first salient point of the model based on the average geodesic distance of several random points. Then, according to the previous salient point, a new point is added to this set of points in each iteration. By adding every salient point, decision function is updated. Hence, a condition is created for selecting the next point in which the iterative point is not extracted from the same protrusion part so that drawing out of a representative point from every protrusion part is guaranteed. This method is stable against model variations with isometric transformations, scaling, and noise with different levels of strength due to using a feature robust to isometric variations and considering the relation between the salient points. In addition, the number of points used in averaging process is decreased in this method, which leads to lower computational complexity in comparison with the other salient point detection algorithms.

  13. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    OpenAIRE

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  14. The Effect of Geographic Units of Analysis on Measuring Geographic Variation in Medical Services Utilization

    Directory of Open Access Journals (Sweden)

    Agnus M. Kim

    2016-07-01

    Full Text Available Objectives: We aimed to evaluate the effect of geographic units of analysis on measuring geographic variation in medical services utilization. For this purpose, we compared geographic variations in the rates of eight major procedures in administrative units (districts and new areal units organized based on the actual health care use of the population in Korea. Methods: To compare geographic variation in geographic units of analysis, we calculated the age–sex standardized rates of eight major procedures (coronary artery bypass graft surgery, percutaneous transluminal coronary angioplasty, surgery after hip fracture, knee-replacement surgery, caesarean section, hysterectomy, computed tomography scan, and magnetic resonance imaging scan from the National Health Insurance database in Korea for the 2013 period. Using the coefficient of variation, the extremal quotient, and the systematic component of variation, we measured geographic variation for these eight procedures in districts and new areal units. Results: Compared with districts, new areal units showed a reduction in geographic variation. Extremal quotients and inter-decile ratios for the eight procedures were lower in new areal units. While the coefficient of variation was lower for most procedures in new areal units, the pattern of change of the systematic component of variation between districts and new areal units differed among procedures. Conclusions: Geographic variation in medical service utilization could vary according to the geographic unit of analysis. To determine how geographic characteristics such as population size and number of geographic units affect geographic variation, further studies are needed.

  15. Cross-cultural variation of memory colors of familiar objects.

    Science.gov (United States)

    Smet, Kevin A G; Lin, Yandan; Nagy, Balázs V; Németh, Zoltan; Duque-Chica, Gloria L; Quintero, Jesús M; Chen, Hung-Shing; Luo, Ronnier M; Safi, Mahdi; Hanselaer, Peter

    2014-12-29

    The effect of cross-regional or cross-cultural differences on color appearance ratings and memory colors of familiar objects was investigated in seven different countries/regions - Belgium, Hungary, Brazil, Colombia, Taiwan, China and Iran. In each region the familiar objects were presented on a calibrated monitor in over 100 different colors to a test panel of observers that were asked to rate the similarity of the presented object color with respect to what they thought the object looks like in reality (memory color). For each object and region the mean observer ratings were modeled by a bivariate Gaussian function. A statistical analysis showed significant (p culture was found to be small. In fact, the differences between the region average observers and the global average observer were found to of the same magnitude or smaller than the typical within region inter-observer variability. Thus, although statistical differences in color appearance ratings and memory between regions were found, regional impact is not likely to be of practical importance.

  16. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  17. Error analysis of motion correction method for laser scanning of moving objects

    Science.gov (United States)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  18. THE HUBBLE WIDE FIELD CAMERA 3 TEST OF SURFACES IN THE OUTER SOLAR SYSTEM: SPECTRAL VARIATION ON KUIPER BELT OBJECTS

    International Nuclear Information System (INIS)

    Fraser, Wesley C.; Brown, Michael E.; Glass, Florian

    2015-01-01

    Here, we present additional photometry of targets observed as part of the Hubble Wide Field Camera 3 (WFC3) Test of Surfaces in the Outer Solar System. Twelve targets were re-observed with the WFC3 in the optical and NIR wavebands designed to complement those used during the first visit. Additionally, all of the observations originally presented by Fraser and Brown were reanalyzed through the same updated photometry pipeline. A re-analysis of the optical and NIR color distribution reveals a bifurcated optical color distribution and only two identifiable spectral classes, each of which occupies a broad range of colors and has correlated optical and NIR colors, in agreement with our previous findings. We report the detection of significant spectral variations on five targets which cannot be attributed to photometry errors, cosmic rays, point-spread function or sensitivity variations, or other image artifacts capable of explaining the magnitude of the variation. The spectrally variable objects are found to have a broad range of dynamical classes and absolute magnitudes, exhibit a broad range of apparent magnitude variations, and are found in both compositional classes. The spectrally variable objects with sufficiently accurate colors for spectral classification maintain their membership, belonging to the same class at both epochs. 2005 TV189 exhibits a sufficiently broad difference in color at the two epochs that span the full range of colors of the neutral class. This strongly argues that the neutral class is one single class with a broad range of colors, rather than the combination of multiple overlapping classes

  19. Object-sensitive Type Analysis of PHP

    NARCIS (Netherlands)

    Van der Hoek, Henk Erik; Hage, J

    2015-01-01

    In this paper we develop an object-sensitive type analysis for PHP, based on an extension of the notion of monotone frameworks to deal with the dynamic aspects of PHP, and following the framework of Smaragdakis et al. for object-sensitive analysis. We consider a number of instantiations of the

  20. Neutron activation analysis of limestone objects

    International Nuclear Information System (INIS)

    Meyers, P.; Van Zelst, L.

    1977-01-01

    The elemental composition of samples from limestone objects were determined by neutron activation analysis to investigate whether this technique can be used to distinguish between objects made of limestone from different sources. Samples weighing between 0.2-2 grams were obtained by drilling from a series of ancient Egyptian and medieval Spanish objects. Analysis was performed on aliquots varying in weight from 40-100 milligrams. The following elements were determined quantitatively: Na, K, Rb, Cs, Ba, Sc, La, Ce, Sm, Eu, Hf, Th, Ta, Cr, Mn, Fe, Co and Zn. The data on Egyptian limestones indicate that, because of the inhomogeneous nature of the stone, 0.2-2 gram samples may not be representative of an entire object. Nevertheless, multivariate statistical methods produced a clear distinction between objects originating from the Luxor area (ancient Thebes) and objects found north of Luxor. The Spanish limestone studied appeared to be more homogeneous. Samples from stylistically related objects have similar elemental compositions while relative large differences were observed between objects having no relationship other than the common provenance of medieval Spain. (orig.) [de

  1. Introduction to global variational geometry

    CERN Document Server

    Krupka, Demeter

    2015-01-01

    The book is devoted to recent research in the global variational theory on smooth manifolds. Its main objective is an extension of the classical variational calculus on Euclidean spaces to (topologically nontrivial) finite-dimensional smooth manifolds; to this purpose the methods of global analysis of differential forms are used. Emphasis is placed on the foundations of the theory of variational functionals on fibered manifolds - relevant geometric structures for variational principles in geometry, physical field theory and higher-order fibered mechanics. The book chapters include: - foundations of jet bundles and analysis of differential forms and vector fields on jet bundles, - the theory of higher-order integral variational functionals for sections of a fibred space, the (global) first variational formula in infinitesimal and integral forms- extremal conditions and the discussion of Noether symmetries and generalizations,- the inverse problems of the calculus of variations of Helmholtz type- variational se...

  2. Economic emission dispatching with variations of wind power and loads using multi-objective optimization by learning automata

    International Nuclear Information System (INIS)

    Liao, H.L.; Wu, Q.H.; Li, Y.Z.; Jiang, L.

    2014-01-01

    Highlights: • Apply multi-objective optimization by learning automata to power system. • Sequentially dimensional search and state memory are incorporated. • Track dispatch under significant variations of wind power and load demand. • Good performance in terms of accuracy, distribution and computation time. - Abstract: This paper is concerned with using multi-objective optimization by learning automata (MOLA) for economic emission dispatching in the environment where wind power and loads vary. With its capabilities of sequentially dimensional search and state memory, MOLA is able to find accurate solutions while satisfying two objectives: fuel cost coupled with environmental emission and voltage stability. Its searching quality and efficiency are measured using the hypervolume indicator for investigating the quality of Pareto front, and demonstrated by tracking the dispatch solutions under significant variations of wind power and load demand. The simulation studies are carried out on the modified midwestern American electric power system and the IEEE 118-bus test system, in which wind power penetration and load variations present. Evaluated on these two power systems, MOLA is fully compared with multi-objective evolutionary algorithm based on decomposition (MOEA/D) and non-dominated sorting genetic algorithm II (NSGA-II). The simulation results have shown the superiority of MOLA over NAGA-II and MOEA/D, as it is able to obtain more accurate and widely distributed Pareto fronts. In the dynamic environment where the operation condition of both wind speed and load demand varies, MOLA outperforms the other two algorithms, with respect to the tracking ability and accuracy of the solutions

  3. Numerical Analysis Objects

    Science.gov (United States)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  4. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    Science.gov (United States)

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  5. Convergence analysis of variational and non-variational multigrid algorithms for the Laplace-Beltrami operator

    KAUST Repository

    Bonito, Andrea

    2012-09-01

    We design and analyze variational and non-variational multigrid algorithms for the Laplace-Beltrami operator on a smooth and closed surface. In both cases, a uniform convergence for the V -cycle algorithm is obtained provided the surface geometry is captured well enough by the coarsest grid. The main argument hinges on a perturbation analysis from an auxiliary variational algorithm defined directly on the smooth surface. In addition, the vanishing mean value constraint is imposed on each level, thereby avoiding singular quadratic forms without adding additional computational cost. Numerical results supporting our analysis are reported. In particular, the algorithms perform well even when applied to surfaces with a large aspect ratio. © 2011 American Mathematical Society.

  6. Object-oriented analysis and design

    CERN Document Server

    Deacon, John

    2005-01-01

    John Deacon’s in-depth, highly pragmatic approach to object-oriented analysis and design, demonstrates how to lay the foundations for developing the best possible software. Students will learn how to ensure that analysis and design remain focused and productive. By working through the book, they will gain a solid working knowledge of best practices in software development.

  7. R-FCN Object Detection Ensemble based on Object Resolution and Image Quality

    DEFF Research Database (Denmark)

    Rasmussen, Christoffer Bøgelund; Nasrollahi, Kamal; Moeslund, Thomas B.

    2017-01-01

    Object detection can be difficult due to challenges such as variations in objects both inter- and intra-class. Additionally, variations can also be present between images. Based on this, research was conducted into creating an ensemble of Region-based Fully Convolutional Networks (R-FCN) object d...

  8. Finite time exergy analysis and multi-objective ecological optimization of a regenerative Brayton cycle considering the impact of flow rate variations

    International Nuclear Information System (INIS)

    Naserian, Mohammad Mahdi; Farahat, Said; Sarhaddi, Faramarz

    2015-01-01

    Highlights: • Defining a dimensionless parameter includes the finite-time and size concepts. • Inserting the concept of exergy of fluid streams into finite-time thermodynamics. • Defining, drawing and modifying of maximum ecological function curve. • Suggesting the appropriate performance zone, according to maximum ecological curve. - Abstract: In this study, the optimal performance of a regenerative Brayton cycle is sought through power and then ecological function maximization using finite-time thermodynamic concept and finite-size components. Multi-objective optimization is used for maximizing the ecological function. Optimizations are performed using genetic algorithm. In order to take into account the finite-time and finite-size concepts in current problem, a dimensionless mass-flow parameter is introduced deploying time variations. The variations of output power, total exergy destruction of the system, and decision variables for the optimum state (maximum ecological function state) are compared to the maximum power state using the dimensionless parameter. The modified ecological function in optimum state is obtained and plotted relating to the dimensionless mass-flow parameter. One can see that the modified ecological function study results in a better performance than that obtained with the maximum power state. Finally, the appropriate performance zone of the heat engine will be obtained

  9. Identification of uncommon objects in containers

    Science.gov (United States)

    Bremer, Peer-Timo; Kim, Hyojin; Thiagarajan, Jayaraman J.

    2017-09-12

    A system for identifying in an image an object that is commonly found in a collection of images and for identifying a portion of an image that represents an object based on a consensus analysis of segmentations of the image. The system collects images of containers that contain objects for generating a collection of common objects within the containers. To process the images, the system generates a segmentation of each image. The image analysis system may also generate multiple segmentations for each image by introducing variations in the selection of voxels to be merged into a segment. The system then generates clusters of the segments based on similarity among the segments. Each cluster represents a common object found in the containers. Once the clustering is complete, the system may be used to identify common objects in images of new containers based on similarity between segments of images and the clusters.

  10. Interpretation of engine cycle-to-cycle variation by chaotic time series analysis

    Energy Technology Data Exchange (ETDEWEB)

    Daw, C.S.; Kahl, W.K.

    1990-01-01

    In this paper we summarize preliminary results from applying a new mathematical technique -- chaotic time series analysis (CTSA) -- to cylinder pressure data from a spark-ignition (SI) four-stroke engine fueled with both methanol and iso-octane. Our objective is to look for the presence of deterministic chaos'' dynamics in peak pressure variations and to investigate the potential usefulness of CTSA as a diagnostic tool. Our results suggest that sequential peak cylinder pressures exhibit some characteristic features of deterministic chaos and that CTSA can extract previously unrecognized information from such data. 18 refs., 11 figs., 2 tabs.

  11. Comparative analysis of face recognition techniques with illumination variation

    International Nuclear Information System (INIS)

    Jondhale, K C; Waghmare, L M

    2010-01-01

    Illumination variation is one of the major challenges in the face recognition. To deal with this problem, this paper presents comparative analysis of three different techniques. First, the DCT is employed to compensate for illumination variations in the logarithm domain. Since illumination variation lies mainly in the low frequency band, an appropriate number of DCT coefficients are truncated to reduce the variations under different lighting conditions. The nearest neighbor classifier based on Euclidean distance is employed for classification. Second, the performance of PCA is checked on normalized image. PCA is a technique used to reduce multidimensional data sets to a lower dimension for analysis. Third, LDA based methods gives a satisfactory result under controlled lighting condition. But its performance under large illumination variation is not satisfactory. So, the performance of LDA is checked on normalized image. Experimental results on the Yale B and ORL database show that the proposed approach of application of PCA and LDA on normalized dataset improves the performance significantly for the face images with large illumination variations.

  12. Object-Oriented Analysis, Structured Analysis, and Jackson System Development

    NARCIS (Netherlands)

    Van Assche, F.; Wieringa, Roelf J.; Moulin, B.; Rolland, C

    1991-01-01

    Conceptual modeling is the activity of producing a conceptual model of an actual or desired version of a universe of discourse (UoD). In this paper, two methods of conceptual modeling are compared, structured analysis (SA) and object-oriented analysis (OOA). This is done by transforming a model

  13. Variational analysis of regular mappings theory and applications

    CERN Document Server

    Ioffe, Alexander D

    2017-01-01

    This monograph offers the first systematic account of (metric) regularity theory in variational analysis. It presents new developments alongside classical results and demonstrates the power of the theory through applications to various problems in analysis and optimization theory. The origins of metric regularity theory can be traced back to a series of fundamental ideas and results of nonlinear functional analysis and global analysis centered around problems of existence and stability of solutions of nonlinear equations. In variational analysis, regularity theory goes far beyond the classical setting and is also concerned with non-differentiable and multi-valued operators. The present volume explores all basic aspects of the theory, from the most general problems for mappings between metric spaces to those connected with fairly concrete and important classes of operators acting in Banach and finite dimensional spaces. Written by a leading expert in the field, the book covers new and powerful techniques, whic...

  14. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  15. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  16. Hospital Variation in Cesarean Delivery: A Multilevel Analysis.

    Science.gov (United States)

    Vecino-Ortiz, Andres I; Bardey, David; Castano-Yepes, Ramon

    2015-12-01

    To assess the issue of hospital variations in Colombia and to contribute to the methodology on health care variations by using a model that clusters the variance between hospitals while accounting for individual-level reimbursement rates and objective health-status variables. We used data on all births (N = 11,954) taking place in a contributory-regimen insurer network in Colombia during 2007. A multilevel logistic regression model was used to account for the share of unexplained variance between hospitals. In addition, an alternative variance decomposition specification was further carried out to measure the proportion of such unexplained variance due to the region effect. Hospitals account for 20% of the variation in performing cesarean sections, whereas region explains only one-third of such variance. Variables accounting for preferences on the demand side as well as reimbursement rates are found to predict the probability of performing cesarean sections. Hospital variations explain large variances within a single-payer's network. Because this insurer company is highly regarded in terms of performance and finance, these results might provide a lower bound for the scale of hospital variation in the Colombian health care market. Such lower bound provides guidance on the relevance of this issue for Colombia. Some factors such as demand-side preferences and physician reimbursement rates increase variations in health care even within a single-payer network. This is a source of inefficiencies, threatening the quality of health care and financial sustainability. The proposed methodology should be considered in further research on health care variations. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  17. Multi-element analysis of unidentified fallen objects from Tatale in ...

    African Journals Online (AJOL)

    A multi-element analysis has been carried out on two fallen objects, # 01 and # 02, using instrumental neutron activation analysis technique. A total of 17 elements were identified in object # 01 while 21 elements were found in object # 02. The two major elements in object # 01 were Fe and Mg, which together constitute ...

  18. A strategic analysis of Business Objects' portal application

    OpenAIRE

    Kristinsson, Olafur Oskar

    2007-01-01

    Business Objects is the leading software firm producing business intelligence software. Business intelligence is a growing market. Small to medium businesses are increasingly looking at business intelligence. Business Objects' flagship product in the enterprise market is Business Objects XI and for medium-size companies it has Crystal Decisions. Portals are the front end for the two products. InfoView, Business Objects portal application, lacks a long-term strategy. This analysis evaluates...

  19. Fixed point theory, variational analysis, and optimization

    CERN Document Server

    Al-Mezel, Saleh Abdullah R; Ansari, Qamrul Hasan

    2015-01-01

    ""There is a real need for this book. It is useful for people who work in areas of nonlinear analysis, optimization theory, variational inequalities, and mathematical economics.""-Nan-Jing Huang, Sichuan University, Chengdu, People's Republic of China

  20. Objective - oriented financial analysis introduction

    Directory of Open Access Journals (Sweden)

    Dessislava Kostova – Pickett

    2018-02-01

    Full Text Available The practice of financial analysis has been immeasurably strengthened in recent years thanks to the ongoing evolution of computerized approaches in the form of spreadsheets and computer-based financial models of different types. These devices not only relieved the analyst's computing task, but also opened up a wide range of analyzes and research into alternative sensitivity, which so far has not been possible. The main potential for object-oriented financial analysis consists in enormously expanding the analyst's capabilities through an online knowledge and information interface that has not yet been achieved through existing methods and software packages.

  1. Frame sequences analysis technique of linear objects movement

    Science.gov (United States)

    Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.

    2017-12-01

    Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.

  2. Big Data Analysis of Human Genome Variations

    KAUST Repository

    Gojobori, Takashi

    2016-01-25

    Since the human genome draft sequence was in public for the first time in 2000, genomic analyses have been intensively extended to the population level. The following three international projects are good examples for large-scale studies of human genome variations: 1) HapMap Data (1,417 individuals) (http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/2010-08_phaseII+III/forward/), 2) HGDP (Human Genome Diversity Project) Data (940 individuals) (http://www.hagsc.org/hgdp/files.html), 3) 1000 genomes Data (2,504 individuals) http://ftp.1000genomes.ebi.ac.uk/vol1/ftp/release/20130502/ If we can integrate all three data into a single volume of data, we should be able to conduct a more detailed analysis of human genome variations for a total number of 4,861 individuals (= 1,417+940+2,504 individuals). In fact, we successfully integrated these three data sets by use of information on the reference human genome sequence, and we conducted the big data analysis. In particular, we constructed a phylogenetic tree of about 5,000 human individuals at the genome level. As a result, we were able to identify clusters of ethnic groups, with detectable admixture, that were not possible by an analysis of each of the three data sets. Here, we report the outcome of this kind of big data analyses and discuss evolutionary significance of human genomic variations. Note that the present study was conducted in collaboration with Katsuhiko Mineta and Kosuke Goto at KAUST.

  3. Introduction and application of the multiscale coefficient of variation analysis.

    Science.gov (United States)

    Abney, Drew H; Kello, Christopher T; Balasubramaniam, Ramesh

    2017-10-01

    Quantifying how patterns of behavior relate across multiple levels of measurement typically requires long time series for reliable parameter estimation. We describe a novel analysis that estimates patterns of variability across multiple scales of analysis suitable for time series of short duration. The multiscale coefficient of variation (MSCV) measures the distance between local coefficient of variation estimates within particular time windows and the overall coefficient of variation across all time samples. We first describe the MSCV analysis and provide an example analytical protocol with corresponding MATLAB implementation and code. Next, we present a simulation study testing the new analysis using time series generated by ARFIMA models that span white noise, short-term and long-term correlations. The MSCV analysis was observed to be sensitive to specific parameters of ARFIMA models varying in the type of temporal structure and time series length. We then apply the MSCV analysis to short time series of speech phrases and musical themes to show commonalities in multiscale structure. The simulation and application studies provide evidence that the MSCV analysis can discriminate between time series varying in multiscale structure and length.

  4. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  5. Determining characteristics of artificial near-Earth objects using observability analysis

    Science.gov (United States)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  6. Convergence analysis of variational and non-variational multigrid algorithms for the Laplace-Beltrami operator

    KAUST Repository

    Bonito, Andrea; Pasciak, Joseph E.

    2012-01-01

    is captured well enough by the coarsest grid. The main argument hinges on a perturbation analysis from an auxiliary variational algorithm defined directly on the smooth surface. In addition, the vanishing mean value constraint is imposed on each level, thereby

  7. Multi-band morpho-Spectral Component Analysis Deblending Tool (MuSCADeT): Deblending colourful objects

    Science.gov (United States)

    Joseph, R.; Courbin, F.; Starck, J.-L.

    2016-05-01

    We introduce a new algorithm for colour separation and deblending of multi-band astronomical images called MuSCADeT which is based on Morpho-spectral Component Analysis of multi-band images. The MuSCADeT algorithm takes advantage of the sparsity of astronomical objects in morphological dictionaries such as wavelets and their differences in spectral energy distribution (SED) across multi-band observations. This allows us to devise a model independent and automated approach to separate objects with different colours. We show with simulations that we are able to separate highly blended objects and that our algorithm is robust against SED variations of objects across the field of view. To confront our algorithm with real data, we use HST images of the strong lensing galaxy cluster MACS J1149+2223 and we show that MuSCADeT performs better than traditional profile-fitting techniques in deblending the foreground lensing galaxies from background lensed galaxies. Although the main driver for our work is the deblending of strong gravitational lenses, our method is fit to be used for any purpose related to deblending of objects in astronomical images. An example of such an application is the separation of the red and blue stellar populations of a spiral galaxy in the galaxy cluster Abell 2744. We provide a python package along with all simulations and routines used in this paper to contribute to reproducible research efforts. Codes can be found at http://lastro.epfl.ch/page-126973.html

  8. On the analysis of line profile variations: A statistical approach

    International Nuclear Information System (INIS)

    McCandliss, S.R.

    1988-01-01

    This study is concerned with the empirical characterization of the line profile variations (LPV), which occur in many of and Wolf-Rayet stars. The goal of the analysis is to gain insight into the physical mechanisms producing the variations. The analytic approach uses a statistical method to quantify the significance of the LPV and to identify those regions in the line profile which are undergoing statistically significant variations. Line positions and flux variations are then measured and subject to temporal and correlative analysis. Previous studies of LPV have for the most part been restricted to observations of a single line. Important information concerning the range and amplitude of the physical mechanisms involved can be obtained by simultaneously observing spectral features formed over a range of depths in the extended mass losing atmospheres of massive, luminous stars. Time series of a Wolf-Rayet and two of stars with nearly complete spectral coverage from 3940 angstrom to 6610 angstrom and with spectral resolution of R = 10,000 are analyzed here. These three stars exhibit a wide range of both spectral and temporal line profile variations. The HeII Pickering lines of HD 191765 show a monotonic increase in the peak rms variation amplitude with lines formed at progressively larger radii in the Wolf-Rayet star wind. Two times scales of variation have been identified in this star: a less than one day variation associated with small scale flickering in the peaks of the line profiles and a greater than one day variation associated with large scale asymmetric changes in the overall line profile shapes. However, no convincing period phenomena are evident at those periods which are well sampled in this time series

  9. Foreign object detection and removal to improve automated analysis of chest radiographs

    International Nuclear Information System (INIS)

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  10. Variation in plasma calcium analysis in primary care in Sweden - a multilevel analysis

    Directory of Open Access Journals (Sweden)

    Eggertsen Robert

    2010-05-01

    Full Text Available Abstract Background Primary hyperparathyroidism (pHPT is a common disease that often remains undetected and causes severe disturbance especially in postmenopausal women. Therefore, national recommendations promoting early pHPT detection by plasma calcium (P-Ca have been issued in Sweden. In this study we aimed to investigate variation of P-Ca analysis between physicians and health care centres (HCCs in primary care in county of Skaraborg, Sweden. Methods In this cross sectional study of patients' records during 2005 we analysed records from 154 629 patients attending 457 physicians at 24 HCCs. We used multilevel logistic regression analysis (MLRA and adjusted for patient, physician and HCC characteristics. Differences were expressed as median odds ratio (MOR. Results There was a substantial variation in number of P-Ca analyses between both HCCs (MORHCC 1.65 [1.44-2.07] and physicians (MORphysician 1.95 [1.85-2.08]. The odds for a P-Ca analysis were lower for male patients (OR 0.80 [0.77-0.83] and increased with the number of diagnoses (OR 25.8 [23.5-28.5]. Sex of the physician had no influence on P-Ca test ordering (OR 0.93 [0.78-1.09]. Physicians under education ordered most P-Ca analyses (OR 1.69 [1.35-2.24] and locum least (OR 0.73 [0.57-0.94]. More of the variance was attributed to the physician level than the HCC level. Different mix of patients did not explain this variance between physicians. Theoretically, if a patient were able to change both GP and HCC, the odds of a P-Ca analysis would in median increase by 2.45. Including characteristics of the patients, physicians and HCCs in the MLRA model did not explain the variance. Conclusions The physician level was more important than the HCC level for the variation in P-Ca analysis, but further exploration of unidentified contextual factors is crucial for future monitoring of practice variation.

  11. Categorical data processing for real estate objects valuation using statistical analysis

    Science.gov (United States)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  12. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Directory of Open Access Journals (Sweden)

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  13. Ten years of Object-Oriented analysis on H1

    International Nuclear Information System (INIS)

    Laycock, Paul

    2012-01-01

    Over a decade ago, the H1 Collaboration decided to embrace the object-oriented paradigm and completely redesign its data analysis model and data storage format. The event data model, based on the ROOT framework, consists of three layers - tracks and calorimeter clusters, identified particles and finally event summary data - with a singleton class providing unified access. This original solution was then augmented with a fourth layer containing user-defined objects. This contribution will summarise the history of the solutions used, from modifications to the original design, to the evolution of the high-level end-user analysis object framework which is used by H1 today. Several important issues are addressed - the portability of expert knowledge to increase the efficiency of data analysis, the flexibility of the framework to incorporate new analyses, the performance and ease of use, and lessons learned for future projects.

  14. Variational analysis and generalized differentiation I basic theory

    CERN Document Server

    Mordukhovich, Boris S

    2006-01-01

    Contains a study of the basic concepts and principles of variational analysis and generalized differentiation in both finite-dimensional and infinite-dimensional spaces. This title presents many applications to problems in optimization, equilibria, stability and sensitivity, control theory, economics, mechanics, and more.

  15. Comparative analysis of imaging configurations and objectives for Fourier microscopy.

    Science.gov (United States)

    Kurvits, Jonathan A; Jiang, Mingming; Zia, Rashid

    2015-11-01

    Fourier microscopy is becoming an increasingly important tool for the analysis of optical nanostructures and quantum emitters. However, achieving quantitative Fourier space measurements requires a thorough understanding of the impact of aberrations introduced by optical microscopes that have been optimized for conventional real-space imaging. Here we present a detailed framework for analyzing the performance of microscope objectives for several common Fourier imaging configurations. To this end, we model objectives from Nikon, Olympus, and Zeiss using parameters that were inferred from patent literature and confirmed, where possible, by physical disassembly. We then examine the aberrations most relevant to Fourier microscopy, including the alignment tolerances of apodization factors for different objective classes, the effect of magnification on the modulation transfer function, and vignetting-induced reductions of the effective numerical aperture for wide-field measurements. Based on this analysis, we identify an optimal objective class and imaging configuration for Fourier microscopy. In addition, the Zemax files for the objectives and setups used in this analysis have been made publicly available as a resource for future studies.

  16. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  17. Data analysis in an Object Request Broker environment

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  18. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    Science.gov (United States)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  19. Worst-case execution time analysis-driven object cache design

    DEFF Research Database (Denmark)

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  20. Analysis of indel variations in the human disease-associated genes ...

    Indian Academy of Sciences (India)

    Keywords. insertion–deletion variations; haematological disease; tumours; human genetics. Journal of Genetics ... domly selected healthy Korean individuals using a blood genomic DNA ... Bioinformatics annotation and 3-D protein structure analysis. In this study ..... 2009 A genome-wide meta-analysis identifies. Journal of ...

  1. Sensitivity analysis and parameter estimation for distributed hydrological modeling: potential of variational methods

    Directory of Open Access Journals (Sweden)

    W. Castaings

    2009-04-01

    Full Text Available Variational methods are widely used for the analysis and control of computationally intensive spatially distributed systems. In particular, the adjoint state method enables a very efficient calculation of the derivatives of an objective function (response function to be analysed or cost function to be optimised with respect to model inputs.

    In this contribution, it is shown that the potential of variational methods for distributed catchment scale hydrology should be considered. A distributed flash flood model, coupling kinematic wave overland flow and Green Ampt infiltration, is applied to a small catchment of the Thoré basin and used as a relatively simple (synthetic observations but didactic application case.

    It is shown that forward and adjoint sensitivity analysis provide a local but extensive insight on the relation between the assigned model parameters and the simulated hydrological response. Spatially distributed parameter sensitivities can be obtained for a very modest calculation effort (~6 times the computing time of a single model run and the singular value decomposition (SVD of the Jacobian matrix provides an interesting perspective for the analysis of the rainfall-runoff relation.

    For the estimation of model parameters, adjoint-based derivatives were found exceedingly efficient in driving a bound-constrained quasi-Newton algorithm. The reference parameter set is retrieved independently from the optimization initial condition when the very common dimension reduction strategy (i.e. scalar multipliers is adopted.

    Furthermore, the sensitivity analysis results suggest that most of the variability in this high-dimensional parameter space can be captured with a few orthogonal directions. A parametrization based on the SVD leading singular vectors was found very promising but should be combined with another regularization strategy in order to prevent overfitting.

  2. EFFECTS OF PARAMETRIC VARIATIONS ON SEISMIC ANALYSIS METHODS FOR NON-CLASSICALLY DAMPED COUPLED SYSTEMS

    International Nuclear Information System (INIS)

    XU, J.; DEGRASSI, G.

    2000-01-01

    A comprehensive benchmark program was developed by Brookhaven National Laboratory (BNL) to perform an evaluation of state-of-the-art methods and computer programs for performing seismic analyses of coupled systems with non-classical damping. The program, which was sponsored by the US Nuclear Regulatory Commission (NRC), was designed to address various aspects of application and limitations of these state-of-the-art analysis methods to typical coupled nuclear power plant (NPP) structures with non-classical damping, and was carried out through analyses of a set of representative benchmark problems. One objective was to examine the applicability of various analysis methods to problems with different dynamic characteristics unique to coupled systems. The examination was performed using parametric variations for three simple benchmark models. This paper presents the comparisons and evaluation of the program participants' results to the BNL exact solutions for the applicable ranges of modeling dynamic characteristic parameters

  3. Object position and image magnification in dental panoramic radiography: a theoretical analysis.

    Science.gov (United States)

    Devlin, H; Yuan, J

    2013-01-01

    The purpose of our study was to investigate how image magnification and distortion in dental panoramic radiography are influenced by object size and position for a small round object such as a ball bearing used for calibration. Two ball bearings (2.5 mm and 6 mm in diameter) were placed at approximately the same position between the teeth of a plastic skull and radiographed 21 times. The skull was replaced each time. Their images were measured by software using edge detection and ellipse-fitting algorithms. Using a standard definition of magnification, equations were derived to enable an object's magnification to be determined from its position and vice versa knowing the diameter and machine parameters. The average magnification of the 2.5 mm ball bearing was 1.292 (0.0445) horizontally and 1.257 (0.0067) vertically with a mean ratio of 1.028 (0.0322); standard deviations are in parentheses. The figures for the 6 mm ball bearing were 1.286 (0.0068), 1.255 (0.0018) and 1.025 (0.0061), respectively. Derived positions of each ball bearing from magnification were more consistent horizontally than vertically. There was less variation in either direction for the 6 mm ball bearing than the 2.5 mm one. Automatic measurement of image size resulted in less variation in vertical magnification values than horizontal. There are only certain positions in the focal trough that achieve zero distortion. Object location can be determined from its diameter, measured magnification and machine parameters. The 6 mm diameter ball bearing is preferable to the 2.5 mm one for more reliable magnification measurement and position determination.

  4. Interaction between High-Level and Low-Level Image Analysis for Semantic Video Object Extraction

    Directory of Open Access Journals (Sweden)

    Andrea Cavallaro

    2004-06-01

    Full Text Available The task of extracting a semantic video object is split into two subproblems, namely, object segmentation and region segmentation. Object segmentation relies on a priori assumptions, whereas region segmentation is data-driven and can be solved in an automatic manner. These two subproblems are not mutually independent, and they can benefit from interactions with each other. In this paper, a framework for such interaction is formulated. This representation scheme based on region segmentation and semantic segmentation is compatible with the view that image analysis and scene understanding problems can be decomposed into low-level and high-level tasks. Low-level tasks pertain to region-oriented processing, whereas the high-level tasks are closely related to object-level processing. This approach emulates the human visual system: what one “sees” in a scene depends on the scene itself (region segmentation as well as on the cognitive task (semantic segmentation at hand. The higher-level segmentation results in a partition corresponding to semantic video objects. Semantic video objects do not usually have invariant physical properties and the definition depends on the application. Hence, the definition incorporates complex domain-specific knowledge and is not easy to generalize. For the specific implementation used in this paper, motion is used as a clue to semantic information. In this framework, an automatic algorithm is presented for computing the semantic partition based on color change detection. The change detection strategy is designed to be immune to the sensor noise and local illumination variations. The lower-level segmentation identifies the partition corresponding to perceptually uniform regions. These regions are derived by clustering in an N-dimensional feature space, composed of static as well as dynamic image attributes. We propose an interaction mechanism between the semantic and the region partitions which allows to

  5. Three dimensional analysis of cosmic ray intensity variation

    International Nuclear Information System (INIS)

    Yasue, Shin-ichi; Mori, Satoru; Nagashima, Kazuo.

    1974-01-01

    Three dimensional analysis of cosmic ray anisotropy and its time variation was performed. This paper describes the analysis of the Forbush decrease in Jan. 1968 to investigate by comparing the direction of the magnetic field in interplanetary space and the direction of the reference axis for cosmic ray anisotropy. New anisotropy becomes dominant at the time of Forbush decrease because the anisotropy of cosmic ray in calm state is wiped out. Such anisotropy produces intensity variation in neutron monitors on the ground. The characteristic parameters of three dimensional anisotropy can be determined from theoretical value and observed intensity. Analyzed data were taken for 6 days from Jan. 25 to Jan. 30, 1968, at Deep River. The decrease of intensity at Deep River was seen for several hours from 11 o'clock (UT), Jan. 26, just before The Forbush decrease. This may be due to the loss cone. The Forbush decrease began at 19 o'clock, Jan. 26, and the main phase continued to 5 o'clock in the next morning. The spectrum of variation was Psup(-0.5). The time variations of the magnetic field in interplanetary space and the reference axis of cosmic ray anisotropy are shown for 15 hours. The average directions of both are almost in coincidence. The spatial distribution of cosmic ray near the earth may be expressed by the superposition of axial symmetrical distribution along a reference axis and its push-out to the direction of 12 o'clock. It is considered that the direction of magnetic force line and the velocity of solar wind correspond to the direction of the reference axis and the magnitude of anisotropy in the direction of 12 o'clock, respectively. (Kato, T.)

  6. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.|info:eu-repo/dai/nl/224281216; Queiroz Feitosa, R.; van der Meer, F.D.|info:eu-repo/dai/nl/138940908; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  7. Analysis of Price Variation and Market Integration of Prosopis ...

    African Journals Online (AJOL)

    Analysis of Price Variation and Market Integration of Prosopis Africana (guill. ... select five markets based on the presence of traders selling the commodity in the markets ... T- test result showed that Prosopis africana seed trade is profitable and ...

  8. Diachronic and Synchronic Analysis - the Case of the Indirect Object in Spanish

    DEFF Research Database (Denmark)

    Dam, Lotte; Dam-Jensen, Helle

    2007-01-01

    The article deals with a monograph on the indirect object in Spanish. The book offers a many-faceted analysis of the indrect object, as it, on the one hand, gives a detailed diachronic analysis of what is known as clitic-doubled constructions and, on the other, a synchronic analysis of both...

  9. Stability Analysis and Variational Integrator for Real-Time Formation Based on Potential Field

    Directory of Open Access Journals (Sweden)

    Shengqing Yang

    2014-01-01

    Full Text Available This paper investigates a framework of real-time formation of autonomous vehicles by using potential field and variational integrator. Real-time formation requires vehicles to have coordinated motion and efficient computation. Interactions described by potential field can meet the former requirement which results in a nonlinear system. Stability analysis of such nonlinear system is difficult. Our methodology of stability analysis is discussed in error dynamic system. Transformation of coordinates from inertial frame to body frame can help the stability analysis focus on the structure instead of particular coordinates. Then, the Jacobian of reduced system can be calculated. It can be proved that the formation is stable at the equilibrium point of error dynamic system with the effect of damping force. For consideration of calculation, variational integrator is introduced. It is equivalent to solving algebraic equations. Forced Euler-Lagrange equation in discrete expression is used to construct a forced variational integrator for vehicles in potential field and obstacle environment. By applying forced variational integrator on computation of vehicles' motion, real-time formation of vehicles in obstacle environment can be implemented. Algorithm based on forced variational integrator is designed for a leader-follower formation.

  10. Elliptic Fourier Analysis of body shape variation of Hippocampus spp. (seahorse in Danajon Bank, Philippines

    Directory of Open Access Journals (Sweden)

    S. R. M. Tabugo-Rico

    2017-12-01

    Full Text Available Seahorses inhabit various ecosystems hence, had become a flagship species of the marine environment. The Philippines as a hot spot of biodiversity in Asia holds a number of species of seahorses. This serve as an exploratory study to describe body shape variation of selected common seahorse species: Hippocampus comes, Hippocampus histrix, Hippocampus spinosissimus and Hippocampus kuda from Danajon bank using Elliptic Fourier Analysis. The method was done to test whether significant yet subtle differences in body shape variation can be species-specific, habitat-influenced and provide evidence of sexual dimorphism. It is hypothesized that phenotypic divergence may provide evidence for genetic differentiation or mere adaptations to habitat variation. Results show significant considerable differences in the body shapes of the five populations based on the canonical variate analysis (CVA and multivariate analysis of variance (MANOVA with significant p values. Populations were found to be distinct from each other suggesting that body shape variation is species-specific, habitat-influenced and provided evidence for sexual dimorphism. Results of discriminant analysis show further support for species specific traits and sexual dimorphism. This study shows the application of the method of geometric morphometrics specifically elliptic fourier analysis in describing subtle body shape variation of selected Hippocampus species.

  11. Creating Objects and Object Categories for Studying Perception and Perceptual Learning

    Science.gov (United States)

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-01-01

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created

  12. Towards Structural Analysis of Audio Recordings in the Presence of Musical Variations

    Directory of Open Access Journals (Sweden)

    Müller Meinard

    2007-01-01

    Full Text Available One major goal of structural analysis of an audio recording is to automatically extract the repetitive structure or, more generally, the musical form of the underlying piece of music. Recent approaches to this problem work well for music, where the repetitions largely agree with respect to instrumentation and tempo, as is typically the case for popular music. For other classes of music such as Western classical music, however, musically similar audio segments may exhibit significant variations in parameters such as dynamics, timbre, execution of note groups, modulation, articulation, and tempo progression. In this paper, we propose a robust and efficient algorithm for audio structure analysis, which allows to identify musically similar segments even in the presence of large variations in these parameters. To account for such variations, our main idea is to incorporate invariance at various levels simultaneously: we design a new type of statistical features to absorb microvariations, introduce an enhanced local distance measure to account for local variations, and describe a new strategy for structure extraction that can cope with the global variations. Our experimental results with classical and popular music show that our algorithm performs successfully even in the presence of significant musical variations.

  13. Head First Object-Oriented Analysis and Design

    CERN Document Server

    McLaughlin, Brett D; West, David

    2006-01-01

    "Head First Object Oriented Analysis and Design is a refreshing look at subject of OOAD. What sets this book apart is its focus on learning. The authors have made the content of OOAD accessible, usable for the practitioner." Ivar Jacobson, Ivar Jacobson Consulting "I just finished reading HF OOA&D and I loved it! The thing I liked most about this book was its focus on why we do OOA&D-to write great software!" Kyle Brown, Distinguished Engineer, IBM "Hidden behind the funny pictures and crazy fonts is a serious, intelligent, extremely well-crafted presentation of OO Analysis and Design

  14. Variation in expert source selection according to different objectivity standards

    DEFF Research Database (Denmark)

    Albæk, Erik

    2011-01-01

    Several scholars have tried to clarify how journalists handle and implement the abstract objectivity norm in daily practice. Less research attention has been paid to how common abstract professional norms and values, in casu the objectivity norm, may systematically vary when interpreted and imple......Several scholars have tried to clarify how journalists handle and implement the abstract objectivity norm in daily practice. Less research attention has been paid to how common abstract professional norms and values, in casu the objectivity norm, may systematically vary when interpreted...

  15. Objective high Resolution Analysis over Complex Terrain with VERA

    Science.gov (United States)

    Mayer, D.; Steinacker, R.; Steiner, A.

    2012-04-01

    VERA (Vienna Enhanced Resolution Analysis) is a model independent, high resolution objective analysis of meteorological fields over complex terrain. This system consists of a special developed quality control procedure and a combination of an interpolation and a downscaling technique. Whereas the so called VERA-QC is presented at this conference in the contribution titled "VERA-QC, an approved Data Quality Control based on Self-Consistency" by Andrea Steiner, this presentation will focus on the method and the characteristics of the VERA interpolation scheme which enables one to compute grid point values of a meteorological field based on irregularly distributed observations and topography related aprior knowledge. Over a complex topography meteorological fields are not smooth in general. The roughness which is induced by the topography can be explained physically. The knowledge about this behavior is used to define the so called Fingerprints (e.g. a thermal Fingerprint reproducing heating or cooling over mountainous terrain or a dynamical Fingerprint reproducing positive pressure perturbation on the windward side of a ridge) under idealized conditions. If the VERA algorithm recognizes patterns of one or more Fingerprints at a few observation points, the corresponding patterns are used to downscale the meteorological information in a greater surrounding. This technique allows to achieve an analysis with a resolution much higher than the one of the observational network. The interpolation of irregularly distributed stations to a regular grid (in space and time) is based on a variational principle applied to first and second order spatial and temporal derivatives. Mathematically, this can be formulated as a cost function that is equivalent to the penalty function of a thin plate smoothing spline. After the analysis field has been divided into the Fingerprint components and the unexplained part respectively, the requirement of a smooth distribution is applied to the

  16. Obtaining 'images' from iron objects using a 3-axis fluxgate magnetometer

    International Nuclear Information System (INIS)

    Chilo, Jose; Jabor, Abbas; Lizska, Ludwik; Eide, Age J.; Lindblad, Thomas

    2007-01-01

    Magnetic objects can cause local variations in the Earth's magnetic field that can be measured with a magnetometer. Here we used tri-axial magnetometer measurements and an analysis method employing wavelet techniques to determine the 'signature' or 'fingerprint' of different iron objects. Clear distinctions among the iron samples were observed. The time-dependent changes in the frequency powers were extracted by use of the Morlet wavelet corresponding to frequency bands from 0.1 to 100 Hz

  17. PGen: large-scale genomic variations analysis workflow and browser in SoyKB.

    Science.gov (United States)

    Liu, Yang; Khan, Saad M; Wang, Juexin; Rynge, Mats; Zhang, Yuanxun; Zeng, Shuai; Chen, Shiyuan; Maldonado Dos Santos, Joao V; Valliyodan, Babu; Calyam, Prasad P; Merchant, Nirav; Nguyen, Henry T; Xu, Dong; Joshi, Trupti

    2016-10-06

    With the advances in next-generation sequencing (NGS) technology and significant reductions in sequencing costs, it is now possible to sequence large collections of germplasm in crops for detecting genome-scale genetic variations and to apply the knowledge towards improvements in traits. To efficiently facilitate large-scale NGS resequencing data analysis of genomic variations, we have developed "PGen", an integrated and optimized workflow using the Extreme Science and Engineering Discovery Environment (XSEDE) high-performance computing (HPC) virtual system, iPlant cloud data storage resources and Pegasus workflow management system (Pegasus-WMS). The workflow allows users to identify single nucleotide polymorphisms (SNPs) and insertion-deletions (indels), perform SNP annotations and conduct copy number variation analyses on multiple resequencing datasets in a user-friendly and seamless way. We have developed both a Linux version in GitHub ( https://github.com/pegasus-isi/PGen-GenomicVariations-Workflow ) and a web-based implementation of the PGen workflow integrated within the Soybean Knowledge Base (SoyKB), ( http://soykb.org/Pegasus/index.php ). Using PGen, we identified 10,218,140 single-nucleotide polymorphisms (SNPs) and 1,398,982 indels from analysis of 106 soybean lines sequenced at 15X coverage. 297,245 non-synonymous SNPs and 3330 copy number variation (CNV) regions were identified from this analysis. SNPs identified using PGen from additional soybean resequencing projects adding to 500+ soybean germplasm lines in total have been integrated. These SNPs are being utilized for trait improvement using genotype to phenotype prediction approaches developed in-house. In order to browse and access NGS data easily, we have also developed an NGS resequencing data browser ( http://soykb.org/NGS_Resequence/NGS_index.php ) within SoyKB to provide easy access to SNP and downstream analysis results for soybean researchers. PGen workflow has been optimized for the most

  18. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  19. Analysis of Δ14C variations in atmosphere

    International Nuclear Information System (INIS)

    Simon, J.; Sivo, A.; Richtarikova, M.; Holy, K.; Polaskova, A.; Bulko, M.; Hola, O.

    2005-01-01

    The Δ 14 C in the atmosphere have been measured and studied in two localities of Slovakia. The accomplished analysis proved the existence of the annual variations of the Δ 14 C with the attenuating amplitude and decreasing mean value. It seems to be logical and physically correct to describe the Δ 14 C time-dependence by the equation: y = Ae -at + Be -bt cos(ω 1 t + (φ)). The coefficients A, a, B, b, (φ) are listed in the table for both the localities. The observed variations of the Δ 14 C have a maximum in summer and minimum in winter .Probably it is caused by the higher requirement of the heat supply in winter season which is connected directly with the fossil CO 2 emissions and more intensive Suess effect. Summer maximum could be explained by the combination of the lower CO 2 emission rate and higher turbulent transport of the stratospheric 14 C to the troposphere. Using the Fourier harmonic analysis the amplitude spectra of the average annual variations were plotted. The obtained result shows that the variations have the high degree of symmetry. Furthermore, the obtained basic frequency ω 1 = 2π/12 [month -1 ] proves that the cyclic processes with the period of T = 12 [month] have a major influence on the 14 C amount in the troposphere. The presence of some higher-order harmonics is significant, but a physical interpretation has not yet been clear. In addition to the main frequency there are presented also 2ω 1 and 3ω 1 in Bratislava and 4ω 1 in Zlkovce data-set. The long-time average of the Δ 14 C in Zlkovce during years 1995-2004 is higher of about 6.6 o / oo than in Bratislava. It represents an unique evidence that the local CO 2 pollution affects the 14 C activity . The correlation on the level R 2 = 0,43 was found between Bratislava and Zlkovce atmospheric Δ 14 C data. (authors)

  20. Analysis and Comprehensive Analytical Modeling of Statistical Variations in Subthreshold MOSFET's High Frequency Characteristics

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2014-01-01

    Full Text Available In this research, the analysis of statistical variations in subthreshold MOSFET's high frequency characteristics defined in terms of gate capacitance and transition frequency, have been shown and the resulting comprehensive analytical models of such variations in terms of their variances have been proposed. Major imperfection in the physical level properties including random dopant fluctuation and effects of variations in MOSFET's manufacturing process, have been taken into account in the proposed analysis and modeling. The up to dated comprehensive analytical model of statistical variation in MOSFET's parameter has been used as the basis of analysis and modeling. The resulting models have been found to be both analytic and comprehensive as they are the precise mathematical expressions in terms of physical level variables of MOSFET. Furthermore, they have been verified at the nanometer level by using 65~nm level BSIM4 based benchmarks and have been found to be very accurate with smaller than 5 % average percentages of errors. Hence, the performed analysis gives the resulting models which have been found to be the potential mathematical tool for the statistical and variability aware analysis and design of subthreshold MOSFET based VHF circuits, systems and applications.

  1. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  2. Data analysis in an object request broker environment

    International Nuclear Information System (INIS)

    Malon, David M.; May, Edward N.; Grossman, Robert L.; Day, Christopher T.; Quarrie, David R.

    1996-01-01

    Computing for the Next Millennium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanism for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function is such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study. (author)

  3. Variation in lumbar punctures for early onset neonatal sepsis: a nationally representative serial cross-sectional analysis, 2003-2009

    Directory of Open Access Journals (Sweden)

    Patrick Stephen W

    2012-08-01

    Full Text Available Abstract Background Whether lumbar punctures (LPs should be performed routinely for term newborns suspected of having early onset neonatal sepsis (EONS is subject to debate. It is unclear whether variations in performance of LPs for EONS may be associated with patient, hospital, insurance or regional factors. Our objective was to identify characteristics associated with the practice of performing LPs for suspected EONS in a nationally representative sample. Methods Utilizing data from the 2003, 2006 and 2009 Kids’ Inpatient Database (KID compiled by the Agency for Healthcare Research and Quality, we examined the frequency and characteristics of term, normal-birth weight newborns receiving an LP for EONS. Survey-weighting was applied for national estimates and used in chi squared and multivariable regression analysis. Results In 2009, there were 13,694 discharges for term newborns that underwent LPs for apparent EONS. Newborns having LPs performed were more likely to be covered by Medicaid vs. private insurance (51.9 vs. 45.1 percent; p Conclusions We found pronounced variation in LPs performed for EONS, even when adjusting for clinical conditions that would prompt LPs. These findings indicate practice variations in newborn care that merit further examination and explanation.

  4. From Pixels to Geographic Objects in Remote Sensing Image Analysis

    NARCIS (Netherlands)

    Addink, E.A.; Van Coillie, Frieke M.B.; Jong, Steven M. de

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received

  5. Use of objective analysis to estimate winter temperature and ...

    Indian Academy of Sciences (India)

    In the complex terrain of Himalaya, nonavailability of snow and meteorological data of the remote locations ... Precipitation intensity; spatial interpolation; objective analysis. J. Earth Syst. ... This technique needs historical database and unable ...

  6. General inverse problems for regular variation

    DEFF Research Database (Denmark)

    Damek, Ewa; Mikosch, Thomas Valentin; Rosinski, Jan

    2014-01-01

    Regular variation of distributional tails is known to be preserved by various linear transformations of some random structures. An inverse problem for regular variation aims at understanding whether the regular variation of a transformed random object is caused by regular variation of components ...

  7. Fast grasping of unknown objects using principal component analysis

    Science.gov (United States)

    Lei, Qujiang; Chen, Guangming; Wisse, Martijn

    2017-09-01

    Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.

  8. Canonical variate regression.

    Science.gov (United States)

    Luo, Chongliang; Liu, Jin; Dey, Dipak K; Chen, Kun

    2016-07-01

    In many fields, multi-view datasets, measuring multiple distinct but interrelated sets of characteristics on the same set of subjects, together with data on certain outcomes or phenotypes, are routinely collected. The objective in such a problem is often two-fold: both to explore the association structures of multiple sets of measurements and to develop a parsimonious model for predicting the future outcomes. We study a unified canonical variate regression framework to tackle the two problems simultaneously. The proposed criterion integrates multiple canonical correlation analysis with predictive modeling, balancing between the association strength of the canonical variates and their joint predictive power on the outcomes. Moreover, the proposed criterion seeks multiple sets of canonical variates simultaneously to enable the examination of their joint effects on the outcomes, and is able to handle multivariate and non-Gaussian outcomes. An efficient algorithm based on variable splitting and Lagrangian multipliers is proposed. Simulation studies show the superior performance of the proposed approach. We demonstrate the effectiveness of the proposed approach in an [Formula: see text] intercross mice study and an alcohol dependence study. © The Author 2016. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. Automatic segmentation of colon glands using object-graphs.

    Science.gov (United States)

    Gunduz-Demir, Cigdem; Kandemir, Melih; Tosun, Akif Burak; Sokmensuer, Cenk

    2010-02-01

    Gland segmentation is an important step to automate the analysis of biopsies that contain glandular structures. However, this remains a challenging problem as the variation in staining, fixation, and sectioning procedures lead to a considerable amount of artifacts and variances in tissue sections, which may result in huge variances in gland appearances. In this work, we report a new approach for gland segmentation. This approach decomposes the tissue image into a set of primitive objects and segments glands making use of the organizational properties of these objects, which are quantified with the definition of object-graphs. As opposed to the previous literature, the proposed approach employs the object-based information for the gland segmentation problem, instead of using the pixel-based information alone. Working with the images of colon tissues, our experiments demonstrate that the proposed object-graph approach yields high segmentation accuracies for the training and test sets and significantly improves the segmentation performance of its pixel-based counterparts. The experiments also show that the object-based structure of the proposed approach provides more tolerance to artifacts and variances in tissues.

  10. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  11. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  12. Forest Rent as an Object of Economic Analysis

    Directory of Open Access Journals (Sweden)

    Lisichko Andriyana M.

    2018-01-01

    Full Text Available The article is aimed at researching the concept of forest rent as an object of economic analysis. The essence of the concept of «forest rent» has been researched. It has been defined that the forest rent is the object of management of the forest complex of Ukraine as a whole and forest enterprises in particular. Rent for special use of forest resources is the object of interest om the part of both the State and the corporate sector, because its value depends on the cost of timber for industry and households. Works of scholars on classification of rents were studied. It has been determined that the rent for specialized use of forest resources is a special kind of natural rent. The structure of constituents in the system of rent relations in the forest sector has been defined in accordance with provisions of the tax code of Ukraine.

  13. Partial differential equations with variable exponents variational methods and qualitative analysis

    CERN Document Server

    Radulescu, Vicentiu D

    2015-01-01

    Partial Differential Equations with Variable Exponents: Variational Methods and Qualitative Analysis provides researchers and graduate students with a thorough introduction to the theory of nonlinear partial differential equations (PDEs) with a variable exponent, particularly those of elliptic type. The book presents the most important variational methods for elliptic PDEs described by nonhomogeneous differential operators and containing one or more power-type nonlinearities with a variable exponent. The authors give a systematic treatment of the basic mathematical theory and constructive meth

  14. Variation in home-birth rates between midwifery practices in the Netherlands.

    NARCIS (Netherlands)

    Wiegers, T.A.; Zee, J. van der; Kerssens, J.J.; Keirse, M.J.N.C.

    2000-01-01

    Objective: to examine the reasons for the variation in home-birth rates between midwifery practices. method: multi-level analysis of client and midwife associated, case-specific and structural factors in relation to 4420 planned and actual home or hospital births in 42 midwifery practices. Findings:

  15. Objective analysis of toolmarks in forensics

    Energy Technology Data Exchange (ETDEWEB)

    Grieve, Taylor N. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Since the 1993 court case of Daubert v. Merrell Dow Pharmaceuticals, Inc. the subjective nature of toolmark comparison has been questioned by attorneys and law enforcement agencies alike. This has led to an increased drive to establish objective comparison techniques with known error rates, much like those that DNA analysis is able to provide. This push has created research in which the 3-D surface profile of two different marks are characterized and the marks’ cross-sections are run through a comparative statistical algorithm to acquire a value that is intended to indicate the likelihood of a match between the marks. The aforementioned algorithm has been developed and extensively tested through comparison of evenly striated marks made by screwdrivers. However, this algorithm has yet to be applied to quasi-striated marks such as those made by the shear edge of slip-joint pliers. The results of this algorithm’s application to the surface of copper wire will be presented. Objective mark comparison also extends to comparison of toolmarks made by firearms. In an effort to create objective comparisons, microstamping of firing pins and breech faces has been introduced. This process involves placing unique alphanumeric identifiers surrounded by a radial code on the surface of firing pins, which transfer to the cartridge’s primer upon firing. Three different guns equipped with microstamped firing pins were used to fire 3000 cartridges. These cartridges are evaluated based on the clarity of their alphanumeric transfers and the clarity of the radial code surrounding the alphanumerics.

  16. Intermediary object for participative design processes based on the ergonomic work analysis

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Duarte, F.; Broberg, Ole

    2012-01-01

    The objective of this paper is to present and discuss the use of an intermediary object, built from the ergonomic work analysis, in a participative design process. The object was a zoning pattern, developed as a visual representation ‘mapping’ of the interrelations among the functional units of t...

  17. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    Science.gov (United States)

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  18. Objective consensus from decision trees.

    Science.gov (United States)

    Putora, Paul Martin; Panje, Cedric M; Papachristofilou, Alexandros; Dal Pra, Alan; Hundsberger, Thomas; Plasswilm, Ludwig

    2014-12-05

    Consensus-based approaches provide an alternative to evidence-based decision making, especially in situations where high-level evidence is limited. Our aim was to demonstrate a novel source of information, objective consensus based on recommendations in decision tree format from multiple sources. Based on nine sample recommendations in decision tree format a representative analysis was performed. The most common (mode) recommendations for each eventuality (each permutation of parameters) were determined. The same procedure was applied to real clinical recommendations for primary radiotherapy for prostate cancer. Data was collected from 16 radiation oncology centres, converted into decision tree format and analyzed in order to determine the objective consensus. Based on information from multiple sources in decision tree format, treatment recommendations can be assessed for every parameter combination. An objective consensus can be determined by means of mode recommendations without compromise or confrontation among the parties. In the clinical example involving prostate cancer therapy, three parameters were used with two cut-off values each (Gleason score, PSA, T-stage) resulting in a total of 27 possible combinations per decision tree. Despite significant variations among the recommendations, a mode recommendation could be found for specific combinations of parameters. Recommendations represented as decision trees can serve as a basis for objective consensus among multiple parties.

  19. Objective consensus from decision trees

    International Nuclear Information System (INIS)

    Putora, Paul Martin; Panje, Cedric M; Papachristofilou, Alexandros; Pra, Alan Dal; Hundsberger, Thomas; Plasswilm, Ludwig

    2014-01-01

    Consensus-based approaches provide an alternative to evidence-based decision making, especially in situations where high-level evidence is limited. Our aim was to demonstrate a novel source of information, objective consensus based on recommendations in decision tree format from multiple sources. Based on nine sample recommendations in decision tree format a representative analysis was performed. The most common (mode) recommendations for each eventuality (each permutation of parameters) were determined. The same procedure was applied to real clinical recommendations for primary radiotherapy for prostate cancer. Data was collected from 16 radiation oncology centres, converted into decision tree format and analyzed in order to determine the objective consensus. Based on information from multiple sources in decision tree format, treatment recommendations can be assessed for every parameter combination. An objective consensus can be determined by means of mode recommendations without compromise or confrontation among the parties. In the clinical example involving prostate cancer therapy, three parameters were used with two cut-off values each (Gleason score, PSA, T-stage) resulting in a total of 27 possible combinations per decision tree. Despite significant variations among the recommendations, a mode recommendation could be found for specific combinations of parameters. Recommendations represented as decision trees can serve as a basis for objective consensus among multiple parties

  20. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  1. Total Variation Depth for Functional Data

    KAUST Repository

    Huang, Huang

    2016-11-15

    There has been extensive work on data depth-based methods for robust multivariate data analysis. Recent developments have moved to infinite-dimensional objects such as functional data. In this work, we propose a new notion of depth, the total variation depth, for functional data. As a measure of depth, its properties are studied theoretically, and the associated outlier detection performance is investigated through simulations. Compared to magnitude outliers, shape outliers are often masked among the rest of samples and harder to identify. We show that the proposed total variation depth has many desirable features and is well suited for outlier detection. In particular, we propose to decompose the total variation depth into two components that are associated with shape and magnitude outlyingness, respectively. This decomposition allows us to develop an effective procedure for outlier detection and useful visualization tools, while naturally accounting for the correlation in functional data. Finally, the proposed methodology is demonstrated using real datasets of curves, images, and video frames.

  2. Size variation and flow experience of physical game support objects

    NARCIS (Netherlands)

    Feijs, L.M.G.; Peters, P.J.F.; Eggen, J.H.

    2004-01-01

    This paper is about designing and evaluating an innovative type of computer game. Game support objects are used to enrich the gaming experience [7]. The added objects are active but are simpler than real robots. In the study reported here they are four helper ghosts connected to a traditional Pacman

  3. Variation in Expert Source Selection According to Different Objectivity Standards

    Science.gov (United States)

    Albaek, Erik

    2011-01-01

    Several scholars have tried to clarify how journalists handle and implement the abstract objectivity norm in daily practice. Less research attention has been paid to how common abstract professional norms and values, "in casu" the objectivity norm, may systematically vary when interpreted and implemented in daily journalistic practice. Allgaier's…

  4. Analysis of micro computed tomography images; a look inside historic enamelled metal objects

    Science.gov (United States)

    van der Linden, Veerle; van de Casteele, Elke; Thomas, Mienke Simon; de Vos, Annemie; Janssen, Elsje; Janssens, Koen

    2010-02-01

    In this study the usefulness of micro-Computed Tomography (µ-CT) for the in-depth analysis of enamelled metal objects was tested. Usually investigations of enamelled metal artefacts are restricted to non-destructive surface analysis or analysis of cross sections after destructive sampling. Radiography, a commonly used technique in the field of cultural heritage studies, is limited to providing two-dimensional information about a three-dimensional object (Lang and Middleton, Radiography of Cultural Material, pp. 60-61, Elsevier-Butterworth-Heinemann, Amsterdam-Stoneham-London, 2005). Obtaining virtual slices and information about the internal structure of these objects was made possible by CT analysis. With this technique the underlying metal work was studied without removing the decorative enamel layer. Moreover visible defects such as cracks were measured in both width and depth and as of yet invisible defects and weaker areas are visualised. All these features are of great interest to restorers and conservators as they allow a view inside these objects without so much as touching them.

  5. Geographic Object-Based Image Analysis – Towards a new paradigm

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  6. A decision analysis approach for risk management of near-earth objects

    Science.gov (United States)

    Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.

    2014-10-01

    Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in

  7. Using Epistemic Network Analysis to understand core topics as planned learning objectives

    DEFF Research Database (Denmark)

    Allsopp, Benjamin Brink; Dreyøe, Jonas; Misfeldt, Morten

    Epistemic Network Analysis is a tool developed by the epistemic games group at the University of Wisconsin Madison for tracking the relations between concepts in students discourse (Shaffer 2017). In our current work we are applying this tool to learning objectives in teachers digital preparation....... The danish mathematics curriculum is organised in six competencies and three topics. In the recently implemented learning platforms teacher choose which of the mathematical competencies that serves as objective for a specific lesson or teaching sequence. Hence learning objectives for lessons and teaching...... sequences are defining a network of competencies, where two competencies are closely related of they often are part of the same learning objective or teaching sequence. We are currently using Epistemic Network Analysis to study these networks. In the poster we will include examples of different networks...

  8. Theory of planned behaviour variables and objective walking behaviour do not show seasonal variation in a randomised controlled trial.

    Science.gov (United States)

    Williams, Stefanie L; French, David P

    2014-02-05

    Longitudinal studies have shown that objectively measured walking behaviour is subject to seasonal variation, with people walking more in summer compared to winter. Seasonality therefore may have the potential to bias the results of randomised controlled trials if there are not adequate statistical or design controls. Despite this there are no studies that assess the impact of seasonality on walking behaviour in a randomised controlled trial, to quantify the extent of such bias. Further there have been no studies assessing how season impacts on the psychological predictors of walking behaviour to date. The aim of the present study was to assess seasonal differences in a) objective walking behaviour and b) Theory of Planned Behaviour (TPB) variables during a randomised controlled trial of an intervention to promote walking. 315 patients were recruited to a two-arm cluster randomised controlled trial of an intervention to promote walking in primary care. A series of repeated measures ANCOVAs were conducted to examine the effect of season on pedometer measures of walking behaviour and TPB measures, assessed immediately post-intervention and six months later. Hierarchical regression analyses were conducted to assess whether season moderated the prediction of intention and behaviour by TPB measures. There were no significant differences in time spent walking in spring/summer compared to autumn/winter. There was no significant seasonal variation in most TPB variables, although the belief that there will be good weather was significantly higher in spring/summer (F = 19.46, p behaviour, or moderate the effects of TPB variables on intention or behaviour. Seasonality does not influence objectively measured walking behaviour or psychological variables during a randomised controlled trial. Consequently physical activity behaviour outcomes in trials will not be biased by the season in which they are measured. Previous studies may have overestimated the extent of

  9. Visual Field Preferences of Object Analysis for Grasping with One Hand

    Directory of Open Access Journals (Sweden)

    Ada eLe

    2014-10-01

    Full Text Available When we grasp an object using one hand, the opposite hemisphere predominantly guides the motor control of grasp movements (Davare et al. 2007; Rice et al. 2007. However, it is unclear whether visual object analysis for grasp control relies more on inputs (a from the contralateral than the ipsilateral visual field, (b from one dominant visual field regardless of the grasping hand, or (c from both visual fields equally. For bimanual grasping of a single object we have recently demonstrated a visual field preference for the left visual field (Le and Niemeier 2013a, 2013b, consistent with a general right-hemisphere dominance for sensorimotor control of bimanual grasps (Le et al., 2013. But visual field differences have never been tested for unimanual grasping. Therefore, here we asked right-handed participants to fixate to the left or right of an object and then grasp the object either with their right or left hand using a precision grip. We found that participants grasping with their right hand performed better with objects in the right visual field: maximum grip apertures (MGAs were more closely matched to the object width and were smaller than for objects in the left visual field. In contrast, when people grasped with their left hand, preferences switched to the left visual field. What is more, MGA scaling showed greater visual field differences compared to right-hand grasping. Our data suggest that, visual object analysis for unimanual grasping shows a preference for visual information from the ipsilateral visual field, and that the left hemisphere is better equipped to control grasps in both visual fields.

  10. Difference Discrete Variational Principles, Euler-Lagrange Cohomology and Symplectic, Multisymplectic Structures I: Difference Discrete Variational Principle

    Institute of Scientific and Technical Information of China (English)

    GUO Han-Ying,; LI Yu-Qi; WU Ke1; WANG Shi-Kun

    2002-01-01

    In this first paper of a series, we study the difference discrete variational principle in the framework of multi-parameter differential approach by regarding the forward difference as an entire geometric object in view of noncommutative differential geometry. Regarding the difference as an entire geometric object, the difference discrete version of Legendre transformation can be introduced. By virtue of this variational principle, we can discretely deal with the variation problems in both the Lagrangian and Hamiltonian formalisms to get difference discrete Euler-Lagrange equations and canonical ones for the difference discrete versions of the classical mechanics and classical field theory.

  11. Electrical Resistance Tomography for Visualization of Moving Objects Using a Spatiotemporal Total Variation Regularization Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Chen

    2018-05-01

    Full Text Available Electrical resistance tomography (ERT has been considered as a data collection and image reconstruction method in many multi-phase flow application areas due to its advantages of high speed, low cost and being non-invasive. In order to improve the quality of the reconstructed images, the Total Variation algorithm attracts abundant attention due to its ability to solve large piecewise and discontinuous conductivity distributions. In industrial processing tomography (IPT, techniques such as ERT have been used to extract important flow measurement information. For a moving object inside a pipe, a velocity profile can be calculated from the cross correlation between signals generated from ERT sensors. Many previous studies have used two sets of 2D ERT measurements based on pixel-pixel cross correlation, which requires two ERT systems. In this paper, a method for carrying out flow velocity measurement using a single ERT system is proposed. A novel spatiotemporal total variation regularization approach is utilised to exploit sparsity both in space and time in 4D, and a voxel-voxel cross correlation method is adopted for measurement of flow profile. Result shows that the velocity profile can be calculated with a single ERT system and that the volume fraction and movement can be monitored using the proposed method. Both semi-dynamic experimental and static simulation studies verify the suitability of the proposed method. For in plane velocity profile, a 3D image based on temporal 2D images produces velocity profile with accuracy of less than 1% error and a 4D image for 3D velocity profiling shows an error of 4%.

  12. Anatomical variations of the celiac trunk and hepatic arterial system: an analysis using multidetector computed tomography angiography

    International Nuclear Information System (INIS)

    Araujo Neto, Severino Aires; Franca, Henrique Almeida; Mello Junior, Carlos Fernando de; Silva Neto, Eulampio Jose; Negromonte, Gustavo Ramalho Pessoa; Duarte, Claudia Martina Araujo; Cavalcanti Neto, Bartolomeu Fragoso; Farias, Rebeca Danielly da Fonseca

    2015-01-01

    Objective: To analyze the prevalence of anatomical variations of celiac arterial trunk (CAT) branches and hepatic arterial system (HAS), as well as the CAT diameter, length and distance to the superior mesenteric artery. Materials And Methods: Retrospective, cross-sectional and predominantly descriptive study based on the analysis of multidetector computed tomography images of 60 patients. Results: The celiac trunk anatomy was normal in 90% of cases. Hepatosplenic trunk was found in 8.3% of patients, and hepatogastric trunk in 1.7%. Variation of the HAS was observed in 21.7% of cases, including anomalous location of the right hepatic artery in 8.3% of cases, and of the left hepatic artery, in 5%. Also, cases of joint relocation of right and left hepatic arteries, and trifurcation of the proper hepatic artery were observed, respectively, in 3 (5%) and 2 (3.3%) patients. Mean length and caliber of the CAT were 2.3 cm and 0.8 cm, respectively. Mean distance between CAT and superior mesenteric artery was 1.2 cm (standard deviation = 4.08). A significant correlation was observed between CAT diameter and length, and CAT diameter and distance to superior mesenteric artery. Conclusion: The pattern of CAT variations and diameter corroborate the majority of the literature data. However, this does not happen in relation to the HAS. (author)

  13. Anatomical variations of the celiac trunk and hepatic arterial system: an analysis using multidetector computed tomography angiography

    Energy Technology Data Exchange (ETDEWEB)

    Araujo Neto, Severino Aires; Franca, Henrique Almeida; Mello Junior, Carlos Fernando de; Silva Neto, Eulampio Jose; Negromonte, Gustavo Ramalho Pessoa; Duarte, Claudia Martina Araujo; Cavalcanti Neto, Bartolomeu Fragoso; Farias, Rebeca Danielly da Fonseca, E-mail: severinoaires@hotmail.com [Universidade Federal da Paraiba (UFPB), Joao Pessoa, PB (Brazil)

    2015-11-15

    Objective: To analyze the prevalence of anatomical variations of celiac arterial trunk (CAT) branches and hepatic arterial system (HAS), as well as the CAT diameter, length and distance to the superior mesenteric artery. Materials And Methods: Retrospective, cross-sectional and predominantly descriptive study based on the analysis of multidetector computed tomography images of 60 patients. Results: The celiac trunk anatomy was normal in 90% of cases. Hepatosplenic trunk was found in 8.3% of patients, and hepatogastric trunk in 1.7%. Variation of the HAS was observed in 21.7% of cases, including anomalous location of the right hepatic artery in 8.3% of cases, and of the left hepatic artery, in 5%. Also, cases of joint relocation of right and left hepatic arteries, and trifurcation of the proper hepatic artery were observed, respectively, in 3 (5%) and 2 (3.3%) patients. Mean length and caliber of the CAT were 2.3 cm and 0.8 cm, respectively. Mean distance between CAT and superior mesenteric artery was 1.2 cm (standard deviation = 4.08). A significant correlation was observed between CAT diameter and length, and CAT diameter and distance to superior mesenteric artery. Conclusion: The pattern of CAT variations and diameter corroborate the majority of the literature data. However, this does not happen in relation to the HAS. (author)

  14. Analysis of process parameters in surface grinding using single objective Taguchi and multi-objective grey relational grade

    Directory of Open Access Journals (Sweden)

    Prashant J. Patil

    2016-09-01

    Full Text Available Close tolerance and good surface finish are achieved by means of grinding process. This study was carried out for multi-objective optimization of MQL grinding process parameters. Water based Al2O3 and CuO nanofluids of various concentrations are used as lubricant for MQL system. Grinding experiments were carried out on instrumented surface grinding machine. For experimentation purpose Taguchi's method was used. Important process parameters that affect the G ratio and surface finish in MQL grinding are depth of cut, type of lubricant, feed rate, grinding wheel speed, coolant flow rate, and nanoparticle size. Grinding performance was calculated by the measurement G ratio and surface finish. For improvement of grinding process a multi-objective process parameter optimization is performed by use of Taguchi based grey relational analysis. To identify most significant factor of process analysis of variance (ANOVA has been used.

  15. Model-based object classification using unification grammars and abstract representations

    Science.gov (United States)

    Liburdy, Kathleen A.; Schalkoff, Robert J.

    1993-04-01

    The design and implementation of a high level computer vision system which performs object classification is described. General object labelling and functional analysis require models of classes which display a wide range of geometric variations. A large representational gap exists between abstract criteria such as `graspable' and current geometric image descriptions. The vision system developed and described in this work addresses this problem and implements solutions based on a fusion of semantics, unification, and formal language theory. Object models are represented using unification grammars, which provide a framework for the integration of structure and semantics. A methodology for the derivation of symbolic image descriptions capable of interacting with the grammar-based models is described and implemented. A unification-based parser developed for this system achieves object classification by determining if the symbolic image description can be unified with the abstract criteria of an object model. Future research directions are indicated.

  16. Advances in variational and hemivariational inequalities theory, numerical analysis, and applications

    CERN Document Server

    Migórski, Stanisław; Sofonea, Mircea

    2015-01-01

    Highlighting recent advances in variational and hemivariational inequalities with an emphasis on theory, numerical analysis and applications, this volume serves as an indispensable resource to graduate students and researchers interested in the latest results from recognized scholars in this relatively young and rapidly-growing field. Particularly, readers will find that the volume’s results and analysis present valuable insights into the fields of pure and applied mathematics, as well as civil, aeronautical, and mechanical engineering. Researchers and students will find new results on well posedness to stationary and evolutionary inequalities and their rigorous proofs. In addition to results on modeling and abstract problems, the book contains new results on the numerical methods for variational and hemivariational inequalities. Finally, the applications presented illustrate the use of these results in the study of miscellaneous mathematical models which describe the contact between deformable bodies and a...

  17. Variations in government contract in Malaysia

    Directory of Open Access Journals (Sweden)

    Jaspal Singh Nachatar

    2010-12-01

    Full Text Available The complexity of construction works means that it is hardly possible to complete a project without changes to the plans or the construction process itself. There can only be a minority of contracts of any size in which the subject matter when completed is identical in every respect with what was contemplated at the outset. As such, variations are inevitable in even the best-planned contracts. This study is attempted to examine the ways a variation was formed in law and project, in finding out whether the Standard Form of Contract used in Malaysia particularly the government Public Works Department (PWD form has been utilized to the best level in variation cases. Additionally, this study examined the benefits of variations to parties in contract and also provides suggestions and assumptions in an effort to contribute solutions to issues and problem detected. The research methodology used in this study was an extensive review of relevant literature, case study, empirical questionnaires and structured interviews and general observations based on experience and surroundings. The academic study approach incorporated stages such as initial understanding, data and information gathering, analysis of data, findings and conclusion and general suggestions in the study. The major findings of this study, among others, revealed that the existences of variations are common in projects. The main cause of variations was due to client request because of inadequate project objectives for the designer to develop comprehensive design. Besides, the analysis pointed out that the government form of contract the Public Works Department (PWD 203/203A can help in overcoming projects with variation because of the clear defined procedure. This study also found that proper planning and coordination at tender stage can minimize the risk of ‘unwanted’ variations. In conclusion, this study recommended that future research should be done in design and build based contract

  18. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    Science.gov (United States)

    Bainbridge, Matthew B.; Webb, John K.

    2017-06-01

    A new and automated method is presented for the analysis of high-resolution absorption spectra. Three established numerical methods are unified into one `artificial intelligence' process: a genetic algorithm (Genetic Voigt Profile FIT, gvpfit); non-linear least-squares with parameter constraints (vpfit); and Bayesian model averaging (BMA). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. gvpfit is also motivated by the importance of obtaining a large statistical sample of measurements of Δα/α. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. In contrast to previous methodologies, we use BMA to derive results using a large set of models and show that this procedure is more robust than a human picking a single preferred model since BMA avoids the systematic uncertainties associated with model choice. Numerical simulations provide stringent tests of the whole process and we show using both real and simulated spectra that the unified automated fitting procedure out-performs a human interactive analysis. The method should be invaluable in the context of future instrumentation like ESPRESSO on the VLT and indeed future ELTs. We apply the method to the zabs = 1.8389 absorber towards the zem = 2.145 quasar J110325-264515. The derived constraint of Δα/α = 3.3 ± 2.9 × 10-6 is consistent with no variation and also consistent with the tentative spatial variation reported in Webb et al. and King et al.

  19. X-ray fluorescence analysis of archaeological finds and art objects: Recognizing gold and gilding

    International Nuclear Information System (INIS)

    Trojek, Tomáš; Hložek, Martin

    2012-01-01

    Many cultural heritage objects were gilded in the past, and nowadays they can be found in archeological excavations or in historical buildings dating back to the Middle Ages, or from the modern period. Old gilded artifacts have been studied using X-ray fluorescence analysis and 2D microanalysis. Several techniques that enable the user to distinguish gold and gilded objects are described and then applied to investigate artifacts. These techniques differ in instrumentation, data analysis and numbers of measurements. The application of Monte Carlo calculation to a quantitative analysis of gilded objects is also introduced. - Highlights: ► Three techniques of gilding identification with XRF analysis are proposed. ► These techniques are applied to gold and gilded art and archeological objects. ► Composition of a substrate material is determined by a Monte Carlo simulation.

  20. Obtaining 'images' from iron objects using a 3-axis fluxgate magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    Chilo, Jose [University of Gaevle, S-80176 Gaevle (Sweden); Jabor, Abbas [Royal Institute of Technology, Department of Physics, S-106 91 Stockholm (Sweden); Lizska, Ludwik [Swedish Institute of Space Physics in Umea (Sweden); Eide, Age J. [Ostfold University College, N-1757 Halden (Norway); Lindblad, Thomas [Royal Institute of Technology, Department of Physics, S-106 91 Stockholm (Sweden)], E-mail: lindblad@particle.kth.se

    2007-10-01

    Magnetic objects can cause local variations in the Earth's magnetic field that can be measured with a magnetometer. Here we used tri-axial magnetometer measurements and an analysis method employing wavelet techniques to determine the 'signature' or 'fingerprint' of different iron objects. Clear distinctions among the iron samples were observed. The time-dependent changes in the frequency powers were extracted by use of the Morlet wavelet corresponding to frequency bands from 0.1 to 100 Hz.

  1. Analysis and design of the SI-simulator software system for the VHTR-SI process by using the object-oriented analysis and object-oriented design methodology

    International Nuclear Information System (INIS)

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2008-01-01

    The SI-simulator is an application software system that simulates the dynamic behavior of the VHTR-SI process by the use of mathematical models. Object-oriented analysis (OOA) and object-oriented design (OOD) methodologies were employed for the SI simulator system development. OOA is concerned with developing software engineering requirements and specifications that are expressed as a system's object model (which is composed of a population of interacting objects), as opposed to the traditional data or functional views of systems. OOD techniques are useful for the development of large complex systems. Also, OOA/OOD methodology is usually employed to maximize the reusability and extensibility of a software system. In this paper, we present a design feature for the SI simulator software system by the using methodologies of OOA and OOD

  2. Objective Quantification of Immune Cell Infiltrates and Epidermal Proliferation in Psoriatic Skin

    DEFF Research Database (Denmark)

    Soendergaard, Christoffer; Nielsen, Ole H; Skak, Kresten

    2015-01-01

    assessments by pathologists with the interobserver and intraobserver variation this includes. Automated quantitative assessment of immunohistochemical staining has the potential to objectively extract numerical measures from cell and tissue structures, and allows efficient high throughput analysis in clinical...... research. Published data of manual cell counts in psoriatic skin samples were in this study reevaluated using the digital image analysis (DIA) software. Whole slides immunohistochemically stained for CD3, CD4, CD8, CD45R0, and Ki-67 were scanned and quantitatively evaluated using simple threshold analysis...

  3. Analysis of Long-Term Temperature Variations in the Human Body.

    Science.gov (United States)

    Dakappa, Pradeepa Hoskeri; Mahabala, Chakrapani

    2015-01-01

    Body temperature is a continuous physiological variable. In normal healthy adults, oral temperature is estimated to vary between 36.1°C and 37.2°C. Fever is a complex host response to many external and internal agents and is a potential contributor to many clinical conditions. Despite being one of the foremost vital signs, temperature and its analysis and variations during many pathological conditions has yet to be examined in detail using mathematical techniques. Classical fever patterns based on recordings obtained every 8-12 h have been developed. However, such patterns do not provide meaningful information in diagnosing diseases. Because fever is a host response, it is likely that there could be a unique response to specific etiologies. Continuous long-term temperature monitoring and pattern analysis using specific analytical methods developed in engineering and physics could aid in revealing unique fever responses of hosts and in different clinical conditions. Furthermore, such analysis can potentially be used as a novel diagnostic tool and to study the effect of pharmaceutical agents and other therapeutic protocols. Thus, the goal of our article is to present a comprehensive review of the recent relevant literature and analyze the current state of research regarding temperature variations in the human body.

  4. Sensitivity and Uncertainty Analysis for Streamflow Prediction Using Different Objective Functions and Optimization Algorithms: San Joaquin California

    Science.gov (United States)

    Paul, M.; Negahban-Azar, M.

    2017-12-01

    The hydrologic models usually need to be calibrated against observed streamflow at the outlet of a particular drainage area through a careful model calibration. However, a large number of parameters are required to fit in the model due to their unavailability of the field measurement. Therefore, it is difficult to calibrate the model for a large number of potential uncertain model parameters. This even becomes more challenging if the model is for a large watershed with multiple land uses and various geophysical characteristics. Sensitivity analysis (SA) can be used as a tool to identify most sensitive model parameters which affect the calibrated model performance. There are many different calibration and uncertainty analysis algorithms which can be performed with different objective functions. By incorporating sensitive parameters in streamflow simulation, effects of the suitable algorithm in improving model performance can be demonstrated by the Soil and Water Assessment Tool (SWAT) modeling. In this study, the SWAT was applied in the San Joaquin Watershed in California covering 19704 km2 to calibrate the daily streamflow. Recently, sever water stress escalating due to intensified climate variability, prolonged drought and depleting groundwater for agricultural irrigation in this watershed. Therefore it is important to perform a proper uncertainty analysis given the uncertainties inherent in hydrologic modeling to predict the spatial and temporal variation of the hydrologic process to evaluate the impacts of different hydrologic variables. The purpose of this study was to evaluate the sensitivity and uncertainty of the calibrated parameters for predicting streamflow. To evaluate the sensitivity of the calibrated parameters three different optimization algorithms (Sequential Uncertainty Fitting- SUFI-2, Generalized Likelihood Uncertainty Estimation- GLUE and Parameter Solution- ParaSol) were used with four different objective functions (coefficient of determination

  5. Mediman: Object oriented programming approach for medical image analysis

    International Nuclear Information System (INIS)

    Coppens, A.; Sibomana, M.; Bol, A.; Michel, C.

    1993-01-01

    Mediman is a new image analysis package which has been developed to analyze quantitatively Positron Emission Tomography (PET) data. It is object-oriented, written in C++ and its user interface is based on InterViews on top of which new classes have been added. Mediman accesses data using external data representation or import/export mechanism which avoids data duplication. Multimodality studies are organized in a simple database which includes images, headers, color tables, lists and objects of interest (OOI's) and history files. Stored color table parameters allow to focus directly on the interesting portion of the dynamic range. Lists allow to organize the study according to modality, acquisition protocol, time and spatial properties. OOI's (points, lines and regions) are stored in absolute 3-D coordinates allowing correlation with other co-registered imaging modalities such as MRI or SPECT. OOI's have visualization properties and are organized into groups. Quantitative ROI analysis of anatomic images consists of position, distance, volume calculation on selected OOI's. An image calculator is connected to mediman. Quantitation of metabolic images is performed via profiles, sectorization, time activity curves and kinetic modeling. Mediman is menu and mouse driven, macro-commands can be registered and replayed. Its interface is customizable through a configuration file. The benefit of the object-oriented approach are discussed from a development point of view

  6. Towards Uniform Accelerometry Analysis: A Standardization Methodology to Minimize Measurement Bias Due to Systematic Accelerometer Wear-Time Variation

    Directory of Open Access Journals (Sweden)

    Tarun R. Katapally, Nazeem Muhajarine

    2014-06-01

    Full Text Available Accelerometers are predominantly used to objectively measure the entire range of activity intensities – sedentary behaviour (SED, light physical activity (LPA and moderate to vigorous physical activity (MVPA. However, studies consistently report results without accounting for systematic accelerometer wear-time variation (within and between participants, jeopardizing the validity of these results. This study describes the development of a standardization methodology to understand and minimize measurement bias due to wear-time variation. Accelerometry is generally conducted over seven consecutive days, with participants' data being commonly considered 'valid' only if wear-time is at least 10 hours/day. However, even within ‘valid’ data, there could be systematic wear-time variation. To explore this variation, accelerometer data of Smart Cities, Healthy Kids study (www.smartcitieshealthykids.com were analyzed descriptively and with repeated measures multivariate analysis of variance (MANOVA. Subsequently, a standardization method was developed, where case-specific observed wear-time is controlled to an analyst specified time period. Next, case-specific accelerometer data are interpolated to this controlled wear-time to produce standardized variables. To understand discrepancies owing to wear-time variation, all analyses were conducted pre- and post-standardization. Descriptive analyses revealed systematic wear-time variation, both between and within participants. Pre- and post-standardized descriptive analyses of SED, LPA and MVPA revealed a persistent and often significant trend of wear-time’s influence on activity. SED was consistently higher on weekdays before standardization; however, this trend was reversed post-standardization. Even though MVPA was significantly higher on weekdays both pre- and post-standardization, the magnitude of this difference decreased post-standardization. Multivariable analyses with standardized SED, LPA and

  7. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  8. A systematic review and meta-analysis of variations in branching patterns of the adult aortic arch.

    Science.gov (United States)

    Popieluszko, Patrick; Henry, Brandon Michael; Sanna, Beatrice; Hsieh, Wan Chin; Saganiak, Karolina; Pękala, Przemysław A; Walocha, Jerzy A; Tomaszewski, Krzysztof A

    2018-07-01

    The aortic arch (AA) is the main conduit of the left side of the heart, providing a blood supply to the head, neck, and upper limbs. As it travels through the thorax, the pattern in which it gives off the branches to supply these structures can vary. Variations of these branching patterns have been studied; however, a study providing a comprehensive incidence of these variations has not yet been conducted. The objective of this study was to perform a meta-analysis of all the studies that report prevalence data on AA variants and to provide incidence data on the most common variants. A systematic search of online databases including PubMed, Embase, Scopus, ScienceDirect, Web of Science, SciELO, BIOSIS, and CNKI was performed for literature describing incidence of AA variations in adults. Studies including prevalence data on adult patients or cadavers were collected and their data analyzed. A total of 51 articles were included (N = 23,882 arches). Seven of the most common variants were analyzed. The most common variants found included the classic branching pattern, defined as a brachiocephalic trunk, a left common carotid, and a left subclavian artery (80.9%); the bovine arch variant (13.6%); and the left vertebral artery variant (2.8%). Compared by geographic data, bovine arch variants were noted to have a prevalence as high as 26.8% in African populations. Although patients who have an AA variant are often asymptomatic, they compose a significant portion of the population of patients and pose a greater risk of hemorrhage and ischemia during surgery in the thorax. Because of the possibility of encountering such variants, it is prudent for surgeons to consider potential variations in planning procedures, especially of an endovascular nature, in the thorax. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  9. Analysis of conformational variations of the cricoid cartilages in Thoroughbred horses using computed tomography.

    Science.gov (United States)

    Dahlberg, J A; Valdes-Martinez, A; Boston, R C; Parente, E J

    2011-03-01

    Loss of arytenoid abduction is a common post operative complication of laryngoplasty without a definitive cause. It has been a clinical impression during laryngoplasty surgery that there is great conformational variability along the caudal edge of the Thoroughbred cricoid cartilage that could impact post operative retention of suture position. A change in suture position would probably lead to some loss of abduction. Defining any structural variability of the cricoid would be an initial step in determining whether this variability could impact on the retention of suture position. Anatomical variations in the larynx of Thoroughbred horses may be detected and measured using objective analysis and computed tomography. Larynges were harvested from 15 mature Thoroughbred horses. Helical CT scans were performed on each specimen. Three independent observers performed a series of measurements on 2D and 3D reconstruction images using digital software. Measurements included the lateral cricoid angle, the caudal cricoid prominences, the distance to the cricoid slope, the angle of the cricoarytenoid joints (CAJ), the cricoid thickness and the suture angle. Mean, standard deviation, coefficient of variation and linear regression analysis were performed among all observers and all measurements. Notable conformational differences were evident on the 3D reconstructions. The highest degree of variability was found in 3 measurements: the distance to the lateral cricoid slope, the lateral cricoid angle and the cricoid thickness. A larger left CAJ angle directly and significantly correlated with a larger suture angle. There are notable conformational differences among cricoid specimens in the Thoroughbred larynx. The morphometric differences identified may impact on optimal prosthesis placement and long-term retention. Since a larger lateral cricoid angle may facilitate abduction loss secondary to a displaced and loosened suture, alternative techniques for suture placement may be of

  10. Variational principles

    CERN Document Server

    Moiseiwitsch, B L

    2004-01-01

    This graduate-level text's primary objective is to demonstrate the expression of the equations of the various branches of mathematical physics in the succinct and elegant form of variational principles (and thereby illuminate their interrelationship). Its related intentions are to show how variational principles may be employed to determine the discrete eigenvalues for stationary state problems and to illustrate how to find the values of quantities (such as the phase shifts) that arise in the theory of scattering. Chapter-by-chapter treatment consists of analytical dynamics; optics, wave mecha

  11. Object tracking using multiple camera video streams

    Science.gov (United States)

    Mehrubeoglu, Mehrube; Rojas, Diego; McLauchlan, Lifford

    2010-05-01

    Two synchronized cameras are utilized to obtain independent video streams to detect moving objects from two different viewing angles. The video frames are directly correlated in time. Moving objects in image frames from the two cameras are identified and tagged for tracking. One advantage of such a system involves overcoming effects of occlusions that could result in an object in partial or full view in one camera, when the same object is fully visible in another camera. Object registration is achieved by determining the location of common features in the moving object across simultaneous frames. Perspective differences are adjusted. Combining information from images from multiple cameras increases robustness of the tracking process. Motion tracking is achieved by determining anomalies caused by the objects' movement across frames in time in each and the combined video information. The path of each object is determined heuristically. Accuracy of detection is dependent on the speed of the object as well as variations in direction of motion. Fast cameras increase accuracy but limit the speed and complexity of the algorithm. Such an imaging system has applications in traffic analysis, surveillance and security, as well as object modeling from multi-view images. The system can easily be expanded by increasing the number of cameras such that there is an overlap between the scenes from at least two cameras in proximity. An object can then be tracked long distances or across multiple cameras continuously, applicable, for example, in wireless sensor networks for surveillance or navigation.

  12. Variation Trend Analysis of Runoff and Sediment Time Series Based on the R/S Analysis of Simulated Loess Tilled Slopes in the Loess Plateau, China

    Directory of Open Access Journals (Sweden)

    Ju Zhang

    2017-12-01

    Full Text Available The objective of this study was to illustrate the temporal variation of runoff and sediment of loess tilled slopes under successive rainfall conditions. Loess tilled slopes with four microtopography types (straight cultivated slope, artificial backhoe, artificial digging, and contour tillage under five slope gradients (5°, 10°, 15°, 20°, 25° were simulated and a rainfall intensity of 60 mm/h was adopted. The temporal trends of runoff and sediment yield were predicted based on the Rescaled Range (R/S analysis method. The results indicate that the Hurst indices of runoff time series and sediment time series are higher than 0.5, and a long-term positive correlation exists between the future and the past. This means that runoff and sediment of loess tilled slopes in the future will have the same trends as in the past. The results obtained by the classical R/S analysis method were the same as those of the modified R/S analysis method. The rationality and reliability of the R/S analysis method were further identified and the method can be used for predicting the trend of runoff and sediment yield. The correlation between the microtopography and the Hurst indices of the runoff and sediment yield time series, as well as between the slopes and the Hurst indices, were tested, and the result was that there was no significant correlation between them. The microtopography and slopes cannot affect the correlation and continuity of runoff and sediment yield time series. This study provides an effective method for predicting variations in the trends of runoff and sediment yield on loess tilled slopes.

  13. Cytoplasmic genetic variation and extensive cytonuclear interactions influence natural variation in the metabolome

    DEFF Research Database (Denmark)

    Joseph, Bindu; Corwin, Jason A.; Li, Baohua

    2013-01-01

    Understanding genome to phenotype linkages has been greatly enabled by genomic sequencing. However, most genome analysis is typically confined to the nuclear genome. We conducted a metabolomic QTL analysis on a reciprocal RIL population structured to examine how variation in the organelle genomes...... was a central hub in the epistatic network controlling the plant metabolome. This epistatic influence manifested such that the cytoplasmic background could alter or hide pairwise epistasis between nuclear loci. Thus, cytoplasmic genetic variation plays a central role in controlling natural variation...... in metabolomic networks. This suggests that cytoplasmic genomes must be included in any future analysis of natural variation....

  14. Objective image analysis of the meibomian gland area.

    Science.gov (United States)

    Arita, Reiko; Suehiro, Jun; Haraguchi, Tsuyoshi; Shirakawa, Rika; Tokoro, Hideaki; Amano, Shiro

    2014-06-01

    To evaluate objectively the meibomian gland area using newly developed software for non-invasive meibography. Eighty eyelids of 42 patients without meibomian gland loss (meiboscore=0), 105 eyelids of 57 patients with loss of less than one-third total meibomian gland area (meiboscore=1), 13 eyelids of 11 patients with between one-third and two-thirds loss of meibomian gland area (meiboscore=2) and 20 eyelids of 14 patients with two-thirds loss of meibomian gland area (meiboscore=3) were studied. Lid borders were automatically determined. The software evaluated the distribution of the luminance and, by enhancing the contrast and reducing image noise, the meibomian gland area was automatically discriminated. The software calculated the ratio of the total meibomian gland area relative to the total analysis area in all subjects. Repeatability of the software was also evaluated. The mean ratio of the meibomian gland area to the total analysis area in the upper/lower eyelids was 51.9±5.7%/54.7±5.4% in subjects with a meiboscore of 0, 47.7±6.0%/51.5±5.4% in those with a meiboscore of 1, 32.0±4.4%/37.2±3.5% in those with a meiboscore of 2 and 16.7±6.4%/19.5±5.8% in subjects with a meiboscore of 3. The meibomian gland area was objectively evaluated using the developed software. This system could be useful for objectively evaluating the effect of treatment on meibomian gland dysfunction. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  15. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  16. Grasping and manipulation of deformable objects based on internal force requirements

    Directory of Open Access Journals (Sweden)

    Sohil Garg

    2008-11-01

    Full Text Available In this paper an analysis of grasping and manipulation of deformable objects by a three finger robot hand has been carried out. It is proved that the required fingertip grasping forces and velocities vary with change in object size due to deformation. The variation of the internal force with the change in fingertip and object contact angle has been investigated in detail. From the results it is concluded that it is very difficult to manipulate an object if the finger contact angle is not between 30 o and 70 o, as the internal forces or velocities become very large outside this range. Hence even if the object is inside the work volume of the three fingers it would still not be possible to manipulate it. A simple control model is proposed which can control the grasping and manipulation of a deformable object. Experimental results are also presented to prove the proposed method.

  17. Variational Transition State Theory

    Energy Technology Data Exchange (ETDEWEB)

    Truhlar, Donald G. [Univ. of Minnesota, Minneapolis, MN (United States)

    2016-09-29

    This is the final report on a project involving the development and applications of variational transition state theory. This project involved the development of variational transition state theory for gas-phase reactions, including optimized multidimensional tunneling contributions and the application of this theory to gas-phase reactions with a special emphasis on developing reaction rate theory in directions that are important for applications to combustion. The development of variational transition state theory with optimized multidimensional tunneling as a useful computational tool for combustion kinetics involved eight objectives.

  18. [Genetic variation analysis of canine parvovirus VP2 gene in China].

    Science.gov (United States)

    Yi, Li; Cheng, Shi-Peng; Yan, Xi-Jun; Wang, Jian-Ke; Luo, Bin

    2009-11-01

    To recognize the molecular biology character, phylogenetic relationship and the state quo prevalent of Canine parvovirus (CPV), Faecal samnples from pet dogs with acute enteritis in the cities of Beijing, Wuhan, and Nanjing were collected and tested for CPV by PCR and other assay between 2006 and 2008. There was no CPV to FPV (MEV) variation by PCR-RFLP analysis in all samples. The complete ORFs of VP2 genes were obtained by PCR from 15 clinical CPVs and 2 CPV vaccine strains. All amplicons were cloned and sequenced. Analysis of the VP2 sequences showed that clinical CPVs both belong to CPV-2a subtype, and could be classified into a new cluster by amino acids contrasting which contains Tyr-->Ile (324) mutation. Besides the 2 CPV vaccine strains belong to CPV-2 subtype, and both of them have scattered variation in amino acids residues of VP2 protein. Construction of the phylogenetic tree based on CPV VP2 sequence showed these 15 CPV clinical strains were in close relationship with Korea strain K001 than CPV-2a isolates in other countries at early time, It is indicated that the canine parvovirus genetic variation was associated with location and time in some degree. The survey of CPV capsid protein VP2 gene provided the useful information for the identification of CPV types and understanding of their genetic relationship.

  19. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    Science.gov (United States)

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  20. A Nationwide Analysis of Cost Variation for Autologous Free Flap Breast Reconstruction.

    Science.gov (United States)

    Billig, Jessica I; Lu, Yiwen; Momoh, Adeyiza O; Chung, Kevin C

    2017-11-01

    Cost variation among hospitals has been demonstrated for surgical procedures. Uncovering these differences has helped guide measures taken to reduce health care spending. To date, the fiscal consequence of hospital variation for autologous free flap breast reconstruction is unknown. To investigate factors that influence cost variation for autologous free flap breast reconstruction. A secondary cross-sectional analysis was performed using the Healthcare Cost and Utilization Project National Inpatient Sample database from 2008 to 2010. The dates of analysis were September 2016 to February 2017. The setting was a stratified sample of all US community hospitals. Participants were female patients who were diagnosed as having breast cancer or were at high risk for breast cancer and underwent autologous free flap breast reconstruction. Variables of interest included demographic data, hospital characteristics, length of stay, complications (surgical and systemic), and inpatient cost. The study used univariate and generalized linear mixed models to examine associations between patient and hospital characteristics and cost. A total of 3302 patients were included in the study, with a median age of 50 years (interquartile range, 44-57 years). The mean cost for autologous free flap breast reconstruction was $22 677 (interquartile range, $14 907-$33 391). Flap reconstructions performed at high-volume hospitals were significantly more costly than those performed at low-volume hospitals ($24 360 vs $18 918, P Logistic regression demonstrated that hospital volume correlated with increased cost (Exp[β], 1.06; 95% CI, 1.02-1.11; P = .003). Fewer surgical complications (16.4% [169 of 1029] vs 23.7% [278 of 1174], P cost variation among patients undergoing autologous free flap breast reconstruction. Experience, as measured by a hospital's volume, provides quality health care with fewer complications but is more costly. Longer length of stay contributed to regional

  1. Analysis of genetic variation and potential applications in genome-scale metabolic modeling

    DEFF Research Database (Denmark)

    Cardoso, Joao; Andersen, Mikael Rørdam; Herrgard, Markus

    2015-01-01

    scale and resolution by re-sequencing thousands of strains systematically. In this article, we review challenges in the integration and analysis of large-scale re-sequencing data, present an extensive overview of bioinformatics methods for predicting the effects of genetic variants on protein function......Genetic variation is the motor of evolution and allows organisms to overcome the environmental challenges they encounter. It can be both beneficial and harmful in the process of engineering cell factories for the production of proteins and chemicals. Throughout the history of biotechnology......, there have been efforts to exploit genetic variation in our favor to create strains with favorable phenotypes. Genetic variation can either be present in natural populations or it can be artificially created by mutagenesis and selection or adaptive laboratory evolution. On the other hand, unintended genetic...

  2. Study of seasonal variation of the gamma radiation at Praia da Areia Preta, Guarapari, Espirito Santo, Brazil: radiometry and risk analysis

    International Nuclear Information System (INIS)

    Moura, Jorge Costa de

    2003-01-01

    The objective of this work is the study of the natural gamma radiation at the Areia Preta Beach (APB) in Guarapari, state of Espirito Santo, Brazil. The level of this radiation is dependent on the concentration of the radioactive mineral monazite in the sand. Probable risks of the exposure to gamma radiation at the APB were evaluated by the preliminary environmental risk analysis technique. For this purpose were conducted two annual sets monitoring gamma radiation in the APB every two months and so, acquired the seasonal variation of the radioactive levels. Additional/y was investigated the granulometry of the heavy mineral fraction and also carried out electronic microscopic scanning and radiometric age dating of the monazites of the APB, the mineral separation by magnetic susceptibility, and the mineralogic determination of the sediment. In order to gain a more complete picture of the seasonal variation, and, consequently, of the risk of exposure to ionizing radiation at the APB, the radiometric variation was also studied at some other beaches in the same region. The results indicate that the highest radiometric values are measured in summer and the lowest in winter. The radiometric dating of the monazites from the APB revealed the ages of 475 and 530 Ma. The Preliminary Hazard Analysis indicates a minimum risk of excessive radioactive exposition. It would take a period of approximately 870 years of a beach fully crowded to result in one case of bad consequences due to exposure to gamma radiation. (author)

  3. 3D object-oriented image analysis in 3D geophysical modelling

    DEFF Research Database (Denmark)

    Fadel, I.; van der Meijde, M.; Kerle, N.

    2015-01-01

    Non-uniqueness of satellite gravity interpretation has traditionally been reduced by using a priori information from seismic tomography models. This reduction in the non-uniqueness has been based on velocity-density conversion formulas or user interpretation of the 3D subsurface structures (objects......) based on the seismic tomography models and then forward modelling these objects. However, this form of object-based approach has been done without a standardized methodology on how to extract the subsurface structures from the 3D models. In this research, a 3D object-oriented image analysis (3D OOA......) approach was implemented to extract the 3D subsurface structures from geophysical data. The approach was applied on a 3D shear wave seismic tomography model of the central part of the East African Rift System. Subsequently, the extracted 3D objects from the tomography model were reconstructed in the 3D...

  4. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  5. Pedestrian-Vehicle Accidents Reconstruction with PC-Crash®: Sensibility Analysis of Factors Variation

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Gala, F.

    2016-07-01

    This paper describes the main findings of a study performed by INSIA-UPM about the improvement of the reconstruction process of real world vehicle-pedestrian accidents using PC-Crash® software, aimed to develop a software tool for the estimation of the variability of the collision speed due to the lack of real values of some parameters required during the reconstruction task. The methodology has been based on a sensibility analysis of the factors variation. A total of 9 factors have been analyzed with the objective of identifying which ones were significant. Four of them (pedestrian height, collision angle, hood height and pedestrian-road friction coefficient) were significant and were included in a full factorial experiment with the collision speed as an additional factor in order to obtain a regression model with up to third level interactions. Two different factorial experiments with the same structure have been performed because of pedestrian gender differences. The tool has been created as a collision speed predictor based on the regression models obtained, using the 4 significant factors and the projection distance measured or estimated in the accident site. The tool has been used on the analysis of real-world reconstructed accidents occurred in the city of Madrid (Spain). The results have been adequate in most cases with less than 10% of deviation between the predicted speed and the one estimated in the reconstructions. (Author)

  6. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    Science.gov (United States)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  7. Genome size variation among and within Camellia species by using flow cytometric analysis.

    Directory of Open Access Journals (Sweden)

    Hui Huang

    Full Text Available BACKGROUND: The genus Camellia, belonging to the family Theaceae, is economically important group in flowering plants. Frequent interspecific hybridization together with polyploidization has made them become taxonomically "difficult taxa". The DNA content is often used to measure genome size variation and has largely advanced our understanding of plant evolution and genome variation. The goals of this study were to investigate patterns of interspecific and intraspecific variation of DNA contents and further explore genome size evolution in a phylogenetic context of the genus. METHODOLOGY/PRINCIPAL FINDINGS: The DNA amount in the genus was determined by using propidium iodide flow cytometry analysis for a total of 139 individual plants representing almost all sections of the two subgenera, Camellia and Thea. An improved WPB buffer was proven to be suitable for the Camellia species, which was able to counteract the negative effects of secondary metabolite and generated high-quality results with low coefficient of variation values (CV <5%. Our results showed trivial effects on different tissues of flowers, leaves and buds as well as cytosolic compounds on the estimation of DNA amount. The DNA content of C. sinensis var. assamica was estimated to be 1C = 3.01 pg by flow cytometric analysis, which is equal to a genome size of about 2940 Mb. CONCLUSION: Intraspecific and interspecific variations were observed in the genus Camellia, and as expected, the latter was larger than the former. Our study suggests a directional trend of increasing genome size in the genus Camellia probably owing to the frequent polyploidization events.

  8. Multi-objective optimization in systematic conservation planning and the representation of genetic variability among populations.

    Science.gov (United States)

    Schlottfeldt, S; Walter, M E M T; Carvalho, A C P L F; Soares, T N; Telles, M P C; Loyola, R D; Diniz-Filho, J A F

    2015-06-18

    Biodiversity crises have led scientists to develop strategies for achieving conservation goals. The underlying principle of these strategies lies in systematic conservation planning (SCP), in which there are at least 2 conflicting objectives, making it a good candidate for multi-objective optimization. Although SCP is typically applied at the species level (or hierarchically higher), it can be used at lower hierarchical levels, such as using alleles as basic units for analysis, for conservation genetics. Here, we propose a method of SCP using a multi-objective approach. We used non-dominated sorting genetic algorithm II in order to identify the smallest set of local populations of Dipteryx alata (baru) (a Brazilian Cerrado species) for conservation, representing the known genetic diversity and using allele frequency information associated with heterozygosity and Hardy-Weinberg equilibrium. We worked in 3 variations for the problem. First, we reproduced a previous experiment, but using a multi-objective approach. We found that the smallest set of populations needed to represent all alleles under study was 7, corroborating the results of the previous study, but with more distinct solutions. In the 2nd and 3rd variations, we performed simultaneous optimization of 4 and 5 objectives, respectively. We found similar but refined results for 7 populations, and a larger portfolio considering intra-specific diversity and persistence with populations ranging from 8-22. This is the first study to apply multi-objective algorithms to an SCP problem using alleles at the population level as basic units for analysis.

  9. AFLP analysis of Cynodon dactylon (L.) Pers. var. dactylon genetic variation.

    Science.gov (United States)

    Wu, Y Q; Taliaferro, C M; Bai, G H; Anderson, M P

    2004-08-01

    Cynodon dactylon (L.) Pers. var. dactylon (common bermudagrass) is geographically widely distributed between about lat 45 degrees N and lat 45 degrees S, penetrating to about lat 53 degrees N in Europe. The extensive variation of morphological and adaptive characteristics of the taxon is substantially documented, but information is lacking on DNA molecular variation in geographically disparate forms. Accordingly, this study was conducted to assess molecular genetic variation and genetic relatedness among 28 C. dactylon var. dactylon accessions originating from 11 countries on 4 continents (Africa, Asia, Australia, and Europe). A fluorescence-labeled amplified fragment length polymorphism (AFLP) DNA profiling method was used to detect the genetic diversity and relatedness. On the basis of 443 polymorphic AFLP fragments from 8 primer combinations, the accessions were grouped into clusters and subclusters associating with their geographic origins. Genetic similarity coefficients (SC) for the 28 accessions ranged from 0.53 to 0.98. Accessions originating from Africa, Australia, Asia, and Europe formed major groupings as indicated by cluster and principal coordinate analysis. Accessions from Australia and Asia, though separately clustered, were relatively closely related and most distantly related to accessions of European origin. African accessions formed two distant clusters and had the greatest variation in genetic relatedness relative to accessions from other geographic regions. Sampling the full extent of genetic variation in C. dactylon var. dactylon would require extensive germplasm collection in the major geographic regions of its distributional range.

  10. Topological situational analysis and synthesis of strategies of object management in the conditions of conflict, uncertainty of behaviour and varible amount of the observed objects

    Directory of Open Access Journals (Sweden)

    Віктор Володимирович Семко

    2016-09-01

    Full Text Available The conflict of cooperation of objects is considered in observation space as integral phenomenon with the certain variety of types of connections between its elements, objects, systems and environment that erected in a single theoretical conception and comprehensively and deeply determine the real features of object of researches. Methodology of system-structural analysis of conflict is used as research of the phenomenon in the whole and system-functional analysis as research with the aim of determination of all basic intercommunications with an environment

  11. Feasibility study for objective oriented design of system thermal hydraulic analysis program

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. In this work, objective oriented program for system safety analysis code has been tried utilizing modernized C language. The analysis, design, implementation and verification steps for OOP system code development are described with some implementation examples. The system code SYSTF based on three-fluid thermal hydraulic solver has been developed by OOP design. The verifications of feasibility are performed with simple fundamental problems and plant models. (author)

  12. Some remarks on variational and quasi-variational inequalities of monotone operators

    International Nuclear Information System (INIS)

    Siddiqi, A.H.

    1990-08-01

    In this paper we study a fairly general class of variational and quasi-variational inequalities problem which represent some important physical phenomena. Several well-known results concerning variational inequalities are special cases of our results. Existence, uniqueness and numerical analysis of this problem have been studied. (author). 39 refs

  13. Integrative analysis of RNA, translation, and protein levels reveals distinct regulatory variation across humans.

    Science.gov (United States)

    Cenik, Can; Cenik, Elif Sarinay; Byeon, Gun W; Grubert, Fabian; Candille, Sophie I; Spacek, Damek; Alsallakh, Bilal; Tilgner, Hagen; Araya, Carlos L; Tang, Hua; Ricci, Emiliano; Snyder, Michael P

    2015-11-01

    Elucidating the consequences of genetic differences between humans is essential for understanding phenotypic diversity and personalized medicine. Although variation in RNA levels, transcription factor binding, and chromatin have been explored, little is known about global variation in translation and its genetic determinants. We used ribosome profiling, RNA sequencing, and mass spectrometry to perform an integrated analysis in lymphoblastoid cell lines from a diverse group of individuals. We find significant differences in RNA, translation, and protein levels suggesting diverse mechanisms of personalized gene expression control. Combined analysis of RNA expression and ribosome occupancy improves the identification of individual protein level differences. Finally, we identify genetic differences that specifically modulate ribosome occupancy--many of these differences lie close to start codons and upstream ORFs. Our results reveal a new level of gene expression variation among humans and indicate that genetic variants can cause changes in protein levels through effects on translation. © 2015 Cenik et al.; Published by Cold Spring Harbor Laboratory Press.

  14. Art, historical and cultural heritage objects studied with different non-destructive analysis

    International Nuclear Information System (INIS)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M.

    2012-01-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  15. Art, historical and cultural heritage objects studied with different non-destructive analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M. [Universidade de Sao Paulo (IF/USP), SP (Brazil). Inst. de Fisica

    2012-07-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  16. Extending Track Analysis from Animals in the Lab to Moving Objects Anywhere

    NARCIS (Netherlands)

    Dommelen, W. van; Laar, P.J.L.J. van de; Noldus, L.P.J.J.

    2013-01-01

    In this chapter we compare two application domains in which the tracking of objects and the analysis of their movements are core activities, viz. animal tracking and vessel tracking. More specifically, we investigate whether EthoVision XT, a research tool for video tracking and analysis of the

  17. Explicit area-based accuracy assessment for mangrove tree crown delineation using Geographic Object-Based Image Analysis (GEOBIA)

    Science.gov (United States)

    Kamal, Muhammad; Johansen, Kasper

    2017-10-01

    Effective mangrove management requires spatially explicit information of mangrove tree crown map as a basis for ecosystem diversity study and health assessment. Accuracy assessment is an integral part of any mapping activities to measure the effectiveness of the classification approach. In geographic object-based image analysis (GEOBIA) the assessment of the geometric accuracy (shape, symmetry and location) of the created image objects from image segmentation is required. In this study we used an explicit area-based accuracy assessment to measure the degree of similarity between the results of the classification and reference data from different aspects, including overall quality (OQ), user's accuracy (UA), producer's accuracy (PA) and overall accuracy (OA). We developed a rule set to delineate the mangrove tree crown using WorldView-2 pan-sharpened image. The reference map was obtained by visual delineation of the mangrove tree crowns boundaries form a very high-spatial resolution aerial photograph (7.5cm pixel size). Ten random points with a 10 m radius circular buffer were created to calculate the area-based accuracy assessment. The resulting circular polygons were used to clip both the classified image objects and reference map for area comparisons. In this case, the area-based accuracy assessment resulted 64% and 68% for the OQ and OA, respectively. The overall quality of the calculation results shows the class-related area accuracy; which is the area of correctly classified as tree crowns was 64% out of the total area of tree crowns. On the other hand, the overall accuracy of 68% was calculated as the percentage of all correctly classified classes (tree crowns and canopy gaps) in comparison to the total class area (an entire image). Overall, the area-based accuracy assessment was simple to implement and easy to interpret. It also shows explicitly the omission and commission error variations of object boundary delineation with colour coded polygons.

  18. Joint Tensor Feature Analysis For Visual Object Recognition.

    Science.gov (United States)

    Wong, Wai Keung; Lai, Zhihui; Xu, Yong; Wen, Jiajun; Ho, Chu Po

    2015-11-01

    Tensor-based object recognition has been widely studied in the past several years. This paper focuses on the issue of joint feature selection from the tensor data and proposes a novel method called joint tensor feature analysis (JTFA) for tensor feature extraction and recognition. In order to obtain a set of jointly sparse projections for tensor feature extraction, we define the modified within-class tensor scatter value and the modified between-class tensor scatter value for regression. The k-mode optimization technique and the L(2,1)-norm jointly sparse regression are combined together to compute the optimal solutions. The convergent analysis, computational complexity analysis and the essence of the proposed method/model are also presented. It is interesting to show that the proposed method is very similar to singular value decomposition on the scatter matrix but with sparsity constraint on the right singular value matrix or eigen-decomposition on the scatter matrix with sparse manner. Experimental results on some tensor datasets indicate that JTFA outperforms some well-known tensor feature extraction and selection algorithms.

  19. Role of regression analysis and variation of rheological data in calculation of pressure drop for sludge pipelines.

    Science.gov (United States)

    Farno, E; Coventry, K; Slatter, P; Eshtiaghi, N

    2018-06-15

    Sludge pumps in wastewater treatment plants are often oversized due to uncertainty in calculation of pressure drop. This issue costs millions of dollars for industry to purchase and operate the oversized pumps. Besides costs, higher electricity consumption is associated with extra CO 2 emission which creates huge environmental impacts. Calculation of pressure drop via current pipe flow theory requires model estimation of flow curve data which depends on regression analysis and also varies with natural variation of rheological data. This study investigates impact of variation of rheological data and regression analysis on variation of pressure drop calculated via current pipe flow theories. Results compare the variation of calculated pressure drop between different models and regression methods and suggest on the suitability of each method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Identifying the factors influencing practice variation in thrombosis medicine: A qualitative content analysis of published practice-pattern surveys.

    Science.gov (United States)

    Skeith, Leslie; Gonsalves, Carol

    2017-11-01

    Practice variation, the differences in clinical management between physicians, is one reason why patient outcomes may differ. Identifying factors that contribute to practice variation in areas of clinical uncertainty or equipoise may have implications for understanding and improving patient care. To discern what factors may influence practice variation, we completed a qualitative content analysis of all practice-pattern surveys in thrombosis medicine in the last 10years. Out of 2117 articles screened using a systematic search strategy, 33 practice-pattern surveys met eligibility criteria. Themes were identified using constant comparative analysis of qualitative data. Practice variation was noted in all 33 practice-pattern surveys. Contributing factors to variation included lack of available evidence, lack of clear and specific guideline recommendations, past experience, patient context, institutional culture and the perceived risk and benefit of a particular treatment. Additional themes highlight the value placed on expertise in challenging clinical scenarios, the complexity of practice variation and the value placed on minimizing practice variation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  2. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1986-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are presented. (author)

  3. A functional analysis of photo-object matching skills of severely retarded adolescents.

    Science.gov (United States)

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photos and the objects. Only one student demonstrated photo-object matching. The results of the four students who failed to demonstrate photo-object matching suggested that physical properties of photos (flat, rectangular) and depth dimensions of objects may exert more control over matching than the similarities of the objects and images within the photos. An analysis of figure-ground variables was conducted to provide an empirical basis for program development in the use of pictures. In one series of tests, rectangular shape and background were removed by cutting out the figures in the photos. The edge shape of the photo and the edge shape of the image were then identical. The results suggest that photo-object matching may be facilitated by using cut-out figures rather than the complete rectangular photo.

  4. Optimized variational analysis scheme of single Doppler radar wind data

    Science.gov (United States)

    Sasaki, Yoshi K.; Allen, Steve; Mizuno, Koki; Whitehead, Victor; Wilk, Kenneth E.

    1989-01-01

    A computer scheme for extracting singularities has been developed and applied to single Doppler radar wind data. The scheme is planned for use in real-time wind and singularity analysis and forecasting. The method, known as Doppler Operational Variational Extraction of Singularities is outlined, focusing on the principle of local symmetry. Results are presented from the application of the scheme to a storm-generated gust front in Oklahoma on May 28, 1987.

  5. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  6. Local Analysis Approach for Short Wavelength Geopotential Variations

    Science.gov (United States)

    Bender, P. L.

    2009-12-01

    The value of global spherical harmonic analyses for determining 15 day to 30 day changes in the Earth's gravity field has been demonstrated extensively using data from the GRACE mission and previous missions. However, additional useful information appears to be obtainable from local analyses of the data. A number of such analyses have been carried out by various groups. In the energy approximation, the changes in the height of the satellite altitude geopotential can be determined from the post-fit changes in the satellite separation during individual one-revolution arcs of data from a GRACE-type pair of satellites in a given orbit. For a particular region, it is assumed that short wavelength spatial variations for the arcs crossing that region during a time T of interest would be used to determine corrections to the spherical harmonic results. The main issue in considering higher measurement accuracy in future missions is how much improvement in spatial resolution can be achieved. For this, the shortest wavelengths that can be determined are the most important. And, while the longer wavelength variations are affected by mass distribution changes over much of the globe, the shorter wavelength ones hopefully will be determined mainly by more local changes in the mass distribution. Future missions are expected to have much higher accuracy for measuring changes in the satellite separation than GRACE. However, how large an improvement in the derived results in hydrology will be achieved is still very much a matter of study, particularly because of the effects of uncertainty in the time variations in the atmospheric and oceanic mass distributions. To be specific, it will be assumed that improving the spatial resolution in continental regions away from the coastlines is the objective, and that the satellite altitude is in the range of roughly 290 to 360 km made possible for long missions by drag-free operation. The advantages of putting together the short wavelength

  7. Quantitative measurement of phase variation amplitude of ultrasonic diffraction grating based on diffraction spectral analysis

    Energy Technology Data Exchange (ETDEWEB)

    Pan, Meiyan, E-mail: yphantomohive@gmail.com; Zeng, Yingzhi; Huang, Zuohua, E-mail: zuohuah@163.com [Laboratory of Quantum Engineering and Quantum Materials, School of Physics and Telecommunication Engineering, South China Normal University, Guangzhou, Guangdong 510006 (China)

    2014-09-15

    A new method based on diffraction spectral analysis is proposed for the quantitative measurement of the phase variation amplitude of an ultrasonic diffraction grating. For a traveling wave, the phase variation amplitude of the grating depends on the intensity of the zeroth- and first-order diffraction waves. By contrast, for a standing wave, this amplitude depends on the intensity of the zeroth-, first-, and second-order diffraction waves. The proposed method is verified experimentally. The measured phase variation amplitude ranges from 0 to 2π, with a relative error of approximately 5%. A nearly linear relation exists between the phase variation amplitude and driving voltage. Our proposed method can also be applied to ordinary sinusoidal phase grating.

  8. Feasibility analysis of CNP 1000 computerized I and C system design objectives

    International Nuclear Information System (INIS)

    Zhang Mingguang; Xu Jijun; Zhang Qinshen

    2000-01-01

    The author states the design objectives of the computerized I and C (CIC) system and advanced main control room (AMCR), which could and should be achieved in CNP 1000, based on the national 1E computer production technology including software and hardware, and current instrumentation and control design technique of nuclear power plant. The feasibility analysis on the design objectives and the reasons or necessity to do the design research projects have been described. The objectives of design research on CIC and AMCR as well as the self-design proficiency after the design research have been given

  9. A cluster analysis of patterns of objectively measured physical activity in Hong Kong.

    Science.gov (United States)

    Lee, Paul H; Yu, Ying-Ying; McDowell, Ian; Leung, Gabriel M; Lam, T H

    2013-08-01

    The health benefits of exercise are clear. In targeting interventions it would be valuable to know whether characteristic patterns of physical activity (PA) are associated with particular population subgroups. The present study used cluster analysis to identify characteristic hourly PA patterns measured by accelerometer. Cross-sectional design. Objectively measured PA in Hong Kong adults. Four-day accelerometer data were collected during 2009 to 2011 for 1714 participants in Hong Kong (mean age 44?2 years, 45?9% male). Two clusters were identified, one more active than the other. The ‘active cluster’ (n 480) was characterized by a routine PA pattern on weekdays and a more active and varied pattern on weekends; the other, the ‘less active cluster’ (n 1234), by a consistently low PA pattern on both weekdays and weekends with little variation from day to day. Demographic, lifestyle, PA level and health characteristics of the two clusters were compared. They differed in age, sex, smoking, income and level of PA required at work. The odds of having any chronic health conditions was lower for the active group (adjusted OR50?62, 95% CI 0?46, 0?84) but the two groups did not differ in terms of specific chronic health conditions or obesity. Implications are drawn for targeting exercise promotion programmes at the population level.

  10. Fast and objective detection and analysis of structures in downhole images

    Science.gov (United States)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  11. SU-E-T-139: Automated Daily EPID Exit Dose Analysis Uncovers Treatment Variations

    Energy Technology Data Exchange (ETDEWEB)

    Olch, A [University of Southern California, Los Angeles, CA (United States)

    2015-06-15

    Purpose: To evaluate a fully automated EPID exit dose system for its ability to detect daily treatment deviations including patient setup, delivery, and anatomy changes. Methods: PerFRACTION (Sun Nuclear Corporation) software is a system that uses integrated EPID images taken during patient treatment and automatically pulled from the Aria database and analyzed based on user-defined comparisons. This was used to monitor 20 plans consisting of a total of 859 fields for 18 patients, for a total of 251 fractions. Nine VMAT, 5 IMRT, and 6 3D plans were monitored. The Gamma analysis was performed for each field within a plan, comparing the first fraction against each of the other fractions in each treatment course. A 2% dose difference, 1 mm distance-to-agreement, and 10% dose threshold was used. These tight tolerances were chosen to achieve a high sensitivity to treatment variations. The field passed if 93% of the pixels had a Gamma of 1 or less. Results: Twenty-nine percent of the fields failed. The average plan passing rate was 92.5%.The average 3D plan passing rate was less than for VMAT or IMRT, 84%, vs. an average of 96.2%. When fields failed, an investigation revealed changes in patient anatomy or setup variations, often also leading to variations of transmission through immobilization devices. Conclusion: PerFRACTION is a fully automated system for determining daily changes in dose transmission through the patient that requires no effort other than for the imager panel to be deployed during treatment. A surprising number of fields failed the analysis and can be attributed to important treatment variations that would otherwise not be appreciated. Further study of inter-fraction treatment variations is possible and warranted. Sun Nuclear Corporation provided a license to the software described.

  12. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1987-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results are compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are also presented. (author) 3 refs.; 4 tabs

  13. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  14. Inverse Transient Analysis for Classification of Wall Thickness Variations in Pipelines

    Directory of Open Access Journals (Sweden)

    Jeffrey Tuck

    2013-12-01

    Full Text Available Analysis of transient fluid pressure signals has been investigated as an alternative method of fault detection in pipeline systems and has shown promise in both laboratory and field trials. The advantage of the method is that it can potentially provide a fast and cost effective means of locating faults such as leaks, blockages and pipeline wall degradation within a pipeline while the system remains fully operational. The only requirement is that high speed pressure sensors are placed in contact with the fluid. Further development of the method requires detailed numerical models and enhanced understanding of transient flow within a pipeline where variations in pipeline condition and geometry occur. One such variation commonly encountered is the degradation or thinning of pipe walls, which can increase the susceptible of a pipeline to leak development. This paper aims to improve transient-based fault detection methods by investigating how changes in pipe wall thickness will affect the transient behaviour of a system; this is done through the analysis of laboratory experiments. The laboratory experiments are carried out on a stainless steel pipeline of constant outside diameter, into which a pipe section of variable wall thickness is inserted. In order to detect the location and severity of these changes in wall conditions within the laboratory system an inverse transient analysis procedure is employed which considers independent variations in wavespeed and diameter. Inverse transient analyses are carried out using a genetic algorithm optimisation routine to match the response from a one-dimensional method of characteristics transient model to the experimental time domain pressure responses. The accuracy of the detection technique is evaluated and benefits associated with various simplifying assumptions and simulation run times are investigated. It is found that for the case investigated, changes in the wavespeed and nominal diameter of the

  15. Inverse Transient Analysis for Classification of Wall Thickness Variations in Pipelines

    Science.gov (United States)

    Tuck, Jeffrey; Lee, Pedro

    2013-01-01

    Analysis of transient fluid pressure signals has been investigated as an alternative method of fault detection in pipeline systems and has shown promise in both laboratory and field trials. The advantage of the method is that it can potentially provide a fast and cost effective means of locating faults such as leaks, blockages and pipeline wall degradation within a pipeline while the system remains fully operational. The only requirement is that high speed pressure sensors are placed in contact with the fluid. Further development of the method requires detailed numerical models and enhanced understanding of transient flow within a pipeline where variations in pipeline condition and geometry occur. One such variation commonly encountered is the degradation or thinning of pipe walls, which can increase the susceptible of a pipeline to leak development. This paper aims to improve transient-based fault detection methods by investigating how changes in pipe wall thickness will affect the transient behaviour of a system; this is done through the analysis of laboratory experiments. The laboratory experiments are carried out on a stainless steel pipeline of constant outside diameter, into which a pipe section of variable wall thickness is inserted. In order to detect the location and severity of these changes in wall conditions within the laboratory system an inverse transient analysis procedure is employed which considers independent variations in wavespeed and diameter. Inverse transient analyses are carried out using a genetic algorithm optimisation routine to match the response from a one-dimensional method of characteristics transient model to the experimental time domain pressure responses. The accuracy of the detection technique is evaluated and benefits associated with various simplifying assumptions and simulation run times are investigated. It is found that for the case investigated, changes in the wavespeed and nominal diameter of the pipeline are both important

  16. Is there much variation in variation? Revisiting statistics of small area variation in health services research

    Directory of Open Access Journals (Sweden)

    Ibáñez Berta

    2009-04-01

    Full Text Available Abstract Background The importance of Small Area Variation Analysis for policy-making contrasts with the scarcity of work on the validity of the statistics used in these studies. Our study aims at 1 determining whether variation in utilization rates between health areas is higher than would be expected by chance, 2 estimating the statistical power of the variation statistics; and 3 evaluating the ability of different statistics to compare the variability among different procedures regardless of their rates. Methods Parametric bootstrap techniques were used to derive the empirical distribution for each statistic under the hypothesis of homogeneity across areas. Non-parametric procedures were used to analyze the empirical distribution for the observed statistics and compare the results in six situations (low/medium/high utilization rates and low/high variability. A small scale simulation study was conducted to assess the capacity of each statistic to discriminate between different scenarios with different degrees of variation. Results Bootstrap techniques proved to be good at quantifying the difference between the null hypothesis and the variation observed in each situation, and to construct reliable tests and confidence intervals for each of the variation statistics analyzed. Although the good performance of Systematic Component of Variation (SCV, Empirical Bayes (EB statistic shows better behaviour under the null hypothesis, it is able to detect variability if present, it is not influenced by the procedure rate and it is best able to discriminate between different degrees of heterogeneity. Conclusion The EB statistics seems to be a good alternative to more conventional statistics used in small-area variation analysis in health service research because of its robustness.

  17. Scandinavian Object Shift and Optimality Theory

    DEFF Research Database (Denmark)

    Engels, Eva; Vikner, Sten

    This study presents an account of object shift, a word order phenomenon found in most of the Scandinavian languages where an object occurs unexpectedly to the left and not to the right of a sentential adverbial. The book examines object shift across many of the Scandinavian languages and dialects...... and original observations, this book is an important addition to the fields of phonology, optimality theory and theoretical syntax......., and analyses the variation, for example whether object shift is optional or obligatory, whether it applies only to pronouns or other objects as well, and whether it applies to adverbials. The authors show that optimality theory, traditionally used in phonology, is a useful framework for accounting...

  18. AFLP and MS-AFLP analysis of the variation within saffron crocus (Crocus sativus L. germplasm.

    Directory of Open Access Journals (Sweden)

    Matteo Busconi

    Full Text Available The presence and extent of genetic variation in saffron crocus are still debated, as testified by several contradictory articles providing contrasting results about the monomorphism or less of the species. Remarkably, phenotypic variations have been frequently observed in the field, such variations are usually unstable and can change from one growing season to another. Considering that gene expression can be influenced both by genetic and epigenetic changes, epigenetics could be a plausible cause of the alternative phenotypes. In order to obtain new insights into this issue, we carried out a molecular marker analysis of 112 accessions from the World Saffron and Crocus Collection. The accessions were grown for at least three years in the same open field conditions. The same samples were analysed using Amplified Fragment Length Polymorphism (AFLP and Methyl Sensitive AFLP in order to search for variation at the genetic (DNA sequence and epigenetic (cytosine methylation level. While the genetic variability was low (4.23% polymorphic peaks and twelve (12 effective different genotypes, the methyl sensitive analysis showed the presence of high epigenetic variability (33.57% polymorphic peaks and twenty eight (28 different effective epigenotypes. The pattern obtained by Factorial Correspondence Analysis of AFLP and, in particular, of MS-AFLP data was consistent with the geographical provenance of the accessions. Very interestingly, by focusing on Spanish accessions, it was observed that the distribution of the accessions in the Factorial Correspondence Analysis is not random but tends to reflect the geographical origin. Two clearly defined clusters grouping accessions from the West (Toledo and Ciudad Real and accessions from the East (Cuenca and Teruel were clearly recognised.

  19. Analysis of Geomagnetic Field Variations during Total Solar Eclipses Using INTERMAGNET Data

    Science.gov (United States)

    KIM, J. H.; Chang, H. Y.

    2017-12-01

    We investigate variations of the geomagnetic field observed by INTERMAGNET geomagnetic observatories over which the totality path passed during a solar eclipse. We compare results acquired by 6 geomagnetic observatories during the 4 total solar eclipses (11 August 1999, 1 August 2008, 11 July 2010, and 20 March 2015) in terms of geomagnetic and solar ecliptic parameters. These total solar eclipses are the only total solar eclipse during which the umbra of the moon swept an INTERMAGNET geomagnetic observatory and simultaneously variations of the geomagnetic field are recorded. We have confirmed previous studies that increase BY and decreases of BX, BZ and F are conspicuous. Interestingly, we have noted that variations of geomagnetic field components observed during the total solar eclipse at Isla de Pascua Mataveri (Easter Island) in Chile (IPM) in the southern hemisphere show distinct decrease of BY and increases of BX and BZ on the contrary. We have found, however, that variations of BX, BY, BZ and F observed at Hornsund in Norway (HRN) seem to be dominated by other geomagnetic occurrence. In addition, we have attempted to obtain any signatures of influence on the temporal behavior of the variation in the geomagnetic field signal during the solar eclipse by employing the wavelet analysis technique. Finally, we conclude by pointing out that despite apparent success a more sophisticate and reliable algorithm is required before implementing to make quantitative comparisons.

  20. Variation in payments for spine surgery episodes of care: implications for episode-based bundled payment.

    Science.gov (United States)

    Kahn, Elyne N; Ellimoottil, Chandy; Dupree, James M; Park, Paul; Ryan, Andrew M

    2018-05-25

    OBJECTIVE Spine surgery is expensive and marked by high variation across regions and providers. Bundled payments have potential to reduce unwarranted spending associated with spine surgery. This study is a cross-sectional analysis of commercial and Medicare claims data from January 2012 through March 2015 in the state of Michigan. The objective was to quantify variation in payments for spine surgery in adult patients, document sources of variation, and determine influence of patient-level, surgeon-level, and hospital-level factors. METHODS Hierarchical regression models were used to analyze contributions of patient-level covariates and influence of individual surgeons and hospitals. The primary outcome was price-standardized 90-day episode payments. Intraclass correlation coefficients-measures of variability accounted for by each level of a hierarchical model-were used to quantify sources of spending variation. RESULTS The authors analyzed 17,436 spine surgery episodes performed by 195 surgeons at 50 hospitals. Mean price-standardized 90-day episode payments in the highest spending quintile exceeded mean payments for episodes in the lowest cost quintile by $42,953 (p accounting for patient-level covariates, the remaining hospital-level and surgeon-level effects accounted for 2.0% (95% CI 1.1%-3.8%) and 4.0% (95% CI 2.9%-5.6%) of total variation, respectively. CONCLUSIONS Significant variation exists in total episode payments for spine surgery, driven mostly by variation in post-discharge and facility payments. Hospital and surgeon effects account for relatively little of the observed variation.

  1. NDVI-Based analysis on the influence of human activities on vegetation variation on Hainan Island

    Science.gov (United States)

    Luo, Hongxia; Dai, Shengpei; Xie, Zhenghui; Fang, Jihua

    2018-02-01

    Using the Moderate Resolution Imaging Spectroradiometer-normalized difference vegetation index (NDVI) dataset, we analyzed the predicted NDVI values variation and the influence of human activities on vegetation on Hainan Island during 2001-2015. We investigated the roles of human activities in vegetation variation, particularly from 2002 when implemented the Grain-for-Greenprogram on Hainan Island. The trend analysis, linear regression model and residual analysis were used to analyze the data. The results of the study showed that (1) The predicted vegetation on Hainan Island showed an general upward trend with a linear growth rate of 0.0025/10y (phuman activities. (3) In general, human activities had played a positive role in the vegetation increase on Hainan Island, and the residual NDVI trend of this region showed positive outcomes for vegetation variation after implementing ecological engineering projects. However, it indicated a growing risk of vegetation degradation in the coastal region of Hainan Island as a result of rapid urbanization, land reclamation.

  2. Objective Voice Parameters in Colombian School Workers with Healthy Voices

    Directory of Open Access Journals (Sweden)

    Lady Catherine Cantor Cutiva

    2015-09-01

    Full Text Available Objectives: To characterize the objective voice parameters among school workers, and to identi­fy associated factors of three objective voice parameters, namely fundamental frequency, sound pressure level and maximum phonation time. Materials and methods: We conducted a cross-sectional study among 116 Colombian teachers and 20 Colombian non-teachers. After signing the informed consent form, participants filled out a questionnaire. Then, a voice sample was recorded and evaluated perceptually by a speech therapist and by objective voice analysis with praat software. Short-term environmental measurements of sound level, temperature, humi­dity, and reverberation time were conducted during visits at the workplaces, such as classrooms and offices. Linear regression analysis was used to determine associations between individual and work-related factors and objective voice parameters. Results: Compared with men, women had higher fundamental frequency (201 Hz for teachers and 209 for non-teachers vs. 120 Hz for teachers and 127 for non-teachers and sound pressure level (82 dB vs. 80 dB, and shorter maximum phonation time (around 14 seconds vs. around 16 seconds. Female teachers younger than 50 years of age evidenced a significant tendency to speak with lower fundamental frequen­cy and shorter mpt compared with female teachers older than 50 years of age. Female teachers had significantly higher fundamental frequency (66 Hz, higher sound pressure level (2 dB and short phonation time (2 seconds than male teachers. Conclusion: Female teachers younger than 50 years of age had significantly lower F0 and shorter mpt compared with those older than 50 years of age. The multivariate analysis showed that gender was a much more important determinant of variations in F0, spl and mpt than age and teaching occupation. Objectively measured temperature also contributed to the changes on spl among school workers.

  3. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    Science.gov (United States)

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective

  4. SVAMP: Sequence variation analysis, maps and phylogeny

    KAUST Repository

    Naeem, Raeece

    2014-04-03

    Summary: SVAMP is a stand-alone desktop application to visualize genomic variants (in variant call format) in the context of geographical metadata. Users of SVAMP are able to generate phylogenetic trees and perform principal coordinate analysis in real time from variant call format (VCF) and associated metadata files. Allele frequency map, geographical map of isolates, Tajima\\'s D metric, single nucleotide polymorphism density, GC and variation density are also available for visualization in real time. We demonstrate the utility of SVAMP in tracking a methicillin-resistant Staphylococcus aureus outbreak from published next-generation sequencing data across 15 countries. We also demonstrate the scalability and accuracy of our software on 245 Plasmodium falciparum malaria isolates from three continents. Availability and implementation: The Qt/C++ software code, binaries, user manual and example datasets are available at http://cbrc.kaust.edu.sa/svamp. © The Author 2014.

  5. SEGMENT OF FINANCIAL CORPORATIONS AS AN OBJECT OF FINANCIAL AND STATISTICAL ANALYSIS

    OpenAIRE

    Marat F. Mazitov

    2013-01-01

    The article is devoted to the study specific features of the formation and change of economic assets of financial corporations as an object of management and financial analysis. He author identifies the features and gives the classification of institutional units belonging to the sector of financial corporations from the viewpoint of assessment and financial analysis of the flows, reflecting change of their assets.

  6. Context based Coding of Binary Shapes by Object Boundary Straightness Analysis

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2004-01-01

    A new lossless compression scheme for bilevel images targeted at binary shapes of image and video objects is presented. The scheme is based on a local analysis of the digital straightness of the causal part of the object boundary, which is used in the context definition for arithmetic encoding....... Tested on individual images of binary shapes and binary layers of digital maps the algorithm outperforms PWC, JBIG and MPEG-4 CAE. On the binary shapes the code lengths are reduced by 21%, 25%, and 42%, respectively. On the maps the reductions are 34%, 32%, and 59%, respectively. The algorithm is also...

  7. A novel quantitative approach for eliminating sample-to-sample variation using a hue saturation value analysis program.

    Science.gov (United States)

    Yabusaki, Katsumi; Faits, Tyler; McMullen, Eri; Figueiredo, Jose Luiz; Aikawa, Masanori; Aikawa, Elena

    2014-01-01

    As computing technology and image analysis techniques have advanced, the practice of histology has grown from a purely qualitative method to one that is highly quantified. Current image analysis software is imprecise and prone to wide variation due to common artifacts and histological limitations. In order to minimize the impact of these artifacts, a more robust method for quantitative image analysis is required. Here we present a novel image analysis software, based on the hue saturation value color space, to be applied to a wide variety of histological stains and tissue types. By using hue, saturation, and value variables instead of the more common red, green, and blue variables, our software offers some distinct advantages over other commercially available programs. We tested the program by analyzing several common histological stains, performed on tissue sections that ranged from 4 µm to 10 µm in thickness, using both a red green blue color space and a hue saturation value color space. We demonstrated that our new software is a simple method for quantitative analysis of histological sections, which is highly robust to variations in section thickness, sectioning artifacts, and stain quality, eliminating sample-to-sample variation.

  8. Some analysis on the diurnal variation of rainfall over the Atlantic Ocean

    Science.gov (United States)

    Gill, T.; Perng, S.; Hughes, A.

    1981-01-01

    Data collected from the GARP Atlantic Tropical Experiment (GATE) was examined. The data were collected from 10,000 grid points arranged as a 100 x 100 array; each grid covered a 4 square km area. The amount of rainfall was measured every 15 minutes during the experiment periods using c-band radars. Two types of analyses were performed on the data: analysis of diurnal variation was done on each of grid points based on the rainfall averages at noon and at midnight, and time series analysis on selected grid points based on the hourly averages of rainfall. Since there are no known distribution model which best describes the rainfall amount, nonparametric methods were used to examine the diurnal variation. Kolmogorov-Smirnov test was used to test if the rainfalls at noon and at midnight have the same statistical distribution. Wilcoxon signed-rank test was used to test if the noon rainfall is heavier than, equal to, or lighter than the midnight rainfall. These tests were done on each of the 10,000 grid points at which the data are available.

  9. Toward Meaningful Manufacturing Variation Data in Design - Feature Based Description of Variation in Manufacturing Processes

    DEFF Research Database (Denmark)

    Eifler, Tobias; Boorla, Srinivasa Murthy; Howard, Thomas J.

    2016-01-01

    The need to mitigate the effects of manufacturing variation already in design is nowadays commonly acknowledged and has led to a wide use of predictive modeling techniques, tolerancing approaches, etc. in industry. The trustworthiness of corresponding variation analyses is, however, not ensured...... by the availability of sophisticated methods and tools alone, but does evidently also depend on the accuracy of the input information used. As existing approaches for the description of manufacturing variation focus however, almost exclusively, on monitoring and controlling production processes, there is frequently...... a lack of objective variation data in design. As a result, variation analyses and tolerancing activities rely on numerous assumptions made to fill the gaps of missing or incomplete data. To overcome this hidden subjectivity, a schema for a consistent and standardised description of manufacturing...

  10. Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy.

    Science.gov (United States)

    Vandenabeele, Peter; Tate, Jim; Moens, Luc

    2007-02-01

    Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland.

  11. Meta-Analysis of Mitochondrial DNA Variation in the Iberian Peninsula.

    Directory of Open Access Journals (Sweden)

    Ruth Barral-Arca

    Full Text Available The Iberian Peninsula has been the focus of attention of numerous studies dealing with mitochondrial DNA (mtDNA variation, most of them targeting the control region segment. In the present study we sequenced the control region of 3,024 Spanish individuals from areas where available data were still limited. We also compiled mtDNA haplotypes from the literature involving 4,588 sequences and 28 population groups or small regions. We meta-analyzed all these data in order to shed further light on patterns of geographic variation, taking advantage of the large sample size and geographic coverage, in contrast with the atomized sampling strategy of previous work. The results indicate that the main mtDNA haplogroups show primarily clinal geographic patterns across the Iberian geography, roughly along a North-South axis. Haplogroup HV0 (where haplogroup U is nested is more prevalent in the Franco Cantabrian region, in good agreement with previous findings that identified this area as a climate refuge during the Last Glacial Maximum (LGM, prior to a subsequent demographic re-expansion towards Central Europe and the Mediterranean. Typical sub-Saharan and North African lineages are slightly more prevalent in South Iberia, although at low frequencies; this pattern has been shaped mainly by the transatlantic slave trade and the Arab invasion of the Iberian Peninsula. The results also indicate that summary statistics that aim to measure molecular variation, or AMOVA, have limited sensitivity to detect population substructure, in contrast to patterns revealed by phylogeographic analysis. Overall, the results suggest that mtDNA variation in Iberia is substantially stratified. These patterns might be relevant in biomedical studies given that stratification is a common cause of false positives in case-control mtDNA association studies, and should be also considered when weighting the DNA evidence in forensic casework, which is strongly dependent on haplotype

  12. Meta-Analysis of Mitochondrial DNA Variation in the Iberian Peninsula.

    Science.gov (United States)

    Barral-Arca, Ruth; Pischedda, Sara; Gómez-Carballa, Alberto; Pastoriza, Ana; Mosquera-Miguel, Ana; López-Soto, Manuel; Martinón-Torres, Federico; Álvarez-Iglesias, Vanesa; Salas, Antonio

    2016-01-01

    The Iberian Peninsula has been the focus of attention of numerous studies dealing with mitochondrial DNA (mtDNA) variation, most of them targeting the control region segment. In the present study we sequenced the control region of 3,024 Spanish individuals from areas where available data were still limited. We also compiled mtDNA haplotypes from the literature involving 4,588 sequences and 28 population groups or small regions. We meta-analyzed all these data in order to shed further light on patterns of geographic variation, taking advantage of the large sample size and geographic coverage, in contrast with the atomized sampling strategy of previous work. The results indicate that the main mtDNA haplogroups show primarily clinal geographic patterns across the Iberian geography, roughly along a North-South axis. Haplogroup HV0 (where haplogroup U is nested) is more prevalent in the Franco Cantabrian region, in good agreement with previous findings that identified this area as a climate refuge during the Last Glacial Maximum (LGM), prior to a subsequent demographic re-expansion towards Central Europe and the Mediterranean. Typical sub-Saharan and North African lineages are slightly more prevalent in South Iberia, although at low frequencies; this pattern has been shaped mainly by the transatlantic slave trade and the Arab invasion of the Iberian Peninsula. The results also indicate that summary statistics that aim to measure molecular variation, or AMOVA, have limited sensitivity to detect population substructure, in contrast to patterns revealed by phylogeographic analysis. Overall, the results suggest that mtDNA variation in Iberia is substantially stratified. These patterns might be relevant in biomedical studies given that stratification is a common cause of false positives in case-control mtDNA association studies, and should be also considered when weighting the DNA evidence in forensic casework, which is strongly dependent on haplotype frequencies.

  13. Atlas of temporal variations - interdisciplinary scientific work

    Science.gov (United States)

    Gamburtsev, A. G.; Oleinik, O. V.

    2003-04-01

    The year 2002 will culminate in the publication of the third volume of the fundamental interdisciplinary work "Atlas of Temporal Variations in Natural, Anthropogenic and Social Processes", which now will comprise three volumes (1994, 1998, 2002). The Atlas has pooled the information on the main peculiarities of processes' behaviour in various natural and humanitarian spheres over the widest temporal and spatial range. The main scientific goal of the work consists in discovering the behaviour pattern of natural, anthropogenic and social processes and the cause and effect links between them. Thus, the Atlas contains extensive comparative generalisation from the vastly different data. For one thing, it is a fundamental work on the law-governed nature of evolution in natural and social spheres; for another, it can be used as a reference book and valuable source of information for research in different directions. The authors seek to treat every piece of information as part of an integrated whole. When analysing the data, we operate on the premise that surrounding nature, society and their elements are open dynamic systems. Systems of this kind exhibit non-linear characteristics and a tendency towards ordered and chaotic behaviour. These features are revealed in the course of the analysis of time series. The data processing procedures applied are unified, all processes being generally expressed in terms of their time series and time-spectral diagrams. The technique is aimed at determination of investigated parameters' rhythms and the analysis of their evolution. This approach enables us to show the dynamics of processes occurring in absolutely dissimilar objects and performs their comparative analysis, with particular emphasis placed on rhythms and trends. As a result successions of illustrations are obtained and formed the basis of the Atlas. The Atlas covers processes that occur in objects belonging to the lithosphere, atmosphere, hydrosphere and social sphere as well

  14. Consideration of Normal Variation of Perfusion Measurements in the Quantitative Analysis of Myocardial Perfusion SPECT: Usefulness in Assessment of Viable Myocardium

    International Nuclear Information System (INIS)

    Paeng, Jin Chul; Lim, Il Han; Kim, Ki Bong; Lee, Dong Soo

    2008-01-01

    Although automatic quantification software of myocardial perfusion SPECT provides highly objective and reproducible quantitative measurements, there is still some limitation in the direct use of quantitative measurements. In this study we derived parameters using normal variation of perfusion measurements, and tried to test the usefulness of these parameters. In order to calculate normal variation of perfusion measurements on myocardial perfusion SPECT, 55 patients (M:F=28:27) of low-likelihood for coronary artery disease were enrolled and 201 Tl rest / 99m Tc-MIBI stress SPECT studies were performed. Using 20-segment model, mean (m) and standard deviation (SD) of perfusion were calculated in each segment. As a myocardial viability assessment group, another 48 patients with known coronary artery disease, who underwent coronary artery bypass graft surgery (CABG) were enrolled. 201 Tl rest / 99m Tc-MIBI stress / 201 Tl 24-hr delayed SPECT was performed before CABG and SPECT was followed up 3 months after CABG. From the preoperative 24-hr delayed SPECT, Q delay (perfusion measurement), Δ delay (Q delay .m) and Z delay ((Q delay .m)/SD) were defined and diagnostic performances of them for myocardial viability were evaluated using area under curve (AUC) on receiver operating characteristic (ROC) curve analysis. Segmental perfusion measurements showed considerable normal variations among segments. In men, the lowest segmental perfusion measurement was 51.8±6.5 and the highest segmental perfusion was 87.0±5.9, and they are 58.7±8.1 and 87.3±6.0, respectively in women. In the viability assessment, Q delay showed AUC of 0.633, while those for Δ delay and Z delay were 0.735 and 0.716, respectively. The AUCs of Δ delay and Z delay were significantly higher than that of Q delay (p=0.001 and 0.018, respectively). The diagnostic performance of Δ delay , which showed highest AUC, was 85% of sensitivity and 53% of specificity at the optimal cutoff of -24.7. On automatic

  15. Multi-objective Analysis for a Sequencing Planning of Mixed-model Assembly Line

    Science.gov (United States)

    Shimizu, Yoshiaki; Waki, Toshiya; Yoo, Jae Kyu

    Diversified customer demands are raising importance of just-in-time and agile manufacturing much more than before. Accordingly, introduction of mixed-model assembly lines becomes popular to realize the small-lot-multi-kinds production. Since it produces various kinds on the same assembly line, a rational management is of special importance. With this point of view, this study focuses on a sequencing problem of mixed-model assembly line including a paint line as its preceding process. By taking into account the paint line together, reducing work-in-process (WIP) inventory between these heterogeneous lines becomes a major concern of the sequencing problem besides improving production efficiency. Finally, we have formulated the sequencing problem as a bi-objective optimization problem to prevent various line stoppages, and to reduce the volume of WIP inventory simultaneously. Then we have proposed a practical method for the multi-objective analysis. For this purpose, we applied the weighting method to derive the Pareto front. Actually, the resulting problem is solved by a meta-heuristic method like SA (Simulated Annealing). Through numerical experiments, we verified the validity of the proposed approach, and discussed the significance of trade-off analysis between the conflicting objectives.

  16. Variational analysis and aerospace engineering mathematical challenges for the aerospace of the future

    CERN Document Server

    Mohammadi, Bijan; Pironneau, Olivier; Cipolla, Vittorio

    2016-01-01

    This book presents papers surrounding the extensive discussions that took place from the ‘Variational Analysis and Aerospace Engineering’ workshop held at the Ettore Majorana Foundation and Centre for Scientific Culture in 2015. Contributions to this volume focus on advanced mathematical methods in aerospace engineering and industrial engineering such as computational fluid dynamics methods, optimization methods in aerodynamics, optimum controls, dynamic systems, the theory of structures, space missions, flight mechanics, control theory, algebraic geometry for CAD applications, and variational methods and applications. Advanced graduate students, researchers, and professionals in mathematics and engineering will find this volume useful as it illustrates current collaborative research projects in applied mathematics and aerospace engineering.

  17. Object detection using categorised 3D edges

    DEFF Research Database (Denmark)

    Kiforenko, Lilita; Buch, Anders Glent; Bodenhagen, Leon

    2015-01-01

    is made possible by the explicit use of edge categories in the feature descriptor. We quantitatively compare our approach with the state-of-the-art template based Linemod method, which also provides an effective way of dealing with texture-less objects, tests were performed on our own object dataset. Our...... categorisation algorithm for describing objects in terms of its different edge types. Relying on edge information allow our system to deal with objects with little or no texture or surface variation. We show that edge categorisation improves matching performance due to the higher level of discrimination, which...

  18. Attachment to Inanimate Objects and Early Childcare: A Twin Study

    Directory of Open Access Journals (Sweden)

    Keren eFortuna

    2014-05-01

    Full Text Available Extensive nonmaternal childcare plays an important role in children's development. This study examined a potential coping mechanism for dealing with daily separation from caregivers involved in childcare experience—children's development of attachments toward inanimate objects. We employed the twin design to estimate relative environmental and genetic contributions to the presence of object attachment, and assess whether childcare explains some of the environmental variation in this developmental phenomenon. Mothers reported about 1122 3-year-old twin pairs. Variation in object attachment was accounted for by heritability (48% and shared environment (48%, with childcare quantity accounting for 2.2% of the shared environment effect. Children who spent half-days in childcare were significantly less likely to attach to objects relative to children who attended full-day childcare.

  19. Forest anisotropy assessment by means of spatial variations analysis of PolSAR backscattering

    Directory of Open Access Journals (Sweden)

    A. V. Dmitriev

    2017-06-01

    Full Text Available The possibility to synthesize polarization response from earth covers at any desired combination of transmit and receive antenna polarizations is the significant advantage of polarimetric radar. It permits better identification of dominant scattering mechanisms especially when analyzing polarization signatures. These signatures depict more details of physical information from target backscattering in various polarization bases. However, polarization signatures cannot reveal spatial variations of the radar backscattering caused by volume heterogeneity of a target. This paper proposes a new approach for estimating volume target heterogeneity from polarimetric synthetic aperture radar (PolSAR images. The approach is based on the analysis of a novel type of polarization signature, which we call fractal polarization signature (FPS. This signature is a result of polarization synthesis of initial fully polarimetric data and subsequent fractal analysis of synthesized images. It is displayed as a 3D plot and can be produced for each point in an image. It is shown that FPS describes backscattering variations or image roughness at different states of polarization. Fully polarimetric data of SIR-C and ALOS PALSAR at ascending/descending orbits were used for testing the proposed approach. The azimuthal dependence of the radar backscattering variations is discovered when analyzing backscattering from a pine forest. It correlates with the results of a field survey of trees branch distribution.

  20. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  1. Multidimensional analysis of Drosophila wing variation in Evolution ...

    Indian Academy of Sciences (India)

    2008-12-23

    Dec 23, 2008 ... the different components of phenotypic variation of a complex trait: the wing. ... of Drosophila wing variation in. Evolution Canyon. J. Genet. 87, 407–419]. Introduction ..... identify the effect of slope on wing shape (figure 2,c). All.

  2. Functional analysis and applied optimization in Banach spaces applications to non-convex variational models

    CERN Document Server

    Botelho, Fabio

    2014-01-01

    This book introduces the basic concepts of real and functional analysis. It presents the fundamentals of the calculus of variations, convex analysis, duality, and optimization that are necessary to develop applications to physics and engineering problems. The book includes introductory and advanced concepts in measure and integration, as well as an introduction to Sobolev spaces. The problems presented are nonlinear, with non-convex variational formulation. Notably, the primal global minima may not be attained in some situations, in which cases the solution of the dual problem corresponds to an appropriate weak cluster point of minimizing sequences for the primal one. Indeed, the dual approach more readily facilitates numerical computations for some of the selected models. While intended primarily for applied mathematicians, the text will also be of interest to engineers, physicists, and other researchers in related fields.

  3. Variational method for objective analysis of scalar variable and its ...

    Indian Academy of Sciences (India)

    e-mail: sinha@tropmet.res.in. In this study real time data have been used to compare the standard and triangle method by ... The work presented in this paper is about a vari- ... But when the balance is needed ..... tred at 17:30h IST of 11 June within half a degree of ..... Ogura Y and Chen Y L 1977 A life history of an intense.

  4. Vector optimization set-valued and variational analysis

    CERN Document Server

    Chen, Guang-ya; Yang, Xiaogi

    2005-01-01

    This book is devoted to vector or multiple criteria approaches in optimization. Topics covered include: vector optimization, vector variational inequalities, vector variational principles, vector minmax inequalities and vector equilibrium problems. In particular, problems with variable ordering relations and set-valued mappings are treated. The nonlinear scalarization method is extensively used throughout the book to deal with various vector-related problems. The results presented are original and should be interesting to researchers and graduates in applied mathematics and operations research

  5. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    OpenAIRE

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  6. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Science.gov (United States)

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  7. Water quality, Multivariate statistical techniques, submarine out fall, spatial variation, temporal variation

    International Nuclear Information System (INIS)

    Garcia, Francisco; Palacio, Carlos; Garcia, Uriel

    2012-01-01

    Multivariate statistical techniques were used to investigate the temporal and spatial variations of water quality at the Santa Marta coastal area where a submarine out fall that discharges 1 m3/s of domestic wastewater is located. Two-way analysis of variance (ANOVA), cluster and principal component analysis and Krigging interpolation were considered for this report. Temporal variation showed two heterogeneous periods. From December to April, and July, where the concentration of the water quality parameters is higher; the rest of the year (May, June, August-November) were significantly lower. The spatial variation reported two areas where the water quality is different, this difference is related to the proximity to the submarine out fall discharge.

  8. The Making of Paranormal Belief: History, Discourse Analysis and the Object of Belief

    OpenAIRE

    White, Lewis

    2013-01-01

    The present study comprises a discursive analysis of a cognitive phenomenon, paranormal beliefs. A discursive psychological approach to belief highlights that an important component of the cognitivist work has been how the object of paranormal belief has been defined in formal study. Using discourse analysis, as developed as a method in the history of psychology, this problem is explored through analysis of published scales. The findings highlight three rhetorical themes that are deployed in ...

  9. The clustering of quasars from an objective-prism survey

    International Nuclear Information System (INIS)

    Webster, A.

    1982-01-01

    The positions and redshifts of 108 quasars from the Cerro Tololo objective-prism survey are subjected to Fourier Power Spectrum Analysis in a search for clustering in their spatial distribution. It is found that, on the whole, these quasars are not clustered but are scattered in space independently at random. The sole exception is a group of four quasars at z = 0.37 which has a low probability of being a chance event and which, with a size of about 100 Mpc, may therefore be the largest known structure in the Universe. The conclusions disagree with Arp's analysis of this catalogue: his 'clouds of quasars' ejected by certain low-redshift galaxies, for example, are attributable to sensitivity variations among the different plates of the survey. It is shown that analysis of deeper surveys is likely to show up quasar clusters even at high redshift, and could therefore provide a useful new cosmological probe. (author)

  10. Thermohydromechanical stability analysis upon joint characteristics and depth variations in the region of an underground repository for the study of a disposal concept of high level radioactive wastes

    International Nuclear Information System (INIS)

    Kim, Jhin Wung; Bae, Dae Suk; Kang, Chul Hyung; Choi, Jong Won

    2003-02-01

    The objective of the present study is to understand a long term(500 years) thermohydromechanical interaction behavior in the vicinity of a repository cavern upon joint location and repository depth variations, and then, to contribute to the development of a disposal concept. The model includes a saturated rock mass, PWR spent fuels in a disposal canister surrounded by compacted bentonite inside a deposition hole, and mixed bentonite backfilled in the rest of the space within a cavern. It is assumed that two joint sets exist within a model. A joint set 1 includes 56 .deg. dip joints, 20m spaced, and a joint set 2 is in the direction perpendicular to a joint set 1 and includes 34 .deg. dip joints, 20m spaced. In order to understand the behavior change upon joint location variations, 5 different models of 500m depth are analyzed, and additional 3 different models of 1km depth are analyzed to understand the effect of a depth variation. The two dimensional distinct element code, UDEC is used for the analysis. To understand the joint behavior adjacent to a repository cavern, Barton-Bandis joint model is used. Effect of the decay heat for PWR spent fuels on a repository model is analyzed, and a steady state algorithm is used for a hydraulic analysis. According to the thermohydromechanical interaction behavior of a repository model upon variations of joint locations and a repository depth, during the period of 500 years from waste emplacement, the effect of a depth variation on the stress and displacement behavior of a model is comparatively smaller than the effect of decay heat from radioactive materials. From the study of the joint location variation effect, it is advisable not to locate an underground opening in the region very close to the joint crossings

  11. Flexible Multi-Objective Transmission Expansion Planning with Adjustable Risk Aversion

    Directory of Open Access Journals (Sweden)

    Jing Qiu

    2017-07-01

    Full Text Available This paper presents a multi-objective transmission expansion planning (TEP framework. Rather than using the conventional deterministic reliability criterion, a risk component based on the probabilistic reliability criterion is incorporated into the TEP objectives. This risk component can capture the stochastic nature of power systems, such as load and wind power output variations, component availability, and incentive-based demand response (IBDR costs. Specifically, the formulation of risk value after risk aversion is explicitly given, and it aims to provide network planners with the flexibility to conduct risk analysis. Thus, a final expansion plan can be selected according to individual risk preferences. Moreover, the economic value of IBDR is modeled and integrated into the cost objective. In addition, a relatively new multi-objective evolutionary algorithm called the MOEA/D is introduced and employed to find Pareto optimal solutions, and tradeoffs between overall cost and risk are provided. The proposed approach is numerically verified on the Garver’s six-bus, IEEE 24-bus RTS and Polish 2383-bus systems. Case study results demonstrate that the proposed approach can effectively reduce cost and hedge risk in relation to increasing wind power integration.

  12. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    Directory of Open Access Journals (Sweden)

    Adam W Green

    Full Text Available Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA and available information to inform a formal decision process to determine optimal and timely management policies.

  13. Prediction of ppm level electrical failure by using physical variation analysis

    Science.gov (United States)

    Hou, Hsin-Ming; Kung, Ji-Fu; Hsu, Y.-B.; Yamazaki, Y.; Maruyama, Kotaro; Toyoshima, Yuya; Chen, Chu-en

    2016-03-01

    their spatial correlation distance. For local variations (LV) there is no correlation, whereas for global variations (GV) the correlation distance is very large [7]-[9]. This is the first time to certificate the validation of spatial distribution from the affordable bias contour big data fundamental infrastructures. And then apply statistical techniques to dig out the variation sources. The GV come from systematic issue, which could be compensated by adaptive LT condition or OPC correction. But LV comes from random issue, which being considered as intrinsic problem such as structure, material, tool capability… etc. In this paper studying, we can find out the advanced technology node SRAM contact CD local variation (LV) dominates in total variation, about 70%. It often plays significant in-line real time catching WP-DPMO role of the product yield loss, especially for wafer edge is the worst loss within wafer distribution and causes serious reliability concern. The major root cause of variations comes from the PR material induced burr defect (LV), the second one comes from GV enhanced wafer edge short opportunity, which being attributed to three factors, first one factor is wafer edge CD deliberated enlargement for yield improvement as shown in Fig. 10. Second factor is overlaps/AA shifts due to tool capability dealing with incoming wafer's war page issue and optical periphery layout dependent working pitch issue as shown in Fig. 9 (1)., the last factor comes from wafer edge burr enhanced by wafer edge larger Photo Resistance (PR) spin centrifugal force. After implementing KPIs such as GV related AA/CD indexes as shown in Fig. 9 (1) and 10, respectively, and LV related burr index as shown in Fig. 11., we can construct the parts per million (PPM) level short probability model via multi-variables regression, canonical correlation analysis and logistic transformation. The model provides prediction of PPM level electrical failure by using in-line real time physical

  14. Object-Based Image Analysis in Wetland Research: A Review

    Directory of Open Access Journals (Sweden)

    Iryna Dronova

    2015-05-01

    Full Text Available The applications of object-based image analysis (OBIA in remote sensing studies of wetlands have been growing over recent decades, addressing tasks from detection and delineation of wetland bodies to comprehensive analyses of within-wetland cover types and their change. Compared to pixel-based approaches, OBIA offers several important benefits to wetland analyses related to smoothing of the local noise, incorporating meaningful non-spectral features for class separation and accounting for landscape hierarchy of wetland ecosystem organization and structure. However, there has been little discussion on whether unique challenges of wetland environments can be uniformly addressed by OBIA across different types of data, spatial scales and research objectives, and to what extent technical and conceptual aspects of this framework may themselves present challenges in a complex wetland setting. This review presents a synthesis of 73 studies that applied OBIA to different types of remote sensing data, spatial scale and research objectives. It summarizes the progress and scope of OBIA uses in wetlands, key benefits of this approach, factors related to accuracy and uncertainty in its applications and the main research needs and directions to expand the OBIA capacity in the future wetland studies. Growing demands for higher-accuracy wetland characterization at both regional and local scales together with advances in very high resolution remote sensing and novel tasks in wetland restoration monitoring will likely continue active exploration of the OBIA potential in these diverse and complex environments.

  15. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  16. Study of Seasonal Variation in Groundwater Quality of Sagar City (India by Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Hemant Pathak

    2011-01-01

    Full Text Available Groundwater is one of the major resources of the drinking water in Sagar city (India.. In this study 15 sampling station were selected for the investigations on 14 chemical parameters. The work was carried out during different months of the pre-monsoon, monsoon and post-monsoon seasons in June 2009 to June 2010. The multivariate statistics such as principal component and cluster analysis were applied to the datasets to investigate seasonal variations in groundwater quality. Principal axis factoring has been used to observe the mode of association of parameters and their interrelationships, for evaluating water quality. Average value of BOD, COD, ammonia and iron was high during entire study period. Elevated values of BOD and ammonia in monsoon, slightly more value of BOD in post-monsoon, BOD, ammonia and iron in pre-monsoon period reflected contribution on temporal effect on groundwater. Results of principal component analysis evinced that all the parameters equally and significantly contribute to groundwater quality variations. Factor 1 and factor 2 analysis revealed the DO value deteriorate due to organic load (BOD/Ammonia in different seasons. Hierarchical cluster analysis grouped 15 stations into four clusters in monsoon, five clusters in post-monsoon and five clusters in pre-monsoon with similar water quality features. Clustered group at monsoon, post-monsoon and pre-monsoon consisted one station exhibiting significant spatial variation in physicochemical composition. The anthropogenic nitrogenous species, as fallout from modernization activities. The study indicated that the groundwater sufficiently well oxygenated and nutrient-rich in study places.

  17. Describing shell shape variations and sexual dimorphism of Golden Apple Snail, Pomacea caniculata (Lamarck, 1822 using geometric morphometric analysis

    Directory of Open Access Journals (Sweden)

    C.C. Cabuga

    2017-09-01

    Full Text Available Pomacea caniculata or Golden Apple Snail (GAS existed to be a rice pest in the Philippines and in Asia. Likewise, geographic location also contributes its increasing populations thus making it invasive among freshwater habitats and rice field areas. This study was conducted in order to describe shell shape variations and sexual dimorphism among the populations of P. caniculata. A total of 180 were randomly collected in the three lakes of Esperanza, Agusan del Sur (Lake Dakong Napo, Lake Oro, and Lake Cebulan, of which each lake comprised of 60 samples (30 males and 30 females. To determine the variations and sexual dimorphism in the shell shape of golden apple snail, coordinates was administered to relative warp analysis and the resulting data were subjected to Multivariate Analysis of Variance (MANOVA, Principal Component Analysis (PCA and Canonical Variate Analysis (CVA. The results show statistically significant (P<0.05 from the appended male and female dorsal and ventral/apertural portion. While male and female spire height, body size, and shell shape opening also shows significant variations. These phenotypic distinctions could be associated with geographic isolation, predation and nutrient component of the gastropods. Thus, the importance of using geometric morphometric advances in describing sexual dimorphism in the shell shape of P. caniculata.

  18. Molecular Karyotyping and Exome Analysis of Salt-Tolerant Rice Mutant from Somaclonal Variation

    Directory of Open Access Journals (Sweden)

    Thanikarn Udomchalothorn

    2014-11-01

    Full Text Available LPT123-TC171 is a salt-tolerant (ST and drought-tolerant (DT rice line that was selected from somaclonal variation of the original Leuang Pratew 123 (LPT123 rice cultivar. The objective of this study was to identify the changes in the rice genome that possibly lead to ST and/or DT characteristics. The genomes of LPT123 and LPT123-TC171 were comparatively studied at the four levels of whole chromosomes (chromosome structure including telomeres, transposable elements, and DNA sequence changes by using next-generation sequencing analysis. Compared with LPT123, the LPT123-TC171 line displayed no changes in the ploidy level, but had a significant deficiency of chromosome ends (telomeres. The functional genome analysis revealed new aspects of the genome response to the in vitro cultivation condition, where exome sequencing revealed the molecular spectrum and pattern of changes in the somaclonal variant compared with the parental LPT123 cultivar. Mutation detection was performed, and the degree of mutations was evaluated to estimate the impact of mutagenesis on the protein functions. Mutations within the known genes responding to both drought and salt stress were detected in 493 positions, while mutations within the genes responding to only salt stress were found in 100 positions. The possible functions of the mutated genes contributing to salt or drought tolerance were discussed. It was concluded that the ST and DT characteristics in the somaclonal variegated line resulted from the base changes in the salt- and drought-responsive genes rather than the changes in chromosome structure or the large duplication or deletion in the specific region of the genome.

  19. The effects of local control station design variation on plant risk

    International Nuclear Information System (INIS)

    O'Hara, J.

    1989-01-01

    The existence of human engineering deficiencies at local control stations (LCSs) was addressed in a study (NUREG/CR-3696) conducted by the Pacific Northwest Laboratory (PNL). PNL concluded that the existence of these human factors deficiencies at safety significant LCSs increases the potential for operator errors that could be detrimental to plant and public safety. However, PNL did not specific analysis to evaluate the effects of LCS design variations on human performance, on plant risk, or on the cost benefit feasibility of upgrading LCSs. The purpose of the present investigation was to conduct such an analysis. The specific objectives of the research were (1) to further define important local control stations, human factors related LCS design variations, and typical human engineering deficiencies (HEDs) at LCSs; (2) to determine the effect of LCS design variations on human performance, i.e., on risk-significant human errors (HEs); (3) to determine the effect of LCS-induced human performance variation on plant risk as measured by core melt frequency (CMF); and (4) to determine whether LCS improvements (upgrades in LCS design to mitigate HEDs) are feasible in a scoping-type value-impact analysis. The results can be summarized as follows. There was an overall effect of LCS variations on human performance. The transition from the worst LCS configuration to the best resulted in an absolute reduction or improvement of 0.82 in mean HEP (reduction by a factor of 20). The transition from low to high levels of FC was associated with a 0.46 (86%) reduction in mean HEP. The majority of the effect was accounted for in the transition from the low to medium levels. The Panel Design dimension also had an effect on human performance although not as large as functional centralization. Upgrading from a low to high panel design resulted in a 0.29 (69%) reduction in mean HEP

  20. Transferability of Object-Oriented Image Analysis Methods for Slum Identification

    Directory of Open Access Journals (Sweden)

    Alfred Stein

    2013-08-01

    Full Text Available Updated spatial information on the dynamics of slums can be helpful to measure and evaluate progress of policies. Earlier studies have shown that semi-automatic detection of slums using remote sensing can be challenging considering the large variability in definition and appearance. In this study, we explored the potential of an object-oriented image analysis (OOA method to detect slums, using very high resolution (VHR imagery. This method integrated expert knowledge in the form of a local slum ontology. A set of image-based parameters was identified that was used for differentiating slums from non-slum areas in an OOA environment. The method was implemented on three subsets of the city of Ahmedabad, India. Results show that textural features such as entropy and contrast derived from a grey level co-occurrence matrix (GLCM and the size of image segments are stable parameters for classification of built-up areas and the identification of slums. Relation with classified slum objects, in terms of enclosed by slums and relative border with slums was used to refine classification. The analysis on three different subsets showed final accuracies ranging from 47% to 68%. We conclude that our method produces useful results as it allows including location specific adaptation, whereas generically applicable rulesets for slums are still to be developed.

  1. Systematic documentation and analysis of human genetic variation using the microattribution approach

    Science.gov (United States)

    Giardine, Belinda; Borg, Joseph; Higgs, Douglas R.; Peterson, Kenneth R.; Maglott, Donna; Basak, A. Nazli; Clark, Barnaby; Faustino, Paula; Felice, Alex E.; Francina, Alain; Gallivan, Monica V. E.; Georgitsi, Marianthi; Gibbons, Richard J.; Giordano, Piero C.; Harteveld, Cornelis L.; Joly, Philippe; Kanavakis, Emmanuel; Kollia, Panagoula; Menzel, Stephan; Miller, Webb; Moradkhani, Kamran; Old, John; Papachatzopoulou, Adamantia; Papadakis, Manoussos N.; Papadopoulos, Petros; Pavlovic, Sonja; Philipsen, Sjaak; Radmilovic, Milena; Riemer, Cathy; Schrijver, Iris; Stojiljkovic, Maja; Thein, Swee Lay; Traeger-Synodinos, Jan; Tully, Ray; Wada, Takahito; Waye, John; Wiemann, Claudia; Zukic, Branka; Chui, David H. K.; Wajcman, Henri; Hardison, Ross C.; Patrinos, George P.

    2013-01-01

    We developed a series of interrelated locus-specific databases to store all published and unpublished genetic variation related to these disorders, and then implemented microattribution to encourage submission of unpublished observations of genetic variation to these public repositories 1. A total of 1,941 unique genetic variants in 37 genes, encoding globins (HBA2, HBA1, HBG2, HBG1, HBD, HBB) and other erythroid proteins (ALOX5AP, AQP9, ARG2, ASS1, ATRX, BCL11A, CNTNAP2, CSNK2A1, EPAS1, ERCC2, FLT1, GATA1, GPM6B, HAO2, HBS1L, KDR, KL, KLF1, MAP2K1, MAP3K5, MAP3K7, MYB, NOS1, NOS2, NOS3, NOX3, NUP133, PDE7B, SMAD3, SMAD6, and TOX) are currently documented in these databases with reciprocal attribution of microcitations to data contributors. Our project provides the first example of implementing microattribution to incentivise submission of all known genetic variation in a defined system. It has demonstrably increased the reporting of human variants and now provides a comprehensive online resource for systematically describing human genetic variation in the globin genes and other genes contributing to hemoglobinopathies and thalassemias. The large repository of previously reported data, together with more recent data, acquired by microattribution, demonstrates how the comprehensive documentation of human variation will provide key insights into normal biological processes and how these are perturbed in human genetic disease. Using the microattribution process set out here, datasets which took decades to accumulate for the globin genes could be assembled rapidly for other genes and disease systems. The principles established here for the globin gene system will serve as a model for other systems and the analysis of other common and/or complex human genetic diseases. PMID:21423179

  2. Benchmarking the Applicability of Ontology in Geographic Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Sachit Rajbhandari

    2017-11-01

    Full Text Available In Geographic Object-based Image Analysis (GEOBIA, identification of image objects is normally achieved using rule-based classification techniques supported by appropriate domain knowledge. However, GEOBIA currently lacks a systematic method to formalise the domain knowledge required for image object identification. Ontology provides a representation vocabulary for characterising domain-specific classes. This study proposes an ontological framework that conceptualises domain knowledge in order to support the application of rule-based classifications. The proposed ontological framework is tested with a landslide case study. The Web Ontology Language (OWL is used to construct an ontology in the landslide domain. The segmented image objects with extracted features are incorporated into the ontology as instances. The classification rules are written in Semantic Web Rule Language (SWRL and executed using a semantic reasoner to assign instances to appropriate landslide classes. Machine learning techniques are used to predict new threshold values for feature attributes in the rules. Our framework is compared with published work on landslide detection where ontology was not used for the image classification. Our results demonstrate that a classification derived from the ontological framework accords with non-ontological methods. This study benchmarks the ontological method providing an alternative approach for image classification in the case study of landslides.

  3. Variational segmentation problems using prior knowledge in imaging and vision

    DEFF Research Database (Denmark)

    Fundana, Ketut

    This dissertation addresses variational formulation of segmentation problems using prior knowledge. Variational models are among the most successful approaches for solving many Computer Vision and Image Processing problems. The models aim at finding the solution to a given energy functional defined......, prior knowledge is needed to obtain the desired solution. The introduction of shape priors in particular, has proven to be an effective way to segment objects of interests. Firstly, we propose a prior-based variational segmentation model to segment objects of interest in image sequences, that can deal....... Many objects have high variability in shape and orientation. This often leads to unsatisfactory results, when using a segmentation model with single shape template. One way to solve this is by using more sophisticated shape models. We propose to incorporate shape priors from a shape sub...

  4. Neurocomputational bases of object and face recognition.

    OpenAIRE

    Biederman, I; Kalocsai, P

    1997-01-01

    A number of behavioural phenomena distinguish the recognition of faces and objects, even when members of a set of objects are highly similar. Because faces have the same parts in approximately the same relations, individuation of faces typically requires specification of the metric variation in a holistic and integral representation of the facial surface. The direct mapping of a hypercolumn-like pattern of activation onto a representation layer that preserves relative spatial filter values in...

  5. General object recognition is specific: Evidence from novel and familiar objects.

    Science.gov (United States)

    Richler, Jennifer J; Wilmer, Jeremy B; Gauthier, Isabel

    2017-09-01

    In tests of object recognition, individual differences typically correlate modestly but nontrivially across familiar categories (e.g. cars, faces, shoes, birds, mushrooms). In theory, these correlations could reflect either global, non-specific mechanisms, such as general intelligence (IQ), or more specific mechanisms. Here, we introduce two separate methods for effectively capturing category-general performance variation, one that uses novel objects and one that uses familiar objects. In each case, we show that category-general performance variance is unrelated to IQ, thereby implicating more specific mechanisms. The first approach examines three newly developed novel object memory tests (NOMTs). We predicted that NOMTs would exhibit more shared, category-general variance than familiar object memory tests (FOMTs) because novel objects, unlike familiar objects, lack category-specific environmental influences (e.g. exposure to car magazines or botany classes). This prediction held, and remarkably, virtually none of the substantial shared variance among NOMTs was explained by IQ. Also, while NOMTs correlated nontrivially with two FOMTs (faces, cars), these correlations were smaller than among NOMTs and no larger than between the face and car tests themselves, suggesting that the category-general variance captured by NOMTs is specific not only relative to IQ, but also, to some degree, relative to both face and car recognition. The second approach averaged performance across multiple FOMTs, which we predicted would increase category-general variance by averaging out category-specific factors. This prediction held, and as with NOMTs, virtually none of the shared variance among FOMTs was explained by IQ. Overall, these results support the existence of object recognition mechanisms that, though category-general, are specific relative to IQ and substantially separable from face and car recognition. They also add sensitive, well-normed NOMTs to the tools available to study

  6. Bi-variate statistical attribute filtering : A tool for robust detection of faint objects

    NARCIS (Netherlands)

    Teeninga, Paul; Moschini, Ugo; Trager, Scott C.; Wilkinson, M.H.F.

    2013-01-01

    We present a new method for morphological connected attribute filtering for object detection in astronomical images. In this approach, a threshold is set on one attribute (power), based on its distribution due to noise, as a function of object area. The results show an order of magnitude higher

  7. Analysis of blended fuel properties and cycle-to-cycle variation in a diesel engine with a diethyl ether additive

    International Nuclear Information System (INIS)

    Ali, Obed M.; Mamat, Rizalman; Masjuki, H.H.; Abdullah, Abdul Adam

    2016-01-01

    Highlights: • Viability of diethyl ether additive to improve palm biodiesel–diesel blend. • Numerical analysis of engine cyclic variation at different additive ratios. • Physicochemical properties of the blends improved with diethyl ether additive. • Blended fuel heating value is significantly affected. • Blended fuel with 4% diethyl ether shows comparable engine cyclic variation to diesel. - Abstract: In this study, the effect of adding small portions of a diethyl ether additive to biodiesel–diesel blended fuel (B30) was investigated. This study includes an evaluation of the fuel properties and a combustion analysis, specifically, an analysis of the cyclic variations in diesel engines. The amount of additive used with B30 is 2%, 4%, 6% and 8% (by volume). The experimental engine test was conducted at 2500 rpm which produce maximum torque, and the in-cylinder pressure data were collected over 200 consecutive engine cycles for each test. The indicated mean effective pressure time series is analyzed using the coefficient of variation and the wavelet analysis method. The test results for the properties show a slight improvement in density and acid value with a significant decrease in the viscosity, pour point and cloud point of the blended fuel with an 8% additive ratio by 26.5%, 4 °C and 3 °C, respectively, compared with blended fuel without additive. However, the heating value is reduced by approximately 4% with increasing the additive ratio to 8%. From the wavelet power spectrum, it is observed that the intermediate and long-term periodicities appear in diesel fuel, while the short-period oscillations become intermittently visible in pure blended fuel. The coefficient of variation for B30 was the lowest and increased as the additive ratios increased, which agrees with the wavelet analysis results. Furthermore, the spectral power increased with an increase in the additive ratio, indicating that the additive has a noticeable effect on increasing the

  8. Cultural Variations across Academic Genres: A Generic Analysis of Intertextuality in Master's Theses Introductions

    Science.gov (United States)

    Ketabi, Saeed; Rahavard, Shaahin

    2013-01-01

    Genre analysis of texts has always been significant. The current study aimed at investigating intertextuality considering cultural variations and differences in students' discourse communities. Social studies, philosophy, and biology were chosen as the representatives of social sciences, humanities and sciences. Tehran University, one of the most…

  9. Manifold-Based Visual Object Counting.

    Science.gov (United States)

    Wang, Yi; Zou, Yuexian; Wang, Wenwu

    2018-07-01

    Visual object counting (VOC) is an emerging area in computer vision which aims to estimate the number of objects of interest in a given image or video. Recently, object density based estimation method is shown to be promising for object counting as well as rough instance localization. However, the performance of this method tends to degrade when dealing with new objects and scenes. To address this limitation, we propose a manifold-based method for visual object counting (M-VOC), based on the manifold assumption that similar image patches share similar object densities. Firstly, the local geometry of a given image patch is represented linearly by its neighbors using a predefined patch training set, and the object density of this given image patch is reconstructed by preserving the local geometry using locally linear embedding. To improve the characterization of local geometry, additional constraints such as sparsity and non-negativity are also considered via regularization, nonlinear mapping, and kernel trick. Compared with the state-of-the-art VOC methods, our proposed M-VOC methods achieve competitive performance on seven benchmark datasets. Experiments verify that the proposed M-VOC methods have several favorable properties, such as robustness to the variation in the size of training dataset and image resolution, as often encountered in real-world VOC applications.

  10. Adobe Boxes: Locating Object Proposals Using Object Adobes.

    Science.gov (United States)

    Fang, Zhiwen; Cao, Zhiguo; Xiao, Yang; Zhu, Lei; Yuan, Junsong

    2016-09-01

    Despite the previous efforts of object proposals, the detection rates of the existing approaches are still not satisfactory enough. To address this, we propose Adobe Boxes to efficiently locate the potential objects with fewer proposals, in terms of searching the object adobes that are the salient object parts easy to be perceived. Because of the visual difference between the object and its surroundings, an object adobe obtained from the local region has a high probability to be a part of an object, which is capable of depicting the locative information of the proto-object. Our approach comprises of three main procedures. First, the coarse object proposals are acquired by employing randomly sampled windows. Then, based on local-contrast analysis, the object adobes are identified within the enlarged bounding boxes that correspond to the coarse proposals. The final object proposals are obtained by converging the bounding boxes to tightly surround the object adobes. Meanwhile, our object adobes can also refine the detection rate of most state-of-the-art methods as a refinement approach. The extensive experiments on four challenging datasets (PASCAL VOC2007, VOC2010, VOC2012, and ILSVRC2014) demonstrate that the detection rate of our approach generally outperforms the state-of-the-art methods, especially with relatively small number of proposals. The average time consumed on one image is about 48 ms, which nearly meets the real-time requirement.

  11. Integrating population variation and protein structural analysis to improve clinical interpretation of missense variation: application to the WD40 domain.

    Science.gov (United States)

    Laskowski, Roman A; Tyagi, Nidhi; Johnson, Diana; Joss, Shelagh; Kinning, Esther; McWilliam, Catherine; Splitt, Miranda; Thornton, Janet M; Firth, Helen V; Wright, Caroline F

    2016-03-01

    We present a generic, multidisciplinary approach for improving our understanding of novel missense variants in recently discovered disease genes exhibiting genetic heterogeneity, by combining clinical and population genetics with protein structural analysis. Using six new de novo missense diagnoses in TBL1XR1 from the Deciphering Developmental Disorders study, together with population variation data, we show that the β-propeller structure of the ubiquitous WD40 domain provides a convincing way to discriminate between pathogenic and benign variation. Children with likely pathogenic mutations in this gene have severely delayed language development, often accompanied by intellectual disability, autism, dysmorphology and gastrointestinal problems. Amino acids affected by likely pathogenic missense mutations are either crucial for the stability of the fold, forming part of a highly conserved symmetrically repeating hydrogen-bonded tetrad, or located at the top face of the β-propeller, where 'hotspot' residues affect the binding of β-catenin to the TBLR1 protein. In contrast, those altered by population variation are significantly less likely to be spatially clustered towards the top face or to be at buried or highly conserved residues. This result is useful not only for interpreting benign and pathogenic missense variants in this gene, but also in other WD40 domains, many of which are associated with disease. © The Author 2016. Published by Oxford University Press.

  12. Novel variational approach for analysis of photonic crystal slabs

    International Nuclear Information System (INIS)

    Aram, Mohammad Hasan; Khorasani, Sina

    2015-01-01

    We propose a new method, based on variational principle, for the analysis of photonic crystal (PC) slabs. Most of the methods used today treat PC slabs as three-dimensional (3D) crystal, and this makes these methods very time and/or memory consuming. In our proposed method, we use the Bloch theorem to expand the field on infinite plane waves, whose amplitudes depend on the component perpendicular to the slab surface. By approximating these amplitudes with appropriate functions, we can find modes of PC slabs almost as fast as we can find modes of two-dimensional crystals. In addition to this advantage, we can also calculate radiation modes with this method, which is not feasible with the 3D plane wave expansion method. (paper)

  13. ACCOUNTING INFORMATION SYSTEMS: AN APPROACH FOCUSED ON OBJECTS WITH INTELLIGENT AGENTS

    Directory of Open Access Journals (Sweden)

    Marcelo Botelho da Costa Moraes

    2010-01-01

    Full Text Available Accounting aims at the treatment of information related to economic events within organizations. In order to do so, the double entry method is used (debt and credit accounting, which only considers monetary variations. With the development of information technologies, accounting information systems are born. In the 1980’s, the REA model (economic Resources, economic Events and economic Agents is created, which focuses on accounting information records, based on the association of economic resources, economic events and economic agents. The objective of this work is to demonstrate an object-oriented modeling with intelligent agents use, for information development and analysis focused on users. The proposed model is also analyzed according to accounting information quality, necessary for accounting information users, capable to comply with the needs of different user groups, with advantages in applications.

  14. Calculating potential error in sodium MRI with respect to the analysis of small objects.

    Science.gov (United States)

    Stobbe, Robert W; Beaulieu, Christian

    2018-06-01

    To facilitate correct interpretation of sodium MRI measurements, calculation of error with respect to rapid signal decay is introduced and combined with that of spatially correlated noise to assess volume-of-interest (VOI) 23 Na signal measurement inaccuracies, particularly for small objects. Noise and signal decay-related error calculations were verified using twisted projection imaging and a specially designed phantom with different sized spheres of constant elevated sodium concentration. As a demonstration, lesion signal measurement variation (5 multiple sclerosis participants) was compared with that predicted from calculation. Both theory and phantom experiment showed that VOI signal measurement in a large 10-mL, 314-voxel sphere was 20% less than expected on account of point-spread-function smearing when the VOI was drawn to include the full sphere. Volume-of-interest contraction reduced this error but increased noise-related error. Errors were even greater for smaller spheres (40-60% less than expected for a 0.35-mL, 11-voxel sphere). Image-intensity VOI measurements varied and increased with multiple sclerosis lesion size in a manner similar to that predicted from theory. Correlation suggests large underestimation of 23 Na signal in small lesions. Acquisition-specific measurement error calculation aids 23 Na MRI data analysis and highlights the limitations of current low-resolution methodologies. Magn Reson Med 79:2968-2977, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  15. Analysis of Camera Parameters Value in Various Object Distances Calibration

    International Nuclear Information System (INIS)

    Yusoff, Ahmad Razali; Ariff, Mohd Farid Mohd; Idris, Khairulnizam M; Majid, Zulkepli; Setan, Halim; Chong, Albert K

    2014-01-01

    In photogrammetric applications, good camera parameters are needed for mapping purpose such as an Unmanned Aerial Vehicle (UAV) that encompassed with non-metric camera devices. Simple camera calibration was being a common application in many laboratory works in order to get the camera parameter's value. In aerial mapping, interior camera parameters' value from close-range camera calibration is used to correct the image error. However, the causes and effects of the calibration steps used to get accurate mapping need to be analyze. Therefore, this research aims to contribute an analysis of camera parameters from portable calibration frame of 1.5 × 1 meter dimension size. Object distances of two, three, four, five, and six meters are the research focus. Results are analyzed to find out the changes in image and camera parameters' value. Hence, camera calibration parameter's of a camera is consider different depend on type of calibration parameters and object distances

  16. Transcriptome analysis of the sea cucumber (Apostichopus japonicus) with variation in individual growth.

    Science.gov (United States)

    Gao, Lei; He, Chongbo; Bao, Xiangbo; Tian, Meilin; Ma, Zhen

    2017-01-01

    The sea cucumber (Apostichopus japonicus) is an economically important aquaculture species in China. However, the serious individual growth variation often caused financial losses to farmers and the genetic mechanisms are poorly understood. In the present study, the extensively analysis at the transcriptome level for individual growth variation in sea cucumber was carried out. A total of 118946 unigenes were assembled from 255861 transcripts, with N50 of 1700. Of all unigenes, about 23% were identified with at least one significant match to known databases. In all four pair of comparison, 1840 genes were found to be expressed differently. Global hypometabolism was found to be occurred in the slow growing population, based on which the hypothesis was raised that growth retardation in individual growth variation of sea cucumber is one type of dormancy which is used to be against to adverse circumstances. Besides, the pathways such as ECM-receptor interaction and focal adhesion were enriched in the maintenance of cell and tissue structure and communication. Further, 76645 SSRs, 765242 SNPs and 146886 ins-dels were detected in the current study providing an extensive set of data for future studies of genetic mapping and selective breeding. In summary, these results will provides deep insight into the molecular basis of individual growth variation in marine invertebrates, and be valuable for understanding the physiological differences of growth process.

  17. Transcriptome analysis of the sea cucumber (Apostichopus japonicus with variation in individual growth.

    Directory of Open Access Journals (Sweden)

    Lei Gao

    Full Text Available The sea cucumber (Apostichopus japonicus is an economically important aquaculture species in China. However, the serious individual growth variation often caused financial losses to farmers and the genetic mechanisms are poorly understood. In the present study, the extensively analysis at the transcriptome level for individual growth variation in sea cucumber was carried out. A total of 118946 unigenes were assembled from 255861 transcripts, with N50 of 1700. Of all unigenes, about 23% were identified with at least one significant match to known databases. In all four pair of comparison, 1840 genes were found to be expressed differently. Global hypometabolism was found to be occurred in the slow growing population, based on which the hypothesis was raised that growth retardation in individual growth variation of sea cucumber is one type of dormancy which is used to be against to adverse circumstances. Besides, the pathways such as ECM-receptor interaction and focal adhesion were enriched in the maintenance of cell and tissue structure and communication. Further, 76645 SSRs, 765242 SNPs and 146886 ins-dels were detected in the current study providing an extensive set of data for future studies of genetic mapping and selective breeding. In summary, these results will provides deep insight into the molecular basis of individual growth variation in marine invertebrates, and be valuable for understanding the physiological differences of growth process.

  18. Infrared polarimetry and photometry of BL Lac objects. 3

    Energy Technology Data Exchange (ETDEWEB)

    Holmes, P A; Brand, P W.J.L. [Edinburgh Univ. (UK). Dept. of Astronomy; Impey, C D [Hawaii Univ., Honolulu (USA). Inst. for Astronomy; Williams, P M [UKIRT, Hilo, HI (USA)

    1984-10-15

    The data presented here is part of a continuing monitoring programme of BL lac objects with J, H and K photometry and polarimetry. A total of 30 BL Lac objects have now been observed photometrically. Infrared polarimetry has also been obtained for 24 of these objects. The sample is sufficiently large to examine statistically, and several important correlations have emerged. Internight variations and wavelength dependence of polarization indicate that BL Lac objects, as a class, may be understood in terms of a relatively simple two-component model.

  19. GPR Detection of Buried Symmetrically Shaped Mine-like Objects using Selective Independent Component Analysis

    DEFF Research Database (Denmark)

    Karlsen, Brian; Sørensen, Helge Bjarup Dissing; Larsen, Jan

    2003-01-01

    from small-scale anti-personal (AP) mines to large-scale anti-tank (AT) mines were designed. Large-scale SF-GPR measurements on this series of mine-like objects buried in soil were performed. The SF-GPR data was acquired using a wideband monostatic bow-tie antenna operating in the frequency range 750......This paper addresses the detection of mine-like objects in stepped-frequency ground penetrating radar (SF-GPR) data as a function of object size, object content, and burial depth. The detection approach is based on a Selective Independent Component Analysis (SICA). SICA provides an automatic...... ranking of components, which enables the suppression of clutter, hence extraction of components carrying mine information. The goal of the investigation is to evaluate various time and frequency domain ICA approaches based on SICA. Performance comparison is based on a series of mine-like objects ranging...

  20. Geometrical treatment of non-potential interactions: the exterior variational calculus, dynamical systems, physical 1-forms and variational selfadjointness

    International Nuclear Information System (INIS)

    Trostel, R.

    1982-01-01

    A mathematical objective of this paper is to provide geometrical formulation of the integrability conditions for the existence of an action functional, that is, to provide a geometrical counterpart (similar to that by Abraham, Marsden, and Hughes) of the variational and functional approach to self-adjointness. This objective is achieved via the exterior variational calculus, an exterior differential calculus on the vector space of functions depending on time or space time, using from the outset extensively the concept of functional differentiation as its foundation. Variational self-adjointness equals the variational closure of the physical 1-form, the vanishing of a generalized curl-operation applied to the equations of motion. The convenience of this more formal approach is demonstrated, not only when deriving the conditions of variational self-adjointness for materials of differential type of arbitrary order (particles or fields), using roughly no more than Dirac's delta-distributions, but also when treating materials of a broader class (including causal and acausal constitutive functionals, materials of rate type, integral type, etc.). A physical objective of this paper is achieved by pointing out that, as physics is primarily concerned with the solutions of the evolution equations, i.e., with the set of the zero points of the physical 1-form, an equivalence relation among the physical 1-forms on the infinite dimensional vector space of functions is constructed by leaving the set of their zero points unchanged. Using this result, a direct Lagrangian universality is indicated and an almost one presented. Moreover, all physical 1-forms connected by invertible supermatrices (thus mixing the evolution law of different times or space-time) are equivalent. Choosing these supermatrices to be diagonal in time or space-time yields the indirect analytical representation factors

  1. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    Science.gov (United States)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  2. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    Science.gov (United States)

    2017-02-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) is a novel information framework developed...prototyping. It supports dynamic plugin of analysis modules, for either research or analysis tasks. The framework integrates multiple image processing...Requirements 2 3. Installing the Software for IOAIDE 2 3.1 Load ARL Software 2 3.2 Load ARL Applications 4 3.3 Load the DSPro Software 7 3.4 Update Java

  3. Analysis of temporal variation in human masticatory cycles during gum chewing.

    Science.gov (United States)

    Crane, Elizabeth A; Rothman, Edward D; Childers, David; Gerstner, Geoffrey E

    2013-10-01

    The study investigated modulation of fast and slow opening (FO, SO) and closing (FC, SC) chewing cycle phases using gum-chewing sequences in humans. Twenty-two healthy adult subjects participated by chewing gum for at least 20s on the right side and at least 20s on the left side while jaw movements were tracked with a 3D motion analysis system. Jaw movement data were digitized, and chewing cycle phases were identified and analysed for all chewing cycles in a complete sequence. All four chewing cycle phase durations were more variant than total cycle durations, a result found in other non-human primates. Significant negative correlations existed between the opening phases, SO and FO, and between the closing phases, SC and FC; however, there was less consistency in terms of which phases were negatively correlated both between subjects, and between chewing sides within subjects, compared with results reported in other species. The coordination of intra-cycle phases appears to be flexible and to follow complex rules during gum-chewing in humans. Alternatively, the observed intra-cycle phase relationships could simply reflect: (1) variation in jaw kinematics due to variation in how gum was handled by the tongue on a chew-by-chew basis in our experimental design or (2) by variation due to data sampling noise and/or how phases were defined and identified. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Variational submanifolds of Euclidean spaces

    Science.gov (United States)

    Krupka, D.; Urban, Z.; Volná, J.

    2018-03-01

    Systems of ordinary differential equations (or dynamical forms in Lagrangian mechanics), induced by embeddings of smooth fibered manifolds over one-dimensional basis, are considered in the class of variational equations. For a given non-variational system, conditions assuring variationality (the Helmholtz conditions) of the induced system with respect to a submanifold of a Euclidean space are studied, and the problem of existence of these "variational submanifolds" is formulated in general and solved for second-order systems. The variational sequence theory on sheaves of differential forms is employed as a main tool for the analysis of local and global aspects (variationality and variational triviality). The theory is illustrated by examples of holonomic constraints (submanifolds of a configuration Euclidean space) which are variational submanifolds in geometry and mechanics.

  5. The objective assessment of experts' and novices' suturing skills using an image analysis program.

    Science.gov (United States)

    Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M

    2013-02-01

    To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.

  6. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  7. Anatomical variations of hepatic arterial system, coeliac trunk and renal arteries: an analysis with multidetector CT angiography.

    Science.gov (United States)

    Ugurel, M S; Battal, B; Bozlar, U; Nural, M S; Tasar, M; Ors, F; Saglam, M; Karademir, I

    2010-08-01

    The purpose of our investigation was to determine the anatomical variations in the coeliac trunk-hepatic arterial system and the renal arteries in patients who underwent multidetector CT (MDCT) angiography of the abdominal aorta for various reasons. A total of 100 patients were analysed retrospectively. The coeliac trunk, hepatic arterial system and renal arteries were analysed individually and anatomical variations were recorded. Statistical analysis of the relationship between hepatocoeliac variations and renal artery variations was performed using a chi(2) test. There was a coeliac trunk trifurcation in 89% and bifurcation in 8% of the cases. Coeliac trunk was absent in 1%, a hepatosplenomesenteric trunk was seen in 1% and a splenomesenteric trunk was present in 1%. Hepatic artery variation was present in 48% of patients. Coeliac trunk and/or hepatic arterial variation was present in 23 (39.7%) of the 58 patients with normal renal arteries, and in 27 (64.3%) of the 42 patients with accessory renal arteries. There was a statistically significant correlation between renal artery variations and coeliac trunk-hepatic arterial system variations (p = 0.015). MDCT angiography permits a correct and detailed evaluation of hepatic and renal vascular anatomy. The prevalence of variations in the coeliac trunk and/or hepatic arteries is increased in people with accessory renal arteries. For that reason, when undertaking angiographic examinations directed towards any single organ, the possibility of variations in the vascular structure of other organs should be kept in mind.

  8. Object-based vegetation classification with high resolution remote sensing imagery

    Science.gov (United States)

    Yu, Qian

    Vegetation species are valuable indicators to understand the earth system. Information from mapping of vegetation species and community distribution at large scales provides important insight for studying the phenological (growth) cycles of vegetation and plant physiology. Such information plays an important role in land process modeling including climate, ecosystem and hydrological models. The rapidly growing remote sensing technology has increased its potential in vegetation species mapping. However, extracting information at a species level is still a challenging research topic. I proposed an effective method for extracting vegetation species distribution from remotely sensed data and investigated some ways for accuracy improvement. The study consists of three phases. Firstly, a statistical analysis was conducted to explore the spatial variation and class separability of vegetation as a function of image scale. This analysis aimed to confirm that high resolution imagery contains the information on spatial vegetation variation and these species classes can be potentially separable. The second phase was a major effort in advancing classification by proposing a method for extracting vegetation species from high spatial resolution remote sensing data. The proposed classification employs an object-based approach that integrates GIS and remote sensing data and explores the usefulness of ancillary information. The whole process includes image segmentation, feature generation and selection, and nearest neighbor classification. The third phase introduces a spatial regression model for evaluating the mapping quality from the above vegetation classification results. The effects of six categories of sample characteristics on the classification uncertainty are examined: topography, sample membership, sample density, spatial composition characteristics, training reliability and sample object features. This evaluation analysis answered several interesting scientific questions

  9. Current evidence on hospital antimicrobial stewardship objectives : A systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E J L; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W T Cohen; Overdiek, Hans W P M; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M P M; Wolfs, Tom F W; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  10. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, E.C.; Hulscher, M.E.J.L.; Mouton, J.W.; Verduin, C.M.; Stuart, J.W.; Overdiek, H.W.; Linden, P.D. van der; Natsch, S.S.; Hertogh, C.M.; Wolfs, T.F.; Schouten, J.A.; Kullberg, B.J.; Prins, J.M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  11. Variation tolerant SoC design

    Science.gov (United States)

    Kozhikkottu, Vivek J.

    The scaling of integrated circuits into the nanometer regime has led to variations emerging as a primary concern for designers of integrated circuits. Variations are an inevitable consequence of the semiconductor manufacturing process, and also arise due to the side-effects of operation of integrated circuits (voltage, temperature, and aging). Conventional design approaches, which are based on design corners or worst-case scenarios, leave designers with an undesirable choice between the considerable overheads associated with over-design and significantly reduced manufacturing yield. Techniques for variation-tolerant design at the logic, circuit and layout levels of the design process have been developed and are in commercial use. However, with the incessant increase in variations due to technology scaling and design trends such as near-threshold computing, these techniques are no longer sufficient to contain the effects of variations, and there is a need to address variations at all stages of design. This thesis addresses the problem of variation-tolerant design at the earliest stages of the design process, where the system-level design decisions that are made can have a very significant impact. There are two key aspects to making system-level design variation-aware. First, analysis techniques must be developed to project the impact of variations on system-level metrics such as application performance and energy. Second, variation-tolerant design techniques need to be developed to absorb the residual impact of variations (that cannot be contained through lower-level techniques). In this thesis, we address both these facets by developing robust and scalable variation-aware analysis and variation mitigation techniques at the system level. The first contribution of this thesis is a variation-aware system-level performance analysis framework. We address the key challenge of translating the per-component clock frequency distributions into a system-level application

  12. Measurement of isotope abundance variations in nature by gravimetric spiking isotope dilution analysis (GS-IDA).

    Science.gov (United States)

    Chew, Gina; Walczyk, Thomas

    2013-04-02

    Subtle variations in the isotopic composition of elements carry unique information about physical and chemical processes in nature and are now exploited widely in diverse areas of research. Reliable measurement of natural isotope abundance variations is among the biggest challenges in inorganic mass spectrometry as they are highly sensitive to methodological bias. For decades, double spiking of the sample with a mix of two stable isotopes has been considered the reference technique for measuring such variations both by multicollector-inductively coupled plasma mass spectrometry (MC-ICPMS) and multicollector-thermal ionization mass spectrometry (MC-TIMS). However, this technique can only be applied to elements having at least four stable isotopes. Here we present a novel approach that requires measurement of three isotope signals only and which is more robust than the conventional double spiking technique. This became possible by gravimetric mixing of the sample with an isotopic spike in different proportions and by applying principles of isotope dilution for data analysis (GS-IDA). The potential and principle use of the technique is demonstrated for Mg in human urine using MC-TIMS for isotopic analysis. Mg is an element inaccessible to double spiking methods as it consists of three stable isotopes only and shows great potential for metabolically induced isotope effects waiting to be explored.

  13. Analysis of substructural variation in families of enzymatic proteins with applications to protein function prediction

    Directory of Open Access Journals (Sweden)

    Fofanov Viacheslav Y

    2010-05-01

    Full Text Available Abstract Background Structural variations caused by a wide range of physico-chemical and biological sources directly influence the function of a protein. For enzymatic proteins, the structure and chemistry of the catalytic binding site residues can be loosely defined as a substructure of the protein. Comparative analysis of drug-receptor substructures across and within species has been used for lead evaluation. Substructure-level similarity between the binding sites of functionally similar proteins has also been used to identify instances of convergent evolution among proteins. In functionally homologous protein families, shared chemistry and geometry at catalytic sites provide a common, local point of comparison among proteins that may differ significantly at the sequence, fold, or domain topology levels. Results This paper describes two key results that can be used separately or in combination for protein function analysis. The Family-wise Analysis of SubStructural Templates (FASST method uses all-against-all substructure comparison to determine Substructural Clusters (SCs. SCs characterize the binding site substructural variation within a protein family. In this paper we focus on examples of automatically determined SCs that can be linked to phylogenetic distance between family members, segregation by conformation, and organization by homology among convergent protein lineages. The Motif Ensemble Statistical Hypothesis (MESH framework constructs a representative motif for each protein cluster among the SCs determined by FASST to build motif ensembles that are shown through a series of function prediction experiments to improve the function prediction power of existing motifs. Conclusions FASST contributes a critical feedback and assessment step to existing binding site substructure identification methods and can be used for the thorough investigation of structure-function relationships. The application of MESH allows for an automated

  14. A Psychoacoustic-Based Multiple Audio Object Coding Approach via Intra-Object Sparsity

    Directory of Open Access Journals (Sweden)

    Maoshen Jia

    2017-12-01

    Full Text Available Rendering spatial sound scenes via audio objects has become popular in recent years, since it can provide more flexibility for different auditory scenarios, such as 3D movies, spatial audio communication and virtual classrooms. To facilitate high-quality bitrate-efficient distribution for spatial audio objects, an encoding scheme based on intra-object sparsity (approximate k-sparsity of the audio object itself is proposed in this paper. The statistical analysis is presented to validate the notion that the audio object has a stronger sparseness in the Modified Discrete Cosine Transform (MDCT domain than in the Short Time Fourier Transform (STFT domain. By exploiting intra-object sparsity in the MDCT domain, multiple simultaneously occurring audio objects are compressed into a mono downmix signal with side information. To ensure a balanced perception quality of audio objects, a Psychoacoustic-based time-frequency instants sorting algorithm and an energy equalized Number of Preserved Time-Frequency Bins (NPTF allocation strategy are proposed, which are employed in the underlying compression framework. The downmix signal can be further encoded via Scalar Quantized Vector Huffman Coding (SQVH technique at a desirable bitrate, and the side information is transmitted in a lossless manner. Both objective and subjective evaluations show that the proposed encoding scheme outperforms the Sparsity Analysis (SPA approach and Spatial Audio Object Coding (SAOC in cases where eight objects were jointly encoded.

  15. Contrasting Specific English Corpora: Language Variation

    Directory of Open Access Journals (Sweden)

    María Luisa Carrió Pastor

    2009-12-01

    Full Text Available The scientific community has traditionally considered technical English as neutral and objective, able to transmit ideas and research in simple sentences and specialized vocabulary. Nevertheless, global communication and intense information delivery have produced a range of different ways of knowledge transmission. Although technical English is considered an objective way to transmit science, writers of academic papers use some words or structures with different frequency in the same genre. As a consequence of this, contrastive studies about the use of second languages have been increasingly attracting scholarly attention. In this research, we evidence that variation in language production is a reality and can be proved contrasting corpora written by native writers of English and by non-native writers of English. The objectives of this paper are first to detect language variation in a technical English corpus; second, to demonstrate that this finding evidences the parts of the sentence that are more sensitive to variation; finally, it also evidences the non-standardisation of technical English. In order to fulfil these objectives, we analysed a corpus of fifty scientific articles written by native speakers of English and fifty scientific articles written by non-native speakers of English. The occurrences were classified and counted in order to detect the most common variations. Further analysis indicated that the variations were caused by mother tongue interference in virtually all cases, although meaning was only very rarely obscured. These findings suggest that the use of certain patterns and expressions originating from L1 interference should be considered as correct as standard English.La comunidad científica considera al inglés técnico como un tipo de lenguaje neutral y objetivo, capaz de transmitir ideas y hallazgos en frases simples y vocabulario reconocido por los especialistas de ese campo. Sin embargo, la comunicación global y el

  16. A descriptive analysis of quantitative indices for multi-objective block layout

    Directory of Open Access Journals (Sweden)

    Amalia Medina Palomera

    2013-01-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  17. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis

    Science.gov (United States)

    Addink, Elisabeth A.; Van Coillie, Frieke M. B.; De Jong, Steven M.

    2012-04-01

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received considerable attention over the past 15 years for analyzing and interpreting remote sensing imagery. In contrast to traditional image analysis, GEOBIA works more like the human eye-brain combination does. The latter uses the object's color (spectral information), size, texture, shape and occurrence to other image objects to interpret and analyze what we see. GEOBIA starts by segmenting the image grouping together pixels into objects and next uses a wide range of object properties to classify the objects or to extract object's properties from the image. Significant advances and improvements in image analysis and interpretation are made thanks to GEOBIA. In June 2010 the third conference on GEOBIA took place at the Ghent University after successful previous meetings in Calgary (2008) and Salzburg (2006). This special issue presents a selection of the 2010 conference papers that are worked out as full research papers for JAG. The papers cover GEOBIA applications as well as innovative methods and techniques. The topics range from vegetation mapping, forest parameter estimation, tree crown identification, urban mapping, land cover change, feature selection methods and the effects of image compression on segmentation. From the original 94 conference papers, 26 full research manuscripts were submitted; nine papers were selected and are presented in this special issue. Selection was done on the basis of quality and topic of the studies. The next GEOBIA conference will take place in Rio de Janeiro from 7 to 9 May 2012 where we hope to welcome even more scientists working in the field of GEOBIA.

  18. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E. J. L.; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W. T. Cohen; Overdiek, Hans W. P. M.; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M. P. M.; Wolfs, Tom F. W.; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes: clinical outcomes,

  19. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Science.gov (United States)

    Dean, Stephen O.

    1988-09-01

    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications.

  20. Inter- and intra-industry variations of capital structure in the Czech manufacturing industry

    Directory of Open Access Journals (Sweden)

    Pavlína Pinková

    2013-01-01

    Full Text Available The objective of the paper is to investigate the existence of inter-industry variations in the capital structure of enterprises of the Czech manufacturing industry and to identify the intra-industry causes of these differences. Three measures of capital structure are employed to determine the inter-industry variations. These are total debt ratio, long-term debt and short-term debt ratios. The set of explanatory variables is included to clarify the intra-industry variations. These explanatory variables are size, asset structure, asset utilization, profitability, non-debt tax shield and growth. The paper reports the analysis of capital structure of five distinctive industrial branches, namely the manufacture of beverages, the manufacture of textiles, the manufacture of paper and paper products, the manufacture of chemicals and chemical products, and the manufacture of computer, electronic and optical products. The data come from the financial statements of selected companies and cover a period from 2008 to 2012. The analysis of variance, correlation and regression analyses are used to develop the statistical framework. The paper aims to study the impact of industry and firm characteristics on capital structure choice.

  1. Seven-Year Multi-Colour Optical Monitoring of BL Lacertae Object ...

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... We monitored the BL Lac object S5 0716+714 in five intermediate optical passbands from 2004 September to 2011 April. The object was active most of the time and intra-day variability was frequently observed. The total variation amplitude tended to decrease with decreasing frequency. Strong ...

  2. Effects of memory colour on colour constancy for unknown coloured objects

    OpenAIRE

    Granzier, Jeroen J M; Gegenfurtner, Karl R

    2012-01-01

    The perception of an object's colour remains constant despite large variations in the chromaticity of the illumination—colour constancy. Hering suggested that memory colours, the typical colours of objects, could help in estimating the illuminant's colour and therefore be an important factor in establishing colour constancy. Here we test whether the presence of objects with diagnostical colours (fruits, vegetables, etc) within a scene influence colour constancy for unknown coloured objects in...

  3. Analysis of interfraction and intrafraction variation during tangential breast irradiation with an electronic portal imaging device

    International Nuclear Information System (INIS)

    Smith, Ryan P.; Bloch, Peter; Harris, Eleanor E.; McDonough, James; Sarkar, Abhirup; Kassaee, Alireza; Avery, Steven; Solin, Lawrence J.

    2005-01-01

    Purpose: To evaluate the daily setup variation and the anatomic movement of the heart and lungs during breast irradiation with tangential photon beams, as measured with an electronic portal imaging device. Methods and materials: Analysis of 1,709 portal images determined changes in the radiation field during a treatment course in 8 patients. Values obtained for every image included central lung distance (CLD) and area of lung and heart within the irradiated field. The data from these measurements were used to evaluate variation from setup between treatment days and motion due to respiration and/or patient movement during treatment delivery. Results: The effect of respiratory motion and movement during treatment was minimal: the maximum range in CLD for any patient on any day was 0.25 cm. The variation caused by day-to-day setup variation was greater, with CLD values for patients ranging from 0.59 cm to 2.94 cm. Similar findings were found for heart and lung areas. Conclusions: There is very little change in CLD and corresponding lung and heart area during individual radiation treatment fractions in breast tangential fields, compared with a relatively greater amount of variation that occurs between days

  4. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    Science.gov (United States)

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  5. Comparing variation across European countries

    DEFF Research Database (Denmark)

    Thygesen, Lau C; Baixauli-Pérez, Cristobal; Librero-López, Julián

    2015-01-01

    BACKGROUND: In geographical studies, population distribution is a key issue. An unequal distribution across units of analysis might entail extra-variation and produce misleading conclusions on healthcare performance variations. This article aims at assessing the impact of building more homogeneou...

  6. Quantitative analysis of structural variations in corpus callosum in adults with multiple system atrophy (MSA)

    Science.gov (United States)

    Bhattacharya, Debanjali; Sinha, Neelam; Saini, Jitender

    2017-03-01

    Multiple system atrophy (MSA) is a rare, non-curable, progressive neurodegenerative disorder that affects nervous system and movement, poses a considerable diagnostic challenge to medical researchers. Corpus callosum (CC) being the largest white matter structure in brain, enabling inter-hemispheric communication, quantification of callosal atrophy may provide vital information at the earliest possible stages. The main objective is to identify the differences in CC structure for this disease, based on quantitative analysis on the pattern of callosal atrophy. We report results of quantification of structural changes in regional anatomical thickness, area and length of CC between patient-groups with MSA with respect to healthy controls. The method utilizes isolating and parcellating the mid-sagittal CC into 100 segments along the length - measuring the width of each segment. It also measures areas within geometrically defined five callosal compartments of the well-known Witelson, and Hofer-Frahma schemes. For quantification, statistical tests are performed on these different callosal measurements. From the statistical analysis, it is concluded that compared to healthy controls, width is reduced drastically throughout CC for MSA group and as well as changes in area and length are also significant for MSA. The study is further extended to check if any significant difference in thickness is found between the two variations of MSA, Parkinsonian MSA and Cerebellar MSA group, using the same methodology. However area and length of this two sub-MSA group, no substantial difference is obtained. The study is performed on twenty subjects for each control and MSA group, who had T1-weighted MRI.

  7. Context-based object-of-interest detection for a generic traffic surveillance analysis system

    NARCIS (Netherlands)

    Bao, X.; Javanbakhti, S.; Zinger, S.; Wijnhoven, R.G.J.; With, de P.H.N.

    2014-01-01

    We present a new traffic surveillance video analysis system, focusing on building a framework with robust and generic techniques, based on both scene understanding and moving object-of-interest detection. Since traffic surveillance is widely applied, we want to design a single system that can be

  8. Analysis of the average daily radon variations in the soil air

    International Nuclear Information System (INIS)

    Holy, K.; Matos, M.; Boehm, R.; Stanys, T.; Polaskova, A.; Hola, O.

    1998-01-01

    In this contribution the search of the relation between the daily variations of the radon concentration and the regular daily oscillations of the atmospheric pressure are presented. The deviation of the radon activity concentration in the soil air from the average daily value reaches only a few percent. For the dry summer months the average daily course of the radon activity concentration can be described by the obtained equation. The analysis of the average daily courses could give the information concerning the depth of the gas permeable soil layer. The soil parameter is determined by others method with difficulty

  9. A measurement based analysis of the spatial distribution, temporal variation and chemical composition of particulate matter in Munich and Augsburg

    Directory of Open Access Journals (Sweden)

    Klaus Schäfer

    2011-02-01

    Full Text Available The objective of the studies presented in this paper is to present an analysis of spatial distribution and temporal variation of particulate matter in Munich and Augsburg, Germany, and to identify and discuss the factors determining the aerosol pollution in both areas. Surface-based in-situ and remote sensing measurements of particle mass and particle size distribution have been performed in, around, and above the two cities. Two measurement campaigns were conducted in Munich, one in late spring and one in winter 2003. Another campaign has been on-going in Augsburg since 2004. Spatial and temporal variations are analyzed from this data (PM10, PM2.5, and PM1. There are higher particle mass concentrations at the urban site than at the surrounding rural sites, especially in winter. No significant difference in the major ionic composition of the particles between the urban and the rural site was detected. This is considered to be related to the spatial distribution of secondary inorganic aerosol that is more homogeneous than aerosol resulting from other sources like traffic or urban releases in general. During the measurement campaigns mixing layer heights were determined continuously by remote sensing (SODAR, ceilometer, RASS. Significant dependence of particle size distribution and particle mass concentration on mixing layer height was found. This finding paves the way to new applications of satellite remote sensing products.

  10. Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry

    Science.gov (United States)

    Lukomski, Michal; Krzemien, Leszek

    2013-05-01

    Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.

  11. ANNOTATION SUPPORTED OCCLUDED OBJECT TRACKING

    Directory of Open Access Journals (Sweden)

    Devinder Kumar

    2012-08-01

    Full Text Available Tracking occluded objects at different depths has become as extremely important component of study for any video sequence having wide applications in object tracking, scene recognition, coding, editing the videos and mosaicking. The paper studies the ability of annotation to track the occluded object based on pyramids with variation in depth further establishing a threshold at which the ability of the system to track the occluded object fails. Image annotation is applied on 3 similar video sequences varying in depth. In the experiment, one bike occludes the other at a depth of 60cm, 80cm and 100cm respectively. Another experiment is performed on tracking humans with similar depth to authenticate the results. The paper also computes the frame by frame error incurred by the system, supported by detailed simulations. This system can be effectively used to analyze the error in motion tracking and further correcting the error leading to flawless tracking. This can be of great interest to computer scientists while designing surveillance systems etc.

  12. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  13. Object-Oriented Programming in the Development of Containment Analysis Code

    International Nuclear Information System (INIS)

    Han, Tae Young; Hong, Soon Joon; Hwang, Su Hyun; Lee, Byung Chul; Byun, Choong Sup

    2009-01-01

    After the mid 1980s, the new programming concept, Object-Oriented Programming (OOP), was introduced and designed, which has the features such as the information hiding, encapsulation, modularity and inheritance. These offered much more convenient programming paradigm to code developers. The OOP concept was readily developed into the programming language as like C++ in the 1990s and is being widely used in the modern software industry. In this paper, we show that the OOP concept is successfully applicable to the development of safety analysis code for containment and propose the more explicit and easy OOP design for developers

  14. Factors Associated with Variations in Population HIV Prevalence across West Africa: Findings from an Ecological Analysis

    Science.gov (United States)

    Prudden, Holly J.; Beattie, Tara S.; Bobrova, Natalia; Panovska-Griffiths, Jasmina; Mukandavire, Zindoga; Gorgens, Marelize; Wilson, David; Watts, Charlotte H.

    2015-01-01

    Background Population HIV prevalence across West Africa varies substantially. We assess the national epidemiological and behavioural factors associated with this. Methods National, urban and rural data on HIV prevalence, the percentage of younger (15–24) and older (25–49) women and men reporting multiple (2+) partners in the past year, HIV prevalence among female sex workers (FSWs), men who have bought sex in the past year (clients), and ART coverage, were compiled for 13 countries. An Ecological analysis using linear regression assessed which factors are associated with national variations in population female and male HIV prevalence, and with each other. Findings National population HIV prevalence varies between 0 4–2 9% for men and 0 4–5.6% for women. ART coverage ranges from 6–23%. National variations in HIV prevalence are not shown to be associated with variations in HIV prevalence among FSWs or clients. Instead they are associated with variations in the percentage of younger and older males and females reporting multiple partners. HIV prevalence is weakly negatively associated with ART coverage, implying it is not increased survival that is the cause of variations in HIV prevalence. FSWs and younger female HIV prevalence are associated with client population sizes, especially older men. Younger female HIV prevalence is strongly associated with older male and female HIV prevalence. Interpretation In West Africa, population HIV prevalence is not significantly higher in countries with high FSW HIV prevalence. Our analysis suggests, higher prevalence occurs where more men buy sex, and where a higher percentage of younger women, and older men and women have multiple partnerships. If a sexual network between clients and young females exists, clients may potentially bridge infection to younger females. HIV prevention should focus both on commercial sex and transmission between clients and younger females with multiple partners. PMID:26698854

  15. Factors Associated with Variations in Population HIV Prevalence across West Africa: Findings from an Ecological Analysis.

    Directory of Open Access Journals (Sweden)

    Holly J Prudden

    Full Text Available Population HIV prevalence across West Africa varies substantially. We assess the national epidemiological and behavioural factors associated with this.National, urban and rural data on HIV prevalence, the percentage of younger (15-24 and older (25-49 women and men reporting multiple (2+ partners in the past year, HIV prevalence among female sex workers (FSWs, men who have bought sex in the past year (clients, and ART coverage, were compiled for 13 countries. An Ecological analysis using linear regression assessed which factors are associated with national variations in population female and male HIV prevalence, and with each other.National population HIV prevalence varies between 0 4-2 9% for men and 0 4-5.6% for women. ART coverage ranges from 6-23%. National variations in HIV prevalence are not shown to be associated with variations in HIV prevalence among FSWs or clients. Instead they are associated with variations in the percentage of younger and older males and females reporting multiple partners. HIV prevalence is weakly negatively associated with ART coverage, implying it is not increased survival that is the cause of variations in HIV prevalence. FSWs and younger female HIV prevalence are associated with client population sizes, especially older men. Younger female HIV prevalence is strongly associated with older male and female HIV prevalence.In West Africa, population HIV prevalence is not significantly higher in countries with high FSW HIV prevalence. Our analysis suggests, higher prevalence occurs where more men buy sex, and where a higher percentage of younger women, and older men and women have multiple partnerships. If a sexual network between clients and young females exists, clients may potentially bridge infection to younger females. HIV prevention should focus both on commercial sex and transmission between clients and younger females with multiple partners.

  16. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  17. Remarks on Causative Verbs and Object Deletion in English

    OpenAIRE

    Onozuka, Hiromi

    2007-01-01

    Rappaport Hovav and Levin (1998) contend that result verbs disallow object deletion becauseof their lexical semantic properties. Their point is that the distinction between result verbs andmanner verbs with their different event structure representation constitutes the important factorwhich dictates the possibility of the variation of argument realization, of which object deletionrepresents one instance. Responding to their claim, Goldberg (2001) presents the evidencewhich mainly concerns the...

  18. Sexual Orientation and Spatial Position Effects on Selective Forms of Object Location Memory

    Science.gov (United States)

    Rahman, Qazi; Newland, Cherie; Smyth, Beatrice Mary

    2011-01-01

    Prior research has demonstrated robust sex and sexual orientation-related differences in object location memory in humans. Here we show that this sexual variation may depend on the spatial position of target objects and the task-specific nature of the spatial array. We tested the recovery of object locations in three object arrays (object…

  19. An analysis of cross-sectional variations in total household energy requirements in India using micro survey data

    International Nuclear Information System (INIS)

    Pachauri, Shonali

    2004-01-01

    Using micro level household survey data from India, we analyse the variation in the pattern and quantum of household energy requirements, both direct and indirect, and the factors causing such variation. An econometric analysis using household survey data from India for the year 1993-1994 reveals that household socio-economic, demographic, geographic, family and dwelling attributes influence the total household energy requirements. There are also large variations in the pattern of energy requirements across households belonging to different expenditure classes. Results from the econometric estimation show that total household expenditure or income level is the most important explanatory variable causing variation in energy requirements across households. In addition, the size of the household dwelling and the age of the head of the household are related to higher household energy requirements. In contrast, the number of members in the household and literacy of the head are associated with lower household energy requirements

  20. An analysis of cross-sectional variations in total household energy requirements in India using micro survey data

    Energy Technology Data Exchange (ETDEWEB)

    Pachauri, Shonali E-mail: shonali.pachauri@cepe.mavt.ethz.ch

    2004-10-01

    Using micro level household survey data from India, we analyse the variation in the pattern and quantum of household energy requirements, both direct and indirect, and the factors causing such variation. An econometric analysis using household survey data from India for the year 1993-1994 reveals that household socio-economic, demographic, geographic, family and dwelling attributes influence the total household energy requirements. There are also large variations in the pattern of energy requirements across households belonging to different expenditure classes. Results from the econometric estimation show that total household expenditure or income level is the most important explanatory variable causing variation in energy requirements across households. In addition, the size of the household dwelling and the age of the head of the household are related to higher household energy requirements. In contrast, the number of members in the household and literacy of the head are associated with lower household energy requirements.

  1. Analysis and functional characterization of sequence variations in ligand binding domain of thyroid hormone receptors in autism spectrum disorder (ASD) patients.

    Science.gov (United States)

    Kalikiri, Mahesh Kumar; Mamidala, Madhu Poornima; Rao, Ananth N; Rajesh, Vidya

    2017-12-01

    Autism spectrum disorder (ASD) is a neuro developmental disorder, reported to be on a rise in the past two decades. Thyroid hormone-T3 plays an important role in early embryonic and central nervous system development. T3 mediates its function by binding to thyroid hormone receptors, TRα and TRβ. Alterations in T3 levels and thyroid receptor mutations have been earlier implicated in neuropsychiatric disorders and have been linked to environmental toxins. Limited reports from earlier studies have shown the effectiveness of T3 treatment with promising results in children with ASD and that the thyroid hormone levels in these children was also normal. This necessitates the need to explore the genetic variations in the components of the thyroid hormone pathway in ASD children. To achieve this objective, we performed genetic analysis of ligand binding domain of THRA and THRB receptor genes in 30 ASD subjects and in age matched controls from India. Our study for the first time reports novel single nucleotide polymorphisms in the THRA and THRB receptor genes of ASD individuals. Autism Res 2017, 10: 1919-1928. ©2017 International Society for Autism Research, Wiley Periodicals, Inc. Thyroid hormone (T3) and thyroid receptors (TRα and TRβ) are the major components of the thyroid hormone pathway. The link between thyroid pathway and neuronal development is proven in clinical medicine. Since the thyroid hormone levels in Autistic children are normal, variations in their receptors needs to be explored. To achieve this objective, changes in THRA and THRB receptor genes was studied in 30 ASD and normal children from India. The impact of some of these mutations on receptor function was also studied. © 2017 International Society for Autism Research, Wiley Periodicals, Inc.

  2. Objective function analysis for electric soundings (VES), transient electromagnetic soundings (TEM) and joint inversion VES/TEM

    Science.gov (United States)

    Bortolozo, Cassiano Antonio; Bokhonok, Oleg; Porsani, Jorge Luís; Monteiro dos Santos, Fernando Acácio; Diogo, Liliana Alcazar; Slob, Evert

    2017-11-01

    Ambiguities in geophysical inversion results are always present. How these ambiguities appear in most cases open to interpretation. It is interesting to investigate ambiguities with regard to the parameters of the models under study. Residual Function Dispersion Map (RFDM) can be used to differentiate between global ambiguities and local minima in the objective function. We apply RFDM to Vertical Electrical Sounding (VES) and TEM Sounding inversion results. Through topographic analysis of the objective function we evaluate the advantages and limitations of electrical sounding data compared with TEM sounding data, and the benefits of joint inversion in comparison with the individual methods. The RFDM analysis proved to be a very interesting tool for understanding the joint inversion method of VES/TEM. Also the advantage of the applicability of the RFDM analyses in real data is explored in this paper to demonstrate not only how the objective function of real data behaves but the applicability of the RFDM approach in real cases. With the analysis of the results, it is possible to understand how the joint inversion can reduce the ambiguity of the methods.

  3. Statistical motion vector analysis for object tracking in compressed video streams

    Science.gov (United States)

    Leny, Marc; Prêteux, Françoise; Nicholson, Didier

    2008-02-01

    Compressed video is the digital raw material provided by video-surveillance systems and used for archiving and indexing purposes. Multimedia standards have therefore a direct impact on such systems. If MPEG-2 used to be the coding standard, MPEG-4 (part 2) has now replaced it in most installations, and MPEG-4 AVC/H.264 solutions are now being released. Finely analysing the complex and rich MPEG-4 streams is a challenging issue addressed in that paper. The system we designed is based on five modules: low-resolution decoder, motion estimation generator, object motion filtering, low-resolution object segmentation, and cooperative decision. Our contributions refer to as the statistical analysis of the spatial distribution of the motion vectors, the computation of DCT-based confidence maps, the automatic motion activity detection in the compressed file and a rough indexation by dedicated descriptors. The robustness and accuracy of the system are evaluated on a large corpus (hundreds of hours of in-and outdoor videos with pedestrians and vehicles). The objective benchmarking of the performances is achieved with respect to five metrics allowing to estimate the error part due to each module and for different implementations. This evaluation establishes that our system analyses up to 200 frames (720x288) per second (2.66 GHz CPU).

  4. Robust video object cosegmentation.

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing; Li, Xuelong; Porikli, Fatih

    2015-10-01

    With ever-increasing volumes of video data, automatic extraction of salient object regions became even more significant for visual analytic solutions. This surge has also opened up opportunities for taking advantage of collective cues encapsulated in multiple videos in a cooperative manner. However, it also brings up major challenges, such as handling of drastic appearance, motion pattern, and pose variations, of foreground objects as well as indiscriminate backgrounds. Here, we present a cosegmentation framework to discover and segment out common object regions across multiple frames and multiple videos in a joint fashion. We incorporate three types of cues, i.e., intraframe saliency, interframe consistency, and across-video similarity into an energy optimization framework that does not make restrictive assumptions on foreground appearance and motion model, and does not require objects to be visible in all frames. We also introduce a spatio-temporal scale-invariant feature transform (SIFT) flow descriptor to integrate across-video correspondence from the conventional SIFT-flow into interframe motion flow from optical flow. This novel spatio-temporal SIFT flow generates reliable estimations of common foregrounds over the entire video data set. Experimental results show that our method outperforms the state-of-the-art on a new extensive data set (ViCoSeg).

  5. Characterization of analysis activity in the development of object-oriented software. Application to a examination system in nuclear medicine

    International Nuclear Information System (INIS)

    Bayas, Marcos Raul Cordova.

    1995-01-01

    The object-oriented approach, formerly proposed as an alternative to conventional software coding techniques, has expanded its scope to other phases in software development, including the analysis phase. This work discusses basic concepts and major object oriented analysis methods, drawing comparisons with structured analysis, which has been the dominant paradigm in systems analysis. The comparison is based on three interdependent system aspects, that must be specified during the analysis phase: data, control and functionality. The specification of a radioisotope examination archive system is presented as a case study. (author). 45 refs., 87 figs., 1 tab

  6. Variational method for integrating radial gradient field

    Science.gov (United States)

    Legarda-Saenz, Ricardo; Brito-Loeza, Carlos; Rivera, Mariano; Espinosa-Romero, Arturo

    2014-12-01

    We propose a variational method for integrating information obtained from circular fringe pattern. The proposed method is a suitable choice for objects with radial symmetry. First, we analyze the information contained in the fringe pattern captured by the experimental setup and then move to formulate the problem of recovering the wavefront using techniques from calculus of variations. The performance of the method is demonstrated by numerical experiments with both synthetic and real data.

  7. Amazing variational approach to chemical reactions

    OpenAIRE

    Fernández, Francisco M.

    2009-01-01

    In this letter we analyse an amazing variational approach to chemical reactions. Our results clearly show that the variational expressions are unsuitable for the analysis of empirical data obtained from chemical reactions.

  8. Pan-Genome Analysis Links the Hereditary Variation of Leptospirillum ferriphilum With Its Evolutionary Adaptation

    Directory of Open Access Journals (Sweden)

    Xian Zhang

    2018-03-01

    Full Text Available Niche adaptation has long been recognized to drive intra-species differentiation and speciation, yet knowledge about its relatedness with hereditary variation of microbial genomes is relatively limited. Using Leptospirillum ferriphilum species as a case study, we present a detailed analysis of genomic features of five recognized strains. Genome-to-genome distance calculation preliminarily determined the roles of spatial distance and environmental heterogeneity that potentially contribute to intra-species variation within L. ferriphilum species at the genome level. Mathematical models were further constructed to extrapolate the expansion of L. ferriphilum genomes (an ‘open’ pan-genome, indicating the emergence of novel genes with new sequenced genomes. The identification of diverse mobile genetic elements (MGEs (such as transposases, integrases, and phage-associated genes revealed the prevalence of horizontal gene transfer events, which is an important evolutionary mechanism that provides avenues for the recruitment of novel functionalities and further for the genetic divergence of microbial genomes. Comprehensive analysis also demonstrated that the genome reduction by gene loss in a broad sense might contribute to the observed diversification. We thus inferred a plausible explanation to address this observation: the community-dependent adaptation that potentially economizes the limiting resources of the entire community. Now that the introduction of new genes is accompanied by a parallel abandonment of some other ones, our results provide snapshots on the biological fitness cost of environmental adaptation within the L. ferriphilum genomes. In short, our genome-wide analyses bridge the relation between genetic variation of L. ferriphilum with its evolutionary adaptation.

  9. Parametric analysis of energy quality management for district in China using multi-objective optimization approach

    International Nuclear Information System (INIS)

    Lu, Hai; Yu, Zitao; Alanne, Kari; Xu, Xu; Fan, Liwu; Yu, Han; Zhang, Liang; Martinac, Ivo

    2014-01-01

    Highlights: • A time-effective multi-objective design optimization scheme is proposed. • The scheme aims at exploring suitable 3E energy system for the specific case. • A realistic case located in China is used for the analysis. • Parametric study is investigated to test the effects of different parameters. - Abstract: Due to the increasing energy demands and global warming, energy quality management (EQM) for districts has been getting importance over the last few decades. The evaluation of the optimum energy systems for specific districts is an essential part of EQM. This paper presents a deep analysis of the optimum energy systems for a district sited in China. A multi-objective optimization approach based on Genetic Algorithm (GA) is proposed for the analysis. The optimization process aims to search for the suitable 3E (minimum economic cost and environmental burden as well as maximum efficiency) energy systems. Here, life cycle CO 2 equivalent (LCCO 2 ), life cycle cost (LCC) and exergy efficiency (EE) are set as optimization objectives. Then, the optimum energy systems for the Chinese case are presented. The final work is to investigate the effects of different energy parameters. The results show the optimum energy systems might vary significantly depending on some parameters

  10. Statistical analysis of activation and reaction energies with quasi-variational coupled-cluster theory

    Science.gov (United States)

    Black, Joshua A.; Knowles, Peter J.

    2018-06-01

    The performance of quasi-variational coupled-cluster (QV) theory applied to the calculation of activation and reaction energies has been investigated. A statistical analysis of results obtained for six different sets of reactions has been carried out, and the results have been compared to those from standard single-reference methods. In general, the QV methods lead to increased activation energies and larger absolute reaction energies compared to those obtained with traditional coupled-cluster theory.

  11. Quasi-static Cycle Performance Analysis of Micro Modular Reactor for Heat Sink Temperature Variation

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Seong Kuk; Lee, Jekyoung; Ahn, Yoonhan; Lee, Jeong Ik [KAIST, Daejeon (Korea, Republic of); Cha, Jae Eun [KAERI, Daejeon (Korea, Republic of)

    2015-10-15

    A Supercritical CO{sub 2} (S-CO{sub 2}) cycle has potential for high thermal efficiency in the moderate turbine inlet temperature (450 - 750 .deg. C) and achieving compact system size because of small specific volume and simple cycle layouts. Owing to small specific volume of S-CO{sub 2} and the development of heat exchanger technology, it can accomplish complete modularization of the system. The previous works focused on the cycle performance analysis for the design point only. However, the heat sink temperature can be changed depending on the ambient atmosphere condition, i.e. weather, seasonal change. This can influence the compressor inlet temperature, which alters the cycle operating condition overall. To reflect the heat sink temperature variation, a quasi-static analysis code for a simple recuperated S-CO{sub 2} Brayton cycle has been developed by the KAIST research team. Thus, cycle performance analysis is carried out with a compressor inlet temperature variation in this research. In the case of dry air-cooling system, the ambient temperature of the local surrounding can affect the compressor inlet temperature. As the compressor inlet temperature increases, thermal efficiency and generated electricity decrease. As further works, the experiment of S-CO{sub 2} integral test loop will be performed to validate in-house codes, such as KAIST{sub T}MD and the quasi-static code.

  12. Thirty-day Readmission Rates and Associated Factors: A Multilevel Analysis of Practice Variations in French Public Psychiatry.

    Science.gov (United States)

    Gandré, Coralie; Gervaix, Jeanne; Thillard, Julien; Macé, Jean-Marc; Roelandt, Jean-Luc; Chevreul, Karine

    2018-03-01

    Inpatient psychiatric readmissions are often used as an indicator of the quality of care and their reduction is in line with international recommendations for mental health care. Research on variations in inpatient readmission rates among mental health care providers is therefore of key importance as these variations can impact equity, quality and efficiency of care when they do not result from differences in patients' needs. Our objectives were first to describe variations in inpatient readmission rates between public mental health care providers in France on a nationwide scale, and second, to identify their association with patient, health care providers and environment characteristics. We carried out a study for the year 2012 using data from ten administrative national databases. 30-day readmissions in inpatient care were identified in the French national psychiatric discharge database. Variations were described numerically and graphically between French psychiatric sectors and factors associated with these variations were identified by carrying out a multi-level logistic regression accounting for the hierarchical structure of the data. Significant practice variations in 30-day inpatient readmission rates were observed with a coefficient of variation above 50%. While a majority of those variations was related to differences within sectors, individual patient characteristics explained a lower part of the variations resulting from differences between sectors than the characteristics of sectors and of their environment. In particular, an increase in the mortality rate and in the acute admission rate for somatic disorders in sectors' catchment area was associated with a decrease in the probability of 30-day readmission. Similarly, an increase in the number of psychiatric inpatient beds in private for-profit hospitals per 1,000 inhabitants in sectors' catchment area was associated with a decrease in this probability, which also varied with overall sectors' case

  13. The Influence of Weather Variation, Urban Design and Built Environment on Objectively Measured Sedentary Behaviour in Children.

    Science.gov (United States)

    Katapally, Tarun Reddy; Rainham, Daniel; Muhajarine, Nazeem

    2016-01-01

    With emerging evidence indicating that independent of physical activity, sedentary behaviour (SB) can be detrimental to health, researchers are increasingly aiming to understand the influence of multiple contexts such as urban design and built environment on SB. However, weather variation, a factor that continuously interacts with all other environmental variables, has been consistently underexplored. This study investigated the influence of diverse environmental exposures (including weather variation, urban design and built environment) on SB in children. This cross-sectional observational study is part of an active living research initiative set in the Canadian prairie city of Saskatoon. Saskatoon's neighbourhoods were classified based on urban street design into grid-pattern, fractured grid-pattern and curvilinear types of neighbourhoods. Diverse environmental exposures were measured including, neighbourhood built environment, and neighbourhood and household socioeconomic environment. Actical accelerometers were deployed between April and June 2010 (spring-summer) to derive SB of 331 10-14 year old children in 25 one week cycles. Each cycle of accelerometry was conducted on a different cohort of children within the total sample. Accelerometer data were matched with localized weather patterns derived from Environment Canada weather data. Multilevel modeling using Hierarchical Linear and Non-linear Modeling software was conducted by factoring in weather variation to depict the influence of diverse environmental exposures on SB. Both weather variation and urban design played a significant role in SB. After factoring in weather variation, it was observed that children living in grid-pattern neighbourhoods closer to the city centre (with higher diversity of destinations) were less likely to be sedentary. This study demonstrates a methodology that could be replicated to integrate geography-specific weather patterns with existing cross-sectional accelerometry data to

  14. OBJECTIVE EVALUATION OF HYPERACTIVATED MOTILITY IN RAT SPERMATOZA USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Science.gov (United States)

    Objective evaluation of hyperactivated motility in rat spermatozoa using computer-assisted sperm analysis.Cancel AM, Lobdell D, Mendola P, Perreault SD.Toxicology Program, University of North Carolina, Chapel Hill, NC 27599, USA.The aim of this study was t...

  15. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Science.gov (United States)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  16. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    Directory of Open Access Journals (Sweden)

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  17. Biological variation of total prostate-specific antigen

    DEFF Research Database (Denmark)

    Söletormos, Georg; Semjonow, Axel; Sibley, Paul E C

    2005-01-01

    BACKGROUND: The objectives of this study were to determine whether a single result for total prostate-specific antigen (tPSA) can be used confidently to guide the need for prostate biopsy and by how much serial tPSA measurements must differ to be significant. tPSA measurements include both...... analytical and biological components of variation. The European Group on Tumor Markers conducted a literature survey to determine both the magnitude and impact of biological variation on single, the mean of replicate, and serial tPSA measurements. METHODS: The survey yielded 27 studies addressing the topic......, and estimates for the biological variation of tPSA could be derived from 12 of these studies. RESULTS: The mean biological variation was 20% in the concentration range 0.1-20 microg/L for men over 50 years. The biological variation means that the one-sided 95% confidence interval (CI) of the dispersion...

  18. EGYPTIAN MUTUAL FUNDS ANALYSIS: HISTORY, PERFORMANCE, OBJECTIVES, RISK AND RETURN

    Directory of Open Access Journals (Sweden)

    Petru STEFEA

    2013-10-01

    Full Text Available The present research aims to overview the mutual fund in Egypt. The establishment of the first mutual funds was achieved in 1994. Nowadays, the total mutual funds reached 90 funds , approximately. The income funds represent the largest share of the Egyptian mutual funds (40%, growth funds (25% and the private equity funds is at least (1%. The total population of the Egyptian mutual funds reached 22. Finally, the study proved that the Egyptian mutual funds have an impact on fund return , total risk and systemic; when analysis relationship between risk and return. The study found influencing for mutual fund's objectives on Sharpe and Terynor ratios.

  19. [Analysis of genomic copy number variations in two sisters with primary amenorrhea and hyperandrogenism].

    Science.gov (United States)

    Zhang, Yanliang; Xu, Qiuyue; Cai, Xuemei; Li, Yixun; Song, Guibo; Wang, Juan; Zhang, Rongchen; Dai, Yong; Duan, Yong

    2015-12-01

    To analyze genomic copy number variations (CNVs) in two sisters with primary amenorrhea and hyperandrogenism. G-banding was performed for karyotype analysis. The whole genome of the two sisters were scanned and analyzed by array-based comparative genomic hybridization (array-CGH). The results were confirmed with real-time quantitative PCR (RT-qPCR). No abnormality was found by conventional G-banded chromosome analysis. Array-CGH has identified 11 identical CNVs from the sisters which, however, overlapped with CNVs reported by the Database of Genomic Variants (http://projects.tcag.ca/variation/). Therefore, they are likely to be benign. In addition, a -8.44 Mb 9p11.1-p13.1 duplication (38,561,587-47,002,387 bp, hg18) and a -80.9 kb 4q13.2 deletion (70,183,990-70,264,889 bp, hg18) were also detected in the elder and younger sister, respectively. The relationship between such CNVs and primary amenorrhea and hyperandrogenism was however uncertain. RT-qPCR results were in accordance with array-CGH. Two CNVs were detected in two sisters by array-CGH, for which further studies are needed to clarify their correlation with primary amenorrhea and hyperandrogenism.

  20. Objectivity And Moral Relativism

    OpenAIRE

    Magni, Sergio Filippo

    2017-01-01

    The relativity of morals has usually been taken as an argument against the objectivity of ethics. However, a more careful analysis can show that there are forms of moral objectivism which have relativistic implications, and that moral relativism can be compatible with the objectivity of ethics. Such an objectivity is not always in contrast to moral relativism and it is possible to be relativists without having to give up the claim of objectivity in ethics

  1. Measurement and Socio-Demographic Variation of Social Capital in a Large Population-Based Survey

    Science.gov (United States)

    Nieminen, Tarja; Martelin, Tuija; Koskinen, Seppo; Simpura, Jussi; Alanen, Erkki; Harkanen, Tommi; Aromaa, Arpo

    2008-01-01

    Objectives: The main objective of this study was to describe the variation of individual social capital according to socio-demographic factors, and to develop a suitable way to measure social capital for this purpose. The similarity of socio-demographic variation between the genders was also assessed. Data and methods: The study applied…

  2. Object of desire self-consciousness theory.

    Science.gov (United States)

    Bogaert, Anthony F; Brotto, Lori A

    2014-01-01

    In this article, the authors discuss the construct of object of desire self-consciousness, the perception that one is romantically and sexually desirable in another's eyes. The authors discuss the nature of the construct, variations in its expression, and how it may function as part of a self-schemata or script related to romance and sexuality. The authors suggest that object of desire self-consciousness may be an adaptive, evolved psychological mechanism allowing sexual and romantic tactics suitable to one's mate value. The authors also suggest that it can act as a signal that one has high mate value in the sexual marketplace. The authors then review literature (e.g., on fantasies, on sexual activity preferences, on sexual dysfunctions, on language) suggesting that object of desire self-consciousness plays a particularly important role in heterosexual women's sexual/romantic functioning and desires.

  3. Designing and optimising anaerobic digestion systems: A multi-objective non-linear goal programming approach

    International Nuclear Information System (INIS)

    Nixon, J.D.

    2016-01-01

    This paper presents a method for optimising the design parameters of an anaerobic digestion (AD) system by using first-order kinetics and multi-objective non-linear goal programming. A model is outlined that determines the ideal operating tank temperature and hydraulic retention time, based on objectives for minimising levelised cost of electricity, and maximising energy potential and feedstock mass reduction. The model is demonstrated for a continuously stirred tank reactor processing food waste in two case study locations. These locations are used to investigate the influence of different environmental and economic climates on optimal conditions. A sensitivity analysis is performed to further examine the variation in optimal results for different financial assumptions and objective weightings. The results identify the conditions for the preferred tank temperature to be in the psychrophilic, mesophilic or thermophilic range. For a tank temperature of 35 °C, ideal hydraulic retention times, in terms of achieving a minimum levelised electricity cost, were found to range from 29.9 to 33 days. Whilst there is a need for more detailed information on rate constants for use in first-order models, multi-objective optimisation modelling is considered to be a promising option for AD design. - Highlights: • Nonlinear goal programming is used to optimise anaerobic digestion systems. • Multiple objectives are set including minimising the levelised cost of electricity. • A model is developed and applied to case studies for the UK and India. • Optimal decisions are made for tank temperature and retention time. • A sensitivity analysis is carried out to investigate different model objectives.

  4. A formalism for the calculus of variations with spinors

    Energy Technology Data Exchange (ETDEWEB)

    Bäckdahl, Thomas, E-mail: thobac@chalmers.se [The School of Mathematics, University of Edinburgh, JCMB 6228, Peter Guthrie Tait Road, Edinburgh EH9 3FD, United Kingdom and Mathematical Sciences - Chalmers University of Technology and University of Gothenburg - SE-412 96 Gothenburg (Sweden); Valiente Kroon, Juan A., E-mail: j.a.valiente-kroon@qmul.ac.uk [School of Mathematical Sciences, Queen Mary, University of London, Mile End Road, London E1 4NS (United Kingdom)

    2016-02-15

    We develop a frame and dyad gauge-independent formalism for the calculus of variations of functionals involving spinorial objects. As a part of this formalism, we define a modified variation operator which absorbs frame and spin dyad gauge terms. This formalism is applicable to both the standard spacetime (i.e., SL(2, ℂ)) 2-spinors as well as to space (i.e., SU(2, ℂ)) 2-spinors. We compute expressions for the variations of the connection and the curvature spinors.

  5. A formalism for the calculus of variations with spinors

    International Nuclear Information System (INIS)

    Bäckdahl, Thomas; Valiente Kroon, Juan A.

    2016-01-01

    We develop a frame and dyad gauge-independent formalism for the calculus of variations of functionals involving spinorial objects. As a part of this formalism, we define a modified variation operator which absorbs frame and spin dyad gauge terms. This formalism is applicable to both the standard spacetime (i.e., SL(2, ℂ)) 2-spinors as well as to space (i.e., SU(2, ℂ)) 2-spinors. We compute expressions for the variations of the connection and the curvature spinors

  6. Challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials

    Directory of Open Access Journals (Sweden)

    Yunbo Chu

    2016-10-01

    Full Text Available Economic evaluation in the form of cost-effectiveness analysis has become a popular means to inform decisions in healthcare. With multi-regional clinical trials in a global development program becoming a new venue for drug efficacy testing in recent decades, questions in methods for cost-effectiveness analysis in the multi-regional clinical trials setting also emerge. This paper addresses some challenges from variation across regions in cost effectiveness analysis in multi-regional clinical trials. Several discussion points are raised for further attention and a multi-regional clinical trial example is presented to illustrate the implications in industrial application. A general message is delivered to call for a depth discussion by all stakeholders to reach an agreement on a good practice in cost-effectiveness analysis in the multi-regional clinical trials. Meanwhile, we recommend an additional consideration of cost-effectiveness analysis results based on the clinical evidence from a certain homogeneous population as sensitivity or scenario analysis upon data availability.

  7. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Science.gov (United States)

    Dean, Stephen O.

    1988-03-01

    Fusion is an essentially inexhaustible source of energy that has the potential for economically attractive commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion-energy development program is the generation of centralstation electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high-energy neutrons suggests potentially unique applications. These include breeding of fissile fuels, production of hydrogen and other chemical products, transmutation or “burning” of various nuclear or chemical wastes, radiation processing of materials, production of radioisotopes, food preservation, medical diagnosis and medical treatment, and space power and space propulsion. In addition, fusion R&D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other hand, are the two primary criteria for setting long-range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R&D program toward practical applications. The transfer of fusion technology and skills from the national laboratories and universities to industry is the key to achieving the long-range objective of commercial fusion applications.

  8. Extrapolating cosmic ray variations and impacts on life: Morlet wavelet analysis

    Science.gov (United States)

    Zarrouk, N.; Bennaceur, R.

    2009-07-01

    Exposure to cosmic rays may have both a direct and indirect effect on Earth's organisms. The radiation may lead to higher rates of genetic mutations in organisms, or interfere with their ability to repair DNA damage, potentially leading to diseases such as cancer. Increased cloud cover, which may cool the planet by blocking out more of the Sun's rays, is also associated with cosmic rays. They also interact with molecules in the atmosphere to create nitrogen oxide, a gas that eats away at our planet's ozone layer, which protects us from the Sun's harmful ultraviolet rays. On the ground, humans are protected from cosmic particles by the planet's atmosphere. In this paper we give estimated results of wavelet analysis from solar modulation and cosmic ray data incorporated in time-dependent cosmic ray variation. Since solar activity can be described as a non-linear chaotic dynamic system, methods such as neural networks and wavelet methods should be very suitable analytical tools. Thus we have computed our results using Morlet wavelets. Many have used wavelet techniques for studying solar activity. Here we have analysed and reconstructed cosmic ray variation, and we have better depicted periods or harmonics other than the 11-year solar modulation cycles.

  9. Nictemeral Variation of Physical Chemical and Biological Parameters of Ribeirão das Cruzes, Araraquara-SP

    Directory of Open Access Journals (Sweden)

    Vitor Rocha Santos

    2015-12-01

    Full Text Available Improper use of water, its degradation and irregular distribution can affect the quantity and quality needed for future generations, as well as create conflicts of interest between the industrial, urban and agricultural segments. In this context, it is of great importance the realization of studies on the quality of the hydric resources based on the analysis of temporal variation of limnological parameters. This study was conducted in the sub-basin of Ribeirão das Cruzes, which contributes to the water supply of the city of Araraquara (SP around 30% of all water captured and offered to the population. The objective of this research was to compare the water quality of river upstream and downstream of effluent discharge from a local treatment station in a 24 hour period (diurnal cycle variation. Data collection, comprising the period of one day, was done in order to observe the dynamics of operation and range of variation of the ecological processes in the studied system. The parameters analyzed showed significant variations in the sections of the upstream and downstream from the effluent discharge. With the nictemeral analysis it is evident the influence of effluents on the the waters of Ribeirão das Cruzes, especially during certain periods of the day.

  10. Meningococcal genetic variation mechanisms viewed through comparative analysis of serogroup C strain FAM18.

    Directory of Open Access Journals (Sweden)

    Stephen D Bentley

    2007-02-01

    Full Text Available The bacterium Neisseria meningitidis is commonly found harmlessly colonising the mucosal surfaces of the human nasopharynx. Occasionally strains can invade host tissues causing septicaemia and meningitis, making the bacterium a major cause of morbidity and mortality in both the developed and developing world. The species is known to be diverse in many ways, as a product of its natural transformability and of a range of recombination and mutation-based systems. Previous work on pathogenic Neisseria has identified several mechanisms for the generation of diversity of surface structures, including phase variation based on slippage-like mechanisms and sequence conversion of expressed genes using information from silent loci. Comparison of the genome sequences of two N. meningitidis strains, serogroup B MC58 and serogroup A Z2491, suggested further mechanisms of variation, including C-terminal exchange in specific genes and enhanced localised recombination and variation related to repeat arrays. We have sequenced the genome of N. meningitidis strain FAM18, a representative of the ST-11/ET-37 complex, providing the first genome sequence for the disease-causing serogroup C meningococci; it has 1,976 predicted genes, of which 60 do not have orthologues in the previously sequenced serogroup A or B strains. Through genome comparison with Z2491 and MC58 we have further characterised specific mechanisms of genetic variation in N. meningitidis, describing specialised loci for generation of cell surface protein variants and measuring the association between noncoding repeat arrays and sequence variation in flanking genes. Here we provide a detailed view of novel genetic diversification mechanisms in N. meningitidis. Our analysis provides evidence for the hypothesis that the noncoding repeat arrays in neisserial genomes (neisserial intergenic mosaic elements provide a crucial mechanism for the generation of surface antigen variants. Such variation will have an

  11. OBJECT-SPACE MULTI-IMAGE MATCHING OF MOBILE-MAPPING-SYSTEM IMAGE SEQUENCES

    Directory of Open Access Journals (Sweden)

    Y. C. Chen

    2012-07-01

    Full Text Available This paper proposes an object-space multi-image matching procedure of terrestrial MMS (Mobile Mapping System image sequences to determine the coordinates of an object point automatically and reliably. This image matching procedure can be applied to find conjugate points of MMS image sequences efficiently. Conventional area-based image matching methods are not reliable to deliver accurate matching results for this application due to image scale variations, viewing angle variations, and object occlusions. In order to deal with these three matching problems, an object space multi-image matching is proposed. A modified NCC (Normalized Cross Correlation coefficient is proposed to measure the similarity of image patches. A modified multi-window matching procedure will also be introduced to solve the problem of object occlusion. A coarse-to-fine procedure with a combination of object-space multi-image matching and multi-window matching is adopted. The proposed procedure has been implemented for the purpose of matching terrestrial MMS image sequences. The ratio of correct matches of this experiment was about 80 %. By providing an approximate conjugate point in an overlapping image manually, most of the incorrect matches could be fixed properly and the ratio of correct matches was improved up to 98 %.

  12. A comparative analysis of pixel- and object-based detection of landslides from very high-resolution images

    Science.gov (United States)

    Keyport, Ren N.; Oommen, Thomas; Martha, Tapas R.; Sajinkumar, K. S.; Gierke, John S.

    2018-02-01

    A comparative analysis of landslides detected by pixel-based and object-oriented analysis (OOA) methods was performed using very high-resolution (VHR) remotely sensed aerial images for the San Juan La Laguna, Guatemala, which witnessed widespread devastation during the 2005 Hurricane Stan. A 3-band orthophoto of 0.5 m spatial resolution together with a 115 field-based landslide inventory were used for the analysis. A binary reference was assigned with a zero value for landslide and unity for non-landslide pixels. The pixel-based analysis was performed using unsupervised classification, which resulted in 11 different trial classes. Detection of landslides using OOA includes 2-step K-means clustering to eliminate regions based on brightness; elimination of false positives using object properties such as rectangular fit, compactness, length/width ratio, mean difference of objects, and slope angle. Both overall accuracy and F-score for OOA methods outperformed pixel-based unsupervised classification methods in both landslide and non-landslide classes. The overall accuracy for OOA and pixel-based unsupervised classification was 96.5% and 94.3%, respectively, whereas the best F-score for landslide identification for OOA and pixel-based unsupervised methods: were 84.3% and 77.9%, respectively.Results indicate that the OOA is able to identify the majority of landslides with a few false positive when compared to pixel-based unsupervised classification.

  13. Circle of willis and its variations; morphometric study in adult human cadavers

    OpenAIRE

    Raghavendra, Shirol VS, Daksha Dixit, Anil Kumar Reddy Y, Desai SP

    2014-01-01

    Background and Objectives: Circle of Willis plays a vital role in collateral circulation and redistribution of blood to all areas of the brain. Variation in circle of Willis is known to cause grave disorders like cerebrovascular disorders, subarachnoid haemorrhage, cerebral aneurysm and schizophrenia. The objectives of the present study are to study the formation and branching pattern of circle of Willis and also to study the distribution of variations. MATERIALS & Methods: The study was cond...

  14. Sources of Variation in Sweat Chloride Measurements in Cystic Fibrosis

    Science.gov (United States)

    Blackman, Scott M.; Raraigh, Karen S.; Corvol, Harriet; Rommens, Johanna M.; Pace, Rhonda G.; Boelle, Pierre-Yves; McGready, John; Sosnay, Patrick R.; Strug, Lisa J.; Knowles, Michael R.; Cutting, Garry R.

    2016-01-01

    Rationale: Expanding the use of cystic fibrosis transmembrane conductance regulator (CFTR) potentiators and correctors for the treatment of cystic fibrosis (CF) requires precise and accurate biomarkers. Sweat chloride concentration provides an in vivo assessment of CFTR function, but it is unknown the degree to which CFTR mutations account for sweat chloride variation. Objectives: To estimate potential sources of variation for sweat chloride measurements, including demographic factors, testing variability, recording biases, and CFTR genotype itself. Methods: A total of 2,639 sweat chloride measurements were obtained in 1,761 twins/siblings from the CF Twin-Sibling Study, French CF Modifier Gene Study, and Canadian Consortium for Genetic Studies. Variance component estimation was performed by nested mixed modeling. Measurements and Main Results: Across the tested CF population as a whole, CFTR gene mutations were found to be the primary determinant of sweat chloride variability (56.1% of variation) with contributions from variation over time (e.g., factors related to testing on different days; 13.8%), environmental factors (e.g., climate, family diet; 13.5%), other residual factors (e.g., test variability; 9.9%), and unique individual factors (e.g., modifier genes, unique exposures; 6.8%) (likelihood ratio test, P < 0.001). Twin analysis suggested that modifier genes did not play a significant role because the heritability estimate was negligible (H2 = 0; 95% confidence interval, 0.0–0.35). For an individual with CF, variation in sweat chloride was primarily caused by variation over time (58.1%) with the remainder attributable to residual/random factors (41.9%). Conclusions: Variation in the CFTR gene is the predominant cause of sweat chloride variation; most of the non-CFTR variation is caused by testing variability and unique environmental factors. If test precision and accuracy can be improved, sweat chloride measurement could be a valuable biomarker

  15. ART OF METALLOGRAPHY: POSSIBILITIES OF DARK-FIELD MICROSCOPY APPLICATION FOR COLORED OBJECTS STRUCTURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    A. G. Anisovich

    2015-01-01

    Full Text Available The application of the method of dark field microscopy for the study of colored objects of material technology was researched. The capability of corrosive damage analysis and determination of the thickness of the metal coating were demonstrated. The performance capability of analysis of «reflection» in the dark field during the study of non-metallic materials – orthopedic implants and fireclay refractory were tested. An example of defect detection of carbon coating was displayed.

  16. Application of LC-MS to the analysis of dyes in objects of historical interest

    Science.gov (United States)

    Zhang, Xian; Laursen, Richard

    2009-07-01

    High-performance liquid chromatography (HPLC) with photodiode array and mass spectrometric detection permits dyes extracted from objects of historical interest or from natural plant or animal dyestuffs to be characterized on the basis of three orthogonal properties: HPLC retention time, UV-visible spectrum and molecular mass. In the present study, we have focused primarily on yellow dyes, the bulk of which are flavonoid glycosides that would be almost impossible to characterize without mass spectrometric detection. Also critical for this analysis is a method for mild extraction of the dyes from objects (e.g., textiles) without hydrolyzing the glycosidic linkages. This was accomplished using 5% formic acid in methanol, rather than the more traditional 6 M HCl. Mass spectroscopy, besides providing the molecular mass of the dye molecule, sometimes yields additional structural data based on fragmentation patterns. In addition, coeluting compounds can often be detected using extracted ion chromatography. The utility of mass spectrometry is illustrated by the analysis of historical specimens of silk that had been dyed yellow with flavonoid glycosides from Sophora japonica (pagoda tree) and curcumins from Curcuma longa (turmeric). In addition, we have used these techniques to identify the dye type, and sometimes the specific dyestuff, in a variety of objects, including a yellow varnish from a 19th century Tibetan altar and a 3000-year-old wool mortuary textiles, from Xinjiang, China. We are using HPLC with diode array and mass spectrometric detection to create a library of analyzed dyestuffs (>200 so far; mostly plants) to serve as references for identification of dyes in objects of historical interest.

  17. Advances in Neutron Activation Analysis of Large Objects with Emphasis on Archaeological Examples. Results of a Coordinated Research Project

    International Nuclear Information System (INIS)

    2018-03-01

    This publication is a compilation of the main results and findings of an IAEA coordinated research project (CRP). In particular, it discusses an innovative variation of neutron activation analysis (NAA) known as large sample NAA (LSNAA). There is no other way to measure the bulk mass fractions of the elements present in a large sample (up to kilograms in mass) non-destructively. Examples amenable to LSNAA include irregularly shaped archaeological artefacts, excavated rock samples, large samples of assorted ore, and finished products, such as nuclear reactor components. The CRP focused primarily on the application of LSNAA in the areas of archaeology and geology; however it was also open for further exploration in other areas such as industry and life sciences as well as in basic research. The CRP contributed to establish the validation of the methodology, and, in particular, it provided an opportunity for developing trained manpower. The specific objectives of this CRP were to: i) Validate and optimize the experimental procedures for LSNAA applications in archaeology and geology; ii) Identify the needs for development or upgrade of the neutron irradiation facility for irradiation of large samples; iii) Develop and standardize data acquisition and data analysis systems; iv) Harmonize and standardize data collection from facilities with similar kind of instrumentation for further analysis and benchmarking. Advantages of LSNAA applications, limitations and scientific and technological requirements are described in this publication, which serves as a reference of interest not only to the NAA experts, research reactor personnel, and those considering this technique, but also to various stakeholders and users such as researchers, industrialists, environmental and legal experts, and administrators.

  18. Sensitivity analysis in oxidation ditch modelling: the effect of variations in stoichiometric, kinetic and operating parameters on the performance indices

    NARCIS (Netherlands)

    Abusam, A.A.A.; Keesman, K.J.; Straten, van G.; Spanjers, H.; Meinema, K.

    2001-01-01

    This paper demonstrates the application of the factorial sensitivity analysis methodology in studying the influence of variations in stoichiometric, kinetic and operating parameters on the performance indices of an oxidation ditch simulation model (benchmark). Factorial sensitivity analysis

  19. Object-oriented Method of Hierarchical Urban Building Extraction from High-resolution Remote-Sensing Imagery

    Directory of Open Access Journals (Sweden)

    TAO Chao

    2016-02-01

    Full Text Available An automatic urban building extraction method for high-resolution remote-sensing imagery,which combines building segmentation based on neighbor total variations with object-oriented analysis,is presented in this paper. Aimed at different extraction complexity from various buildings in the segmented image,a hierarchical building extraction strategy with multi-feature fusion is adopted. Firstly,we extract some rectangle buildings which remain intact after segmentation through shape analysis. Secondly,in order to ensure each candidate building target to be independent,multidirectional morphological road-filtering algorithm is designed which can separate buildings from the neighboring roads with similar spectrum. Finally,we take the extracted buildings and the excluded non-buildings as samples to establish probability model respectively,and Bayesian discriminating classifier is used for making judgment of the other candidate building objects to get the ultimate extraction result. The experimental results have shown that the approach is able to detect buildings with different structure and spectral features in the same image. The results of performance evaluation also support the robustness and precision of the approach developed.

  20. A Comparative Analysis of Structured and Object-Oriented ...

    African Journals Online (AJOL)

    The concepts of structured and object-oriented programming methods are not relatively new but these approaches are still very much useful and relevant in today's programming paradigm. In this paper, we distinguish the features of structured programs from that of object oriented programs. Structured programming is a ...

  1. Quantitative Analysis of Mixtures of Monoprotic Acids Applying Modified Model-Based Rank Annihilation Factor Analysis on Variation Matrices of Spectrophotometric Acid-Base Titrations

    Directory of Open Access Journals (Sweden)

    Ebrahim Ghorbani-Kalhor

    2015-04-01

    Full Text Available In the current work, a new version of rank annihilation factor analysis was developedto circumvent the rank deficiency problem in multivariate data measurements.Simultaneous determination of dissociation constant and concentration of monoprotic acids was performed by applying model-based rank annihilation factor analysis on variation matrices of spectrophotometric acid-base titrations data. Variation matrices can be obtained by subtracting first row of data matrix from all rows of the main data matrix. This method uses variation matrices instead of multivariate spectrophotometric acid-base titrations matrices to circumvent the rank deficiency problem in the rank quantitation step. The applicability of this approach was evaluated by simulated data at first stage, then the binary mixtures of ascorbic and sorbic acids as model compounds were investigated by the proposed method. At the end, the proposed method was successfully applied for resolving the ascorbic and sorbic acid in an orange juice real sample. Therefore, unique results were achieved by applying rank annihilation factor analysis on variation matrix and using hard soft model combination advantage without any problem and difficulty in rank determination. Normal 0 false false false EN-US X-NONE AR-SA /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:Arial; mso-bidi-theme-font:minor-bidi; mso-bidi-language:AR-SA;}    

  2. Perceived Effect of Climate Variation on Food Crop Production in ...

    African Journals Online (AJOL)

    The study objective is to determine the perception of food crop farmers in Oyo state to climate variation as it affects their production, because the relationship between climate variation and food security is direct and Oyo State has enormous potentials to make Nigeria food secure. Multi-stage sampling technique was used to ...

  3. Multivariate analysis of seasonal variation in the composition and thermal properties of butterfat with an emphasis on authenticity assessment

    International Nuclear Information System (INIS)

    Tomaszewska-Gras, J.

    2016-01-01

    The aim of this study was to analyze the seasonal variation in the composition and thermal properties of butterfat (BF) in order to evaluate the applicability of differential scanning calorimetry (DSC) for the authenticity assessment of butter. The composition of fatty acids (FA) and triacylglycerols (TAG) and the thermal properties of genuine BF purchased in the summer and in the winter from six producers were determined. Principal component analysis (PCA) was used to recognize variation and as a result, all BF samples were classified into two groups: one composed of mixed samples from the summer and winter and the other comprising only summer BF samples. DSC and GC analysis revealed that the group of only summer BF samples was characterized by lower melting temperatures and peak heights of low- and medium melting fractions and the highest proportions of unsaturated FAs (ΣC18:1, ΣC18:2, ΣC18:3). The results indicated that most of the variation in the composition and thermal properties was affected by summer BF samples, which may result from the alternative animal feeding systems employed in the summer season, i.e., pasture vs. indoor. Therefore, seasonal variation should be taken into consideration during the elaboration of the analytical method of authenticity assessment. [es

  4. Empirical analysis of skin friction under variations of temperature

    International Nuclear Information System (INIS)

    Parra Alvarez, A. R. de la; Groot Viana, M. de

    2014-01-01

    In soil geotechnical characterization, strength parameters, cohesion (c) and internal friction angle (Φ) has been traditional measured without taking into account temperature, been a very important issue in energy geostructures. The present document analyzes the variation of these parameters in soil-concrete interface at different temperatures. A traditional shear strength case with a forced plane of failure was used. Several tests were carried out to determine the variation of skin friction in granular and cohesive oils with temperature. (Author)

  5. Approach to proliferation risk assessment based on multiple objective analysis framework

    Energy Technology Data Exchange (ETDEWEB)

    Andrianov, A.; Kuptsov, I. [Obninsk Institute for Nuclear Power Engineering of NNRU MEPhI (Russian Federation); Studgorodok 1, Obninsk, Kaluga region, 249030 (Russian Federation)

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  6. Approach to proliferation risk assessment based on multiple objective analysis framework

    International Nuclear Information System (INIS)

    Andrianov, A.; Kuptsov, I.

    2013-01-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk

  7. Genetic Variation in Cardiomyopathy and Cardiovascular Disorders.

    Science.gov (United States)

    McNally, Elizabeth M; Puckelwartz, Megan J

    2015-01-01

    With the wider deployment of massively-parallel, next-generation sequencing, it is now possible to survey human genome data for research and clinical purposes. The reduced cost of producing short-read sequencing has now shifted the burden to data analysis. Analysis of genome sequencing remains challenged by the complexity of the human genome, including redundancy and the repetitive nature of genome elements and the large amount of variation in individual genomes. Public databases of human genome sequences greatly facilitate interpretation of common and rare genetic variation, although linking database sequence information to detailed clinical information is limited by privacy and practical issues. Genetic variation is a rich source of knowledge for cardiovascular disease because many, if not all, cardiovascular disorders are highly heritable. The role of rare genetic variation in predicting risk and complications of cardiovascular diseases has been well established for hypertrophic and dilated cardiomyopathy, where the number of genes that are linked to these disorders is growing. Bolstered by family data, where genetic variants segregate with disease, rare variation can be linked to specific genetic variation that offers profound diagnostic information. Understanding genetic variation in cardiomyopathy is likely to help stratify forms of heart failure and guide therapy. Ultimately, genetic variation may be amenable to gene correction and gene editing strategies.

  8. Genetic analysis of variation in human meiotic recombination.

    Directory of Open Access Journals (Sweden)

    Reshmi Chowdhury

    2009-09-01

    Full Text Available The number of recombination events per meiosis varies extensively among individuals. This recombination phenotype differs between female and male, and also among individuals of each gender. In this study, we used high-density SNP genotypes of over 2,300 individuals and their offspring in two datasets to characterize recombination landscape and to map the genetic variants that contribute to variation in recombination phenotypes. We found six genetic loci that are associated with recombination phenotypes. Two of these (RNF212 and an inversion on chromosome 17q21.31 were previously reported in the Icelandic population, and this is the first replication in any other population. Of the four newly identified loci (KIAA1462, PDZK1, UGCG, NUB1, results from expression studies provide support for their roles in meiosis. Each of the variants that we identified explains only a small fraction of the individual variation in recombination. Notably, we found different sequence variants associated with female and male recombination phenotypes, suggesting that they are regulated by different genes. Characterization of genetic variants that influence natural variation in meiotic recombination will lead to a better understanding of normal meiotic events as well as of non-disjunction, the primary cause of pregnancy loss.

  9. Ghost Imaging of Space Objects

    International Nuclear Information System (INIS)

    Strekalov, Dmitry V; Erkmen, Baris I; Yu Nan

    2013-01-01

    The term 'ghost imaging' was coined in 1995 when an optical correlation measurement in combination with an entangled photon-pair source was used to image a mask placed in one optical channel by raster-scanning a detector in the other, empty, optical channel. Later, it was shown that the entangled photon source could be replaced with thermal sources of light, which are abundantly available as natural illumination sources. It was also shown that the bucket detector could be replaced with a remote point-like detector, opening the possibility to remote-sensing imaging applications. In this paper, we discuss the application of ghost-imaging-like techniques to astronomy, with the objective of detecting intensity-correlation signatures resulting from space objects of interest, such as exo-planets, gas clouds, and gravitational lenses. An important aspect of being able to utilize ghost imaging in astronomy, is the recognition that in interstellar imaging geometries the object of interest can act as an effective beam splitter, yielding detectable variations in the intensity-correlation signature.

  10. Objective characterization of bruise evolution using photothermal depth profiling and Monte Carlo modeling

    Science.gov (United States)

    Vidovič, Luka; Milanič, Matija; Majaron, Boris

    2015-01-01

    Pulsed photothermal radiometry (PPTR) allows noninvasive determination of laser-induced temperature depth profiles in optically scattering layered structures. The obtained profiles provide information on spatial distribution of selected chromophores such as melanin and hemoglobin in human skin. We apply the described approach to study time evolution of incidental bruises (hematomas) in human subjects. By combining numerical simulations of laser energy deposition in bruised skin with objective fitting of the predicted and measured PPTR signals, we can quantitatively characterize the key processes involved in bruise evolution (i.e., hemoglobin mass diffusion and biochemical decomposition). Simultaneous analysis of PPTR signals obtained at various times post injury provides an insight into the variations of these parameters during the bruise healing process. The presented methodology and results advance our understanding of the bruise evolution and represent an important step toward development of an objective technique for age determination of traumatic bruises in forensic medicine.

  11. Exhaled nitric oxide - circadian variations in healthy subjects

    Directory of Open Access Journals (Sweden)

    Antosova M

    2009-12-01

    Full Text Available Abstract Objective Exhaled nitric oxide (eNO has been suggested as a marker of airway inflammatory diseases. The level of eNO is influenced by many various factor including age, sex, menstrual cycle, exercise, food, drugs, etc. The aim of our study was to investigate a potential influence of circadian variation on eNO level in healthy subjects. Methods Measurements were performed in 44 women and 10 men, non-smokers, without respiratory tract infection in last 2 weeks. The eNO was detected at 4-hour intervals from 6 a.m. to 10 p.m. using an NIOX analyzer. We followed the ATS/ERS guidelines for eNO measurement and analysis. Results Peak of eNO levels were observed at 10 a.m. (11.1 ± 7.2 ppb, the lowest value was detected at 10 p.m. (10.0 ± 5.8 ppb. The difference was statistically significant (paired t-test, P Conclusions The daily variations in eNO, with the peak in the morning hours, could be of importance in clinical practice regarding the choice of optimal time for monitoring eNO in patients with respiratory disease.

  12. Bi-objective optimization for multi-modal transportation routing planning problem based on Pareto optimality

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2015-09-01

    Full Text Available Purpose: The purpose of study is to solve the multi-modal transportation routing planning problem that aims to select an optimal route to move a consignment of goods from its origin to its destination through the multi-modal transportation network. And the optimization is from two viewpoints including cost and time. Design/methodology/approach: In this study, a bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. Minimizing the total transportation cost and the total transportation time are set as the optimization objectives of the model. In order to balance the benefit between the two objectives, Pareto optimality is utilized to solve the model by gaining its Pareto frontier. The Pareto frontier of the model can provide the multi-modal transportation operator (MTO and customers with better decision support and it is gained by the normalized normal constraint method. Then, an experimental case study is designed to verify the feasibility of the model and Pareto optimality by using the mathematical programming software Lingo. Finally, the sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case. Findings: The calculation results indicate that the proposed model and Pareto optimality have good performance in dealing with the bi-objective optimization. The sensitivity analysis also shows the influence of the variation of the demand and supply on the multi-modal transportation organization clearly. Therefore, this method can be further promoted to the practice. Originality/value: A bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. The Pareto frontier based sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case.

  13. A genome-wide analysis of putative functional and exonic variation associated with extremely high intelligence.

    Science.gov (United States)

    Spain, S L; Pedroso, I; Kadeva, N; Miller, M B; Iacono, W G; McGue, M; Stergiakouli, E; Davey Smith, G; Putallaz, M; Lubinski, D; Meaburn, E L; Plomin, R; Simpson, M A

    2016-08-01

    Although individual differences in intelligence (general cognitive ability) are highly heritable, molecular genetic analyses to date have had limited success in identifying specific loci responsible for its heritability. This study is the first to investigate exome variation in individuals of extremely high intelligence. Under the quantitative genetic model, sampling from the high extreme of the distribution should provide increased power to detect associations. We therefore performed a case-control association analysis with 1409 individuals drawn from the top 0.0003 (IQ >170) of the population distribution of intelligence and 3253 unselected population-based controls. Our analysis focused on putative functional exonic variants assayed on the Illumina HumanExome BeadChip. We did not observe any individual protein-altering variants that are reproducibly associated with extremely high intelligence and within the entire distribution of intelligence. Moreover, no significant associations were found for multiple rare alleles within individual genes. However, analyses using genome-wide similarity between unrelated individuals (genome-wide complex trait analysis) indicate that the genotyped functional protein-altering variation yields a heritability estimate of 17.4% (s.e. 1.7%) based on a liability model. In addition, investigation of nominally significant associations revealed fewer rare alleles associated with extremely high intelligence than would be expected under the null hypothesis. This observation is consistent with the hypothesis that rare functional alleles are more frequently detrimental than beneficial to intelligence.

  14. Solid mechanics a variational approach

    CERN Document Server

    Dym, Clive L

    2013-01-01

    Solid Mechanics: A Variational Approach, Augmented Edition presents a lucid and thoroughly developed approach to solid mechanics for students engaged in the study of elastic structures not seen in other texts currently on the market. This work offers a clear and carefully prepared exposition of variational techniques as they are applied to solid mechanics. Unlike other books in this field, Dym and Shames treat all the necessary theory needed for the study of solid mechanics and include extensive applications. Of particular note is the variational approach used in developing consistent structural theories and in obtaining exact and approximate solutions for many problems.  Based on both semester and year-long courses taught to undergraduate seniors and graduate students, this text is geared for programs in aeronautical, civil, and mechanical engineering, and in engineering science. The authors’ objective is two-fold: first, to introduce the student to the theory of structures (one- and two-dimensional) as ...

  15. Multi-objective optimization of GPU3 Stirling engine using third order analysis

    International Nuclear Information System (INIS)

    Toghyani, Somayeh; Kasaeian, Alibakhsh; Hashemabadi, Seyyed Hasan; Salimi, Morteza

    2014-01-01

    Highlights: • A third-order analysis is carried out for optimization of Stirling engine. • The triple-optimization is done on a GPU3 Stirling engine. • A multi-objective optimization is carried out for a Stirling engine. • The results are compared with an experimental previous work for checking the model improvement. • The methods of TOPSIS, Fuzzy, and LINMAP are compared with each other in aspect of optimization. - Abstract: Stirling engine is an external combustion engine that uses any external heat source to generate mechanical power which operates at closed cycles. These engines are good choices for using in power generation systems; because these engines present a reasonable theoretical efficiency which can be closer to the Carnot efficiency, comparing with other reciprocating thermal engines. Hence, many studies have been conducted on Stirling engines and the third order thermodynamic analysis is one of them. In this study, multi-objective optimization with four decision variables including the temperature of heat source, stroke, mean effective pressure, and the engine frequency were applied in order to increase the efficiency and output power and reduce the pressure drop. Three decision-making procedures were applied to optimize the answers from the results. At last, the applied methods were compared with the results obtained of one experimental work and a good agreement was observed

  16. Sequence length variation, indel costs, and congruence in sensitivity analysis

    DEFF Research Database (Denmark)

    Aagesen, Lone; Petersen, Gitte; Seberg, Ole

    2005-01-01

    The behavior of two topological and four character-based congruence measures was explored using different indel treatments in three empirical data sets, each with different alignment difficulties. The analyses were done using direct optimization within a sensitivity analysis framework in which...... the cost of indels was varied. Indels were treated either as a fifth character state, or strings of contiguous gaps were considered single events by using linear affine gap cost. Congruence consistently improved when indels were treated as single events, but no congruence measure appeared as the obviously...... preferable one. However, when combining enough data, all congruence measures clearly tended to select the same alignment cost set as the optimal one. Disagreement among congruence measures was mostly caused by a dominant fragment or a data partition that included all or most of the length variation...

  17. Batch variation between branchial cell cultures: An analysis of variance

    DEFF Research Database (Denmark)

    Hansen, Heinz Johs. Max; Grosell, M.; Kristensen, L.

    2003-01-01

    We present in detail how a statistical analysis of variance (ANOVA) is used to sort out the effect of an unexpected batch-to-batch variation between cell cultures. Two separate cultures of rainbow trout branchial cells were grown on permeable filtersupports ("inserts"). They were supposed...... and introducing the observed difference between batches as one of the factors in an expanded three-dimensional ANOVA, we were able to overcome an otherwisecrucial lack of sufficiently reproducible duplicate values. We could thereby show that the effect of changing the apical medium was much more marked when...... the radioactive lipid precursors were added on the apical, rather than on the basolateral, side. Theinsert cell cultures were obviously polarized. We argue that it is not reasonable to reject troublesome experimental results, when we do not know a priori that something went wrong. The ANOVA is a very useful...

  18. Only 7% of the variation in feed efficiency in veal calves can be predicted from variation in feeding motivation, digestion, metabolism, immunology, and behavioral traits in early life.

    Science.gov (United States)

    Gilbert, M S; van den Borne, J J G C; van Reenen, C G; Gerrits, W J J

    2017-10-01

    High interindividual variation in growth performance is commonly observed in veal calf production and appears to depend on milk replacer (MR) composition. Our first objective was to examine whether variation in growth performance in healthy veal calves can be predicted from early life characterization of these calves. Our second objective was to determine whether these predictions differ between calves that are fed a high- or low-lactose MR in later life. A total of 180 male Holstein-Friesian calves arrived at the facilities at 17 ± 3.4 d of age, and blood samples were collected before the first feeding. Subsequently, calves were characterized in the following 9 wk (period 1) using targeted challenges related to traits within each of 5 categories: feeding motivation, digestion, postabsorptive metabolism, behavior and stress, and immunology. In period 2 (wk 10-26), 130 calves were equally divided over 2 MR treatments: a control MR that contained lactose as the only carbohydrate source and a low-lactose MR in which 51% of the lactose was isocalorically replaced by glucose, fructose, and glycerol (2:1:2 ratio). Relations between early life characteristics and growth performance in later life were assessed in 117 clinically healthy calves. Average daily gain (ADG) in period 2 tended to be greater for control calves (1,292 ± 111 g/d) than for calves receiving the low-lactose MR (1,267 ± 103 g/d). Observations in period 1 were clustered per category using principal component analysis, and the resulting principal components were used to predict performance in period 2 using multiple regression procedures. Variation in observations in period 1 predicted 17% of variation in ADG in period 2. However, this was mainly related to variation in solid feed refusals. When ADG was adjusted to equal solid feed intake, only 7% of the variation in standardized ADG in period 2, in fact reflecting feed efficiency, could be explained by early life measurements. This indicates that >90

  19. Object 'Ukryttya' 1986-2006

    International Nuclear Information System (INIS)

    Klyuchnikov, A.A.; Krasnov, V.A.; Rud'ko, V.M.; Shcherbin, V.N.

    2006-01-01

    This monograph summarizes the materials pertaining to 'Ukryttya' Object state. The results of researches of fuel containing materials conditions and of forecast of their future behavior are demonstrated, aerosol characterizations, mechanisms of production of liquid radioactive waste, radiation conditions at SO industrial site and NSC assembly site, as well as analysis of 'Ukryttya' Objects environmental impact, are performed. The conditions of 'Ukryttya' Object building structures are described. Preparation and realization of international projects for conversion of 'Ukryttya' Object into an ecologically safe system, including the 'Ukryttya' Implementation Plan (SIP), are considered

  20. Analysis and Comparison of Objective Methods for Image Quality Assessment

    Directory of Open Access Journals (Sweden)

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  1. Variational formulation based analysis on growth of yield front in ...

    African Journals Online (AJOL)

    The present study investigates the growth of elastic-plastic front in rotating solid disks of non-uniform thickness having exponential and parabolic geometry variation. The problem is solved through an extension of a variational method in elastoplastic regime. The formulation is based on von-Mises yield criterion and linear ...

  2. Towards a syntactic analysis of European Portuguese cognate objects

    Directory of Open Access Journals (Sweden)

    Celda Morgado Choupina

    2013-01-01

    Full Text Available The present paper aims at discussing selected syntactic aspects of cognate objects in European Portuguese, along the lines of Distributed Morphology (Haugen, 2009. Cognate objects may be readily discovered in numerous human languages, including European Portuguese (Chovia uma chuva miudinha. It is assumed in papers devoted to their English counterparts that they belong to various subclasses. Indeed, some of them are genuine cognates (to sleep a sleep... or hyponyms (to dance a jig; Hale & Keyser, 2002. It turns out that in European Portuguese, they can be split into four different categories: (i genuine cognate objects (chorar um choro..., (ii similar cognate objects (dançar uma dança (iii objects hyponyms (dançar um tango and (iv prepositional cognate objects (morrer de uma morte .... There are, then, significant differences between various classes of cognate objects: whereas the genuine ones call imperatively for a restrictive modifier and a definite article, the remaining ones admit it only optionally. It might be concluded, then, that a lexicalist theory set up along the lines of Hale and Keyser is unable to deal successfully with distributional facts proper to various classes of cognate constructions in European Portuguese. That is why the present study is conducted more in accordance with syntactic principles of Distributed Morphology, with a strong impact of hypotheses put forward by Haugen (2009.

  3. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    Science.gov (United States)

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  4. Using Item Analysis to Assess Objectively the Quality of the Calgary-Cambridge OSCE Checklist

    Directory of Open Access Journals (Sweden)

    Tyrone Donnon

    2011-06-01

    Full Text Available Background:  The purpose of this study was to investigate the use of item analysis to assess objectively the quality of items on the Calgary-Cambridge Communications OSCE checklist. Methods:  A total of 150 first year medical students were provided with extensive teaching on the use of the Calgary-Cambridge Guidelines for interviewing patients and participated in a final year end 20 minute communication OSCE station.  Grouped into either the upper half (50% or lower half (50% communication skills performance groups, discrimination, difficulty and point biserial values were calculated for each checklist item. Results:  The mean score on the 33 item communication checklist was 24.09 (SD = 4.46 and the internal reliability coefficient was ? = 0.77. Although most of the items were found to have moderate (k = 12, 36% or excellent (k = 10, 30% discrimination values, there were 6 (18% identified as ‘fair’ and 3 (9% as ‘poor’. A post-examination review focused on item analysis findings resulted in an increase in checklist reliability (? = 0.80. Conclusions:  Item analysis has been used with MCQ exams extensively. In this study, it was also found to be an objective and practical approach to use in evaluating the quality of a standardized OSCE checklist.

  5. Relationships among and variation within rare breeds of swine.

    Science.gov (United States)

    Roberts, K S; Lamberson, W R

    2015-08-01

    Extinction of rare breeds of livestock threatens to reduce the total genetic variation available for selection in the face of the changing environment and new diseases. Swine breeds facing extinction typically share characteristics such as small size, slow growth rate, and high fat percentage, which limit them from contributing to commercial production. Compounding the risk of loss of variation is the lack of pedigree information for many rare breeds due to inadequate herd books, which increases the chance that producers are breeding closely related individuals. By making genetic data available, producers can make more educated breeding decisions to preserve genetic diversity in future generations, and conservation organizations can prioritize investments in breed preservation. The objective of this study was to characterize genetic variation within and among breeds of swine and prioritize heritage breeds for preservation. Genotypes from the Illumina PorcineSNP60 BeadChip (GeneSeek, Lincoln, NE) were obtained for Guinea, Ossabaw Island, Red Wattle, American Saddleback, Mulefoot, British Saddleback, Duroc, Landrace, Large White, Pietrain, and Tamworth pigs. A whole-genome analysis toolset was used to construct a genomic relationship matrix and to calculate inbreeding coefficients for the animals within each breed. Relatedness and average inbreeding coefficient differed among breeds, and pigs from rare breeds were generally more closely related and more inbred ( Guinea pigs. Tamworth, Duroc, and Mulefoot tended to not cluster with the other 7 breeds.

  6. Day-to-day and within-day variation in urinary iodine excretion

    DEFF Research Database (Denmark)

    Rasmussen, Lone Banke; Ovesen, L.; Christiansen, E.

    1999-01-01

    Objective: To examine the day-to-day and within-day variation in urinary iodine excretion and the day-to-day variation in iodine intake. Design: Collection of consecutive 24-h urine samples and casual urine samples over 24 h. Setting: The study population consisted of highly motivated subjects fr...

  7. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  8. Software Analysis of Mining Images for Objects Detection

    Directory of Open Access Journals (Sweden)

    Jan Tomecek

    2013-11-01

    Full Text Available The contribution is dealing with the development of the new module of robust FOTOMNG system for editing images from a video or miningimage from measurements for subsequent improvement of detection of required objects in the 2D image. The generated module allows create a finalhigh-quality picture by combination of multiple images with the search objects. We can combine input data according to the parameters or basedon reference frames. Correction of detected 2D objects is also part of this module. The solution is implemented intoFOTOMNG system and finishedwork has been tested in appropriate frames, which were validated core functionality and usability. Tests confirmed the function of each part of themodule, its accuracy and implications of integration.

  9. Variation Tolerant On-Chip Interconnects

    CERN Document Server

    Nigussie, Ethiopia Enideg

    2012-01-01

    This book presents design techniques, analysis and implementation of high performance and power efficient, variation tolerant on-chip interconnects.  Given the design paradigm shift to multi-core, interconnect-centric designs and the increase in sources of variability and their impact in sub-100nm technologies, this book will be an invaluable reference for anyone concerned with the design of next generation, high-performance electronics systems. Provides comprehensive, circuit-level explanation of high-performance, energy-efficient, variation-tolerant on-chip interconnect; Describes design techniques to mitigate problems caused by variation; Includes techniques for design and implementation of self-timed on-chip interconnect, delay variation insensitive communication protocols, high speed signaling techniques and circuits, bit-width independent completion detection and process, voltage and temperature variation tolerance.                          

  10. Object tracking by occlusion detection via structured sparse learning

    KAUST Repository

    Zhang, Tianzhu

    2013-06-01

    Sparse representation based methods have recently drawn much attention in visual tracking due to good performance against illumination variation and occlusion. They assume the errors caused by image variations can be modeled as pixel-wise sparse. However, in many practical scenarios these errors are not truly pixel-wise sparse but rather sparsely distributed in a structured way. In fact, pixels in error constitute contiguous regions within the object\\'s track. This is the case when significant occlusion occurs. To accommodate for non-sparse occlusion in a given frame, we assume that occlusion detected in previous frames can be propagated to the current one. This propagated information determines which pixels will contribute to the sparse representation of the current track. In other words, pixels that were detected as part of an occlusion in the previous frame will be removed from the target representation process. As such, this paper proposes a novel tracking algorithm that models and detects occlusion through structured sparse learning. We test our tracker on challenging benchmark sequences, such as sports videos, which involve heavy occlusion, drastic illumination changes, and large pose variations. Experimental results show that our tracker consistently outperforms the state-of-the-art. © 2013 IEEE.

  11. Robust object tracking techniques for vision-based 3D motion analysis applications

    Science.gov (United States)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  12. Only 7% of the variation in feed efficiency in veal calves can be predicted from variation in feeding motivation, digestion, metabolism, immunology, and behavioral traits in early life

    NARCIS (Netherlands)

    Gilbert, M.S.; Borne, van den J.J.G.C.; Reenen, van C.G.; Gerrits, W.J.J.

    2017-01-01

    High interindividual variation in growth performance is commonly observed in veal calf production and appears to depend on milk replacer (MR) composition. Our first objective was to examine whether variation in growth performance in healthy veal calves can be predicted from early life

  13. Face Recognition Is Affected by Similarity in Spatial Frequency Range to a Greater Degree Than Within-Category Object Recognition

    Science.gov (United States)

    Collin, Charles A.; Liu, Chang Hong; Troje, Nikolaus F.; McMullen, Patricia A.; Chaudhuri, Avi

    2004-01-01

    Previous studies have suggested that face identification is more sensitive to variations in spatial frequency content than object recognition, but none have compared how sensitive the 2 processes are to variations in spatial frequency overlap (SFO). The authors tested face and object matching accuracy under varying SFO conditions. Their results…

  14. Analysis of Idiom Variation in the Framework of Linguistic Subjectivity

    Science.gov (United States)

    Liu, Zhengyuan

    2012-01-01

    Idiom variation is a ubiquitous linguistic phenomenon which has raised a lot of research questions. The past approach was either formal or functional. Both of them did not pay much attention to cognitive factors of language users. By putting idiom variation in the framework of linguistic subjectivity, we have offered a new perspective in the…

  15. Analysis of Pressure Variations in a Low-Pressure Nickel-Hydrogen Battery - Part 1.

    Science.gov (United States)

    Purushothaman, B K; Wainright, J S

    2012-05-15

    A low pressure nickel-hydrogen battery using either a metal hydride or gaseous hydrogen for H(2) storage has been developed for use in implantable neuroprosthetic devices. In this paper, pressure variations inside the cell for the gaseous hydrogen version are analyzed and correlated with oxygen evolution side reaction at the end of charging, the recombination of oxygen with hydrogen during charging and a subsequent rest period, and the self-discharge of the nickel electrode. About 70% of the recombination occurred simultaneously with oxygen evolution during charging and the remaining oxygen recombined with hydrogen during the 1(st) hour after charging. Self-discharge of the cell varies linearly with hydrogen pressure at a given state of charge and increased with increasing battery charge levels. The coulometric efficiency calculated based on analysis of the pressure-time data agreed well with the efficiency calculated based on the current-time data. Pressure variations in the battery are simulated accurately to predict coulometric efficiency and the state of charge of the cell, factors of extreme importance for a battery intended for implantation within the human body.

  16. Relationship between climatic variables and the variation in bulk tank milk composition using canonical correlation analysis.

    Science.gov (United States)

    Stürmer, Morgana; Busanello, Marcos; Velho, João Pedro; Heck, Vanessa Isabel; Haygert-Velho, Ione Maria Pereira

    2018-06-04

    A number of studies have addressed the relations between climatic variables and milk composition, but these works used univariate statistical approaches. In our study, we used a multivariate approach (canonical correlation) to study the impact of climatic variables on milk composition, price, and monthly milk production at a dairy farm using bulk tank milk data. Data on milk composition, price, and monthly milk production were obtained from a dairy company that purchased the milk from the farm, while climatic variable data were obtained from the National Institute of Meteorology (INMET). The data are from January 2014 to December 2016. Univariate correlation analysis and canonical correlation analysis were performed. Few correlations between the climatic variables and milk composition were found using a univariate approach. However, using canonical correlation analysis, we found a strong and significant correlation (r c  = 0.95, p value = 0.0029). Lactose, ambient temperature measures (mean, minimum, and maximum), and temperature-humidity index (THI) were found to be the most important variables for the canonical correlation. Our study indicated that 10.2% of the variation in milk composition, pricing, and monthly milk production can be explained by climatic variables. Ambient temperature variables, together with THI, seem to have the most influence on variation in milk composition.

  17. Landscape complexity and soil moisture variation in south Georgia, USA, for remote sensing applications

    Science.gov (United States)

    Giraldo, Mario A.; Bosch, David; Madden, Marguerite; Usery, Lynn; Kvien, Craig

    2008-08-01

    SummaryThis research addressed the temporal and spatial variation of soil moisture (SM) in a heterogeneous landscape. The research objective was to investigate soil moisture variation in eight homogeneous 30 by 30 m plots, similar to the pixel size of a Landsat Thematic Mapper (TM) or Enhanced Thematic Mapper plus (ETM+) image. The plots were adjacent to eight stations of an in situ soil moisture network operated by the United States Department of Agriculture-Agriculture Research Service USDA-ARS in Tifton, GA. We also studied five adjacent agricultural fields to examine the effect of different landuses/land covers (LULC) (grass, orchard, peanuts, cotton and bare soil) on the temporal and spatial variation of soil moisture. Soil moisture field data were collected on eight occasions throughout 2005 and January 2006 to establish comparisons within and among eight homogeneous plots. Consistently throughout time, analysis of variance (ANOVA) showed high variation in the soil moisture behavior among the plots and high homogeneity in the soil moisture behavior within them. A precipitation analysis for the eight sampling dates throughout the year 2005 showed similar rainfall conditions for the eight study plots. Therefore, soil moisture variation among locations was explained by in situ local conditions. Temporal stability geostatistical analysis showed that soil moisture has high temporal stability within the small plots and that a single point reading can be used to monitor soil moisture status for the plot within a maximum 3% volume/volume (v/v) soil moisture variation. Similarly, t-statistic analysis showed that soil moisture status in the upper soil layer changes within 24 h. We found statistical differences in the soil moisture between the different LULC in the agricultural fields as well as statistical differences between these fields and the adjacent 30 by 30 m plots. From this analysis, it was demonstrated that spatial proximity is not enough to produce similar

  18. Aligning experimental design with bioinformatics analysis to meet discovery research objectives.

    Science.gov (United States)

    Kane, Michael D

    2002-01-01

    The utility of genomic technology and bioinformatic analytical support to provide new and needed insight into the molecular basis of disease, development, and diversity continues to grow as more research model systems and populations are investigated. Yet deriving results that meet a specific set of research objectives requires aligning or coordinating the design of the experiment, the laboratory techniques, and the data analysis. The following paragraphs describe several important interdependent factors that need to be considered to generate high quality data from the microarray platform. These factors include aligning oligonucleotide probe design with the sample labeling strategy if oligonucleotide probes are employed, recognizing that compromises are inherent in different sample procurement methods, normalizing 2-color microarray raw data, and distinguishing the difference between gene clustering and sample clustering. These factors do not represent an exhaustive list of technical variables in microarray-based research, but this list highlights those variables that span both experimental execution and data analysis. Copyright 2001 Wiley-Liss, Inc.

  19. Geographical variation in the prevalence of sensitization to common aeroallergens in adults

    DEFF Research Database (Denmark)

    Newson, R B; van Ree, R; Forsberg, B

    2014-01-01

    BACKGROUND: Geographical variation in the prevalence of sensitization to aeroallergens may reflect differences in exposure to risk factors such as having older siblings, being raised on a farm or other unidentified exposures. OBJECTIVE: We wanted to measure geographical variation in skin prick te...

  20. Objective and quantitative analysis of daytime sleepiness in physicians after night duties.

    Science.gov (United States)

    Wilhelm, Barbara J; Widmann, Anja; Durst, Wilhelm; Heine, Christian; Otto, Gerhard

    2009-06-01

    Work place studies often have the disadvantage of lacking objective data less prone to subject bias. The aim of this study was to contribute objective data to the discussion about safety aspects of night shifts in physicians. For this purpose we applied the Pupillographic Sleepiness Test (PST). The PST allows recording and analyses of pupillary sleepiness-related oscillations in darkness for 11 min in the sitting subject. The parameter of evaluation is the Pupillary Unrest Index (PUI; mm/min). For statistical analysis the natural logarithm of this parameter is used (lnPUI). Thirty-four physicians were examined by the PST and subjective scales during the first half of the day. Data taken during a day work period (D) were compared to those taken directly after night duty (N) by a Wilcoxon signed rank test. Night duty caused a mean sleep reduction of 3 h (Difference N-D: median 3 h, minimum 0 h, maximum 7 h, p home.

  1. Scanpath-based analysis of objects conspicuity in context of human vision physiology.

    Science.gov (United States)

    Augustyniak, Piotr

    2007-01-01

    This paper discusses principal aspects of objects conspicuity investigated with use of an eye tracker and interpreted on the background of human vision physiology. Proper management of objects conspicuity is fundamental in several leading edge applications in the information society like advertisement, web design, man-machine interfacing and ergonomics. Although some common rules of human perception are applied since centuries in the art, the interest of human perception process is motivated today by the need of gather and maintain the recipient attention by putting selected messages in front of the others. Our research uses the visual tasks methodology and series of progressively modified natural images. The modifying details were attributed by their size, color and position while the scanpath-derived gaze points confirmed or not the act of perception. The statistical analysis yielded the probability of detail perception and correlations with the attributes. This probability conforms to the knowledge about the retina anatomy and perception physiology, although we use noninvasive methods only.

  2. New methods of museum objects preservation

    International Nuclear Information System (INIS)

    Justa, P.

    1988-01-01

    The proceedings contains 22 papers of which four discuss the use of ionizing radiation in the preservation and restoration of cultural objects, this namely: radiation methods used for the impregnation of wooden cultural objects, a mobile irradiation robot and its uses for the preservation of museum objects, X-ray fluorescence analysis as an auxiliary scientific discipline for restorers, and the use of neutron activation analysis for expertise of paintings. Some 230 participants attended the seminar. (J.B.)

  3. Urban Image Classification: Per-Pixel Classifiers, Sub-Pixel Analysis, Object-Based Image Analysis, and Geospatial Methods. 10; Chapter

    Science.gov (United States)

    Myint, Soe W.; Mesev, Victor; Quattrochi, Dale; Wentz, Elizabeth A.

    2013-01-01

    Remote sensing methods used to generate base maps to analyze the urban environment rely predominantly on digital sensor data from space-borne platforms. This is due in part from new sources of high spatial resolution data covering the globe, a variety of multispectral and multitemporal sources, sophisticated statistical and geospatial methods, and compatibility with GIS data sources and methods. The goal of this chapter is to review the four groups of classification methods for digital sensor data from space-borne platforms; per-pixel, sub-pixel, object-based (spatial-based), and geospatial methods. Per-pixel methods are widely used methods that classify pixels into distinct categories based solely on the spectral and ancillary information within that pixel. They are used for simple calculations of environmental indices (e.g., NDVI) to sophisticated expert systems to assign urban land covers. Researchers recognize however, that even with the smallest pixel size the spectral information within a pixel is really a combination of multiple urban surfaces. Sub-pixel classification methods therefore aim to statistically quantify the mixture of surfaces to improve overall classification accuracy. While within pixel variations exist, there is also significant evidence that groups of nearby pixels have similar spectral information and therefore belong to the same classification category. Object-oriented methods have emerged that group pixels prior to classification based on spectral similarity and spatial proximity. Classification accuracy using object-based methods show significant success and promise for numerous urban 3 applications. Like the object-oriented methods that recognize the importance of spatial proximity, geospatial methods for urban mapping also utilize neighboring pixels in the classification process. The primary difference though is that geostatistical methods (e.g., spatial autocorrelation methods) are utilized during both the pre- and post

  4. Development of shearography for surface strain measurement of non planar objects

    International Nuclear Information System (INIS)

    Groves, Roger Michael

    2001-01-01

    The subject of this thesis is the development of optical instrumentation for surface strain measurement of non-planar objects. The speckle interferometry technique of shearography is used to perform quantitative measurements of surface strain on non-planar objects and to compensate these measurements for the errors that are due to the shape and slope of the object. Shearography is an optical technique that is usually used for defect location and for qualitative strain characterisation. In this thesis a multi-component shearography system is described that can measure the six components of displacement gradient. From these measurements the surface strain can be fully characterised. For non-planar objects an error is introduced into the displacement gradient measurement due to the variation of the sensitivity vector across the field of view and the variation in the magnitude of applied shear due to the curvature of the object surface. To correct for these errors requires a knowledge of the slope and shape of the object. Shearography may also be used to measure object slope and shape by a source displacement technique. Therefore slope, shape and surface strain may be measured using the same optical system. The thesis describes a method of multiplexing the shear direction using polarisation switching, a method of measuring the source position using shadow Moire and the shearography source displacement technique for measuring the surface slope and shape of objects. The multi-component shearography system is used to perform measurements of the six components of surface strain, on an industrial component, with a correction applied for errors due to the shape and slope of the object. (author)

  5. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  6. Analysis of Various Multi-Objective Optimization Evolutionary Algorithms for Monte Carlo Treatment Planning System

    CERN Document Server

    Tydrichova, Magdalena

    2017-01-01

    In this project, various available multi-objective optimization evolutionary algorithms were compared considering their performance and distribution of solutions. The main goal was to select the most suitable algorithms for applications in cancer hadron therapy planning. For our purposes, a complex testing and analysis software was developed. Also, many conclusions and hypothesis have been done for the further research.

  7. GRAIN-SIZE MEASUREMENTS OF FLUVIAL GRAVEL BARS USING OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Pedro Castro

    2018-01-01

    Full Text Available Traditional techniques for classifying the average grain size in gravel bars require manual measurements of each grain diameter. Aiming productivity, more efficient methods have been developed by applying remote sensing techniques and digital image processing. This research proposes an Object-Based Image Analysis methodology to classify gravel bars in fluvial channels. First, the study evaluates the performance of multiresolution segmentation algorithm (available at the software eCognition Developer in performing shape recognition. The linear regression model was applied to assess the correlation between the gravels’ reference delineation and the gravels recognized by the segmentation algorithm. Furthermore, the supervised classification was validated by comparing the results with field data using the t-statistic test and the kappa index. Afterwards, the grain size distribution in gravel bars along the upper Bananeiras River, Brazil was mapped. The multiresolution segmentation results did not prove to be consistent with all the samples. Nonetheless, the P01 sample showed an R2 =0.82 for the diameter estimation and R2=0.45 the recognition of the eliptical ft. The t-statistic showed no significant difference in the efficiencies of the grain size classifications by the field survey data and the Object-based supervised classification (t = 2.133 for a significance level of 0.05. However, the kappa index was 0.54. The analysis of the both segmentation and classification results did not prove to be replicable.

  8. Short-time variations of the solar neutrino luminosity (Fourier analysis of the argon-37 production rate data)

    International Nuclear Information System (INIS)

    Haubold, H.J.; Gerth, E.

    1985-01-01

    We continue the Fourier analysis of the argon-37 production rate for runs 18--80 observed in Davis' well known solar neutrino experiment. The method of Fourier analysis with the unequally-spaced data of Davis and associates is described and the discovered periods we compare with our recently published results for the analysis of runs 18--69 (Haubold and Gerth, 1983). The harmonic analysis of the data of runs 18--80 shows time variations of the solar neutrino flux with periods π = 8.33; 5.26; 2.13; 1.56; 0.83; 0.64; 0.54; and 0.50 years, respectively, which confirms our earlier computations

  9. "Life history space": a multivariate analysis of life history variation in extant and extinct Malagasy lemurs.

    Science.gov (United States)

    Catlett, Kierstin K; Schwartz, Gary T; Godfrey, Laurie R; Jungers, William L

    2010-07-01

    Studies of primate life history variation are constrained by the fact that all large-bodied extant primates are haplorhines. However, large-bodied strepsirrhines recently existed. If we can extract life history information from their skeletons, these species can contribute to our understanding of primate life history variation. This is particularly important in light of new critiques of the classic "fast-slow continuum" as a descriptor of variation in life history profiles across mammals in general. We use established dental histological methods to estimate gestation length and age at weaning for five extinct lemur species. On the basis of these estimates, we reconstruct minimum interbirth intervals and maximum reproductive rates. We utilize principal components analysis to create a multivariate "life history space" that captures the relationships among reproductive parameters and brain and body size in extinct and extant lemurs. Our data show that, whereas large-bodied extinct lemurs can be described as "slow" in some fashion, they also varied greatly in their life history profiles. Those with relatively large brains also weaned their offspring late and had long interbirth intervals. These were not the largest of extinct lemurs. Thus, we distinguish size-related life history variation from variation that linked more strongly to ecological factors. Because all lemur species larger than 10 kg, regardless of life history profile, succumbed to extinction after humans arrived in Madagascar, we argue that large body size increased the probability of extinction independently of reproductive rate. We also provide some evidence that, among lemurs, brain size predicts reproductive rate better than body size. (c) 2010 Wiley-Liss, Inc.

  10. A cross-sectional analysis of variation in charges and prices across California for percutaneous coronary intervention.

    Directory of Open Access Journals (Sweden)

    Renee Y Hsia

    Full Text Available Though past studies have shown wide variation in aggregate hospital price indices and specific procedures, few have documented or explained such variation for distinct and common episodes of care.We sought to examine the variability in charges for percutaneous coronary intervention (PCI with a drug-eluting stent and without major complications (MS-DRG-247, and determine whether hospital and market characteristics influenced these charges.We conducted a cross-sectional analysis of adults admitted to California hospitals in 2011 for MS-DRG-247 using patient discharge data from the California Office of Statewide Health Planning and Development. We used a two-part linear regression model to first estimate hospital-specific charges adjusted for patient characteristics, and then examine whether the between-hospital variation in those estimated charges was explained by hospital and market characteristics.Adjusted charges for the average California patient admitted for uncomplicated PCI ranged from $22,047 to $165,386 (median: $88,350 depending on which hospital the patient visited. Hospitals in areas with the highest cost of living, those in rural areas, and those with more Medicare patients had higher charges, while government-owned hospitals charged less. Overall, our model explained 43% of the variation in adjusted charges. Estimated discounted prices paid by private insurers ranged from $3,421 to $80,903 (median: $28,571.Charges and estimated discounted prices vary widely between hospitals for the average California patient undergoing PCI without major complications, a common and relatively homogeneous episode of care. Though observable hospital characteristics account for some of this variation, the majority remains unexplained.

  11. Temporal correlation in the Goldberg variations

    OpenAIRE

    Chestopal, Victor

    2010-01-01

    An interpreter of the Goldberg Variations is almost completely deprived of such utterly important guidance as the composer's tempo markings, which are as rare in the Goldberg Variations as they are in the other works of Bach. The final goal of my study is to suggest a logical foundation, upon which an interpreter of the Goldberg Variations can make his/her choice of tempi. Upon the analysis of opus's structure, which reveals an impressive panorama of symmetries, I suggest a multilevel system ...

  12. Concept Maps as Instructional Tools for Improving Learning of Phase Transitions in Object-Oriented Analysis and Design

    Science.gov (United States)

    Shin, Shin-Shing

    2016-01-01

    Students attending object-oriented analysis and design (OOAD) courses typically encounter difficulties transitioning from requirements analysis to logical design and then to physical design. Concept maps have been widely used in studies of user learning. The study reported here, based on the relationship of concept maps to learning theory and…

  13. Empirical study on mutual fund objective classification.

    Science.gov (United States)

    Jin, Xue-jun; Yang, Xiao-lan

    2004-05-01

    Mutual funds are usually classified on the basis of their objectives. If the activities of mutual funds are consistent with their stated objectives, investors may look at the latter as signals of their risks and incomes. This work analyzes mutual fund objective classification in China by statistical methods of distance analysis and discriminant analysis; and examines whether the stated investment objectives of mutual funds adequately represent their attributes to investors. That is, if mutual funds adhere to their stated objectives, attributes must be heterogeneous between investment objective groups and homogeneous within them. Our conclusion is to some degree, the group of optimized exponential funds is heterogeneous to other groups. As a whole, there exist no significant differences between different objective groups; and 50% of mutual funds are not consistent with their objective groups.

  14. Overlapping constraint for variational surface reconstruction

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Solem, J.E.

    2005-01-01

    In this paper a counter example, illustrating a shortcoming in most variational formulations for 3D surface estimation, is presented. The nature of this shortcoming is a lack of an overlapping constraint. A remedy for this shortcoming is presented in the form of a penalty function with an analysi...... of the effects of this function on surface motion. For practical purposes, this will only have minor influence on current methods. However, the insight provided in the analysis is likely to influence future developments in the field of variational surface reconstruction....

  15. Variational Bayesian Learning for Wavelet Independent Component Analysis

    Science.gov (United States)

    Roussos, E.; Roberts, S.; Daubechies, I.

    2005-11-01

    In an exploratory approach to data analysis, it is often useful to consider the observations as generated from a set of latent generators or "sources" via a generally unknown mapping. For the noisy overcomplete case, where we have more sources than observations, the problem becomes extremely ill-posed. Solutions to such inverse problems can, in many cases, be achieved by incorporating prior knowledge about the problem, captured in the form of constraints. This setting is a natural candidate for the application of the Bayesian methodology, allowing us to incorporate "soft" constraints in a natural manner. The work described in this paper is mainly driven by problems in functional magnetic resonance imaging of the brain, for the neuro-scientific goal of extracting relevant "maps" from the data. This can be stated as a `blind' source separation problem. Recent experiments in the field of neuroscience show that these maps are sparse, in some appropriate sense. The separation problem can be solved by independent component analysis (ICA), viewed as a technique for seeking sparse components, assuming appropriate distributions for the sources. We derive a hybrid wavelet-ICA model, transforming the signals into a domain where the modeling assumption of sparsity of the coefficients with respect to a dictionary is natural. We follow a graphical modeling formalism, viewing ICA as a probabilistic generative model. We use hierarchical source and mixing models and apply Bayesian inference to the problem. This allows us to perform model selection in order to infer the complexity of the representation, as well as automatic denoising. Since exact inference and learning in such a model is intractable, we follow a variational Bayesian mean-field approach in the conjugate-exponential family of distributions, for efficient unsupervised learning in multi-dimensional settings. The performance of the proposed algorithm is demonstrated on some representative experiments.

  16. Variation and diversity in Homo erectus: a 3D geometric morphometric analysis of the temporal bone.

    Science.gov (United States)

    Terhune, Claire E; Kimbel, William H; Lockwood, Charles A

    2007-07-01

    Although the level of taxonomic diversity within the fossil hominin species Homo erectus (sensu lato) is continually debated, there have been relatively few studies aiming to quantify the morphology of this species. Instead, most researchers have relied on qualitative descriptions or the evaluation of nonmetric characters, which in many cases display continuous variation. Also, only a few studies have used quantitative data to formally test hypotheses regarding the taxonomic composition of the "erectus" hypodigm. Despite these previous analyses, however, and perhaps in part due to these varied approaches for assessing variation within specimens typically referred to H. erectus (sensu lato) and the general lack of rigorous statistical testing of how variation within this taxon is partitioned, there is currently little consensus regarding whether this group is a single species, or whether it should instead be split into separate temporal or geographically delimited taxa. In order to evaluate possible explanations for variation within H. erectus, we tested the general hypothesis that variation within the temporal bone morphology of H. erectus is consistent with that of a single species, using great apes and humans as comparative taxa. Eighteen three-dimensional (3D) landmarks of the temporal bone were digitized on a total of 520 extant and fossil hominid crania. Landmarks were registered by Generalized Procrustes Analysis, and Procrustes distances were calculated for comparisons of individuals within and between the extant taxa. Distances between fossil specimens and between a priori groupings of fossils were then compared to the distances calculated within the extant taxa to assess the variation within the H. erectus sample relative to that of known species, subspecies, and populations. Results of these analyses indicate that shape variation within the entire H. erectus sample is generally higher than extant hominid intraspecific variation, and putative H. ergaster

  17. Target Selection Models with Preference Variation Between Offenders

    NARCIS (Netherlands)

    Townsley, Michael; Birks, Daniel; Ruiter, Stijn; Bernasco, Wim; White, Gentry

    2016-01-01

    Objectives: This study explores preference variation in location choice strategies of residential burglars. Applying a model of offender target selection that is grounded in assertions of the routine activity approach, rational choice perspective, crime pattern and social disorganization theories,

  18. Application of Archimedean copulas to the analysis of drought decadal variation in China

    Science.gov (United States)

    Zuo, Dongdong; Feng, Guolin; Zhang, Zengping; Hou, Wei

    2017-12-01

    Based on daily precipitation data collected from 1171 stations in China during 1961-2015, the monthly standardized precipitation index was derived and used to extract two major drought characteristics which are drought duration and severity. Next, a bivariate joint model was established based on the marginal distributions of the two variables and Archimedean copula functions. The joint probability and return period were calculated to analyze the drought characteristics and decadal variation. According to the fit analysis, the Gumbel-Hougaard copula provided the best fit to the observed data. Based on four drought duration classifications and four severity classifications, the drought events were divided into 16 drought types according to the different combinations of duration and severity classifications, and the probability and return period were analyzed for different drought types. The results showed that the occurring probability of six common drought types (0 accounted for 76% of the total probability of all types. Moreover, due to their greater variation, two drought types were particularly notable, i.e., the drought types where D ≥ 6 and S ≥ 2. Analyzing the joint probability in different decades indicated that the location of the drought center had a distinctive stage feature, which cycled from north to northeast to southwest during 1961-2015. However, southwest, north, and northeast China had a higher drought risk. In addition, the drought situation in southwest China should be noted because the joint probability values, return period, and the analysis of trends in the drought duration and severity all indicated a considerable risk in recent years.

  19. Applications of RIGAKU Dmax Rapid II micro-X-ray diffractometer in the analysis of archaeological metal objects

    Science.gov (United States)

    Mozgai, Viktória; Szabó, Máté; Bajnóczi, Bernadett; Weiszburg, Tamás G.; Fórizs, István; Mráv, Zsolt; Tóth, Mária

    2017-04-01

    During material analysis of archaeological metal objects, especially their inlays or corrosion products, not only microstructure and chemical composition, but mineralogical composition is necessary to be determined. X-ray powder diffraction (XRD) is a widely-used method to specify the mineralogical composition. However, when sampling is not or limitedly allowed due to e.g. the high value of the object, the conventional XRD analysis can hardly be used. Laboratory micro-XRD instruments provide good alternatives, like the RIGAKU Dmax Rapid II micro-X-ray diffractometer, which is a unique combination of a MicroMax-003 third generation microfocus, sealed tube X-ray generator and a curved 'image plate' detector. With this instrument it is possible to measure as small as 10 µm area in diameter on the object. Here we present case studies for the application of the micro-XRD technique in the study of archaeological metal objects. In the first case niello inlay of a Late Roman silver augur staff was analysed. Due to the high value of the object, since it is the only piece known from the Roman Empire, only non-destructive analyses were allowed. To reconstruct the preparation of the niello, SEM-EDX analysis was performed on the niello inlays to characterise their chemical composition and microstructure. Two types of niello are present: a homogeneous, silver sulphide niello (acanthite) and an inhomogeneous silver-copper sulphide niello (exsolution of acanthite and jalpaite or jalpaite and stromeyerite). The micro-X-ray diffractometer was used to verify the mineralogical composition of the niello, supposed on the base of SEM results. In the second case corrosion products of a Late Roman copper cauldron with uncertain provenance were examined, since they may hold clues about the burial conditions (pH, Eh, etc.) of the object. A layer by layer analysis was performed in cross sections of small metal samples by using electron microprobe and micro-X-ray diffractometer. The results

  20. Quasi-objects, Cult Objects and Fashion Objects

    DEFF Research Database (Denmark)

    Andersen, Bjørn Schiermer

    2011-01-01

    This article attempts to rehabilitate the concept of fetishism and to contribute to the debate on the social role of objects as well as to fashion theory. Extrapolating from Michel Serres’ theory of the quasi-objects, I distinguish two phenomenologies possessing almost opposite characteristics. T...... as a unique opportunity for studying the interchange between these two forms of fetishism and their respective phenomenologies. Finally, returning to Serres, I briefly consider the theoretical consequences of introducing the fashion object as a quasi-object.......This article attempts to rehabilitate the concept of fetishism and to contribute to the debate on the social role of objects as well as to fashion theory. Extrapolating from Michel Serres’ theory of the quasi-objects, I distinguish two phenomenologies possessing almost opposite characteristics....... These two phenomenologies are, so I argue, essential to quasi-object theory, yet largely ignored by Serres’ sociological interpreters. They correspond with the two different theories of fetishism found in Marx and Durkheim, respectively. In the second half of the article, I introduce the fashion object...

  1. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    Science.gov (United States)

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  2. Variational analysis for simulating free-surface flows in a porous medium

    Directory of Open Access Journals (Sweden)

    Shabbir Ahmed

    2003-01-01

    is used to obtain a discrete form of equations for a two-dimensional domain. The matrix characteristics and the stability criteria have been investigated to develop a stable numerical algorithm for solving the governing equation. A computer programme has been written to solve a symmetric positive definite system obtained from the variational finite element analysis. The system of equations is solved using the conjugate gradient method. The solution generates time-varying hydraulic heads in the subsurface. The interfacing free surface between the unsaturated and saturated zones in the variably saturated domain is located, based on the computed hydraulic heads. Example problems are investigated. The finite element solutions are compared with the exact solutions for the example problems. The numerical characteristics of the finite element solution method are also investigated using the example problems.

  3. Association analysis between spatiotemporal variation of vegetation greenness and precipitation/temperature in the Yangtze River Basin (China).

    Science.gov (United States)

    Cui, Lifang; Wang, Lunche; Singh, Ramesh P; Lai, Zhongping; Jiang, Liangliang; Yao, Rui

    2018-05-23

    The variation in vegetation greenness provides good understanding of the sustainable management and monitoring of land surface ecosystems. The present paper discusses the spatial-temporal changes in vegetation and controlling factors in the Yangtze River Basin (YRB) using Global Inventory Modeling and Mapping Studies (GIMMS) Normalized Difference Vegetation Index (NDVI) for the period 2001-2013. Theil-Sen Median trend analysis, Pearson correlation coefficients, and residual analysis have been used, which shows decreasing trend of the annual mean NDVI over the whole YRB. Spatially, the regions with significant decreasing trends were mainly located in parts of central YRB, and pronounced increasing trends were observed in parts of the eastern and western YRB. The mean NDVI during spring and summer seasons increased, while it decreased during autumn and winter seasons. The seasonal mean NDVI shows spatial heterogeneity due to the vegetation types. The correlation analysis shows a positive relation between NDVI and temperature over most of the YRB, whereas NDVI and precipitation show a negative correlation. The residual analysis shows an increase in NDVI in parts of eastern and western YRB and the decrease in NDVI in the small part of Yangtze River Delta (YRD) and the mid-western YRB due to human activities. In general, climate factors were the principal drivers of NDVI variation in YRB in recent years.

  4. FEM analysis of impact of external objects to pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Gracie, Robert; Konuk, Ibrahim [Geological Survey of Canada, Ottawa, ON (Canada)]. E-mail: ikonuk@NRCan.gc.ca; Fredj, Abdelfettah [BMT Fleet Technology Limited, Ottawa, ON (Canada)

    2003-07-01

    One of the most common hazards to pipelines is impact of external objects. Earth moving machinery, farm equipment or bullets can dent or fail land pipelines. External objects such as anchors, fishing gear, ice can damage offshore pipelines. This paper develops an FEM model to simulate the impact process and presents investigations using the FEM model to determine the influence of the geometry and velocity of the impacting object and also will study the influence of the pipe diameter, wall thickness, and concrete thickness along with internal pressure. The FEM model is developed by using LS-DYNA explicit FEM software utilizing shell and solid elements. The model allows damage and removal of the concrete and corrosion coating elements during impact. Parametric studies will be presented relating the dent size to pipe diameter, wall thickness and concrete thickness, internal pipe pressure, and impacting object geometry. The primary objective of this paper is to develop and present the FEM model. The model can be applied to both offshore and land pipeline problems. Some examples are used to illustrate how the model can be applied to real life problems. A future paper will present more detailed parametric studies. (author)

  5. From Nonradiating Sources to Directionally Invisible Objects

    Science.gov (United States)

    Hurwitz, Elisa

    The goal of this dissertation is to extend the understanding of invisible objects, in particular nonradiating sources and directional nonscattering scatterers. First, variations of null-field nonradiating sources are derived from Maxwell's equations. Next, it is shown how to design a nonscattering scatterer by applying the boundary conditions for nonradiating sources to the scalar wave equation, referred to here as the "field cloak method". This technique is used to demonstrate directionally invisible scatterers for an incident field with one direction of incidence, and the influence of symmetry on the directionality is explored. This technique, when applied to the scalar wave equation, is extended to show that a directionally invisible object may be invisible for multiple directions of incidence simultaneously. This opens the door to the creation of optically switchable, directionally invisible objects which could be implemented in couplers and other novel optical devices. Next, a version of the "field cloak method" is extended to the Maxwell's electro-magnetic vector equations, allowing more flexibility in the variety of directionally invisible objects that can be designed. This thesis concludes with examples of such objects and future applications.

  6. from synchronic variation to a grammaticalization path

    African Journals Online (AJOL)

    Kate H

    Abstract. The authors argue that the synchronic variation of cognate objects of weather verbs exhibited in six African languages of South Africa (Sepedi, Sesotho, Tshivenda, isiXhosa, Xitsonga, and. isiZulu) has a diachronic explanation, and may be represented as a grammaticalization path. This path gradually leads from ...

  7. Multi-objective optimization and exergoeconomic analysis of a combined cooling, heating and power based compressed air energy storage system

    International Nuclear Information System (INIS)

    Yao, Erren; Wang, Huanran; Wang, Ligang; Xi, Guang; Maréchal, François

    2017-01-01

    Highlights: • A novel tri-generation based compressed air energy storage system. • Trade-off between efficiency and cost to highlight the best compromise solution. • Components with largest irreversibility and potential improvements highlighted. - Abstract: Compressed air energy storage technologies can improve the supply capacity and stability of the electricity grid, particularly when fluctuating renewable energies are massively connected. While incorporating the combined cooling, heating and power systems into compressed air energy storage could achieve stable operation as well as efficient energy utilization. In this paper, a novel combined cooling, heating and power based compressed air energy storage system is proposed. The system combines a gas engine, supplemental heat exchangers and an ammonia-water absorption refrigeration system. The design trade-off between the thermodynamic and economic objectives, i.e., the overall exergy efficiency and the total specific cost of product, is investigated by an evolutionary multi-objective algorithm for the proposed combined system. It is found that, with an increase in the exergy efficiency, the total product unit cost is less affected in the beginning, while rises substantially afterwards. The best trade-off solution is selected with an overall exergy efficiency of 53.04% and a total product unit cost of 20.54 cent/kWh, respectively. The variation of decision variables with the exergy efficiency indicates that the compressor, turbine and heat exchanger preheating the inlet air of turbine are the key equipment to cost-effectively pursuit a higher exergy efficiency. It is also revealed by an exergoeconomic analysis that, for the best trade-off solution, the investment costs of the compressor and the two heat exchangers recovering compression heat and heating up compressed air for expansion should be reduced (particularly the latter), while the thermodynamic performance of the gas engine need to be improved

  8. Population genetic variation in the tree fern Alsophila spinulosa (Cyatheaceae): effects of reproductive strategy.

    Science.gov (United States)

    Wang, Ting; Su, Yingjuan; Li, Yuan

    2012-01-01

    Essentially all ferns can perform both sexual and asexual reproduction. Their populations represent suitable study objects to test the population genetic effects of different reproductive systems. Using the diploid homosporous fern Alsophila spinulosa as an example species, the main purpose of this study was to assess the relative impact of sexual and asexual reproduction on the level and structure of population genetic variation. Inter-simple sequence repeats analysis was conducted on 140 individuals collected from seven populations (HSG, LCH, BPC, MPG, GX, LD, and ZHG) in China. Seventy-four polymorphic bands discriminated a total of 127 multilocus genotypes. Character compatibility analysis revealed that 50.0 to 70.0% of the genotypes had to be deleted in order to obtain a tree-like structure in the data set from populations HSG, LCH, MPG, BPC, GX, and LD; and there was a gradual decrease of conflict in the data set when genotypes with the highest incompatibility counts were successively deleted. In contrast, in population ZHG, only 33.3% of genotypes had to be removed to achieve complete compatibility in the data set, which showed a sharp decline in incompatibility upon the deletion of those genotypes. All populations examined possessed similar levels of genetic variation. Population ZHG was not found to be more differentiated than the other populations. Sexual recombination is the predominant source of genetic variation in most of the examined populations of A. spinulosa. However, somatic mutation contributes most to the genetic variation in population ZHG. This change of the primary mode of reproduction does not cause a significant difference in the population genetic composition. Character compatibility analysis represents an effective approach to separate the role of sexual and asexual components in shaping the genetic pattern of fern populations.

  9. Population genetic variation in the tree fern Alsophila spinulosa (Cyatheaceae: effects of reproductive strategy.

    Directory of Open Access Journals (Sweden)

    Ting Wang

    Full Text Available BACKGROUND: Essentially all ferns can perform both sexual and asexual reproduction. Their populations represent suitable study objects to test the population genetic effects of different reproductive systems. Using the diploid homosporous fern Alsophila spinulosa as an example species, the main purpose of this study was to assess the relative impact of sexual and asexual reproduction on the level and structure of population genetic variation. METHODOLOGY/PRINCIPAL FINDINGS: Inter-simple sequence repeats analysis was conducted on 140 individuals collected from seven populations (HSG, LCH, BPC, MPG, GX, LD, and ZHG in China. Seventy-four polymorphic bands discriminated a total of 127 multilocus genotypes. Character compatibility analysis revealed that 50.0 to 70.0% of the genotypes had to be deleted in order to obtain a tree-like structure in the data set from populations HSG, LCH, MPG, BPC, GX, and LD; and there was a gradual decrease of conflict in the data set when genotypes with the highest incompatibility counts were successively deleted. In contrast, in population ZHG, only 33.3% of genotypes had to be removed to achieve complete compatibility in the data set, which showed a sharp decline in incompatibility upon the deletion of those genotypes. All populations examined possessed similar levels of genetic variation. Population ZHG was not found to be more differentiated than the other populations. CONCLUSIONS/SIGNIFICANCE: Sexual recombination is the predominant source of genetic variation in most of the examined populations of A. spinulosa. However, somatic mutation contributes most to the genetic variation in population ZHG. This change of the primary mode of reproduction does not cause a significant difference in the population genetic composition. Character compatibility analysis represents an effective approach to separate the role of sexual and asexual components in shaping the genetic pattern of fern populations.

  10. Intellectual capital: approaches to analysis as an object of the internal environment of an economic entity

    Directory of Open Access Journals (Sweden)

    O. E. Ustinova

    2017-01-01

    Full Text Available Intellectual capital is of strategic importance for a modern company. At the same time, its effective management, including a stimulating and creative approach to solving problems, will help to increase the competitiveness and development of economic entities. The article considers intellectual capital as an object of analysis of the internal environment. In the context of the proposed approaches to its study, its impact on the development of the company is also considered. The intellectual capital has a special significance and influence on internal processes, since on each of them the intellectual component allows to achieve a positive synergetic effect from the interaction of different objects. In more detail, it is proposed to consider it in terms of the position of the company it occupies on the market, the principles of its activities, the formation of marketing policies, the use of resources, methods and means of making managerial decisions, and the organizational culture formed. For the analysis of the state of the internal environment, the main approaches are proposed, in which the intellectual capital is considered, among them: methods for analyzing cash flows, economic efficiency and financial feasibility of the project, analysis of the consolidated financial flow by group of objects, assessment of the potential of the business entity, technology of choice of investment policy, technology Selection of incentive mechanisms. In this regard, it is advisable to analyze the company's internal environment from the position of influencing its state of intellectual capital. The scheme of interaction of intellectual capital and objects of an estimation of an internal environment of the managing subject is offered. The results of this study should be considered as initial data for the further development of the economic evaluation of the influence of intellectual capital on the competitiveness of companies.

  11. A REGION-BASED MULTI-SCALE APPROACH FOR OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    T. Kavzoglu

    2016-06-01

    Full Text Available Within the last two decades, object-based image analysis (OBIA considering objects (i.e. groups of pixels instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient. Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  12. Green mathematics: Benefits of including biological variation in your data analysis

    NARCIS (Netherlands)

    Tijskens, L.M.M.; Schouten, R.E.; Unuk, T.; Simcic, M.

    2015-01-01

    Biological variation is omnipresent in nature. It contains useful information that is neglected by the usually applied statistical procedures. To extract this information special procedures have to be applied. Biological variation is seen in properties (e.g. size, colour, firmness), but the

  13. Size variation in Middle Pleistocene humans.

    Science.gov (United States)

    Arsuaga, J L; Carretero, J M; Lorenzo, C; Gracia, A; Martínez, I; Bermúdez de Castro, J M; Carbonell, E

    1997-08-22

    It has been suggested that European Middle Pleistocene humans, Neandertals, and prehistoric modern humans had a greater sexual dimorphism than modern humans. Analysis of body size variation and cranial capacity variation in the large sample from the Sima de los Huesos site in Spain showed instead that the sexual dimorphism is comparable in Middle Pleistocene and modern populations.

  14. School variation in asthma: compositional or contextual?

    Directory of Open Access Journals (Sweden)

    Tracy K Richmond

    2009-12-01

    Full Text Available Childhood asthma prevalence and morbidity have been shown to vary by neighborhood. Less is known about between-school variation in asthma prevalence and whether it exists beyond what one might expect due to students at higher risk of asthma clustering within different schools. Our objective was to determine whether between-school variation in asthma prevalence exists and if so, if it is related to the differential distribution of individual risk factors for and correlates of asthma or to contextual influences of schools.Cross-sectional analysis of 16,640 teens in grades 7-12 in Wave 1 (data collected in 1994-5 of the National Longitudinal Study of Adolescent Health. Outcome was current diagnosis of asthma as reported by respondents' parents. Two-level random effects models were used to assess the contribution of schools to the variance in asthma prevalence before and after controlling for individual attributes.The highest quartile schools had mean asthma prevalence of 21.9% compared to the lowest quartile schools with mean asthma prevalence of 7.1%. In our null model, the school contributed significantly to the variance in asthma (sigma(u0(2 = 0.27, CI: 0.20, 0.35. Controlling for individual, school and neighborhood attributes reduced the between-school variance modestly (sigma(u0(2 = 0.19 CI: 0.13-0.29.Significant between-school variation in current asthma prevalence exists even after controlling for the individual, school and neighborhood factors. This provides evidence for school level contextual influences on asthma. Further research is needed to determine potential mechanisms through which schools may influence asthma outcomes.

  15. Variational principles for nonpotential operators

    CERN Document Server

    Filippov, V M

    1989-01-01

    This book develops a variational method for solving linear equations with B-symmetric and B-positive operators and generalizes the method to nonlinear equations with nonpotential operators. The author carries out a constructive extension of the variational method to "nonvariational" equations (including parabolic equations) in classes of functionals which differ from the Euler-Lagrange functionals. In this connection, some new functions spaces are considered. Intended for mathematicians working in the areas of functional analysis and differential equations, this book would also prove useful for researchers in other areas and students in advanced courses who use variational methods in solving linear and nonlinear boundary value problems in continuum mechanics and theoretical physics.

  16. Analysis of art objects by means of ion beam induced luminescence

    International Nuclear Information System (INIS)

    Quaranta, A; Dran, J C; Salomon, J; Pivin, J C; Vomiero, A; Tonezzer, M; Maggioni, G; Carturan, S; Mea, G Della

    2006-01-01

    The impact of energetic ions on solid samples gives rise to the emission of visible light owing to the electronic excitation of intrinsic defects or extrinsic impurities. The intensity and position of the emission features provide information on the nature of the luminescence centers and on their chemical environments. This makes ion beam induced luminescence (IBIL) a useful complement to other ion beam analyses, like PIXE, in the cultural heritage field in characterizing the composition and the provenience of art objects. In the present paper, IBIL measurements have been performed on inorganic pigments for underlying the complementary role played by IBIL in the analysis of artistic works. Some blue and red pigment has been presented as case study

  17. Prevalence of anatomical variations in maxillary sinus using cone beam computed tomography

    Directory of Open Access Journals (Sweden)

    Deepjyoti K Mudgade

    2018-01-01

    Full Text Available Introduction: The maxillary sinuses (MS are of particular importance to dentist because of their close proximity to the teeth and their associated structures, so increased risk of maxillary sinusitis has been reported with periapical abscess, periodontal diseases, dental trauma, tooth extraction, and implant placement. Complications of MS are related to its anatomic and pathologic variations. Thus, study was conducted to assess the prevalence of anatomic variations in MS by using cone-beam computerized tomography (CBCT. Aims and Objectives: To determine different anatomical variations in MS by using CBCT. Materials and Methods: CBCT scans of 150 subjects were collected between the age group of 18 years to 70 years and were analyzed for MS anatomical variation. Statistical Analysis: The distribution of age, sex, reasons for CBCT, and dimensions of sinus calculated using descriptive statistics and distribution of other anatomic findings using Chi-square test. Results: Prevalence of obstructed ostium is 23.3% and septa is 66.7%. Average height, width, and antero-posterior (A-P dimensions for right MS are 34.13 mm, 26.09 mm, 37.39 mm and that of left MS are 33.24 mm, 26.11 mm, 37.72 mm respectively. Average distance between lower border of ostium to sinus floor in right MS is 32.17 mm and that of left is 32.69 mm. Average diameter of ostium in right MS is 1.88 mm and that of left is 1.67 mm. Conclusion: Study highlights the importance of accurate assessment of MS and its variations in order to properly differentiate the pathologic lesions from anatomic variations avoiding unnecessary surgical explorations.

  18. Static analysis of unbounded structures in object-oriented programs

    NARCIS (Netherlands)

    Grabe, Immo

    2012-01-01

    In this thesis we investigate different techniques and formalisms to address complexity introduced by unbounded structures in object-oriented programs. We give a representation of a weakest precondition calculus for abstract object creation in dynamic logic. Based on this calculus we define symbolic

  19. Genetic variation between ecotypic populations of Chloris ...

    African Journals Online (AJOL)

    Genetic variation between ecotypic populations of Chloris roxburghiana grass detected through RAPD analysis. ... frequency indicated that the four populations of C. roxburghiana were genetically distinct, probably as a result of variation in soil fertility, geographical isolation and socio-ecological history of the study sites.

  20. Programs as Data Objects

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the Second Symposium on Programs as Data Objects, PADO 2001, held in Aarhus, Denmark, in May 2001. The 14 revised full papers presented were carefully reviewed and selected from 30 submissions. Various aspects of looking at programs as data objects...... are covered from the point of view of program analysis, program transformation, computational complexity, etc....

  1. Can state-of-the-art HVS-based objective image quality criteria be used for image reconstruction techniques based on ROI analysis?

    Science.gov (United States)

    Dostal, P.; Krasula, L.; Klima, M.

    2012-06-01

    Various image processing techniques in multimedia technology are optimized using visual attention feature of the human visual system. Spatial non-uniformity causes that different locations in an image are of different importance in terms of perception of the image. In other words, the perceived image quality depends mainly on the quality of important locations known as regions of interest. The performance of such techniques is measured by subjective evaluation or objective image quality criteria. Many state-of-the-art objective metrics are based on HVS properties; SSIM, MS-SSIM based on image structural information, VIF based on the information that human brain can ideally gain from the reference image or FSIM utilizing the low-level features to assign the different importance to each location in the image. But still none of these objective metrics utilize the analysis of regions of interest. We solve the question if these objective metrics can be used for effective evaluation of images reconstructed by processing techniques based on ROI analysis utilizing high-level features. In this paper authors show that the state-of-the-art objective metrics do not correlate well with subjective evaluation while the demosaicing based on ROI analysis is used for reconstruction. The ROI were computed from "ground truth" visual attention data. The algorithm combining two known demosaicing techniques on the basis of ROI location is proposed to reconstruct the ROI in fine quality while the rest of image is reconstructed with low quality. The color image reconstructed by this ROI approach was compared with selected demosaicing techniques by objective criteria and subjective testing. The qualitative comparison of the objective and subjective results indicates that the state-of-the-art objective metrics are still not suitable for evaluation image processing techniques based on ROI analysis and new criteria is demanded.

  2. Detection of somaclonal variation in micropropagated Hibiscus ...

    African Journals Online (AJOL)

    The main objective of micropropagation is to produce clones i.e. plants which are phenotypically and genetically identical to the mother plants. The culture of organized meristems usually guarantees the production of true-to-type plants but variations in the progenies have been widely reported. Hibiscus sabdariffa L. plants ...

  3. Design analysis of vertical wind turbine with airfoil variation

    Science.gov (United States)

    Maulana, Muhammad Ilham; Qaedy, T. Masykur Al; Nawawi, Muhammad

    2016-03-01

    With an ever increasing electrical energy crisis occurring in the Banda Aceh City, it will be important to investigate alternative methods of generating power in ways different than fossil fuels. In fact, one of the biggest sources of energy in Aceh is wind energy. It can be harnessed not only by big corporations but also by individuals using Vertical Axis Wind Turbines (VAWT). This paper presents a three-dimensional CFD analysis of the influence of airfoil design on performance of a Darrieus-type vertical-axis wind turbine (VAWT). The main objective of this paper is to develop an airfoil design for NACA 63-series vertical axis wind turbine, for average wind velocity 2,5 m/s. To utilize both lift and drag force, some of designs of airfoil are analyzed using a commercial computational fluid dynamics solver such us Fluent. Simulation is performed for this airfoil at different angles of attach rearranging from -12°, -8°, -4°, 0°, 4°, 8°, and 12°. The analysis showed that the significant enhancement in value of lift coefficient for airfoil NACA 63-series is occurred for NACA 63-412.

  4. Exploring Variation in Glycemic Control Across and Within Eight High-Income Countries

    DEFF Research Database (Denmark)

    Charalampopoulos, Dimitrios; Hermann, Julia M; Svensson, Jannet

    2018-01-01

    OBJECTIVE: International studies on childhood type 1 diabetes (T1D) have focused on whole-country mean HbA1c levels, thereby concealing potential variations within countries. We aimed to explore the variations in HbA1c across and within eight high-income countries to best inform international ben...

  5. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-08-01

    Full Text Available Airborne laser scanning (ALS is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2 and additional classification variables from full-waveform (FWF ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original

  6. Circadian variation of urinary albumin excretion in pregnancy

    NARCIS (Netherlands)

    Douma, C. E.; van der Post, J. A.; van Acker, B. A.; Boer, K.; Koopman, M. G.

    1995-01-01

    OBJECTIVE: The hypothesis was tested that circadian variations in urinary albumin excretion of pregnant women in the third trimester of normal pregnancy are different from nonpregnant individuals. DESIGN: Circadian variability in urinary albumin excretion was studied both in pregnant women and in

  7. Multi-objective optimization of a cascade refrigeration system: Exergetic, economic, environmental, and inherent safety analysis

    International Nuclear Information System (INIS)

    Eini, Saeed; Shahhosseini, Hamidreza; Delgarm, Navid; Lee, Moonyong; Bahadori, Alireza

    2016-01-01

    Highlights: • A multi-objective optimization is performed for a cascade refrigeration cycle. • The optimization problem considers inherently safe design as well as 3E analysis. • As a measure of inherent safety level a quantitative risk analysis is utilized. • A CO 2 /NH 3 cascade refrigeration system is compared with a CO 2 /C 3 H 8 system. - Abstract: Inherently safer design is the new approach to maximize the overall safety of a process plant. This approach suggests some risk reduction strategies to be implemented in the early stages of design. In this paper a multi-objective optimization was performed considering economic, exergetic, and environmental aspects besides evaluation of the inherent safety level of a cascade refrigeration system. The capital costs, the processing costs, and the social cost due to CO 2 emission were considered to be included in the economic objective function. Exergetic efficiency of the plant was considered as the second objective function. As a measure of inherent safety level, Quantitative Risk Assessment (QRA) was performed to calculate total risk level of the cascade as the third objective function. Two cases (ammonia and propane) were considered to be compared as the refrigerant of the high temperature circuit. The achieved optimum solutions from the multi–objective optimization process were given as Pareto frontier. The ultimate optimal solution from available solutions on the Pareto optimal curve was selected using Decision-Makings approaches. NSGA-II algorithm was used to obtain Pareto optimal frontiers. Also, three decision-making approaches (TOPSIS, LINMAP, and Shannon’s entropy methods) were utilized to select the final optimum point. Considering continuous material release from the major equipment in the plant, flash and jet fire scenarios were considered for the CO 2 /C 3 H 8 cycle and toxic hazards were considered for the CO 2 /NH 3 cycle. The results showed no significant differences between CO 2 /NH 3 and

  8. Computed tomography angiography study of variations of the celiac trunk and hepatic artery in 100 patients

    Directory of Open Access Journals (Sweden)

    Ivelise Regina Canito Brasil

    Full Text Available Abstract Objective: To describe the main anatomical variations of the celiac trunk and the hepatic artery at their origins. Materials and Methods: This was a prospective analysis of 100 consecutive computed tomography angiography studies of the abdomen performed during a one-year period. The findings were stratified according to classification systems devised by Sureka et al. and Michels. Results: The celiac trunk was "normal" (i.e., the hepatogastrosplenic trunk and superior mesenteric artery originating separately from the abdominal aorta in 43 patients. In our sample, we identified four types of variations of the celiac trunk. Regarding the hepatic artery, a normal anatomical pattern (i.e., the proper hepatic artery being a continuation of the common hepatic artery and bifurcating into the right and left hepatic arteries was seen in 82 patients. We observed six types of variations of the hepatic artery. Conclusion: We found rates of variations of the hepatic artery that are different from those reported in the literature. Our findings underscore the need for proper knowledge and awareness of these anatomical variations, which can facilitate their recognition and inform decisions regarding the planning of surgical procedures, in order to avoid iatrogenic intraoperative injuries, which could lead to complications.

  9. Computed tomography angiography study of variations of the celiac trunk and hepatic artery in 100 patients

    Energy Technology Data Exchange (ETDEWEB)

    Brasil, Ivelise Regina Canito; Araujo, Igor Farias de; Lima, Adriana Augusta Lopes de Araujo; Melo, Ernesto Lima Araujo; Esmeraldo, Ronaldo de Matos, E-mail: igor_farias98@hotmail.com [Universidade Estadual do Ceará (UECE), Fortaleza, CE (Brazil). Escola de Medicina

    2018-01-15

    Objective: To describe the main anatomical variations of the celiac trunk and the hepatic artery at their origins. Materials and methods: This was a prospective analysis of 100 consecutive computed tomography angiography studies of the abdomen performed during a one-year period. The findings were stratified according to classification systems devised by Sureka et al. and Michels. Results: The celiac trunk was 'normal' (i.e., the hepatogastrosplenic trunk and superior mesenteric artery originating separately from the abdominal aorta) in 43 patients. In our sample, we identified four types of variations of the celiac trunk. Regarding the hepatic artery, a normal anatomical pattern (i.e., the proper hepatic artery being a continuation of the common hepatic artery and bifurcating into the right and left hepatic arteries) was seen in 82 patients. We observed six types of variations of the hepatic artery. Conclusion: We found rates of variations of the hepatic artery that are different from those reported in the literature. Our findings underscore the need for proper knowledge and awareness of these anatomical variations, which can facilitate their recognition and inform decisions regarding the planning of surgical procedures, in order to avoid iatrogenic intraoperative injuries, which could lead to complications. (author)

  10. Computed tomography angiography study of variations of the celiac trunk and hepatic artery in 100 patients

    International Nuclear Information System (INIS)

    Brasil, Ivelise Regina Canito; Araujo, Igor Farias de; Lima, Adriana Augusta Lopes de Araujo; Melo, Ernesto Lima Araujo; Esmeraldo, Ronaldo de Matos

    2018-01-01

    Objective: To describe the main anatomical variations of the celiac trunk and the hepatic artery at their origins. Materials and methods: This was a prospective analysis of 100 consecutive computed tomography angiography studies of the abdomen performed during a one-year period. The findings were stratified according to classification systems devised by Sureka et al. and Michels. Results: The celiac trunk was 'normal' (i.e., the hepatogastrosplenic trunk and superior mesenteric artery originating separately from the abdominal aorta) in 43 patients. In our sample, we identified four types of variations of the celiac trunk. Regarding the hepatic artery, a normal anatomical pattern (i.e., the proper hepatic artery being a continuation of the common hepatic artery and bifurcating into the right and left hepatic arteries) was seen in 82 patients. We observed six types of variations of the hepatic artery. Conclusion: We found rates of variations of the hepatic artery that are different from those reported in the literature. Our findings underscore the need for proper knowledge and awareness of these anatomical variations, which can facilitate their recognition and inform decisions regarding the planning of surgical procedures, in order to avoid iatrogenic intraoperative injuries, which could lead to complications. (author)

  11. Multidimensional analysis of Drosophila wing variation in Evolution ...

    Indian Academy of Sciences (India)

    In this study, using Drosophila melanogaster isofemale lines derived from wild flies collected on both slopes of the canyon, we investigated the effect of developmental temperature upon the different components of phenotypic variation of a complex trait: the wing. Combining geometric and traditional morphometrics, we find ...

  12. Robust Individual-Cell/Object Tracking via PCANet Deep Network in Biomedicine and Computer Vision

    Directory of Open Access Journals (Sweden)

    Bineng Zhong

    2016-01-01

    Full Text Available Tracking individual-cell/object over time is important in understanding drug treatment effects on cancer cells and video surveillance. A fundamental problem of individual-cell/object tracking is to simultaneously address the cell/object appearance variations caused by intrinsic and extrinsic factors. In this paper, inspired by the architecture of deep learning, we propose a robust feature learning method for constructing discriminative appearance models without large-scale pretraining. Specifically, in the initial frames, an unsupervised method is firstly used to learn the abstract feature of a target by exploiting both classic principal component analysis (PCA algorithms with recent deep learning representation architectures. We use learned PCA eigenvectors as filters and develop a novel algorithm to represent a target by composing of a PCA-based filter bank layer, a nonlinear layer, and a patch-based pooling layer, respectively. Then, based on the feature representation, a neural network with one hidden layer is trained in a supervised mode to construct a discriminative appearance model. Finally, to alleviate the tracker drifting problem, a sample update scheme is carefully designed to keep track of the most representative and diverse samples during tracking. We test the proposed tracking method on two standard individual cell/object tracking benchmarks to show our tracker's state-of-the-art performance.

  13. Neural regions supporting lexical processing of objects and actions: A case series analysis

    Directory of Open Access Journals (Sweden)

    Bonnie L Breining

    2014-04-01

    Full Text Available Introduction. Linking semantic representations to lexical items is an important cognitive process for both producing and comprehending language. Past research has suggested that the bilateral anterior temporal lobes are critical for this process (e.g. Patterson, Nestor, & Rogers, 2007. However, the majority of studies focused on object concepts alone, ignoring actions. The few that considered actions suggest that the temporal poles are not critical for their processing (e.g. Kemmerer et al., 2010. In this case series, we investigated the neural substrates of linking object and action concepts to lexical labels by correlating the volume of defined regions of interest with behavioral performance on picture-word verification and picture naming tasks of individuals with primary progressive aphasia (PPA. PPA is a neurodegenerative condition with heterogeneous neuropathological causes, characterized by increasing language deficits for at least two years in the face of relatively intact cognitive function in other domains (Gorno-Tempini et al., 2011. This population displays appropriate heterogeneity of performance and focal atrophy for investigating the neural substrates involved in lexical semantic processing of objects and actions. Method. Twenty-one individuals with PPA participated in behavioral assessment within six months of high resolution anatomical MRI scans. Behavioral assessments consisted of four tasks: picture-word verification and picture naming of objects and actions. Performance on these assessments was correlated with brain volume measured using atlas-based analysis in twenty regions of interest that are commonly atrophied in PPA and implicated in language processing. Results. Impaired performance for all four tasks significantly correlated with atrophy in the right superior temporal pole, left anterior middle temporal gyrus, and left fusiform gyrus. No regions were identified in which volume correlated with performance for both

  14. Seasonal variation in hemodialysis initiation: A single-center retrospective analysis.

    Directory of Open Access Journals (Sweden)

    Yujiro Maeoka

    Full Text Available The number of new dialysis patients has been increasing worldwide, particularly among elderly individuals. However, information on seasonal variation in hemodialysis initiation in recent decades is lacking, and the seasonal distribution of patients' conditions immediately prior to starting dialysis remains unclear. Having this information could help in developing a modifiable approach to improving pre-dialysis care. We retrospectively investigated the records of 297 patients who initiated hemodialysis at Hiroshima Prefectural Hospital from January 1st, 2009 to December 31st, 2013. Seasonal differences were assessed by χ2 or Kruskal-Wallis tests. Multiple comparison analysis was performed with the Steel test. The overall number of patients starting dialysis was greatest in winter (n = 85, 28.6%, followed by spring (n = 74, 24.9%, summer (n = 70, 23.6%, and autumn (n = 68, 22.9%, though the differences were not significant. However, there was a significant winter peak in dialysis initiation among patients aged ≥65 years, but not in those aged <65 years. Fluid overload assessed by clinicians was the most common uremic symptom among all patients, but a winter peak was only detected in patients aged ≥65 years. The body weight gain ratio showed a similar trend to fluid overload assessed by clinicians. Pulmonary edema was most pronounced in winter among patients aged ≥65 years compared with other seasons. The incidences of infection were modestly increased in summer and winter, but not statistically significant. Cardiac complications were similar in all seasons. This study demonstrated the existence of seasonal variation in dialysis initiation, with a winter peak among patients aged ≥65 years. The winter increment in dialysis initiation was mainly attributable to increased fluid overload. These findings suggest that elderly individuals should be monitored particularly closely during the winter.

  15. Objective-guided image annotation.

    Science.gov (United States)

    Mao, Qi; Tsang, Ivor Wai-Hung; Gao, Shenghua

    2013-04-01

    Automatic image annotation, which is usually formulated as a multi-label classification problem, is one of the major tools used to enhance the semantic understanding of web images. Many multimedia applications (e.g., tag-based image retrieval) can greatly benefit from image annotation. However, the insufficient performance of image annotation methods prevents these applications from being practical. On the other hand, specific measures are usually designed to evaluate how well one annotation method performs for a specific objective or application, but most image annotation methods do not consider optimization of these measures, so that they are inevitably trapped into suboptimal performance of these objective-specific measures. To address this issue, we first summarize a variety of objective-guided performance measures under a unified representation. Our analysis reveals that macro-averaging measures are very sensitive to infrequent keywords, and hamming measure is easily affected by skewed distributions. We then propose a unified multi-label learning framework, which directly optimizes a variety of objective-specific measures of multi-label learning tasks. Specifically, we first present a multilayer hierarchical structure of learning hypotheses for multi-label problems based on which a variety of loss functions with respect to objective-guided measures are defined. And then, we formulate these loss functions as relaxed surrogate functions and optimize them by structural SVMs. According to the analysis of various measures and the high time complexity of optimizing micro-averaging measures, in this paper, we focus on example-based measures that are tailor-made for image annotation tasks but are seldom explored in the literature. Experiments show consistency with the formal analysis on two widely used multi-label datasets, and demonstrate the superior performance of our proposed method over state-of-the-art baseline methods in terms of example-based measures on four

  16. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Directory of Open Access Journals (Sweden)

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  17. Selection problems and objectives in mutation breeding

    International Nuclear Information System (INIS)

    Mac Key, J.

    1984-01-01

    In plant breeding, major genes are preferably handled by inbreeding, back-crosses and selection through the family/pedigree method. Polygenic systems need gene accumulation, i.e. handling in bulk allowing natural/recurrent selection to operate. The two types of genetic control normally occur together irrespective of whether the variation is created by crossing or by mutagenesis. Cross-breeding can conveniently work with both types of variation and offers a range of genetic backgrounds. Problems are the often enormous recombination potential risking the break-down of already accomplished genic constellations or undesirable linkages. Mutation induction implies a scattered mono- to oligo-factorial variation mostly functioning as a negative load. As a result, it will be difficult and unrealistic to try to explore micromutations, as defined by Gaul, in vegetatively propagated and autogamous crop plants. Quantitative analyses have not been able to give guidance since the induced variation includes disturbed vitality and main or side-effects of events that are possible to define as macro-mutations. The possibility of better exhausting the variation induced will mainly depend on the precision in selection techniques, i.e. by dividing complex traits into their components, by improving environmental conditions for selection, and/or by sharpening the screening technique. Contrary to recombination breeding, mutation-induced variation does not fit a plan encompassing overall agronomic traits simultaneously. The progress has to go step by step. Thus, even more than in cross-breeding, it is important that accurately outlined objectives be set. Some characters, such as flower colour, can easily be defined while others, such as yield, may be more interdependent, calling for compromises difficult to foresee. The complexity of the latter category of traits is illustrated by the interaction pattern in relation to grain yield in cereals where both shoot and root are considered

  18. The effect of subdivision on variation at multi-allelic loci under balancing selection

    DEFF Research Database (Denmark)

    Schierup, M H; Vekemans, X; Charlesworth, D

    2000-01-01

    Simulations are used to investigate the expected pattern of variation at loci under different forms of multi-allelic balancing selection in a finite island model of a subdivided population. The objective is to evaluate the effect of restricted migration among demes on the distribution of polymorp......Simulations are used to investigate the expected pattern of variation at loci under different forms of multi-allelic balancing selection in a finite island model of a subdivided population. The objective is to evaluate the effect of restricted migration among demes on the distribution...

  19. GuidosToolbox: universal digital image object analysis

    Science.gov (United States)

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  20. National Variation in Urethroplasty Cost and Predictors of Extreme Cost: A Cost Analysis With Policy Implications.

    Science.gov (United States)

    Harris, Catherine R; Osterberg, E Charles; Sanford, Thomas; Alwaal, Amjad; Gaither, Thomas W; McAninch, Jack W; McCulloch, Charles E; Breyer, Benjamin N

    2016-08-01

    To determine which factors are associated with higher costs of urethroplasty procedure and whether these factors have been increasing over time. Identification of determinants of extreme costs may help reduce cost while maintaining quality. We conducted a retrospective analysis using the 2001-2010 Healthcare Cost and Utilization Project-Nationwide Inpatient Sample (HCUP-NIS). The HCUP-NIS captures hospital charges which we converted to cost using the HCUP cost-to-charge ratio. Log cost linear regression with sensitivity analysis was used to determine variables associated with increased costs. Extreme cost was defined as the top 20th percentile of expenditure, analyzed with logistic regression, and expressed as odds ratios (OR). A total of 2298 urethroplasties were recorded in NIS over the study period. The median (interquartile range) calculated cost was $7321 ($5677-$10,000). Patients with multiple comorbid conditions were associated with extreme costs [OR 1.56, 95% confidence interval (CI) 1.19-2.04, P = .02] compared with patients with no comorbid disease. Inpatient complications raised the odds of extreme costs (OR 3.2, CI 2.14-4.75, P costs (OR 1.78, 95% CI 1.2-2.64, P = .005). Variations in patient age, race, hospital region, bed size, teaching status, payor type, and volume of urethroplasty cases were not associated with extremes of cost. Cost variation for perioperative inpatient urethroplasty procedures is dependent on preoperative patient comorbidities, postoperative complications, and surgical complexity related to graft usage. Procedural cost and cost variation are critical for understanding which aspects of care have the greatest impact on cost. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Duality for Multitime Multiobjective Ratio Variational Problems on First Order Jet Bundle

    Directory of Open Access Journals (Sweden)

    Mihai Postolache

    2012-01-01

    Full Text Available We consider a new class of multitime multiobjective variational problems of minimizing a vector of quotients of functionals of curvilinear integral type. Based on the efficiency conditions for multitime multiobjective ratio variational problems, we introduce a ratio dual of generalized Mond-Weir-Zalmai type, and under some assumptions of generalized convexity, duality theorems are stated. We prove our weak duality theorem for efficient solutions, showing that the value of the objective function of the primal cannot exceed the value of the dual. Direct and converse duality theorems are stated, underlying the connections between the values of the objective functions of the primal and dual programs. As special cases, duality results of Mond-Weir-Zalmai type for a multitime multiobjective variational problem are obtained. This work further develops our studies in (Pitea and Postolache (2011.

  2. Approaches to defining «financial potential» concept as of economic analysis object

    Directory of Open Access Journals (Sweden)

    O.M. Dzyubenkо

    2017-12-01

    Full Text Available The research analyzes the works of scientists who studied the issues of financial potential as an economic category. Due to analyzing the approaches of the scientists to the concept of "financial potential" the author identifies six approaches to the interpretation of its essence, they are: the totality of the enterprise financial resources, the sources of the enterprise economic activity financing, the enterprise economic activity development, the enterprise financial indicators, the system of enterprise financial management, the enterprise efficiency characteristics. It is established that the financial potential is the multifaceted category that characterizes the financial and economic activity of enterprises. The author's definition of the financial potential in the context of its place in the objects of economic analysis is proposed. It is established that the financial potential is the object of the enterprise economic activity management and is the subject to analytical assessments for establishing its state and directions of development.

  3. SECOND-ORDER VARIATIONAL ANALYSIS IN CONIC PROGRAMMING WITH APPLICATIONS TO OPTIMALITY AND STABILITY

    Czech Academy of Sciences Publication Activity Database

    Mordukhovich, B. S.; Outrata, Jiří; Ramírez, H. C.

    2015-01-01

    Roč. 25, č. 1 (2015), s. 76-101 ISSN 1052-6234 R&D Projects: GA ČR(CZ) GAP201/12/0671 Grant - others:Australian Research Council(AU) DP-110102011; USA National Science Foundation(US) DMS-1007132; Australian Reseach Council(AU) DP-12092508; Portuguese Foundation of Science and Technologies(PT) MAT/11109; FONDECYT Project(CL) 1110888; Universidad de Chile(CL) BASAL Project Centro de Modelamiento Matematico Institutional support: RVO:67985556 Keywords : variational analysis * second-order theory * conic programming * generalized differentiation * optimality conditions * isolated calmness * tilt stability Subject RIV: BA - General Mathematics Impact factor: 2.659, year: 2015 http://library.utia.cas.cz/separaty/2015/MTR/outrata-0439413.pdf

  4. Comparative Transcriptome Analysis of Chinary, Assamica and Cambod tea (Camellia sinensis) Types during Development and Seasonal Variation using RNA-seq Technology

    Science.gov (United States)

    Kumar, Ajay; Chawla, Vandna; Sharma, Eshita; Mahajan, Pallavi; Shankar, Ravi; Yadav, Sudesh Kumar

    2016-11-01

    Tea quality and yield is influenced by various factors including developmental tissue, seasonal variation and cultivar type. Here, the molecular basis of these factors was investigated in three tea cultivars namely, Him Sphurti (H), TV23 (T), and UPASI-9 (U) using RNA-seq. Seasonal variation in these cultivars was studied during active (A), mid-dormant (MD), dormant (D) and mid-active (MA) stages in two developmental tissues viz. young and old leaf. Development appears to affect gene expression more than the seasonal variation and cultivar types. Further, detailed transcript and metabolite profiling has identified genes such as F3‧H, F3‧5‧H, FLS, DFR, LAR, ANR and ANS of catechin biosynthesis, while MXMT, SAMS, TCS and XDH of caffeine biosynthesis/catabolism as key regulators during development and seasonal variation among three different tea cultivars. In addition, expression analysis of genes related to phytohormones such as ABA, GA, ethylene and auxin has suggested their role in developmental tissues during seasonal variation in tea cultivars. Moreover, differential expression of genes involved in histone and DNA modification further suggests role of epigenetic mechanism in coordinating global gene expression during developmental and seasonal variation in tea. Our findings provide insights into global transcriptional reprogramming associated with development and seasonal variation in tea.

  5. Weekday variation in triglyceride concentrations in 1.8 million blood samples

    DEFF Research Database (Denmark)

    Jaskolowski, Jörn; Ritz, Christian; Sjödin, Anders Mikael

    2017-01-01

    BACKGROUND: Triglyceride (TG) concentration is used as a marker of cardio-metabolic risk. However, diurnal and possibly weekday variation exists in TG concentrations. OBJECTIVE: To investigate weekday variation in TG concentrations among 1.8 million blood samples drawn between 2008 and 2015 from...... variations in TG concentrations were recorded for out-patients between the age of 9 to 26 years, with up to 20% higher values on Mondays compared to Fridays (all PTriglyceride concentrations were highest after the weekend and gradually declined during the week. We suggest that unhealthy...

  6. Nurse-surgeon object transfer: video analysis of communication and situation awareness in the operating theatre.

    Science.gov (United States)

    Korkiakangas, Terhi; Weldon, Sharon-Marie; Bezemer, Jeff; Kneebone, Roger

    2014-09-01

    One of the most central collaborative tasks during surgical operations is the passing of objects, including instruments. Little is known about how nurses and surgeons achieve this. The aim of the present study was to explore what factors affect this routine-like task, resulting in fast or slow transfer of objects. A qualitative video study, informed by an observational ethnographic approach, was conducted in a major teaching hospital in the UK. A total of 20 general surgical operations were observed. In total, approximately 68 h of video data have been reviewed. A subsample of 225 min has been analysed in detail using interactional video-analysis developed within the social sciences. Two factors affecting object transfer were observed: (1) relative instrument trolley position and (2) alignment. The scrub nurse's instrument trolley position (close to vs. further back from the surgeon) and alignment (gaze direction) impacts on the communication with the surgeon, and consequently, on the speed of object transfer. When the scrub nurse was standing close to the surgeon, and "converged" to follow the surgeon's movements, the transfer occurred more seamlessly and faster (1.0 s). The smoothness of object transfer can be improved by adjusting the scrub nurse's instrument trolley position, enabling a better monitoring of surgeon's bodily conduct and affording early orientation (awareness) to an upcoming request (changing situation). Object transfer is facilitated by the surgeon's embodied practices, which can elicit the nurse's attention to the request and, as a response, maximise a faster object transfer. A simple intervention to highlight the significance of these factors could improve communication in the operating theatre. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Object oriented approach to B reconstruction

    International Nuclear Information System (INIS)

    Katayama, N.

    1992-01-01

    The complexity of full and partial reconstruction of B mesons has led CLEO to use object oriented techniques to do physics data analyses. An object oriented language for HEP data analysis was designed, a compiler which translates the user code into C++ source code, has been written using the UNIX tools, lex and yacc. The resulting C++ code can be linked and run in the normal CLEO data analysis system

  8. Heart rate and blood pressure variations after transvascular patent ductus arteriosus occlusion in dogs.

    Science.gov (United States)

    De Monte, Valentina; Staffieri, Francesco; Caivano, Domenico; Nannarone, Sara; Birettoni, Francesco; Porciello, Francesco; Di Meo, Antonio; Bufalari, Antonello

    2017-08-01

    The objective of the study was to retrospectively analyse the cardiovascular effects that occurs following the transvascular occlusion of patent ductus arteriosus in dogs. Sixteen anaesthesia records were included. Variables were recorded at the time of placing the arterial introducer, occlusion of the ductus, and from 5 to 60min thereafter, including, among the other, heart rate, systolic, diastolic and mean arterial blood pressure. The maximal percentage variation of the aforementioned physiological parameters within 60min of occlusion, compared with the values recorded at the introducer placing, was calculated. The time at which maximal variation occurred was also computed. Correlations between maximal percentage variation of physiological parameters and the diameter of the ductus and systolic and diastolic flow velocity through it were evaluated with linear regression analysis. Heart rate decreased after occlusion of the ductus with a mean maximal percentage variation of 41.0±14.8% after 21.2±13.7min. Mean and diastolic arterial blood pressure increased after occlusion with a mean maximal percentage variation of 30.6±18.1 and 55.4±27.1% after 19.6±12.1 and 15.7±10.8min, respectively. Mean arterial blood pressure variation had a significant and moderate inverse correlation with diastolic and systolic flow velocity through the ductus. Transvascular patent ductus arteriosus occlusion in anaesthetised dogs causes a significant reduction in heart rate and an increase in diastolic and mean blood arterial pressure within 20min of closure of the ductus. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Geographic variation in health insurance benefit in Qianjiang District, China

    OpenAIRE

    Ye, Ting; Wu, Yue; Zhang, Liang

    2017-01-01

    Background: Health insurance coverage is of great importance; yet, it is unclear whether there is some geographic variation in health insurance benefit for urban and rural patients covered by a same basic health insurance, especially in China.Objective: To identify the potential geographic variation in health insurance benefit and its possible socioeconomic and geographical factors at the town level.Methods: All the beneficiaries underthe health insurance who had the in-hospital experience in...

  10. A simultaneous CONCOR algorithm for the analysis of two partitioned matrices

    NARCIS (Netherlands)

    Lafosse, R; Ten Berge, J.M.F.

    2006-01-01

    A standard approach to derive underlying components from two or more data matrices, holding data from the same individuals or objects, is the (generalized) canonical correlation analysis. This technique finds components (canonical variates) with maximal sums of correlations between them. The

  11. Interobserver delineation variation in lung tumour stereotacticbody radiotherapy

    DEFF Research Database (Denmark)

    Persson, G. F.; Nygaard, D. E.; Hollensen, Christian

    2012-01-01

    the interobserver delineation variation for stereotactic body radiotherapy (SBRT) of peripheral lung tumours using a cross-sectional study design. Methods 22 consecutive patients with 26 tumours were included. Positron emission tomography/CT scans were acquired for planning of SBRT. Three oncologists and three......-sectional analysis of delineation variation for peripheral lung tumours referred for SBRT, establishing the evidence that interobserver variation is very small for these tumours....

  12. Automated Analysis of Flow Cytometry Data to Reduce Inter-Lab Variation in the Detection of Major Histocompatibility Complex Multimer-Binding T Cells

    DEFF Research Database (Denmark)

    Pedersen, Natasja Wulff; Chandran, P. Anoop; Qian, Yu

    2017-01-01

    Manual analysis of flow cytometry data and subjective gate-border decisions taken by individuals continue to be a source of variation in the assessment of antigen-specific T cells when comparing data across laboratories, and also over time in individual labs. Therefore, strategies to provide...... automated analysis of major histocompatibility complex (MHC) multimer-binding T cells represent an attractive solution to decrease subjectivity and technical variation. The challenge of using an automated analysis approach is that MHC multimer-binding T cell populations are often rare and therefore...... laboratories. We used three different methods, FLOw Clustering without K (FLOCK), Scalable Weighted Iterative Flow-clustering Technique (SWIFT), and ReFlow to analyze flow cytometry data files from 28 laboratories. Each laboratory screened for antigen-responsive T cell populations with frequency ranging from 0...

  13. Reduced object related negativity response indicates impaired auditory scene analysis in adults with autistic spectrum disorder

    Directory of Open Access Journals (Sweden)

    Veema Lodhia

    2014-02-01

    Full Text Available Auditory Scene Analysis provides a useful framework for understanding atypical auditory perception in autism. Specifically, a failure to segregate the incoming acoustic energy into distinct auditory objects might explain the aversive reaction autistic individuals have to certain auditory stimuli or environments. Previous research with non-autistic participants has demonstrated the presence of an Object Related Negativity (ORN in the auditory event related potential that indexes pre-attentive processes associated with auditory scene analysis. Also evident is a later P400 component that is attention dependent and thought to be related to decision-making about auditory objects. We sought to determine whether there are differences between individuals with and without autism in the levels of processing indexed by these components. Electroencephalography (EEG was used to measure brain responses from a group of 16 autistic adults, and 16 age- and verbal-IQ-matched typically-developing adults. Auditory responses were elicited using lateralized dichotic pitch stimuli in which inter-aural timing differences create the illusory perception of a pitch that is spatially separated from a carrier noise stimulus. As in previous studies, control participants produced an ORN in response to the pitch stimuli. However, this component was significantly reduced in the participants with autism. In contrast, processing differences were not observed between the groups at the attention-dependent level (P400. These findings suggest that autistic individuals have difficulty segregating auditory stimuli into distinct auditory objects, and that this difficulty arises at an early pre-attentive level of processing.

  14. Dual-band infrared capabilities for imaging buried object sites

    Energy Technology Data Exchange (ETDEWEB)

    Del Grande, N.K.; Durbin, P.F.; Gorvad, M.R.; Perkins, D.E.; Clark, G.A.; Hernandez, J.E.; Sherwood, R.J.

    1993-04-02

    We discuss dual-band infrared (DBIR) capabilities for imaging buried object sizes. We identify physical features affecting thermal contrast needed to distinguish buried object sites from undisturbed sites or surface clutter. Apart from atmospheric transmission and system performance, these features include: object size, shape, and burial depth; ambient soil, disturbed soil and object site thermal diffusivity differences; surface temperature, emissivity, plant-cover, slope, albedo and roughness variations; weather conditions and measurement times. We use good instrumentation to measure the time-varying temperature differences between buried object sites and undisturbed soil sites. We compare near surface soil temperature differences with radiometric infrared (IR) surface temperature differences recorded at 4.7 {plus_minus} 0.4 {mu}m and at 10.6 {plus_minus} 1.0 {mu}m. By producing selective DBIR image ratio maps, we distinguish temperature-difference patterns from surface emissivity effects. We discuss temperature differences between buried object sites, filled hole site (without buried objects), cleared (undisturbed) soil sites, and grass-covered sites (with and without different types of surface clutter). We compare temperature, emissivity-ratio, visible and near-IR reflectance signatures of surface objects, leafy plants and sod. We discuss the physical aspects of environmental, surface and buried target features affecting interpretation of buried targets, surface objects and natural backgrounds.

  15. Genomic variation landscape of the human gut microbiome

    DEFF Research Database (Denmark)

    Schloissnig, Siegfried; Arumugam, Manimozhiyan; Sunagawa, Shinichi

    2013-01-01

    Whereas large-scale efforts have rapidly advanced the understanding and practical impact of human genomic variation, the practical impact of variation is largely unexplored in the human microbiome. We therefore developed a framework for metagenomic variation analysis and applied it to 252 faecal...... polymorphism rates of 0.11 was more variable between gut microbial species than across human hosts. Subjects sampled at varying time intervals exhibited individuality and temporal stability of SNP variation patterns, despite considerable composition changes of their gut microbiota. This indicates...

  16. Investigating the effects of climate variations on bacillary dysentery incidence in northeast China using ridge regression and hierarchical cluster analysis

    Directory of Open Access Journals (Sweden)

    Guo Junqiao

    2008-09-01

    Full Text Available Abstract Background The effects of climate variations on bacillary dysentery incidence have gained more recent concern. However, the multi-collinearity among meteorological factors affects the accuracy of correlation with bacillary dysentery incidence. Methods As a remedy, a modified method to combine ridge regression and hierarchical cluster analysis was proposed for investigating the effects of climate variations on bacillary dysentery incidence in northeast China. Results All weather indicators, temperatures, precipitation, evaporation and relative humidity have shown positive correlation with the monthly incidence of bacillary dysentery, while air pressure had a negative correlation with the incidence. Ridge regression and hierarchical cluster analysis showed that during 1987–1996, relative humidity, temperatures and air pressure affected the transmission of the bacillary dysentery. During this period, all meteorological factors were divided into three categories. Relative humidity and precipitation belonged to one class, temperature indexes and evaporation belonged to another class, and air pressure was the third class. Conclusion Meteorological factors have affected the transmission of bacillary dysentery in northeast China. Bacillary dysentery prevention and control would benefit from by giving more consideration to local climate variations.

  17. Testing and injury potential analysis of rollovers with narrow object impacts.

    Science.gov (United States)

    Meyer, Steven E; Forrest, Stephen; Herbst, Brian; Hayden, Joshua; Orton, Tia; Sances, Anthony; Kumaresan, Srirangam

    2004-01-01

    Recent statistics highlight the significant risk of serious and fatal injuries to occupants involved in rollover collisions due to excessive roof crush. The government has reported that in 2002. Sports Utility Vehicle rollover related fatalities increased by 14% to more than 2400 annually. 61% of all SUV fatalities included rollovers [1]. Rollover crashes rely primarily upon the roof structures to maintain occupant survival space. Frequently these crashes occur off the travel lanes of the roadway and, therefore, can include impacts with various types of narrow objects such as light poles, utility poles and/or trees. A test device and methodology is presented which facilitates dynamic, repeatable rollover impact evaluation of complete vehicle roof structures with such narrow objects. These tests allow for the incorporation of Anthropomorphic Test Dummies (ATDs) which can be instrumented to measure accelerations, forces and moments to evaluate injury potential. High-speed video permits for detailed analysis of occupant kinematics and evaluation of injury causation. Criteria such as restraint performance, injury potential, survival space and the effect of roof crush associated with various types of design alternatives, countermeasures and impact circumstances can also be evaluated. In addition to presentation of the methodology, two representative vehicle crash tests are also reported. Results indicated that the reinforced roof structure significantly reduced the roof deformation compared to the production roof structure.

  18. Smart variations: Functional substructures for part compatibility

    KAUST Repository

    Zheng, Youyi

    2013-05-01

    As collections of 3D models continue to grow, reusing model parts allows generation of novel model variations. Naïvely swapping parts across models, however, leads to implausible results, especially when mixing parts across different model families. Hence, the user has to manually ensure that the final model remains functionally valid. We claim that certain symmetric functional arrangements (sFarr-s), which are special arrangements among symmetrically related substructures, bear close relation to object functions. Hence, we propose a purely geometric approach based on such substructures to match, replace, and position triplets of parts to create non-trivial, yet functionally plausible, model variations. We demonstrate that starting even from a small set of models such a simple geometric approach can produce a diverse set of non-trivial and plausible model variations. © 2013 The Author(s) Computer Graphics Forum © 2013 The Eurographics Association and Blackwell Publishing Ltd.

  19. Shapes, Proportions, and Variations in Breast Aesthetic Ideals: The Definition of Breast Beauty, Analysis, and Surgical Practice.

    Science.gov (United States)

    Mallucci, Patrick; Branford, Olivier Alexandre

    2015-10-01

    There are few objective analyses in the plastic surgical literature to define an aesthetically pleasing template for breast shape and proportion. The authors previously identified key objective parameters that define breast aesthetic ideals in 2 studies: an observational analysis of 100 models with natural breasts, and a population analysis with 1315 respondents. From these data a simple yet reproducible formula for surgical planning in breast augmentation has been developed to consistently achieve beautiful breasts, namely the ICE principle. This article proposes that this principle be used as the basis for design in aesthetic breast surgery. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Variational principles for locally variational forms

    International Nuclear Information System (INIS)

    Brajercik, J.; Krupka, D.

    2005-01-01

    We present the theory of higher order local variational principles in fibered manifolds, in which the fundamental global concept is a locally variational dynamical form. Any two Lepage forms, defining a local variational principle for this form, differ on intersection of their domains, by a variationally trivial form. In this sense, but in a different geometric setting, the local variational principles satisfy analogous properties as the variational functionals of the Chern-Simons type. The resulting theory of extremals and symmetries extends the first order theories of the Lagrange-Souriau form, presented by Grigore and Popp, and closed equivalents of the first order Euler-Lagrange forms of Hakova and Krupkova. Conceptually, our approach differs from Prieto, who uses the Poincare-Cartan forms, which do not have higher order global analogues

  1. Taxonomies of Educational Objective Domain

    OpenAIRE

    Eman Ghanem Nayef; Nik Rosila Nik Yaacob; Hairul Nizam Ismail

    2013-01-01

    This paper highlights an effort to study the educational objective domain taxonomies including Bloom’s taxonomy, Lorin Anderson’s taxonomy, and Wilson’s taxonomy. In this study a comparison among these three taxonomies have been done. Results show that Bloom’s taxonomy is more suitable as an analysis tool to Educational Objective domain.

  2. Factors influencing variation in dentist service rates.

    Science.gov (United States)

    Grembowski, D; Milgrom, P; Fiset, L

    1990-01-01

    In the previous article, we calculated dentist service rates for 200 general dentists based on a homogeneous, well-educated, upper-middle-class population of patients. Wide variations in the rates were detected. In this analysis, factors influencing variation in the rates were identified. Variation in rates for categories of dental services was explained by practice characteristics, patient exposure to fluoridated water supplies, and non-price competition in the dental market. Rates were greatest in large, busy practices in markets with high fees. Older practices consistently had lower rates across services. As a whole, these variables explained between 5 and 30 percent of the variation in the rates.

  3. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    Science.gov (United States)

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  4. Unexpectedly high genetic variation in large unisexual clumps of the subdioecious plant Honckenya peploides

    DEFF Research Database (Denmark)

    Sánchez-Vilas, Julia; Philipp, Marianne; Retuerto, Rubén

    2010-01-01

    Honckenya peploides is a subdioecious dune plant that reproduces both sexually and by clonal growth. In northwest Spain this species was found to exhibit an extreme spatial segregation of the sexes, and our objective was to investigate genetic variation in unisexual clumps. Genetic variation was ...

  5. Spectral analysis of pipe-to-soil potentials with variations of the Earth's magnetic field in the Australian region

    Science.gov (United States)

    Marshall, R. A.; Waters, C. L.; Sciffer, M. D.

    2010-05-01

    Long, steel pipelines used to transport essential resources such as gas and oil are potentially vulnerable to space weather. In order to inhibit corrosion, the pipelines are usually coated in an insulating material and maintained at a negative electric potential with respect to Earth using cathodic protection units. During periods of enhanced geomagnetic activity, potential differences between the pipeline and surrounding soil (referred to as pipe-to-soil potentials (PSPs)) may exhibit large voltage swings which place the pipeline outside the recommended "safe range" and at an increased risk of corrosion. The PSP variations result from the "geoelectric" field at the Earth's surface and associated geomagnetic field variations. Previous research investigating the relationship between the surface geoelectric field and geomagnetic source fields has focused on the high-latitude regions where line currents in the ionosphere E region are often the assumed source of the geomagnetic field variations. For the Australian region Sq currents also contribute to the geomagnetic field variations and provide the major contribution during geomagnetic quiet times. This paper presents the results of a spectral analysis of PSP measurements from four pipeline networks from the Australian region with geomagnetic field variations from nearby magnetometers. The pipeline networks extend from Queensland in the north of Australia to Tasmania in the south and provide PSP variations during both active and quiet geomagnetic conditions. The spectral analyses show both consistent phase and amplitude relationships across all pipelines, even for large separations between magnetometer and PSP sites and for small-amplitude signals. Comparison between the observational relationships and model predictions suggests a method for deriving a geoelectric field proxy suitable for indicating PSP-related space weather conditions.

  6. Multi-objective analysis of the conjunctive use of surface water and groundwater in a multisource water supply system

    Science.gov (United States)

    Vieira, João; da Conceição Cunha, Maria

    2017-04-01

    A multi-objective decision model has been developed to identify the Pareto-optimal set of management alternatives for the conjunctive use of surface water and groundwater of a multisource urban water supply system. A multi-objective evolutionary algorithm, Borg MOEA, is used to solve the multi-objective decision model. The multiple solutions can be shown to stakeholders allowing them to choose their own solutions depending on their preferences. The multisource urban water supply system studied here is dependent on surface water and groundwater and located in the Algarve region, southernmost province of Portugal, with a typical warm Mediterranean climate. The rainfall is low, intermittent and concentrated in a short winter, followed by a long and dry period. A base population of 450 000 inhabitants and visits by more than 13 million tourists per year, mostly in summertime, turns water management critical and challenging. Previous studies on single objective optimization after aggregating multiple objectives together have already concluded that only an integrated and interannual water resources management perspective can be efficient for water resource allocation in this drought prone region. A simulation model of the multisource urban water supply system using mathematical functions to represent the water balance in the surface reservoirs, the groundwater flow in the aquifers, and the water transport in the distribution network with explicit representation of water quality is coupled with Borg MOEA. The multi-objective problem formulation includes five objectives. Two objective evaluate separately the water quantity and the water quality supplied for the urban use in a finite time horizon, one objective calculates the operating costs, and two objectives appraise the state of the two water sources - the storage in the surface reservoir and the piezometric levels in aquifer - at the end of the time horizon. The decision variables are the volume of withdrawals from

  7. Analysis of Scattering by Inhomogeneous Dielectric Objects Using Higher-Order Hierarchical MoM

    DEFF Research Database (Denmark)

    Kim, Oleksiy S.; Jørgensen, Erik; Meincke, Peter

    2003-01-01

    An efficient technique for the analysis of electromagnetic scattering by arbitrary shaped inhomogeneous dielectric objects is presented. The technique is based on a higher-order method of moments (MoM) solution of the volume integral equation. This higher-order MoM solution comprises recently...... that the condition number of the resulting MoM matrix is reduced by several orders of magnitude in comparison to existing higher-order hierarchical basis functions and, consequently, an iterative solver can be applied even for high expansion orders. Numerical results demonstrate excellent agreement...

  8. SUPPORT VECTOR MACHINE CLASSIFICATION OF OBJECT-BASED DATA FOR CROP MAPPING, USING MULTI-TEMPORAL LANDSAT IMAGERY

    Directory of Open Access Journals (Sweden)

    R. Devadas

    2012-07-01

    Full Text Available Crop mapping and time series analysis of agronomic cycles are critical for monitoring land use and land management practices, and analysing the issues of agro-environmental impacts and climate change. Multi-temporal Landsat data can be used to analyse decadal changes in cropping patterns at field level, owing to its medium spatial resolution and historical availability. This study attempts to develop robust remote sensing techniques, applicable across a large geographic extent, for state-wide mapping of cropping history in Queensland, Australia. In this context, traditional pixel-based classification was analysed in comparison with image object-based classification using advanced supervised machine-learning algorithms such as Support Vector Machine (SVM. For the Darling Downs region of southern Queensland we gathered a set of Landsat TM images from the 2010–2011 cropping season. Landsat data, along with the vegetation index images, were subjected to multiresolution segmentation to obtain polygon objects. Object-based methods enabled the analysis of aggregated sets of pixels, and exploited shape-related and textural variation, as well as spectral characteristics. SVM models were chosen after examining three shape-based parameters, twenty-three textural parameters and ten spectral parameters of the objects. We found that the object-based methods were superior to the pixel-based methods for classifying 4 major landuse/land cover classes, considering the complexities of within field spectral heterogeneity and spectral mixing. Comparative analysis clearly revealed that higher overall classification accuracy (95% was observed in the object-based SVM compared with that of traditional pixel-based classification (89% using maximum likelihood classifier (MLC. Object-based classification also resulted speckle-free images. Further, object-based SVM models were used to classify different broadacre crop types for summer and winter seasons. The influence of

  9. The impact of body mass index (BMI variation on mortality of incident elderly patients on peritoneal dialysis: a joint model analysis

    Directory of Open Access Journals (Sweden)

    Marcia Regina Gianotti Franco

    Full Text Available Abstract Introduction: Data on impact of high body mass index (BMI on mortality of patients on peritoneal dialysis (PD, especially among elderly, are inconsistent. Objective: To evaluate impact of BMI on cohort of incident elderly PD patients over time. Methods: Prospective multicenter cohort study (December / 2004-October/2007 with 674 patients. Socio-demographic and clinical data evaluated with patients followed until death, transfer to hemodialysis (HD, recovery of renal function, loss of follow-up or transplant. Patients were divided into incident on renal replacement therapy (RRT for PD (PD first: 230 and transferred from hemodialysis (HD first: 444. Analysis was performed comparing these two groups using chi-square or Kruskal Wallis. Similar analysis was used to compare patients on automated peritoneal dialysis (APD vs. continuous ambulatory peritoneal dialysis (CAPD. Data were compared between patients according to BMI by ANOVA, Kruskal Wallis or chi-square. For analysis of survival, Kaplan Meier method was used and to adjust confounding variables, Cox regression proportional hazard. Joint model for longitudinal and time-dependent data was conducted, assessing impact that a longitudinal variable displays on time of survival. Results: Malnourished patients (76.79 ± 7.53 years were older (p < 0.0001 with higher percentage of death (44.6%, p = 0.001; diabetes mellitus showed high prevalence in obese patients (68%, p < 0.0001; higher blood pressure levels (p = 0.002 were present in obese and overweight patients. Conclusions: Increased BMI variation over time proved to be a protective factor, with a decrease of about 1% in risk of death for every BMI unit earned.

  10. Dynamic analysis of elastic rubber tired car wheel breaking under variable normal load

    Science.gov (United States)

    Fedotov, A. I.; Zedgenizov, V. G.; Ovchinnikova, N. I.

    2017-10-01

    The purpose of the paper is to analyze the dynamics of the braking of the wheel under normal load variations. The paper uses a mathematical simulation method according to which the calculation model of an object as a mechanical system is associated with a dynamically equivalent schematic structure of the automatic control. Transfer function tool analyzing structural and technical characteristics of an object as well as force disturbances were used. It was proved that the analysis of dynamic characteristics of the wheel subjected to external force disturbances has to take into account amplitude and phase-frequency characteristics. Normal load variations impact car wheel braking subjected to disturbances. The closer slip to the critical point is, the higher the impact is. In the super-critical area, load variations cause fast wheel blocking.

  11. Objective and subjective analysis of women's voice with idiopathic Parkinson's disease

    Directory of Open Access Journals (Sweden)

    Riviana Rodrigues das Graças

    2012-07-01

    Full Text Available OBJECTIVE: To compare the voice quality of women with idiopathic Parkinson's disease and those without it. METHODS: An evaluation was performed including 19 female patients diagnosed with idiopathic Parkinson's disease, with an average age of 66 years, and 27 women with an average of 67 years-old in the Control Group. The assessment was performed by computed acoustic analysis and perceptual evaluation. RESULTS: Parkinson's disease patients presented moderate rough and unstable voice quality. The parameters of grade, roughness, and instability had higher scores in Parkinson's disease patients with statistically significant differences. Acoustic measures of Jitter and period perturbation quotient (PPQ significantly differ between groups. CONCLUSIONS: Parkinson's disease female individuals showed more vocal alterations compared to the Control Group, when both perceptual and acoustic evaluations were analyzed.

  12. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  13. A Brief Study of Variation Theory in Quality Management

    Directory of Open Access Journals (Sweden)

    Mostafa Farah Bakhsh

    2016-06-01

    Full Text Available Variation is part of everyday life and exists all the time. Variation is the product of differences. Difference in nature of processes results in different products during the time. Proper diagnosis of variation patterns in minimizing the loss is necessary. Continuous quality improvement is regarded as successive reduction of performance variation for delivering high quality products to the customers. In Deming viewpoint, quality deviation is classified to two groups of common and special causes. Variation is not a new word, but understanding and concerns about it are modern. First step in performance variation management is acceptance and belief of variation. For proper management of variations, appropriate tools should be used for detection and display of them. Control are useful tools in recognition, analysis and removing process performance variations.

  14. Computing Optical Variable Periods of BL Lac Object S5 0716+ 714 ...

    Indian Academy of Sciences (India)

    Computing Optical Variable Periods of BL Lac Object S5 0716+ 714 ... The study of long-term periodical variation is an important way to get the charac- ... continuous Fourier transform together, define a window function, and finally obtain.

  15. Open Issues in Object-Oriented Programming

    DEFF Research Database (Denmark)

    Madsen, Ole Lehrmann

    1995-01-01

    We discuss a number of open issues within object-oriented programming. The central mechanisms of object-oriented programming appeared with Simula, developed more than 30 years ago; these include class, subclass, virtual function, active object and the first application framework, Class Simulation....... The core parts of object-oriented programming should be well understood, but there are still a large number of issues where there is no consensus. The term object-orientation has been applied to many subjects, such as analysis, design implementation, data modeling in databases, and distribution...

  16. Procedural facade variations from a single layout

    KAUST Repository

    Bao, Fan

    2013-02-19

    We introduce a framework to generate many variations of a facade design that look similar to a given facade layout. Starting from an input image, the facade is hierarchically segmented and labeled with a collection of manual and automatic tools. The user can then model constraints that should be maintained in any variation of the input facade design. Subsequently, facade variations are generated for different facade sizes, where multiple variations can be produced for a certain size. Computing such new facade variations has many unique challenges, and we propose a new algorithm based on interleaving heuristic search and quadratic programming. In contrast to most previous work, we focus on the generation of new design variations and not on the automatic analysis of the input\\'s structure. Adding a modeling step with the user in the loop ensures that our results routinely are of high quality. © 2013 ACM.

  17. Procedural facade variations from a single layout

    KAUST Repository

    Bao, Fan; Schwarz, Michael; Wonka, Peter

    2013-01-01

    We introduce a framework to generate many variations of a facade design that look similar to a given facade layout. Starting from an input image, the facade is hierarchically segmented and labeled with a collection of manual and automatic tools. The user can then model constraints that should be maintained in any variation of the input facade design. Subsequently, facade variations are generated for different facade sizes, where multiple variations can be produced for a certain size. Computing such new facade variations has many unique challenges, and we propose a new algorithm based on interleaving heuristic search and quadratic programming. In contrast to most previous work, we focus on the generation of new design variations and not on the automatic analysis of the input's structure. Adding a modeling step with the user in the loop ensures that our results routinely are of high quality. © 2013 ACM.

  18. Sub-OBB based object recognition and localization algorithm using range images

    International Nuclear Information System (INIS)

    Hoang, Dinh-Cuong; Chen, Liang-Chia; Nguyen, Thanh-Hung

    2017-01-01

    This paper presents a novel approach to recognize and estimate pose of the 3D objects in cluttered range images. The key technical breakthrough of the developed approach can enable robust object recognition and localization under undesirable condition such as environmental illumination variation as well as optical occlusion to viewing the object partially. First, the acquired point clouds are segmented into individual object point clouds based on the developed 3D object segmentation for randomly stacked objects. Second, an efficient shape-matching algorithm called Sub-OBB based object recognition by using the proposed oriented bounding box (OBB) regional area-based descriptor is performed to reliably recognize the object. Then, the 3D position and orientation of the object can be roughly estimated by aligning the OBB of segmented object point cloud with OBB of matched point cloud in a database generated from CAD model and 3D virtual camera. To detect accurate pose of the object, the iterative closest point (ICP) algorithm is used to match the object model with the segmented point clouds. From the feasibility test of several scenarios, the developed approach is verified to be feasible for object pose recognition and localization. (paper)

  19. Effects of memory colour on colour constancy for unknown coloured objects.

    Science.gov (United States)

    Granzier, Jeroen J M; Gegenfurtner, Karl R

    2012-01-01

    The perception of an object's colour remains constant despite large variations in the chromaticity of the illumination-colour constancy. Hering suggested that memory colours, the typical colours of objects, could help in estimating the illuminant's colour and therefore be an important factor in establishing colour constancy. Here we test whether the presence of objects with diagnostical colours (fruits, vegetables, etc) within a scene influence colour constancy for unknown coloured objects in the scene. Subjects matched one of four Munsell papers placed in a scene illuminated under either a reddish or a greenish lamp with the Munsell book of colour illuminated by a neutral lamp. The Munsell papers were embedded in four different scenes-one scene containing diagnostically coloured objects, one scene containing incongruent coloured objects, a third scene with geometrical objects of the same colour as the diagnostically coloured objects, and one scene containing non-diagnostically coloured objects (eg, a yellow coffee mug). All objects were placed against a black background. Colour constancy was on average significantly higher for the scene containing the diagnostically coloured objects compared with the other scenes tested. We conclude that the colours of familiar objects help in obtaining colour constancy for unknown objects.

  20. Development of objective flow regime identification method using self-organizing neural network

    International Nuclear Information System (INIS)

    Lee, Jae Young; Kim, Nam Seok; Kwak, Nam Yee

    2004-01-01

    Two-phase flow shows various flow patterns according to the amount of the void and its relative velocity to the liquid flow. This variation directly affect the interfacial transfer which is the key factor for the design or analysis of the phase change systems. Especially the safety analysis of the nuclear power plant has been performed based on the numerical code furnished with the proper constitutive relations depending highly upon the flow regimes. Heavy efforts have been focused to identify the flow regime and at this moment we stand on relative very stable engineering background compare to the other research field. However, the issues related to objectiveness and transient flow regime are still open to study. Lee et al. and Ishii developed the method for the objective and instantaneous flow regime identification based on the neural network and new index of probability distribution of the flow regime which allows just one second observation for the flow regime identification. In the present paper, we developed the self-organized neural network for more objective approach to this problem. Kohonen's Self-Organizing Map (SOM) has been used for clustering, visualization, and abstraction. The SOM is trained through unsupervised competitive learning using a 'winner takes it all' policy. Therefore, its unsupervised training character delete the possible interference of the regime developer to the neural network training. After developing the computer code, we evaluate the performance of the code with the vertically upward two-phase flow in the pipes of 25.4 and 50.4 cmm I.D. Also, the sensitivity of the number of the clusters to the flow regime identification was made

  1. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2015-09-01

    well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  2. Denominative Variation in the Terminology of Fauna and Flora: Cultural and Linguistic (ASymmetries

    Directory of Open Access Journals (Sweden)

    Sabrina de Cássia Martins

    2018-05-01

    Full Text Available The present work approaches the denominative variation in Terminology. In this way, it has as object of study the specialized lexical units in Portuguese language formed by at least one of the following color names: black, white, yellow, blue, orange, gray, green, brown, red, pink, violet, purple and indigo. The comparative analysis of this vocabulary among Portuguese, English and Italian languages was conducted considering two sub-areas of Biology: Botany, specifically Angiosperms (Monocotyledons and Eudicotyledons, and Zoology, exclusively Vertebrates (fish, amphibians, reptiles, birds and mammals. It will be described in the next pages how common names are created in these tree languages.

  3. Exploiting Higher Order and Multi-modal Features for 3D Object Detection

    DEFF Research Database (Denmark)

    Kiforenko, Lilita

    that describe object visual appearance such as shape, colour, texture etc. This thesis focuses on robust object detection and pose estimation of rigid objects using 3D information. The thesis main contributions are novel feature descriptors together with object detection and pose estimation algorithms....... The initial work introduces a feature descriptor that uses edge categorisation in combination with a local multi-modal histogram descriptor in order to detect objects with little or no texture or surface variation. The comparison is performed with a state-of-the-art method, which is outperformed...... of the methods work well for one type of objects in a specific scenario, in another scenario or with different objects they might fail, therefore more robust solutions are required. The typical problem solution is the design of robust feature descriptors, where feature descriptors contain information...

  4. RAPD analysis of colchicine induced variation of the Dendrobium ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-04-20

    Apr 20, 2009 ... species of the Dendrobium genera, and 13 orchids across generas. ... to detect variations at species level and among somaclonal variants in this study. ..... alternative for colchicines in in vitro choromosome doubling of Lilium.

  5. Pharmacogenetic effects of angiotensin-converting enzyme inhibitors over age-related urea andcreatinine variations in patients with dementia due to Alzheimer disease

    OpenAIRE

    Ferreira de Oliveira, Fabricio; Berretta, Juliana Marília; Suchi Chen, Elizabeth; Cardoso Smith, Marilia; Ferreira Bertolucci, Paulo Henrique

    2016-01-01

    Background: Renal function declines according to age and vascular risk factors, whereas few data are available regarding geneticallymediated effects of anti-hypertensives over renal function. Objective: To estimate urea and creatinine variations in dementia due to Alzheimer disease (AD) by way of a pharmacogenetic analysis of the anti-hypertensive effects of angiotensin-converting enzyme inhibitors (ACEis). Methods: Consecutive outpatients older than 60 years-old with AD and no history of kid...

  6. Characteristics of seasonal variation and solar activity dependence of the geomagnetic solar quiet daily variation

    Science.gov (United States)

    Shinbori, A.; Koyama, Y.; Nose, M.; Hori, T.

    2017-12-01

    Characteristics of seasonal variation and solar activity dependence of the X- and Y-components of the geomagnetic solar quiet (Sq) daily variation at Memanbetsu in mid-latitudes and Guam near the equator have been investigated using long-term geomagnetic field data with 1-h time resolution from 1957 to 2016. In this analysis, we defined the quiet day when the maximum value of the Kp index is less than 3 for that day. In this analysis, we used the monthly average of the adjusted daily F10.7 corresponding to geomagnetically quiet days. For identification of the monthly mean Sq variation in the X and Y components (Sq-X and Sq-Y), we first determined the baseline of the X and Y components from the average value from 22 to 2 h (LT: local time) for each quiet day. Next, we calculated a deviation from the baseline of the X- and Y-components of the geomagnetic field for each quiet day, and computed the monthly mean value of the deviation for each local time. As a result, Sq-X and Sq-Y shows a clear seasonal variation and solar activity dependence. The amplitude of seasonal variation increases significantly during high solar activities, and is proportional to the solar F10.7 index. The pattern of the seasonal variation is quite different between Sq-X and Sq-Y. The result of the correlation analysis between the solar F10.7 index and Sq-X and Sq-Y shows almost the linear relationship, but the slope and intercept of the linear fitted line varies as function of local time and month. This implies that the sensitivity of Sq-X and Sq-Y to the solar activity is different for different local times and seasons. The local time dependence of the offset value of Sq-Y at Guam and its seasonal variation suggest a magnetic field produced by inter-hemispheric field-aligned currents (FACs). From the sign of the offset value of Sq-Y, it is infer that the inter-hemispheric FACs flow from the summer to winter hemispheres in the dawn and dusk sectors and from the winter to summer hemispheres in

  7. Difference Discrete Variational Principle,EULER-Lagrange Cohomology and Symplectic, Multisymplectic Structures

    OpenAIRE

    Guo, H. Y.; Li, Y. Q.; Wu, K.; Wang, S. K.

    2001-01-01

    We study the difference discrete variational principle in the framework of multi-parameter differential approach by regarding the forward difference as an entire geometric object in view of noncomutative differential geometry. By virtue of this variational principle, we get the difference discrete Euler-Lagrange equations and canonical ones for the difference discrete versions of the classical mechanics and classical field theory. We also explore the difference discrete versions for the Euler...

  8. Genomic Sequence Variation Markup Language (GSVML).

    Science.gov (United States)

    Nakaya, Jun; Kimura, Michio; Hiroi, Kaei; Ido, Keisuke; Yang, Woosung; Tanaka, Hiroshi

    2010-02-01

    With the aim of making good use of internationally accumulated genomic sequence variation data, which is increasing rapidly due to the explosive amount of genomic research at present, the development of an interoperable data exchange format and its international standardization are necessary. Genomic Sequence Variation Markup Language (GSVML) will focus on genomic sequence variation data and human health applications, such as gene based medicine or pharmacogenomics. We developed GSVML through eight steps, based on case analysis and domain investigations. By focusing on the design scope to human health applications and genomic sequence variation, we attempted to eliminate ambiguity and to ensure practicability. We intended to satisfy the requirements derived from the use case analysis of human-based clinical genomic applications. Based on database investigations, we attempted to minimize the redundancy of the data format, while maximizing the data covering range. We also attempted to ensure communication and interface ability with other Markup Languages, for exchange of omics data among various omics researchers or facilities. The interface ability with developing clinical standards, such as the Health Level Seven Genotype Information model, was analyzed. We developed the human health-oriented GSVML comprising variation data, direct annotation, and indirect annotation categories; the variation data category is required, while the direct and indirect annotation categories are optional. The annotation categories contain omics and clinical information, and have internal relationships. For designing, we examined 6 cases for three criteria as human health application and 15 data elements for three criteria as data formats for genomic sequence variation data exchange. The data format of five international SNP databases and six Markup Languages and the interface ability to the Health Level Seven Genotype Model in terms of 317 items were investigated. GSVML was developed as

  9. Gamifying Video Object Segmentation.

    Science.gov (United States)

    Spampinato, Concetto; Palazzo, Simone; Giordano, Daniela

    2017-10-01

    Video object segmentation can be considered as one of the most challenging computer vision problems. Indeed, so far, no existing solution is able to effectively deal with the peculiarities of real-world videos, especially in cases of articulated motion and object occlusions; limitations that appear more evident when we compare the performance of automated methods with the human one. However, manually segmenting objects in videos is largely impractical as it requires a lot of time and concentration. To address this problem, in this paper we propose an interactive video object segmentation method, which exploits, on one hand, the capability of humans to identify correctly objects in visual scenes, and on the other hand, the collective human brainpower to solve challenging and large-scale tasks. In particular, our method relies on a game with a purpose to collect human inputs on object locations, followed by an accurate segmentation phase achieved by optimizing an energy function encoding spatial and temporal constraints between object regions as well as human-provided location priors. Performance analysis carried out on complex video benchmarks, and exploiting data provided by over 60 users, demonstrated that our method shows a better trade-off between annotation times and segmentation accuracy than interactive video annotation and automated video object segmentation approaches.

  10. Analysis of Pressure Variations in a Low-Pressure Nickel-Hydrogen Battery – Part 1

    Science.gov (United States)

    Purushothaman, B. K.; Wainright, J. S.

    2012-01-01

    A low pressure nickel-hydrogen battery using either a metal hydride or gaseous hydrogen for H2 storage has been developed for use in implantable neuroprosthetic devices. In this paper, pressure variations inside the cell for the gaseous hydrogen version are analyzed and correlated with oxygen evolution side reaction at the end of charging, the recombination of oxygen with hydrogen during charging and a subsequent rest period, and the self-discharge of the nickel electrode. About 70% of the recombination occurred simultaneously with oxygen evolution during charging and the remaining oxygen recombined with hydrogen during the 1st hour after charging. Self-discharge of the cell varies linearly with hydrogen pressure at a given state of charge and increased with increasing battery charge levels. The coulometric efficiency calculated based on analysis of the pressure-time data agreed well with the efficiency calculated based on the current-time data. Pressure variations in the battery are simulated accurately to predict coulometric efficiency and the state of charge of the cell, factors of extreme importance for a battery intended for implantation within the human body. PMID:22423175

  11. Conical differentiability for evolution variational inequalities

    Science.gov (United States)

    Jarušek, Jiří; Krbec, Miroslav; Rao, Murali; Sokołowski, Jan

    The conical differentiability of solutions to the parabolic variational inequality with respect to the right-hand side is proved in the paper. From one side the result is based on the Lipschitz continuity in H {1}/{2},1 (Q) of solutions to the variational inequality with respect to the right-hand side. On the other side, in view of the polyhedricity of the convex cone K={v∈ H;v |Σ c⩾0,v |Σ d=0}, we prove new results on sensitivity analysis of parabolic variational inequalities. Therefore, we have a positive answer to the question raised by Fulbert Mignot (J. Funct. Anal. 22 (1976) 25-32).

  12. Teaching tools in Evidence Based Practice: evaluation of reusable learning objects (RLOs for learning about Meta-analysis

    Directory of Open Access Journals (Sweden)

    Wharrad Heather

    2011-05-01

    Full Text Available Abstract Background All healthcare students are taught the principles of evidence based practice on their courses. The ability to understand the procedures used in systematically reviewing evidence reported in studies, such as meta-analysis, are an important element of evidence based practice. Meta-analysis is a difficult statistical concept for healthcare students to understand yet it is an important technique used in systematic reviews to pool data from studies to look at combined effectiveness of treatments. In other areas of the healthcare curricula, by supplementing lectures, workbooks and workshops with pedagogically designed, multimedia learning objects (known as reusable learning objects or RLOs we have shown an improvement in students' perceived understanding in subjects they found difficult. In this study we describe the development and evaluation of two RLOs on meta-analysis. The RLOs supplement associated lectures and aim to improve students' understanding of meta-analysis in healthcare students. Methods Following a quality controlled design process two RLOs were developed and delivered to two cohorts of students, a Master in Public Health course and Postgraduate diploma in nursing course. Students' understanding of five key concepts of Meta-analysis were measured before and after a lecture and again after RLO use. RLOs were also evaluated for their educational value, learning support, media attributes and usability using closed and open questions. Results Students rated their understanding of meta-analysis as improved after a lecture and further improved after completing the RLOs (Wilcoxon paired test, p Conclusions Meta-analysis RLOs that are openly accessible and unrestricted by usernames and passwords provide flexible support for students who find the process of meta-analysis difficult.

  13. Chambolle's Projection Algorithm for Total Variation Denoising

    Directory of Open Access Journals (Sweden)

    Joan Duran

    2013-12-01

    Full Text Available Denoising is the problem of removing the inherent noise from an image. The standard noise model is additive white Gaussian noise, where the observed image f is related to the underlying true image u by the degradation model f=u+n, and n is supposed to be at each pixel independently and identically distributed as a zero-mean Gaussian random variable. Since this is an ill-posed problem, Rudin, Osher and Fatemi introduced the total variation as a regularizing term. It has proved to be quite efficient for regularizing images without smoothing the boundaries of the objects. This paper focuses on the simple description of the theory and on the implementation of Chambolle's projection algorithm for minimizing the total variation of a grayscale image. Furthermore, we adapt the algorithm to the vectorial total variation for color images. The implementation is described in detail and its parameters are analyzed and varied to come up with a reliable implementation.

  14. Introduction to the Special Issue on Advancing Methods for Analyzing Dialect Variation.

    Science.gov (United States)

    Clopper, Cynthia G

    2017-07-01

    Documenting and analyzing dialect variation is traditionally the domain of dialectology and sociolinguistics. However, modern approaches to acoustic analysis of dialect variation have their roots in Peterson and Barney's [(1952). J. Acoust. Soc. Am. 24, 175-184] foundational work on the acoustic analysis of vowels that was published in the Journal of the Acoustical Society of America (JASA) over 6 decades ago. Although Peterson and Barney (1952) were not primarily concerned with dialect variation, their methods laid the groundwork for the acoustic methods that are still used by scholars today to analyze vowel variation within and across languages. In more recent decades, a number of methodological advances in the study of vowel variation have been published in JASA, including work on acoustic vowel overlap and vowel normalization. The goal of this special issue was to honor that tradition by bringing together a set of papers describing the application of emerging acoustic, articulatory, and computational methods to the analysis of dialect variation in vowels and beyond.

  15. Fast Variations In Spectrum of Comet Halley

    Science.gov (United States)

    Borysenko, S. A.

    The goal of this work is to research fast variations of spectral lines intensities in spectra of comet Halley. The present research was made on the basis of more then 500 high- resolution spectrogram obtained by L.M. Shulman and H.K. Nazarchuk in November- December, 1985 at the 6-m telescope (SAO, Russia). Some fast variations with different quasiperiods were detected in all the spectrograms. Quasiperiods of these variations were from 15 - 40 min to 1.5 - 2 hours. As data from spacecraft "Vega-2" show, more fast variations with quasiperiods 5 - 10 min are obviously present in cometary time variations. Only the most important lines so as C2, C3, CN, CH and NH2 were analyzed. False periods were checked by comparison of the power spectra of the variations with the computed spectral window of the data. Only false periods about 400 sec (the avarage period of exposition) were detected. An algorithm for analysis of locally Poisson's time series was proposed. Two types of fast variations are detected: 1)high amplitude variations with more long quasiperiods (1.5 - 2 hours) and the coefficient of crosscorrelations between line intensities about 0.9 - 0.95; 2)low amplitude variations with short periods (15 - 40 min), which look like white noise and have the coefficient of crosscorrelations about 0.1 - 0.3. This difference may be caused by nature of variations. The first type variations may be an effect of both active processes in cometary nucleus and streams of solar protons. Analysis of solar proton flux variation with energies >1 MeV in November - Decem- ber 1985 confirms the above-mentioned version. In the second case it may by only inner processes in the nucleus that generate the observed variations. For determination of general parameters of cometary atmosphere, such as the produc- tion rates of radicals C2, C3, CN, CH, and NH2 it was necessary to estimate the contri- bution of dust grains luminiscence into the continuum of the comet. Space and wave- length distribution

  16. Y-Chromosome variation in hominids: intraspecific variation is limited to the polygamous chimpanzee.

    Directory of Open Access Journals (Sweden)

    Gabriele Greve

    Full Text Available BACKGROUND: We have previously demonstrated that the Y-specific ampliconic fertility genes DAZ (deleted in azoospermia and CDY (chromodomain protein Y varied with respect to copy number and position among chimpanzees (Pan troglodytes. In comparison, seven Y-chromosomal lineages of the bonobo (Pan paniscus, the chimpanzee's closest living relative, showed no variation. We extend our earlier comparative investigation to include an analysis of the intraspecific variation of these genes in gorillas (Gorilla gorilla and orangutans (Pongo pygmaeus, and examine the resulting patterns in the light of the species' markedly different social and mating behaviors. METHODOLOGY/PRINCIPAL FINDINGS: Fluorescence in situ hybridization analysis (FISH of DAZ and CDY in 12 Y-chromosomal lineages of western lowland gorilla (G. gorilla gorilla and a single lineage of the eastern lowland gorilla (G. beringei graueri showed no variation among lineages. Similar findings were noted for the 10 Y-chromosomal lineages examined in the Bornean orangutan (Pongo pygmaeus, and 11 Y-chromosomal lineages of the Sumatran orangutan (P. abelii. We validated the contrasting DAZ and CDY patterns using quantitative real-time polymerase chain reaction (qPCR in chimpanzee and bonobo. CONCLUSION/SIGNIFICANCE: High intraspecific variation in copy number and position of the DAZ and CDY genes is seen only in the chimpanzee. We hypothesize that this is best explained by sperm competition that results in the variant DAZ and CDY haplotypes detected in this species. In contrast, bonobos, gorillas and orangutans-species that are not subject to sperm competition-showed no intraspecific variation in DAZ and CDY suggesting that monoandry in gorillas, and preferential female mate choice in bonobos and orangutans, probably permitted the fixation of a single Y variant in each taxon. These data support the notion that the evolutionary history of a primate Y chromosome is not simply encrypted in its DNA

  17. Variation in semen parameters derived from computer-aided semen analysis, within donors and between donors

    NARCIS (Netherlands)

    Wijchman, JG; De Wolf, BTHM; Graaff, R; Arts, EGJM

    2001-01-01

    The development of computer-aided semen analysis (CASA) has made it possible to study sperm motility characteristics objectively and longitudinally. In this 2-year study of 8 sperm donors, we used CASA to measure 7 semen parameters (concentration, percentage of motile spermatozoa, curvilinear

  18. Explaining variation in nascent entrepreneurship

    NARCIS (Netherlands)

    A.J. van Stel (André); A.R.M. Wennekers (Sander); P. Reynolds (Paul); A.R. Thurik (Roy)

    2004-01-01

    textabstractThis paper aims at explaining cross-country variation in nascent entrepreneurship. Regression analysis is applied using various explanatory variables derived from three different approaches. We make use of the Global Entrepreneurship Monitor database, including nascent entrepreneurship

  19. Slope instability caused by small variations in hydraulic conductivity

    Science.gov (United States)

    Reid, M.E.

    1997-01-01

    Variations in hydraulic conductivity can greatly modify hillslope ground-water flow fields, effective-stress fields, and slope stability. In materials with uniform texture, hydraulic conductivities can vary over one to two orders of magnitude, yet small variations can be difficult to determine. The destabilizing effects caused by small (one order of magnitude or less) hydraulic conductivity variations using ground-water flow modeling, finite-element deformation analysis, and limit-equilibrium analysis are examined here. Low hydraulic conductivity materials that impede downslope ground-water flow can create unstable areas with locally elevated pore-water pressures. The destabilizing effects of small hydraulic heterogeneities can be as great as those induced by typical variations in the frictional strength (approximately 4??-8??) of texturally similar materials. Common "worst-case" assumptions about ground-water flow, such as a completely saturated "hydrostatic" pore-pressure distribution, do not account for locally elevated pore-water pressures and may not provide a conservative slope stability analysis. In site characterization, special attention should be paid to any materials that might impede downslope ground-water flow and create unstable regions.

  20. Discrepancy variation of dinucleotide microsatellite repeats in eukaryotic genomes

    Directory of Open Access Journals (Sweden)

    HUAN GAO

    2009-01-01

    Full Text Available To address whether there are differences of variation among repeat motif types and among taxonomic groups, we present here an analysis of variation and correlation of dinucleotide microsatellite repeats in eukaryotic genomes. Ten taxonomic groups were compared, those being primates, mammalia (excluding primates and rodentia, rodentia, birds, fish, amphibians and reptiles, insects, molluscs, plants and fungi, respectively. The data used in the analysis is from the literature published in the Journal of Molecular Ecology Notes. Analysis of variation reveals that there are no significant differences between AC and AG repeat motif types. Moreover, the number of alleles correlates positively with the copy number in both AG and AC repeats. Similar conclusions can be obtained from each taxonomic group. These results strongly suggest that the increase of SSR variation is almost linear with the increase of the copy number of each repeat motif. As well, the results suggest that the variability of SSR in the genomes of low-ranking species seem to be more than that of high-ranking species, excluding primates and fungi.

  1. On the minimizers of calculus of variations problems in Hilbert spaces

    KAUST Repository

    Gomes, Diogo A.

    2014-01-19

    The objective of this paper is to discuss existence, uniqueness and regularity issues of minimizers of one dimensional calculus of variations problem in Hilbert spaces. © 2014 Springer-Verlag Berlin Heidelberg.

  2. On the minimizers of calculus of variations problems in Hilbert spaces

    KAUST Repository

    Gomes, Diogo A.; Nurbekyan, Levon

    2014-01-01

    The objective of this paper is to discuss existence, uniqueness and regularity issues of minimizers of one dimensional calculus of variations problem in Hilbert spaces. © 2014 Springer-Verlag Berlin Heidelberg.

  3. Simulation of multicomponent light source for optical-electronic system of color analysis objects

    Science.gov (United States)

    Peretiagin, Vladimir S.; Alekhin, Artem A.; Korotaev, Valery V.

    2016-04-01

    Development of lighting technology has led to possibility of using LEDs in the specialized devices for outdoor, industrial (decorative and accent) and domestic lighting. In addition, LEDs and devices based on them are widely used for solving particular problems. For example, the LED devices are widely used for lighting of vegetables and fruit (for their sorting or growing), textile products (for the control of its quality), minerals (for their sorting), etc. Causes of active introduction LED technology in different systems, including optical-electronic devices and systems, are a large choice of emission color and LED structure, that defines the spatial, power, thermal and other parameters. Furthermore, multi-element and color devices of lighting with adjustable illumination properties can be designed and implemented by using LEDs. However, devices based on LEDs require more attention if you want to provide a certain nature of the energy or color distribution at all the work area (area of analysis or observation) or surface of the object. This paper is proposed a method of theoretical modeling of the lighting devices. The authors present the models of RGB multicomponent light source applied to optical-electronic system for the color analysis of mineral objects. The possibility of formation the uniform and homogeneous on energy and color illumination of the work area for this system is presented. Also authors showed how parameters and characteristics of optical radiation receiver (by optical-electronic system) affect on the energy, spatial, spectral and colorimetric properties of a multicomponent light source.

  4. An encoder for the measurement of relative motions between two objects

    International Nuclear Information System (INIS)

    Saro, M.

    1995-01-01

    The motion encoder is composed of a measuring rule, mounted on one of the object, which bears at least two tracks (X, Y) with multiple simple marks distributed following a similar pattern on the two tracks, and at least one specific mark (each mark limit is defining a step variation on the rule), and at least two mark readers, mounted on the second object, each one associated to a track. Data processing means are used to estimate distance and motion direction. Application to robotics and metrology

  5. A comparison of feature detectors and descriptors for object class matching

    DEFF Research Database (Denmark)

    Hietanen, Antti; Lankinen, Jukka; Kämäräinen, Joni-Kristian

    2016-01-01

    appearance variation can be large. We extend the benchmarks to the class matching setting and evaluate state-of-the-art detectors and descriptors with Caltech and ImageNet classes. Our experiments provide important findings with regard to object class matching: (1) the original SIFT is still the best...

  6. Classification of Archaeological Targets by the Use of Temporary Magnetic Variations Examination

    Science.gov (United States)

    Finkelstein, Michael; Eppelbaum, Lev

    2015-04-01

    Many buried magnetized archaeological and geological objects producing significant magnetic anomalies(for instance, ancient furnaces, weapon, agricultural targets and high-magnetized basalts) may be classified without high-expensive excavations. Such a classification may be conducted on the basis of comprehensive studying temporary magnetic variations over these objects. It is especially significant for archaeogeophysical investigations in the areas of world recognized religious and cultural artifacts where all excavations are forbidden (Eppelbaum, 2010). Yanovsky's (1978) investigations laid the foundation of the magnetic variations utilization for separation of disturbing objects with high magnetic susceptibility (not depending on intensity of the studied magnetic anomalies). However, these procedures are inapplicable for studying low-intensive and negative magnetic anomalies, where an influence of residual magnetization may be sufficient one. At the same time the approach presented below may be used for investigation of the nature of magnetic anomalies with arbitrary intensity and origin. In the common case (we consider for simplicity that anomalous object is a sphere) the value of magnetic variations η could be estimated using the following expression (Finkelstein and Eppelbaum, 1997): η =f( P ))+δ Ha +δ Ho /δ Ho, where induction parameter P=α √ {κ & &gamma & ω } (Wait, 1951), Ho is the initial field of magnetic variations, Ha is the anomalous component of magnetic variations, κ is the magnetic susceptibility, &gamma is the electric conductivity, ω is the frequency of geomagnetic variations, and α is the radius of the sphere. For the approximate estimation of possible values of anomalous geomagnetic variations (AGV) over sphere within some domain T, we will use an expression of the anomalous vertical magnetic component Z for any point M (x, y, z) in the external space (for the case of vertical magnetization) (Nepomnyaschikh, 1964): Za =( {κ 1 -κ

  7. The variational spiked oscillator

    International Nuclear Information System (INIS)

    Aguilera-Navarro, V.C.; Ullah, N.

    1992-08-01

    A variational analysis of the spiked harmonic oscillator Hamiltonian -d 2 / d x 2 + x 2 + δ/ x 5/2 , δ > 0, is reported in this work. A trial function satisfying Dirichlet boundary conditions is suggested. The results are excellent for a large range of values of the coupling parameter. (author)

  8. a Baseline for Upper Crustal Velocity Variations Along the East Pacific Rise

    Science.gov (United States)

    Kappus, Mary Elizabeth

    Seismic measurements of the oceanic crust and theoretical models of its generation at mid-ocean ridges suggest several systematic variations in upper crustal velocity structure, but without constraints on the inherent variation in newly-formed crust these suggestions remain tentative. The Wide Aperture Profiles (WAPs) which form the database for this study have sufficient horizontal extent and resolution in the upper crust to establish a zero-age baseline. After assessing the adequacy of amplitude preservation in several tau - p transform methods we make a precise estimate of the velocity at the top of the crust from analysis of amplitudes in the tau - p domain. Along a 52-km segment we find less than 5% variation from 2.45 km/s. Velocity models of the uppermost crust are constructed using waveform inversion for both reflection and refraction arrivals. This method exploits the high quality of both primary and secondary phases and provides an objective process for iteratively improving trial models and for measuring misfit. The resulting models show remarkable homogeneity: on-axis variation is 5% or less within layers 2A and 2B, increasing to 10% at the sharp 2A/2B boundary. The extrusive volcanic layer is only 130 m thick along-axis and corresponds to the triangular -shaped neovolcanic zone. From this we infer that the sheeted dikes feeding the extrusive layer 2A come up to very shallow depths on axis. Along axis, a fourth-order deviation from axial linearity identified geochemically is observed as a small increase in thickness of the extrusive layer. Off -axis, the velocity increases only slightly to 2.49 km/s, while the thickness of the extrusives increases to 217 km and the variability in both parameters increases with distance from the ridge axis. In a separate section we present the first published analysis of seismic records of thunder. We calculate multi -taper spectra to determine the peak energy in the lightning bolt and apply time-dependent polarization

  9. Conscientious objection in health care

    Directory of Open Access Journals (Sweden)

    Kuře Josef

    2016-12-01

    Full Text Available The paper deals with conscientious objection in health care, addressing the problems of scope, verification and limitation of such refusal, paying attention to ideological agendas hidden behind the right of conscience where the claimed refusal can cause harm or where such a claim is an attempt to impose certain moral values on society or an excuse for not providing health care. The nature of conscientious objection will be investigated and an ethical analysis of conscientious objection will be conducted. Finally some suggestions for health care policy will be proposed.

  10. Investigation, sensitivity analysis, and multi-objective optimization of effective parameters on temperature and force in robotic drilling cortical bone.

    Science.gov (United States)

    Tahmasbi, Vahid; Ghoreishi, Majid; Zolfaghari, Mojtaba

    2017-11-01

    The bone drilling process is very prominent in orthopedic surgeries and in the repair of bone fractures. It is also very common in dentistry and bone sampling operations. Due to the complexity of bone and the sensitivity of the process, bone drilling is one of the most important and sensitive processes in biomedical engineering. Orthopedic surgeries can be improved using robotic systems and mechatronic tools. The most crucial problem during drilling is an unwanted increase in process temperature (higher than 47 °C), which causes thermal osteonecrosis or cell death and local burning of the bone tissue. Moreover, imposing higher forces to the bone may lead to breaking or cracking and consequently cause serious damage. In this study, a mathematical second-order linear regression model as a function of tool drilling speed, feed rate, tool diameter, and their effective interactions is introduced to predict temperature and force during the bone drilling process. This model can determine the maximum speed of surgery that remains within an acceptable temperature range. Moreover, for the first time, using designed experiments, the bone drilling process was modeled, and the drilling speed, feed rate, and tool diameter were optimized. Then, using response surface methodology and applying a multi-objective optimization, drilling force was minimized to sustain an acceptable temperature range without damaging the bone or the surrounding tissue. In addition, for the first time, Sobol statistical sensitivity analysis is used to ascertain the effect of process input parameters on process temperature and force. The results show that among all effective input parameters, tool rotational speed, feed rate, and tool diameter have the highest influence on process temperature and force, respectively. The behavior of each output parameters with variation in each input parameter is further investigated. Finally, a multi-objective optimization has been performed considering all the

  11. Anatomical Variations of the Circulus Arteriosus in Cadaveric Human Brains

    Science.gov (United States)

    Gunnal, S. A.; Farooqui, M. S.; Wabale, R. N.

    2014-01-01

    Objective. Circulus arteriosus/circle of Willis (CW) is a polygonal anastomotic channel at the base of the brain which unites the internal carotid and vertebrobasilar system. It maintains the steady and constant supply to the brain. The variations of CW are seen often. The Aim of the present work is to find out the percentage of normal pattern of CW, and the frequency of variations of the CW and to study the morphological and morphometric aspects of all components of CW. Methods. Circulus arteriosus of 150 formalin preserved brains were dissected. Dimensions of all the components forming circles were measured. Variations of all the segments were noted and well photographed. The variations such as aplasia, hypoplasia, duplication, fenestrations, and difference in dimensions with opposite segments were noted. The data collected in the study was analyzed. Results. Twenty-one different types of CW were found in the present study. Normal and complete CW was found in 60%. CW with gross morphological variations was seen in 40%. Maximum variations were seen in the PCoA followed by the ACoA in 50% and 40%, respectively. Conclusion. As it confirms high percentage of variations, all surgical interventions should be preceded by angiography. Awareness of these anatomical variations is important in neurovascular procedures. PMID:24891951

  12. Joint analysis of short-period variations of ionospheric parameters in Siberia and the Far East and processes of the tropical cyclogenesis

    Science.gov (United States)

    Chernigovskaya, M. A.; Kurkin, V. I.; Orlov, I. I.; Sharkov, E. A.; Pokrovskaya, I. V.

    2009-04-01

    In this work a possibility of manifestation of strong meteorological disturbances in the Earth lower atmosphere in variations of ionospheric parameters in the zone remote from the disturbance source has been studied. The spectral analysis of short-period variations (about ten minutes, hours) in maximum observed frequencies (MOF) of one-skip signals of oblique sounding has been carried out. These variations were induced by changes in the upper atmosphere parameters along the Magadan-Irkutsk oblique-incidence sounding path on the background of diurnal variations in the parameter under study. Data on MOF measurements with off-duty factor approximately 5 min in equinoxes (September, March) of 2005-2007 were used. The analysis was made using the improved ISTP-developed technique of determining periodicities in time series. The increase of signal spectrum energy at certain frequencies is interpreted as manifestation of traveling ionospheric disturbances (TID) associated with propagation of internal gravity waves in the atmosphere. The analysis revealed TIDs of temporal scales under consideration. The question concerning localization of possible sources of revealed disturbances is discussed. Troposphere meteorological disturbances giant in their energy (tropical cyclones, typhoon) are considered as potential sources of observable TIDs. The needed information on tropical cyclones that occurred in the north area of the Indian Ocean, south-west and central areas of the Pacific Ocean in 2005-2007 is taken from the electron base of satellite data on the global tropical cyclogenesis "Global-TC" (ISR RAS). In order to effectively separate disturbances associated with the magnetospheric-ionospheric interaction and disturbances induced by the lower atmosphere influence on the upper atmosphere, we analyze the tropical cyclogenesis events that occurred in quiet helio-geomagnetic conditions. The study was supported by the Program of RAS Presidium N 16 (Part 3) and the RFBR Grant N 08-05-00658.

  13. Photometric observations of nine Transneptunian objects and Centaurs

    Science.gov (United States)

    Hromakina, T.; Perna, D.; Belskaya, I.; Dotto, E.; Rossi, A.; Bisi, F.

    2018-02-01

    We present the results of photometric observations of six Transneptunian objects and three Centaurs, estimations of their rotational periods and corresponding amplitudes. For six of them we present also lower limits of density values. All observations were made using 3.6-m TNG telescope (La Palma, Spain). For four objects - (148975) 2001 XA255, (281371) 2008 FC76, (315898) 2008 QD4, and 2008 CT190 - the estimation of short-term variability was made for the first time. We confirm rotation period values for two objects: (55636) 2002 TX300 and (202421) 2005 UQ513, and improve the precision of previously reported rotational period values for other three - (120178) 2003 OP32, (145452) 2005 RN43, (444030) 2004 NT33 - by using both our and literature data. We also discuss here that small distant bodies, similar to asteroids in the Main belt, tend to have double-peaked rotational periods caused by the elongated shape rather than surface albedo variations.

  14. Super-Relaxed ( -Proximal Point Algorithms, Relaxed ( -Proximal Point Algorithms, Linear Convergence Analysis, and Nonlinear Variational Inclusions

    Directory of Open Access Journals (Sweden)

    Agarwal RaviP

    2009-01-01

    Full Text Available We glance at recent advances to the general theory of maximal (set-valued monotone mappings and their role demonstrated to examine the convex programming and closely related field of nonlinear variational inequalities. We focus mostly on applications of the super-relaxed ( -proximal point algorithm to the context of solving a class of nonlinear variational inclusion problems, based on the notion of maximal ( -monotonicity. Investigations highlighted in this communication are greatly influenced by the celebrated work of Rockafellar (1976, while others have played a significant part as well in generalizing the proximal point algorithm considered by Rockafellar (1976 to the case of the relaxed proximal point algorithm by Eckstein and Bertsekas (1992. Even for the linear convergence analysis for the overrelaxed (or super-relaxed ( -proximal point algorithm, the fundamental model for Rockafellar's case does the job. Furthermore, we attempt to explore possibilities of generalizing the Yosida regularization/approximation in light of maximal ( -monotonicity, and then applying to first-order evolution equations/inclusions.

  15. Evidence of increment of efficiency of the Mexican Stock Market through the analysis of its variations

    Science.gov (United States)

    Coronel-Brizio, H. F.; Hernández-Montoya, A. R.; Huerta-Quintanilla, R.; Rodríguez-Achach, M.

    2007-07-01

    It is well known that there exist statistical and structural differences between the stock markets of developed and emerging countries. In this work, and in order to find out if the efficiency of the Mexican Stock Market has been changing over time, we have performed and compared several analyses of the variations of the Mexican Stock Market index (IPC) and Dow Jones industrial average index (DJIA) for different periods of their historical daily data. We have analyzed the returns autocorrelation function (ACF) and used detrended fluctuation analysis (DFA) to study returns variations. We also analyze the volatility, mean value and standard deviation of both markets and compare their evolution. We conclude from the overall result of these studies, that they show compelling evidence of the increment of efficiency of the Mexican Stock Market over time. The data samples analyzed here, correspond to daily values of the IPC and DJIA for the period 10/30/1978-02/28/2006.

  16. High Recharge Areas in the Choushui River Alluvial Fan (Taiwan Assessed from Recharge Potential Analysis and Average Storage Variation Indexes

    Directory of Open Access Journals (Sweden)

    Jui-Pin Tsai

    2015-03-01

    Full Text Available High recharge areas significantly influence the groundwater quality and quantity in regional groundwater systems. Many studies have applied recharge potential analysis (RPA to estimate groundwater recharge potential (GRP and have delineated high recharge areas based on the estimated GRP. However, most of these studies define the RPA parameters with supposition, and this represents a major source of uncertainty for applying RPA. To objectively define the RPA parameter values without supposition, this study proposes a systematic method based on the theory of parameter identification. A surrogate variable, namely the average storage variation (ASV index, is developed to calibrate the RPA parameters, because of the lack of direct GRP observations. The study results show that the correlations between the ASV indexes and computed GRP values improved from 0.67 before calibration to 0.85 after calibration, thus indicating that the calibrated RPA parameters represent the recharge characteristics of the study area well; these data also highlight how defining the RPA parameters with ASV indexes can help to improve the accuracy. The calibrated RPA parameters were used to estimate the GRP distribution of the study area, and the GRP values were graded into five levels. High and excellent level areas are defined as high recharge areas, which composed 7.92% of the study area. Overall, this study demonstrates that the developed approach can objectively define the RPA parameters and high recharge areas of the Choushui River alluvial fan, and the results should serve as valuable references for the Taiwanese government in their efforts to conserve the groundwater quality and quantity of the study area.

  17. ANALYSIS ON THE VARIATION OF MEDIAL ROTATION VALUES ACCORDING TO THE POSITION OF THE HUMERAL DIAPHYSIS.

    Science.gov (United States)

    Miyazaki, Alberto Naoki; Fregoneze, Marcelo; Santos, Pedro Doneux; da Silva, Luciana Andrade; do Val Sella, Guilherme; Cohen, Carina; Busin Giora, Taís Stedile; Checchia, Sergio Luiz; Raia, Fabio; Pekelman, Hélio; Cymrot, Raquel

    2012-01-01

    To analyze the validity of measurements of medial rotation (MR) of the shoulder, using vertebral levels, according to the variation in the position of the humeral diaphysis, and to test the bi-goniometer as a new measuring instrument. 140 shoulders (70 patients) were prospectively evaluated in cases presenting unilateral shoulder MR limitation. The vertebral level was evaluated by means of a visual scale and was correlated with the angle obtained according to the position of the humeral diaphysis, using the bi-goniometer developed with the Department of Mechanical Engineering of Mackenzie University. The maximum vertebral level reached through MR on the unaffected side ranged from T3 to T12, and on the affected side, from T6 to the trochanter. Repositioning of the affected limb in MR according to the angular values on the normal side showed that 57.13% of the patients reached lower levels, between the sacrum, gluteus and trochanter. From analysis on the maximum vertebral level attained and the variation between the affected angle x (frontal plane: abduction and MR of the shoulder) and the unaffected angle x in MR, we observed that the greater the angle of the diaphyseal axis was, the lower the variation in the vertebral level attained was. From evaluating the linear correlation between the variables of difference in maximum vertebral level reached and variation in the affected angle y (extension and abduction of the shoulder) and the unaffected angle y in MR, we observed that there was no well-established linear relationship between these variables. Measurement of MR using vertebral levels does not correspond to the real values, since it varies according to the positioning of the humeral diaphysis.

  18. Sub-Hour X-Ray Variability of High-Energy Peaked BL Lacertae Objects

    OpenAIRE

    Bidzina Kapanadze

    2018-01-01

    The study of multi-wavelength flux variability in BL Lacertae objects is very important to discern unstable processes and emission mechanisms underlying their extreme observational features. While the innermost regions of these objects are not accessible from direct observations, we may draw conclusions about their internal structure via the detection of flux variations on various timescales, based on the light-travel argument. In this paper, we review the sub-hour X-ray variability in high-e...

  19. Visual object tracking by correlation filters and online learning

    Science.gov (United States)

    Zhang, Xin; Xia, Gui-Song; Lu, Qikai; Shen, Weiming; Zhang, Liangpei

    2018-06-01

    Due to the complexity of background scenarios and the variation of target appearance, it is difficult to achieve high accuracy and fast speed for object tracking. Currently, correlation filters based trackers (CFTs) show promising performance in object tracking. The CFTs estimate the target's position by correlation filters with different kinds of features. However, most of CFTs can hardly re-detect the target in the case of long-term tracking drifts. In this paper, a feature integration object tracker named correlation filters and online learning (CFOL) is proposed. CFOL estimates the target's position and its corresponding correlation score using the same discriminative correlation filter with multi-features. To reduce tracking drifts, a new sampling and updating strategy for online learning is proposed. Experiments conducted on 51 image sequences demonstrate that the proposed algorithm is superior to the state-of-the-art approaches.

  20. Precise Object Tracking under Deformation

    International Nuclear Information System (INIS)

    Saad, M.H.

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results. xiiiThe precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high