WorldWideScience

Sample records for objective analysis precision

  1. Precise object tracking under deformation

    Saad, M.H

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This frame-work focuses on the precise object tracking under deformation such as scaling , rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results.

  2. Precise Object Tracking under Deformation

    Saad, M.H.

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results. xiiiThe precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high

  3. Analysis of Precision of Activation Analysis Method

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  4. Precise Analysis of String Expressions

    Christensen, Aske Simon; Møller, Anders; Schwartzbach, Michael Ignatieff

    2003-01-01

    We perform static analysis of Java programs to answer a simple question: which values may occur as results of string expressions? The answers are summarized for each expression by a regular language that is guaranteed to contain all possible values. We present several applications of this analysis...... are automatically produced. We present extensive benchmarks demonstrating that the analysis is efficient and produces results of useful precision......., including statically checking the syntax of dynamically generated expressions, such as SQL queries. Our analysis constructs flow graphs from class files and generates a context-free grammar with a nonterminal for each string expression. The language of this grammar is then widened into a regular language...

  5. A linear actuator for precision positioning of dual objects

    Peng, Yuxin; Cao, Jie; Guo, Zhao; Yu, Haoyong

    2015-01-01

    In this paper, a linear actuator for precision positioning of dual objects is proposed based on a double friction drive principle using a single piezoelectric element (PZT). The linear actuator consists of an electromagnet and a permanent magnet, which are connected by the PZT. The electromagnet serves as an object 1, and another object (object 2) is attached on the permanent magnet by the magnetic force. For positioning the dual objects independently, two different friction drive modes can be alternated by an on–off control of the electromagnet. When the electromagnet releases from the guide way, it can be driven by impact friction force generated by the PZT. Otherwise, when the electromagnet clamps on the guide way and remains stationary, the object 2 can be driven based on the principle of smooth impact friction drive. A prototype was designed and constructed and experiments were carried out to test the basic performance of the actuator. It has been verified that with a compact size of 31 mm (L) × 12 mm (W) × 8 mm (H), the two objects can achieve long strokes on the order of several millimeters and high resolutions of several tens of nanometers. Since the proposed actuator allows independent movement of two objects by a single PZT, the actuator has the potential to be constructed compactly. (paper)

  6. Functional Object Analysis

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  7. The neural basis of precise visual short-term memory for complex recognisable objects.

    Veldsman, Michele; Mitchell, Daniel J; Cusack, Rhodri

    2017-10-01

    Recent evidence suggests that visual short-term memory (VSTM) capacity estimated using simple objects, such as colours and oriented bars, may not generalise well to more naturalistic stimuli. More visual detail can be stored in VSTM when complex, recognisable objects are maintained compared to simple objects. It is not yet known if it is recognisability that enhances memory precision, nor whether maintenance of recognisable objects is achieved with the same network of brain regions supporting maintenance of simple objects. We used a novel stimulus generation method to parametrically warp photographic images along a continuum, allowing separate estimation of the precision of memory representations and the number of items retained. The stimulus generation method was also designed to create unrecognisable, though perceptually matched, stimuli, to investigate the impact of recognisability on VSTM. We adapted the widely-used change detection and continuous report paradigms for use with complex, photographic images. Across three functional magnetic resonance imaging (fMRI) experiments, we demonstrated greater precision for recognisable objects in VSTM compared to unrecognisable objects. This clear behavioural advantage was not the result of recruitment of additional brain regions, or of stronger mean activity within the core network. Representational similarity analysis revealed greater variability across item repetitions in the representations of recognisable, compared to unrecognisable complex objects. We therefore propose that a richer range of neural representations support VSTM for complex recognisable objects. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Numerical Analysis Objects

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  9. Per Object statistical analysis

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  10. Efficient Tracking of Moving Objects with Precision Guarantees

    Civilis, Alminas; Jensen, Christian Søndergaard; Nenortaite, Jovita

    2004-01-01

    Sustained advances in wireless communications, geo-positioning, and consumer electronics pave the way to a kind of location-based service that relies on the tracking of the continuously changing positions of an entire population of service users. This type of service is characterized by large...... an object is moving. Empirical performance studies based on a real road network and GPS logs from cars are reported....

  11. Chemical Shift Imaging (CSI) by precise object displacement

    Leclerc, Sebastien; Trausch, Gregory; Cordier, Benoit; Grandclaude, Denis; Retournard, Alain; Fraissard, Jacques; Canet, Daniel

    2006-01-01

    International audience; A mechanical device (NMR lift) has been built for displacing vertically an object (typically a NMR sample tube) inside the NMR probe with an accuracy of 1 Μm. A series of single pulse experiments are performed for incremented vertical positions of the sample. With a sufficiently spatially selective rf field, one obtains chemical shift information along the displacement direction (one dimensional Chemical Shift Imaging – CSI). Knowing the vertical radio-frequency (rf) f...

  12. Evidence of gradual loss of precision for simple features and complex objects in visual working memory.

    Rademaker, Rosanne L; Park, Young Eun; Sack, Alexander T; Tong, Frank

    2018-03-01

    Previous studies have suggested that people can maintain prioritized items in visual working memory for many seconds, with negligible loss of information over time. Such findings imply that working memory representations are robust to the potential contaminating effects of internal noise. However, once visual information is encoded into working memory, one might expect it to inevitably begin degrading over time, as this actively maintained information is no longer tethered to the original perceptual input. Here, we examined this issue by evaluating working memory for single central presentations of an oriented grating, color patch, or face stimulus, across a range of delay periods (1, 3, 6, or 12 s). We applied a mixture-model analysis to distinguish changes in memory precision over time from changes in the frequency of outlier responses that resemble random guesses. For all 3 types of stimuli, participants exhibited a clear and consistent decline in the precision of working memory as a function of temporal delay, as well as a modest increase in guessing-related responses for colored patches and face stimuli. We observed a similar loss of precision over time while controlling for temporal distinctiveness. Our results demonstrate that visual working memory is far from lossless: while basic visual features and complex objects can be maintained in a quite stable manner over time, these representations are still subject to noise accumulation and complete termination. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  13. Accuracy and precision in activation analysis: counting

    Becker, D.A.

    1974-01-01

    Accuracy and precision in activation analysis was investigated with regard to counting of induced radioactivity. The various parameters discussed include configuration, positioning, density, homogeneity, intensity, radioisotopic purity, peak integration, and nuclear constants. Experimental results are presented for many of these parameters. The results obtained indicate that counting errors often contribute significantly to the inaccuracy and imprecision of analyses. The magnitude of these errors range from less than 1 percent to 10 percent or more in many cases

  14. Multiple-objective optimization in precision laser cutting of different thermoplastics

    Tamrin, K. F.; Nukman, Y.; Choudhury, I. A.; Shirley, S.

    2015-04-01

    Thermoplastics are increasingly being used in biomedical, automotive and electronics industries due to their excellent physical and chemical properties. Due to the localized and non-contact process, use of lasers for cutting could result in precise cut with small heat-affected zone (HAZ). Precision laser cutting involving various materials is important in high-volume manufacturing processes to minimize operational cost, error reduction and improve product quality. This study uses grey relational analysis to determine a single optimized set of cutting parameters for three different thermoplastics. The set of the optimized processing parameters is determined based on the highest relational grade and was found at low laser power (200 W), high cutting speed (0.4 m/min) and low compressed air pressure (2.5 bar). The result matches with the objective set in the present study. Analysis of variance (ANOVA) is then carried out to ascertain the relative influence of process parameters on the cutting characteristics. It was found that the laser power has dominant effect on HAZ for all thermoplastics.

  15. A Mission Planning Approach for Precision Farming Systems Based on Multi-Objective Optimization

    Zhaoyu Zhai

    2018-06-01

    Full Text Available As the demand for food grows continuously, intelligent agriculture has drawn much attention due to its capability of producing great quantities of food efficiently. The main purpose of intelligent agriculture is to plan agricultural missions properly and use limited resources reasonably with minor human intervention. This paper proposes a Precision Farming System (PFS as a Multi-Agent System (MAS. Components of PFS are treated as agents with different functionalities. These agents could form several coalitions to complete the complex agricultural missions cooperatively. In PFS, mission planning should consider several criteria, like expected benefit, energy consumption or equipment loss. Hence, mission planning could be treated as a Multi-objective Optimization Problem (MOP. In order to solve MOP, an improved algorithm, MP-PSOGA, is proposed, taking advantages of the Genetic Algorithms and Particle Swarm Optimization. A simulation, called precise pesticide spraying mission, is performed to verify the feasibility of the proposed approach. Simulation results illustrate that the proposed approach works properly. This approach enables the PFS to plan missions and allocate scarce resources efficiently. The theoretical analysis and simulation is a good foundation for the future study. Once the proposed approach is applied to a real scenario, it is expected to bring significant economic improvement.

  16. Objective - oriented financial analysis introduction

    Dessislava Kostova – Pickett

    2018-02-01

    Full Text Available The practice of financial analysis has been immeasurably strengthened in recent years thanks to the ongoing evolution of computerized approaches in the form of spreadsheets and computer-based financial models of different types. These devices not only relieved the analyst's computing task, but also opened up a wide range of analyzes and research into alternative sensitivity, which so far has not been possible. The main potential for object-oriented financial analysis consists in enormously expanding the analyst's capabilities through an online knowledge and information interface that has not yet been achieved through existing methods and software packages.

  17. High precision spectrophotometric analysis of thorium

    Palmieri, H.E.L.

    1984-01-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium when processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using dissodium ethylenediaminetetraacetate (EDTA) solution and alizarin-S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer programme. Besides the equivalence point, other parameters of titration were determined: the indicator concentration, the absorbance of the metal-indicator complex, and the stability constants of the metal-indicator and the metal-EDTA complexes. (Author) [pt

  18. Thorium spectrophotometric analysis with high precision

    Palmieri, H.E.L.

    1983-06-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using disodium ethylenediaminetetraacetate (EDTA) solution and alizarin S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer program. (author)

  19. Quantization and training of object detection networks with low-precision weights and activations

    Yang, Bo; Liu, Jian; Zhou, Li; Wang, Yun; Chen, Jie

    2018-01-01

    As convolutional neural networks have demonstrated state-of-the-art performance in object recognition and detection, there is a growing need for deploying these systems on resource-constrained mobile platforms. However, the computational burden and energy consumption of inference for these networks are significantly higher than what most low-power devices can afford. To address these limitations, this paper proposes a method to train object detection networks with low-precision weights and activations. The probability density functions of weights and activations of each layer are first directly estimated using piecewise Gaussian models. Then, the optimal quantization intervals and step sizes for each convolution layer are adaptively determined according to the distribution of weights and activations. As the most computationally expensive convolutions can be replaced by effective fixed point operations, the proposed method can drastically reduce computation complexity and memory footprint. Performing on the tiny you only look once (YOLO) and YOLO architectures, the proposed method achieves comparable accuracy to their 32-bit counterparts. As an illustration, the proposed 4-bit and 8-bit quantized versions of the YOLO model achieve a mean average precision of 62.6% and 63.9%, respectively, on the Pascal visual object classes 2012 test dataset. The mAP of the 32-bit full-precision baseline model is 64.0%.

  20. Object properties and cognitive load in the formation of associative memory during precision lifting.

    Li, Yong; Randerath, Jennifer; Bauer, Hans; Marquardt, Christian; Goldenberg, Georg; Hermsdörfer, Joachim

    2009-01-03

    When we manipulate familiar objects in our daily life, our grip force anticipates the physical demands right from the moment of contact with the object, indicating the existence of a memory for relevant object properties. This study explores the formation and consolidation of the memory processes that associate either familiar (size) or arbitrary object features (color) with object weight. In the general task, participants repetitively lifted two differently weighted objects (580 and 280 g) in a pseudo-random order. Forty young healthy adults participated in this study and were randomly distributed into four groups: Color Cue Single task (CCS, blue and red, 9.8(3)cm(3)), Color Cue Dual task (CCD), No Cue (NC) and Size Cue (SC, 9.8(3) and 6(3)cm(3)) group. All groups performed a repetitive precision grasp-lift task and were retested with the same protocol after a 5-min pause. The CCD group was also required to simultaneously perform a memory task during each lift of differently weighted objects coded by color. The results show that groups lifting objects with arbitrary or familiar features successfully formed the association between object weight and manipulated object features and incorporated this into grip force programming, as observed in the different scaling of grip force and grip force rate for different object weights. An arbitrary feature, i.e., color, can be sufficiently associated with object weight, however with less strength than the familiar feature of size. The simultaneous memory task impaired anticipatory force scaling during repetitive object lifting but did not jeopardize the learning process and the consolidation of the associative memory.

  1. PRECISE - pregabalin in addition to usual care: Statistical analysis plan

    S. Mathieson (Stephanie); L. Billot (Laurent); C. Maher (Chris); A.J. McLachlan (Andrew J.); J. Latimer (Jane); B.W. Koes (Bart); M.J. Hancock (Mark J.); I. Harris (Ian); R.O. Day (Richard O.); J. Pik (Justin); S. Jan (Stephen); C.-W.C. Lin (Chung-Wei Christine)

    2016-01-01

    textabstractBackground: Sciatica is a severe, disabling condition that lacks high quality evidence for effective treatment strategies. This a priori statistical analysis plan describes the methodology of analysis for the PRECISE study. Methods/design: PRECISE is a prospectively registered, double

  2. Moving Object Detection Using Scanning Camera on a High-Precision Intelligent Holder

    Chen, Shuoyang; Xu, Tingfa; Li, Daqun; Zhang, Jizhou; Jiang, Shenwang

    2016-01-01

    During the process of moving object detection in an intelligent visual surveillance system, a scenario with complex background is sure to appear. The traditional methods, such as “frame difference” and “optical flow”, may not able to deal with the problem very well. In such scenarios, we use a modified algorithm to do the background modeling work. In this paper, we use edge detection to get an edge difference image just to enhance the ability of resistance illumination variation. Then we use a “multi-block temporal-analyzing LBP (Local Binary Pattern)” algorithm to do the segmentation. In the end, a connected component is used to locate the object. We also produce a hardware platform, the core of which consists of the DSP (Digital Signal Processor) and FPGA (Field Programmable Gate Array) platforms and the high-precision intelligent holder. PMID:27775671

  3. Moving Object Detection Using Scanning Camera on a High-Precision Intelligent Holder

    Shuoyang Chen

    2016-10-01

    Full Text Available During the process of moving object detection in an intelligent visual surveillance system, a scenario with complex background is sure to appear. The traditional methods, such as “frame difference” and “optical flow”, may not able to deal with the problem very well. In such scenarios, we use a modified algorithm to do the background modeling work. In this paper, we use edge detection to get an edge difference image just to enhance the ability of resistance illumination variation. Then we use a “multi-block temporal-analyzing LBP (Local Binary Pattern” algorithm to do the segmentation. In the end, a connected component is used to locate the object. We also produce a hardware platform, the core of which consists of the DSP (Digital Signal Processor and FPGA (Field Programmable Gate Array platforms and the high-precision intelligent holder.

  4. Finger pressure adjustments to various object configurations during precision grip in humans and monkeys.

    Viaro, Riccardo; Tia, Banty; Coudé, Gino; Canto, Rosario; Oliynyk, Andriy; Salmas, Paola; Masia, Lorenzo; Sandini, Giulio; Fadiga, Luciano

    2017-06-01

    In this study, we recorded the pressure exerted onto an object by the index finger and the thumb of the preferred hand of 18 human subjects and either hand of two macaque monkeys during a precision grasping task. The to-be-grasped object was a custom-made device composed by two plates which could be variably oriented by a motorized system while keeping constant the size and thus grip dimension. The to-be-grasped plates were covered by an array of capacitive sensors to measure specific features of finger adaptation, namely pressure intensity and centroid location and displacement. Kinematic measurements demonstrated that for human subjects and for monkeys, different plate configurations did not affect wrist velocity and grip aperture during the reaching phase. Consistently, at the instant of fingers-plates contact, pressure centroids were clustered around the same point for all handle configurations. However, small pressure centroid displacements were specifically adopted for each configuration, indicating that both humans and monkeys can display finger adaptation during precision grip. Moreover, humans applied stronger thumb pressure intensity, performed less centroid displacement and required reduced adjustment time, as compared to monkeys. These pressure patterns remain similar when different load forces were required to pull the handle, as ascertained by additional measurements in humans. The present findings indicate that, although humans and monkeys share common features in motor control of grasping, they differ in the adjustment of fingertip pressure, probably because of skill and/or morphology divergences. Such a precision grip device may form the groundwork for future studies on prehension mechanisms. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  5. Integrated modeling and analysis methodology for precision pointing applications

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  6. Precision tomographic analysis of reactor fuels

    Lee, Yong Deok; Lee, Chang Hee; Kim, Jong Soo; Jeong, Jwong Hwan; Nam, Ki Yong

    2001-03-01

    For the tomographical assay, search of current status, analysis of neutron beam characteristics, MCNP code simulation, sim-fuel fabrication, neutron experiment for sim-fuel, multiaxes operation system design were done. In sensitivity simulation, the reconstruction results showed the good agreement. Also, the scoping test at ANL was very helpful for actual assay. Therefore, the results are applied for HANARO tomographical system setup and consecutive next research.

  7. Precision tomographic analysis of reactor fuels

    Lee, Yong Deok; Lee, Chang Hee; Kim, Jong Soo; Jeong, Jwong Hwan; Nam, Ki Yong

    2001-03-01

    For the tomographical assay, search of current status, analysis of neutron beam characteristics, MCNP code simulation, sim-fuel fabrication, neutron experiment for sim-fuel, multiaxes operation system design were done. In sensitivity simulation, the reconstruction results showed the good agreement. Also, the scoping test at ANL was very helpful for actual assay. Therefore, the results are applied for HANARO tomographical system setup and consecutive next research

  8. [Refractive precision and objective quality of vision after toric lens implantation in cataract surgery].

    Debois, A; Nochez, Y; Bezo, C; Bellicaud, D; Pisella, P-J

    2012-10-01

    To study efficacy and predictability of toric IOL implantation for correction of preoperative corneal astigmatism by analysing spherocylindrical refractive precision and objective quality of vision. Prospective study of 13 eyes undergoing micro-incisional cataract surgery through a 1.8mm corneal incision with toric IOL implantation (Lentis L313T(®), Oculentis) to treat over one D of preoperative corneal astigmatism. Preoperative evaluation included keratometry, subjective refraction, and total and corneal aberrometry (KR-1(®), Topcon). Six months postoperatively, measurements included slit lamp photography, documenting IOL rotation, tilt or decentration, uncorrected visual acuity, best-corrected visual acuity and objective quality of vision measurement (OQAS(®) Visiometrics, Spain). Postoperatively, mean uncorrected distance visual acuity was 8.33/10 ± 1.91 (0.09 ± 0.11 LogMar). Mean postoperative refractive sphere was 0.13 ± 0.73 diopters. Mean refractive astigmatism was -0.66 ± 0.56 diopters with corneal astigmatism of 2.17 ± 0.68 diopters. Mean IOL rotation was 4.4° ± 3.6° (range 0° to 10°). Mean rotation of this IOL at 6 months was less than 5°, demonstrating stability of the optic within the capsular bag. Objective quality of vision measurements were consistent with subjective uncorrected visual acuity. Implantation of the L313T(®) IOL is safe and effective for correction of corneal astigmatism in 1.8mm micro-incisional cataract surgery. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  9. Precision-Recall-Gain Curves:PR Analysis Done Right

    Flach, Peter; Kull, Meelis

    2015-01-01

    Precision-Recall analysis abounds in applications of binary classification where true negatives do not add value and hence should not affect assessment of the classifier's performance. Perhaps inspired by the many advantages of receiver operating characteristic (ROC) curves and the area under such curves for accuracy-based performance assessment, many researchers have taken to report Precision-Recall (PR) curves and associated areas as performance metric. We demonstrate in this paper that thi...

  10. System and method for high precision isotope ratio destructive analysis

    Bushaw, Bruce A; Anheier, Norman C; Phillips, Jon R

    2013-07-02

    A system and process are disclosed that provide high accuracy and high precision destructive analysis measurements for isotope ratio determination of relative isotope abundance distributions in liquids, solids, and particulate samples. The invention utilizes a collinear probe beam to interrogate a laser ablated plume. This invention provides enhanced single-shot detection sensitivity approaching the femtogram range, and isotope ratios that can be determined at approximately 1% or better precision and accuracy (relative standard deviation).

  11. Optimization to improve precision in neutron activation analysis

    Yustina Tri Handayani

    2010-01-01

    The level of precision or accuracy required in analysis should be satisfied the general requirements and customer needs. In presenting the results of the analysis, the level of precision is expressed as uncertainty. Requirement general is Horwitz prediction. Factors affecting the uncertainty in the Neutron Activation Analysis (NAA) include the mass of sample, mass standards, concentration in standard, count of sample, count of standard and counting geometry. Therefore, to achieve the expected level of precision, these parameters need to be optimized. A standard concentration of similar materials is applied as a basis of calculation. In the calculation NIST SRM 2704 is applied for sediment samples. Mass of sample, irradiation time and cooling time can be modified to obtain the expected uncertainty. The prediction results show the level of precision for Al, V, Mg, Mn, K, Na, As, Cr, Co, Fe, and Zn eligible the Horwitz. The predictive the count and standard deviation for Mg-27 and Zn-65 were higher than the actual value occurred due to overlapping of Mg-27 and Mn-54 peaks and Zn-65 and Fe-59 peaks. Precision level of Ca is greater than the Horwitz, since the value of microscopic cross section, the probability of radiation emission of Ca-49 and gamma spectrometer efficiency at 3084 keV is relatively small. Increased precision can only be done by extending the counting time and multiply the number of samples, because of the fixed value. The prediction results are in accordance with experimental results. (author)

  12. Precise Plan in the analysis of volume precision in SynergyTM conebeam CT image

    Bai Sen; Xu Qingfeng; Zhong Renming; Jiang Xiaoqin; Jiang Qingfeng; Xu Feng

    2007-01-01

    Objective: A method of checking the volume precision in Synergy TM conebeam CT image. Methods: To scan known phantoms (big, middle, small spheres, cubes and cuniform cavum) at different positions (CBCT centre and departure centre from 5, 8, 10 cm along the accelerator G-T way)with conebeam CT, the phantom volume of reconstructed images were measure. Then to compared measured volume of Synergy TM conebeam CT with fanbeam CT results and nominal values. Results: The middle spheres had 1.5% discrepancy in nominal values and metrical average values at CBCT centre and departure from centre 5, 8 cm along accelerator G-T way. The small spheres showed 8.1%, with 0.8 % of the big cube and 2.9% of small cube, in nominal values and metrical average values at CBCT centre and departure from centre 5, 8, 10 cm along the accelerator G-T way. Conclusion: In valid scan range of Synergy TM conebeam CT, reconstructed precision is independent of the distance deviation from the center. (authors)

  13. Object-sensitive Type Analysis of PHP

    Van der Hoek, Henk Erik; Hage, J

    2015-01-01

    In this paper we develop an object-sensitive type analysis for PHP, based on an extension of the notion of monotone frameworks to deal with the dynamic aspects of PHP, and following the framework of Smaragdakis et al. for object-sensitive analysis. We consider a number of instantiations of the

  14. Variable precision rough set for multiple decision attribute analysis

    Lai; Kin; Keung

    2008-01-01

    A variable precision rough set (VPRS) model is used to solve the multi-attribute decision analysis (MADA) problem with multiple conflicting decision attributes and multiple condition attributes. By introducing confidence measures and a β-reduct, the VPRS model can rationally solve the conflicting decision analysis problem with multiple decision attributes and multiple condition attributes. For illustration, a medical diagnosis example is utilized to show the feasibility of the VPRS model in solving the MADA...

  15. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  16. An Empirical Study of Precise Interprocedural Array Analysis

    Michael Hind

    1994-01-01

    Full Text Available In this article we examine the role played by the interprocedural analysis of array accesses in the automatic parallelization of Fortran programs. We use the PTRAN system to provide measurements of several benchmarks to compare different methods of representing interprocedurally accessed arrays. We examine issues concerning the effectiveness of automatic parallelization using these methods and the efficiency of a precise summarization method.

  17. Object-oriented analysis and design

    Deacon, John

    2005-01-01

    John Deacon’s in-depth, highly pragmatic approach to object-oriented analysis and design, demonstrates how to lay the foundations for developing the best possible software. Students will learn how to ensure that analysis and design remain focused and productive. By working through the book, they will gain a solid working knowledge of best practices in software development.

  18. Neutron activation analysis of limestone objects

    Meyers, P.; Van Zelst, L.

    1977-01-01

    The elemental composition of samples from limestone objects were determined by neutron activation analysis to investigate whether this technique can be used to distinguish between objects made of limestone from different sources. Samples weighing between 0.2-2 grams were obtained by drilling from a series of ancient Egyptian and medieval Spanish objects. Analysis was performed on aliquots varying in weight from 40-100 milligrams. The following elements were determined quantitatively: Na, K, Rb, Cs, Ba, Sc, La, Ce, Sm, Eu, Hf, Th, Ta, Cr, Mn, Fe, Co and Zn. The data on Egyptian limestones indicate that, because of the inhomogeneous nature of the stone, 0.2-2 gram samples may not be representative of an entire object. Nevertheless, multivariate statistical methods produced a clear distinction between objects originating from the Luxor area (ancient Thebes) and objects found north of Luxor. The Spanish limestone studied appeared to be more homogeneous. Samples from stylistically related objects have similar elemental compositions while relative large differences were observed between objects having no relationship other than the common provenance of medieval Spain. (orig.) [de

  19. Aspects of precision and accuracy in neutron activation analysis

    Heydorn, K.

    1980-03-01

    Analytical results without systematic errors and with accurately known random errors are normally distributed around their true values. Such results may be produced by means of neutron activation analysis both with and without radiochemical separation. When all sources of random variation are known a priori, their effect may be combined with the Poisson statistics characteristic of the counting process, and the standard deviation of a single analytical result may be estimated. The various steps of a complete neutron activation analytical procedure are therefore studied in detail with respect to determining their contribution to the overall variability of the final result. Verification of the estimated standard deviation is carried out by demonstrating the absence of significant unknown random errors through analysing, in replicate, samples covering the range of concentrations and matrices anticipated in actual use. Agreement between the estimated and the observed variability of replicate results is then tested by a simple statistic T based on the chi-square distribution. It is found that results from neutron activation analysis on biological samples can be brought into statistical control. In routine application of methods in statistical control the same statistical test may be used for quality control when some of the actual samples are analysed in duplicate. This analysis of precision serves to detect unknown or unexpected sources of variation of the analytical results, and both random and systematic errors have been discovered in practical trace element investigations in different areas of research. Particularly, at the ultratrace level of concentration where there are few or no standard reference materials for ascertaining the accuracy of results, the proposed quality control based on the analysis of precision combined with neutron activation analysis with radiochemical separation, with an a priori precision independent of the level of concentration, becomes a

  20. BEAMGAA. A chance for high precision analysis of big samples

    Goerner, W.; Berger, A.; Haase, O.; Segebade, Chr.; Alber, D.; Monse, G.

    2005-01-01

    In activation analysis of traces in small samples, the non-equivalence of the activating radiation doses of sample and calibration material gives rise to sometimes tolerable systematic errors. Conversely, analysis of major components usually demands high trueness and precision. To meet this, beam geometry activation analysis (BEAMGAA) procedures have been developed for instrumental photon (IPAA) and neutron activation analysis (INAA) in which the activating neutron/photon beam exhibits broad, flat-topped characteristics. This results in a very low lateral activating flux gradient compared to known radiation facilities, however, at significantly lower flux density. The axial flux gradient can be accounted for by a monitor-sample-monitor assembly. As a first approach, major components were determined in high purity substances as well as selenium in a cattle fodder additive. (author)

  1. Designing concept maps for a precise and objective description of pharmaceutical innovations

    Iordatii Maia

    2013-01-01

    Full Text Available Abstract Background When a new drug is launched onto the market, information about the new manufactured product is contained in its monograph and evaluation report published by national drug agencies. Health professionals need to be able to determine rapidly and easily whether the new manufactured product is potentially useful for their practice. There is therefore a need to identify the best way to group together and visualize the main items of information describing the nature and potential impact of the new drug. The objective of this study was to identify these items of information and to bring them together in a model that could serve as the standard for presenting the main features of new manufactured product. Methods We developed a preliminary conceptual model of pharmaceutical innovations, based on the knowledge of the authors. We then refined this model, using a random sample of 40 new manufactured drugs recently approved by the national drug regulatory authorities in France and covering a broad spectrum of innovations and therapeutic areas. Finally, we used another sample of 20 new manufactured drugs to determine whether the model was sufficiently comprehensive. Results The results of our modeling led to three sub models described as conceptual maps representingi the medical context for use of the new drug (indications, type of effect, therapeutical arsenal for the same indications, ii the nature of the novelty of the new drug (new molecule, new mechanism of action, new combination, new dosage, etc., and iii the impact of the drug in terms of efficacy, safety and ease of use, compared with other drugs with the same indications. Conclusions Our model can help to standardize information about new drugs released onto the market. It is potentially useful to the pharmaceutical industry, medical journals, editors of drug databases and medical software, and national or international drug regulation agencies, as a means of describing the main

  2. Analysis of Hall Probe Precise Positioning with Cylindrical Permanent Magnet

    Belicev, P.; Vorozhtsov, A.S.; Vorozhtsov, S.B.

    2007-01-01

    Precise positioning of a Hall probe for cyclotron magnetic field mapping, using cylindrical permanent magnets, was analyzed. The necessary permanent magnet parameters in order to achieve ±20 μm position precision, were determined. (author)

  3. Precision of neutron activation analysis for environmental biological materials

    Hamaguchi, Hiroshi; Iwata, Shiro; Koyama, Mutsuo; Sasajima, Kazuhisa; Numata, Yuichi.

    1977-01-01

    Between 1973 and 1974 a special committee ''Research on the application of neutron activation analysis to the environmental samples'' had been organized at the Research Reactor Institute, Kyoto University. Eleven research groups composed mainly of the committee members cooperated in the intercomparison programme of the reactor neutron activation analysis of NBS standard reference material, 1571 Orchard Leaves and 1577 Bovine Liver. Five different type of reactors were used for the neutron irradiation; i.e. KUR reactor of the Research Reactor Institute, Kyoto University, TRIGA MARK II reactor of the Institute for Atomic Energy, Rikkyo University, and JRR-2, JRR-3, JRR-4 reactor of Japan Atomic Energy Research Institute. Analyses were performed mainly by instrumental method. Precision of the analysis of 23 elements in Orchard Leaves and 13 elements in Bovine Liver presented by the different research groups was shown in table 4 and 5, respectively. The coefficient of variation for these elements was from several to -- 30 percent. Averages given to these elements agreed well with the NBS certified or reference values. Thus, from the practical point of view for the routine multielement analysis of environmental samples, the validity of the instrumental neutron activation technique for this purpose has been proved. (auth.)

  4. Automated analysis of objective-prism spectra

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  5. Precision of Carbon-14 analysis in a single laboratory

    Nashriyah Mat; Misman Sumin; Holland, P.T.

    2009-01-01

    In a single laboratory, one operator has used a Biological Material Oxidizer (BMO) unit to prepare (combust) solid samples before analyzing (counting) the radioactivity by using various Liquid Scintillation Counters (LSCs). The different batches of commercially available solid Certified Reference Material (CRM, Amersham, UK) standards were analyzed depending on the time of analysis over a period of seven years. The certified radioactivity and accuracy of the C-14 standards as cellulose tabs, designated as the Certified Reference Material (CRM), was 5000 + 3% DPM. Each analysis was carried out using triplicate tabs. The medium of counting was commercially available cocktail containing the sorbent solution for the oxidizer gases, although of different batches were used depending on the date of analysis. The mean DPM of the solutions was measured after correction for quenching by the LSC internal standard procedure and subtracting the mean DPM of control. The precisions of the standard and control counts and of the recovery percentage for the CRM were measured as the coefficients of variation (CV), for the C-14 determination over the seven year period. The results from a recently acquired Sample Oxidizer unit were also included for comparison. (Author)

  6. Objective analysis of toolmarks in forensics

    Grieve, Taylor N. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Since the 1993 court case of Daubert v. Merrell Dow Pharmaceuticals, Inc. the subjective nature of toolmark comparison has been questioned by attorneys and law enforcement agencies alike. This has led to an increased drive to establish objective comparison techniques with known error rates, much like those that DNA analysis is able to provide. This push has created research in which the 3-D surface profile of two different marks are characterized and the marks’ cross-sections are run through a comparative statistical algorithm to acquire a value that is intended to indicate the likelihood of a match between the marks. The aforementioned algorithm has been developed and extensively tested through comparison of evenly striated marks made by screwdrivers. However, this algorithm has yet to be applied to quasi-striated marks such as those made by the shear edge of slip-joint pliers. The results of this algorithm’s application to the surface of copper wire will be presented. Objective mark comparison also extends to comparison of toolmarks made by firearms. In an effort to create objective comparisons, microstamping of firing pins and breech faces has been introduced. This process involves placing unique alphanumeric identifiers surrounded by a radial code on the surface of firing pins, which transfer to the cartridge’s primer upon firing. Three different guns equipped with microstamped firing pins were used to fire 3000 cartridges. These cartridges are evaluated based on the clarity of their alphanumeric transfers and the clarity of the radial code surrounding the alphanumerics.

  7. DDASAC, Double-Precision Differential or Algebraic Sensitivity Analysis

    Caracotsios, M.; Stewart, W.E.; Petzold, L.

    1997-01-01

    1 - Description of program or function: DDASAC solves nonlinear initial-value problems involving stiff implicit systems of ordinary differential and algebraic equations. Purely algebraic nonlinear systems can also be solved, given an initial guess within the region of attraction of a solution. Options include automatic reconciliation of inconsistent initial states and derivatives, automatic initial step selection, direct concurrent parametric sensitivity analysis, and stopping at a prescribed value of any user-defined functional of the current solution vector. Local error control (in the max-norm or the 2-norm) is provided for the state vector and can include the sensitivities on request. 2 - Method of solution: Reconciliation of initial conditions is done with a damped Newton algorithm adapted from Bain and Stewart (1991). Initial step selection is done by the first-order algorithm of Shampine (1987), extended here to differential-algebraic equation systems. The solution is continued with the DASSL predictor- corrector algorithm (Petzold 1983, Brenan et al. 1989) with the initial acceleration phase detected and with row scaling of the Jacobian added. The backward-difference formulas for the predictor and corrector are expressed in divide-difference form, and the fixed-leading-coefficient form of the corrector (Jackson and Sacks-Davis 1980, Brenan et al. 1989) is used. Weights for error tests are updated in each step with the user's tolerances at the predicted state. Sensitivity analysis is performed directly on the corrector equations as given by Catacotsios and Stewart (1985) and is extended here to the initialization when needed. 3 - Restrictions on the complexity of the problem: This algorithm, like DASSL, performs well on differential-algebraic systems of index 0 and 1 but not on higher-index systems; see Brenan et al. (1989). The user assigns the work array lengths and the output unit. The machine number range and precision are determined at run time by a

  8. The PPP Precision Analysis Based on BDS Regional Navigation System

    ZHU Yongxing

    2015-04-01

    Full Text Available BeiDou navigation satellite system(BDS has opened service in most of the Asia-Pacific region, it offers the possibility to break the technological monopoly of GPS in the field of high-precision applications, so its performance of precise point positioning (PPP has been a great concern. Firstly, the constellation of BeiDou regional navigation system and BDS/GPS tracking network is introduced. Secondly, the precise ephemeris and clock offset accuracy of BeiDou satellite based on domestic tracking network is analyzed. Finally, the static and kinematic PPP accuracy is studied, and compared with the GPS. The actual measured numerical example shows that the static and kinematic PPP based on BDS can achieve centimeter-level and decimeter-level respectively, reaching the current level of GPS precise point positioning.

  9. Object-Oriented Analysis, Structured Analysis, and Jackson System Development

    Van Assche, F.; Wieringa, Roelf J.; Moulin, B.; Rolland, C

    1991-01-01

    Conceptual modeling is the activity of producing a conceptual model of an actual or desired version of a universe of discourse (UoD). In this paper, two methods of conceptual modeling are compared, structured analysis (SA) and object-oriented analysis (OOA). This is done by transforming a model

  10. Best, Useful and Objective Precisions for Information Retrieval of Three Search Methods in PubMed and iPubMed

    Somayyeh Nadi Ravandi

    2016-10-01

    Full Text Available MEDLINE is one of the valuable sources of medical information on the Internet. Among the different open access sites of MEDLINE, PubMed is the best-known site. In 2010, iPubMed was established with an interaction-fuzzy search method for MEDLINE access. In the present work, we aimed to compare the precision of the retrieved sources (Best, Useful and Objective precision in the PubMed and iPubMed using two search methods (simple and MeSH search in PubMed and interaction-fuzzy method in iPubmed. During our semi-empirical study period, we held training workshops for 61 students of higher education to teach them Simple Search, MeSH Search, and Fuzzy-Interaction Search methods. Then, the precision of 305 searches for each method prepared by the students was calculated on the basis of Best precision, Useful precision, and Objective precision formulas. Analyses were done in SPSS version 11.5 using the Friedman and Wilcoxon Test, and three precisions obtained with the three precision formulas were studied for the three search methods. The mean precision of the interaction-fuzzy Search method was higher than that of the simple search and MeSH search for all three types of precision, i.e., Best precision, Useful precision, and Objective precision, and the Simple search method was in the next rank, and their mean precisions were significantly different (P < 0.001. The precision of the interaction-fuzzy search method in iPubmed was investigated for the first time. Also for the first time, three types of precision were evaluated in PubMed and iPubmed. The results showed that the Interaction-Fuzzy search method is more precise than using the natural language search (simple search and MeSH search, and users of this method found papers that were more related to their queries; even though search in Pubmed is useful, it is important that users apply new search methods to obtain the best results.

  11. Numerical Simulation Analysis of High-precision Dispensing Needles for Solid-liquid Two-phase Grinding

    Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming

    2018-03-01

    In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.

  12. Objective Bayesian Analysis of Skew- t Distributions

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  13. Solution Method and Precision Analysis of Double-difference Dynamic Precise Orbit Determination of BeiDou Navigation Satellite System

    LIU Weiping

    2016-02-01

    Full Text Available To resolve the high relativity between the transverse element of GEO orbit and double-difference ambiguity, the classical double-difference dynamic method is improved and the method, which is to determine precise BeiDou satellite orbit using carrier phase and pseudo-range smoothed by phase, is proposed. The feasibility of the method is discussed and the influence of the method about ambiguity fixing is analyzed. Considering the characteristic of BeiDou, the method, which is to fix double-difference ambiguity of BeiDou satellites by QIF, is derived. The real data analysis shows that the new method, which can reduce the relativity and assure the precision, is better than the classical double-difference dynamic method. The result of ambiguity fixing is well by QIF, but the ambiguity fixing success rate is not high on the whole. So the precision of BeiDou orbit can't be improved clearly after ambiguity fixing.

  14. Ion beam analysis in art and archaeology: attacking the power precisions paradigm

    Abraham, Meg

    2004-01-01

    It is a post-modern axiom that the closer one looks at something the more blinkered is the view, thus the result is often a failure to see the whole picture. With this in mind, the value of a tool for art and archaeology applications is greatly enhanced if the information is scientifically precise and yet is easily integrated into the broader study regarding the objects at hand. Art and archaeological objects offer some unique challenges for researchers. First, they are almost always extraordinarily inhomogeneous across individual pieces and across types. Second they are often valuable and delicate so sampling is discouraged. Finally, in most cases, each piece is unique, thus the data is also unique and is of greatest value when incorporated into the overall understanding of the object or of the culture of the artisan. Ion beam analysis solves many of these problems. With IBA, it is possible to avoid sampling by using an external beam setup or by manipulating small objects in a vacuum. The technique is largely non-destructive, allowing for multiple data points to be taken across an object. The X-ray yields are from deeper in the sample than those of other techniques and using RBS one can attain bulk concentrations from microns into the sample. And finally, the resulting X-ray spectra is easily interpreted and understood by many conservators and curators, while PIXE maps are a wonderful visual record of the results of the analysis. Some examples of the special role that ion beam analysis plays in the examination of cultural objects will be covered in this talk

  15. High Dynamics and Precision Optical Measurement Using a Position Sensitive Detector (PSD in Reflection-Mode: Application to 2D Object Tracking over a Smart Surface

    Ioan Alexandru Ivan

    2012-12-01

    Full Text Available When related to a single and good contrast object or a laser spot, position sensing, or sensitive, detectors (PSDs have a series of advantages over the classical camera sensors, including a good positioning accuracy for a fast response time and very simple signal conditioning circuits. To test the performance of this kind of sensor for microrobotics, we have made a comparative analysis between a precise but slow video camera and a custom-made fast PSD system applied to the tracking of a diffuse-reflectivity object transported by a pneumatic microconveyor called Smart-Surface. Until now, the fast system dynamics prevented the full control of the smart surface by visual servoing, unless using a very expensive high frame rate camera. We have built and tested a custom and low cost PSD-based embedded circuit, optically connected with a camera to a single objective by means of a beam splitter. A stroboscopic light source enhanced the resolution. The obtained results showed a good linearity and a fast (over 500 frames per second response time which will enable future closed-loop control by using PSD.

  16. High precision isotopic ratio analysis of volatile metal chelates

    Hachey, D.L.; Blais, J.C.; Klein, P.D.

    1980-01-01

    High precision isotope ratio measurements have been made for a series of volatile alkaline earth and transition metal chelates using conventional GC/MS instrumentation. Electron ionization was used for alkaline earth chelates, whereas isobutane chemical ionization was used for transition metal studies. Natural isotopic abundances were determined for a series of Mg, Ca, Cr, Fe, Ni, Cu, Cd, and Zn chelates. Absolute accuracy ranged between 0.01 and 1.19 at. %. Absolute precision ranged between +-0.01-0.27 at. % (RSD +- 0.07-10.26%) for elements that contained as many as eight natural isotopes. Calibration curves were prepared using natural abundance metals and their enriched 50 Cr, 60 Ni, and 65 Cu isotopes covering the range 0.1-1010.7 at. % excess. A separate multiple isotope calibration curve was similarly prepared using enriched 60 Ni (0.02-2.15 at. % excess) and 62 Ni (0.23-18.5 at. % excess). The samples were analyzed by GC/CI/MS. Human plasma, containing enriched 26 Mg and 44 Ca, was analyzed by EI/MS. 1 figure, 5 tables

  17. Analysis of precision in chemical oscillators: implications for circadian clocks

    D'Eysmond, Thomas; De Simone, Alessandro; Naef, Felix

    2013-01-01

    Biochemical reaction networks often exhibit spontaneous self-sustained oscillations. An example is the circadian oscillator that lies at the heart of daily rhythms in behavior and physiology in most organisms including humans. While the period of these oscillators evolved so that it resonates with the 24 h daily environmental cycles, the precision of the oscillator (quantified via the Q factor) is another relevant property of these cell-autonomous oscillators. Since this quantity can be measured in individual cells, it is of interest to better understand how this property behaves across mathematical models of these oscillators. Current theoretical schemes for computing the Q factors show limitations for both high-dimensional models and in the vicinity of Hopf bifurcations. Here, we derive low-noise approximations that lead to numerically stable schemes also in high-dimensional models. In addition, we generalize normal form reductions that are appropriate near Hopf bifurcations. Applying our approximations to two models of circadian clocks, we show that while the low-noise regime is faithfully recapitulated, increasing the level of noise leads to species-dependent precision. We emphasize that subcomponents of the oscillator gradually decouple from the core oscillator as noise increases, which allows us to identify the subnetworks responsible for robust rhythms. (paper)

  18. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  19. In vivo glenohumeral analysis using 3D MRI models and a flexible software tool: feasibility and precision.

    Busse, Harald; Thomas, Michael; Seiwerts, Matthias; Moche, Michael; Busse, Martin W; von Salis-Soglio, Georg; Kahn, Thomas

    2008-01-01

    To implement a PC-based morphometric analysis platform and to evaluate the feasibility and precision of MRI measurements of glenohumeral translation. Using a vertically open 0.5T MRI scanner, the shoulders of 10 healthy subjects were scanned in apprehension (AP) and in neutral position (NP), respectively. Surface models of the humeral head (HH) and the glenoid cavity (GC) were created from segmented MR images by three readers. Glenohumeral translation was determined by the projection point of the manually fitted HH center on the GC plane defined by the two main principal axes of the GC model. Positional precision, given as mean (extreme value at 95% confidence level), was 0.9 (1.8) mm for the HH center and 0.7 (1.6) mm for the GC centroid; angular GC precision was 1.3 degrees (2.3 degrees ) for the normal and about 4 degrees (7 degrees ) for the anterior and superior coordinate axes. The two-dimensional (2D) precision of the HH projection point was 1.1 (2.2) mm. A significant HH translation between AP and NP was found. Despite a limited quality of the underlying model data, our PC-based analysis platform allows a precise morphometric analysis of the glenohumeral joint. The software is easily extendable and may potentially be used for an objective evaluation of therapeutical measures.

  20. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  1. Noble gases in Mars atmosphere: new precise analysis with Paloma

    Sarda, Ph.; Paloma Team

    2003-04-01

    The Viking mission embarked a mass spectrometer designed by Alfred O. Nier that yielded the first determination of the elemental and isotopic composition of noble gases in Mars atmosphere. For example, the 40Ar/36Ar ratio in martian air is roughly 10 fold that in terrestrial air. This extraordinary accomplishment, however, has furnished only partial results with large analytical uncertainties. For example, we do not know the isotopic composition of helium, and only very poorly that of Ne, Kr and Xe. In planetary science, it is fundamental to have a good knowledge of the atmosphere because this serves as a reference for all further studies of volatiles. In addition, part of our present knowledge of Mars atmosphere is based on the SNC meteorites, and again points to important differences between the atmospheres of Earth and Mars. For example the 129Xe/132Xe ratio of martian atmosphere would be twice that of terrestrial air and the 36Ar/38Ar ratio strongly different from the terrestrial or solar value. There is a need for confirming that the atmospheric components found in SNC meteorites actually represents the atmosphere of Mars, or to determine how different they are. Paloma is an instrument designed to generate elemental and isotopic data for He, Ne, Ar, Kr and Xe (and other gases) using a mass spectrometer with a purification and separation line. Gas purification and separation did not exist on the Vicking instrument. Because Paloma includes purification and separation, we expect strong improvement in precision. Ne, Ar and Xe isotope ratios should be obtained with an accuracy of better than 1%. Determination of the presently unknown ^3He/^4He ratio is also awaited from this experiment. Knowledge of noble gas isotopes in Mars atmosphere will allow some insight into major planetary processes such as degassing (^3He/^4He, 40Ar/36Ar, 129Xe/130Xe, 136Xe/130Xe), gravitational escape to space (^3He/^4He, 20Ne/22Ne), hydrodynamic escape and/or impact erosion of the

  2. Precise analysis of the metal package photomultiplier single photoelectron spectra

    Chirikov-Zorin, I.E.; Fedorko, I.; Sykora, I.; Tokar, S.; Menzione, A.

    2000-01-01

    A deconvolution method based on a sophisticated photomultiplier response function was used to analyse the compact metal package photomultiplier spectra taken in single photoelectron mode. The spectra taken by Hamamtsu R5600 and R5900 photomultipliers have been analysed. The detailed analysis shows that the method appropriately describes the process of charge multiplication in these photomultipliers in a wide range of working regimes and the deconvoluted parameters are established with about 1% accuracy. The method can be used for a detailed analysis of photomultiplier noise and for calibration purposes

  3. Precise design-based defect characterization and root cause analysis

    Xie, Qian; Venkatachalam, Panneerselvam; Lee, Julie; Chen, Zhijin; Zafar, Khurram

    2017-03-01

    that human operators will typically miss), to obtain the exact defect location on design, to compare all defective patterns thus detected against a library of known patterns, and to classify all defective patterns as either new or known. By applying the computer to these tasks, we automate the entire process from defective pattern identification to pattern classification with high precision, and we perform this operation en masse during R & D, ramp, and volume production. By adopting the methodology, whenever a specific weak pattern is identified, we are able to run a series of characterization operations to ultimately arrive at the root cause. These characterization operations can include (a) searching all pre-existing Review SEM images for the presence of the specific weak pattern to determine whether there is any spatial (within die or within wafer) or temporal (within any particular date range, before or after a mask revision, etc.) correlation and (b) understanding the failure rate of the specific weak pattern to prioritize the urgency of the problem, (c) comparing the weak pattern against an OPC (Optical Procimity Correction) Verification report or a PWQ (Process Window Qualification)/FEM (Focus Exposure Matrix) result to assess the likelihood of it being a litho-sensitive pattern, etc. After resolving the specific weak pattern, we will categorize it as known pattern, and the engineer will move forward with discovering new weak patterns.

  4. Precision Statistical Analysis of Images Based on Brightness Distribution

    Muzhir Shaban Al-Ani

    2017-07-01

    Full Text Available Study the content of images is considered an important topic in which reasonable and accurate analysis of images are generated. Recently image analysis becomes a vital field because of huge number of images transferred via transmission media in our daily life. These crowded media with images lead to highlight in research area of image analysis. In this paper, the implemented system is passed into many steps to perform the statistical measures of standard deviation and mean values of both color and grey images. Whereas the last step of the proposed method concerns to compare the obtained results in different cases of the test phase. In this paper, the statistical parameters are implemented to characterize the content of an image and its texture. Standard deviation, mean and correlation values are used to study the intensity distribution of the tested images. Reasonable results are obtained for both standard deviation and mean value via the implementation of the system. The major issue addressed in the work is concentrated on brightness distribution via statistical measures applying different types of lighting.

  5. High precision analysis of trace lithium isotope by thermal ionization mass spectrometry

    Tang Lei; Liu Xuemei; Long Kaiming; Liu Zhao; Yang Tianli

    2010-01-01

    High precision analysis method of ng lithium by thermal ionization mass spectrometry is developed. By double-filament measurement,phosphine acid ion enhancer and sample pre-baking technique,the precision of trace lithium analysis is improved. For 100 ng lithium isotope standard sample, relative standard deviation is better than 0.086%; for 10 ng lithium isotope standard sample, relative standard deviation is better than 0.90%. (authors)

  6. Objectivity

    Daston, Lorraine

    2010-01-01

    Objectivity has a history, and it is full of surprises. In Objectivity, Lorraine Daston and Peter Galison chart the emergence of objectivity in the mid-nineteenth-century sciences--and show how the concept differs from its alternatives, truth-to-nature and trained judgment. This is a story of lofty epistemic ideals fused with workaday practices in the making of scientific images. From the eighteenth through the early twenty-first centuries, the images that reveal the deepest commitments of the empirical sciences--from anatomy to crystallography--are those featured in scientific atlases, the compendia that teach practitioners what is worth looking at and how to look at it. Galison and Daston use atlas images to uncover a hidden history of scientific objectivity and its rivals. Whether an atlas maker idealizes an image to capture the essentials in the name of truth-to-nature or refuses to erase even the most incidental detail in the name of objectivity or highlights patterns in the name of trained judgment is a...

  7. A strategic analysis of Business Objects' portal application

    Kristinsson, Olafur Oskar

    2007-01-01

    Business Objects is the leading software firm producing business intelligence software. Business intelligence is a growing market. Small to medium businesses are increasingly looking at business intelligence. Business Objects' flagship product in the enterprise market is Business Objects XI and for medium-size companies it has Crystal Decisions. Portals are the front end for the two products. InfoView, Business Objects portal application, lacks a long-term strategy. This analysis evaluates...

  8. The multi-filter rotating shadowband radiometer (MFRSR) - precision infrared radiometer (PIR) platform in Fairbanks: Scientific objectives

    Stamnes, K.; Leontieva, E. [Univ. of Alaska, Fairbanks (United States)

    1996-04-01

    The multi-filter rotating shadowband radiometer (MFRSR) and precision infrared radiometer (PIR) have been employed at the Geophysical Institute in Fairbanks to check their performance under arctic conditions. Drawing on the experience of the previous measurements in the Arctic, the PIR was equipped with a ventilator to prevent frost and moisture build-up. We adopted the Solar Infrared Observing Sytem (SIROS) concept from the Southern Great Plains Cloud and Radiation Testbed (CART) to allow implementation of the same data processing software for a set of radiation and meteorological instruments. To validate the level of performance of the whole SIROS prior to its incorporation into the North Slope of Alaska (NSA) Cloud and Radiation Testbed Site instrumental suite for flux radiatin measurements, the comparison between measurements and model predictions will be undertaken to assess the MFRSR-PIR Arctic data quality.

  9. Precision and Accuracy of k0-NAA Method for Analysis of Multi Elements in Reference Samples

    Sri-Wardani

    2004-01-01

    Accuracy and precision of k 0 -NAA method could determine in the analysis of multi elements contained in reference samples. The analyzed results of multi elements in SRM 1633b sample were obtained with optimum results in bias of 20% but it is in a good accuracy and precision. The analyzed results of As, Cd and Zn in CCQM-P29 rice flour sample were obtained with very good result in bias of 0.5 - 5.6%. (author)

  10. GEOPOSITIONING PRECISION ANALYSIS OF MULTIPLE IMAGE TRIANGULATION USING LRO NAC LUNAR IMAGES

    K. Di

    2016-06-01

    Full Text Available This paper presents an empirical analysis of the geopositioning precision of multiple image triangulation using Lunar Reconnaissance Orbiter Camera (LROC Narrow Angle Camera (NAC images at the Chang’e-3(CE-3 landing site. Nine LROC NAC images are selected for comparative analysis of geopositioning precision. Rigorous sensor models of the images are established based on collinearity equations with interior and exterior orientation elements retrieved from the corresponding SPICE kernels. Rational polynomial coefficients (RPCs of each image are derived by least squares fitting using vast number of virtual control points generated according to rigorous sensor models. Experiments of different combinations of images are performed for comparisons. The results demonstrate that the plane coordinates can achieve a precision of 0.54 m to 2.54 m, with a height precision of 0.71 m to 8.16 m when only two images are used for three-dimensional triangulation. There is a general trend that the geopositioning precision, especially the height precision, is improved with the convergent angle of the two images increasing from several degrees to about 50°. However, the image matching precision should also be taken into consideration when choosing image pairs for triangulation. The precisions of using all the 9 images are 0.60 m, 0.50 m, 1.23 m in along-track, cross-track, and height directions, which are better than most combinations of two or more images. However, triangulation with selected fewer images could produce better precision than that using all the images.

  11. Constraint Solver Techniques for Implementing Precise and Scalable Static Program Analysis

    Zhang, Ye

    solver using unification we could make a program analysis easier to design and implement, much more scalable, and still as precise as expected. We present an inclusion constraint language with the explicit equality constructs for specifying program analysis problems, and a parameterized framework...... developers to build reliable software systems more quickly and with fewer bugs or security defects. While designing and implementing a program analysis remains a hard work, making it both scalable and precise is even more challenging. In this dissertation, we show that with a general inclusion constraint...... data flow analyses for C language, we demonstrate a large amount of equivalences could be detected by off-line analyses, and they could then be used by a constraint solver to significantly improve the scalability of an analysis without sacrificing any precision....

  12. Use of objective analysis to estimate winter temperature and ...

    In the complex terrain of Himalaya, nonavailability of snow and meteorological data of the remote locations ... Precipitation intensity; spatial interpolation; objective analysis. J. Earth Syst. ... This technique needs historical database and unable ...

  13. An Information-Based Approach to Precision Analysis of Indoor WLAN Localization Using Location Fingerprint

    Mu Zhou

    2015-12-01

    Full Text Available In this paper, we proposed a novel information-based approach to precision analysis of indoor wireless local area network (WLAN localization using location fingerprint. First of all, by using the Fisher information matrix (FIM, we derive the fundamental limit of WLAN fingerprint-based localization precision considering different signal distributions in characterizing the variation of received signal strengths (RSSs in the target environment. After that, we explore the relationship between the localization precision and access point (AP placement, which can provide valuable suggestions for the design of the highly-precise localization system. Second, we adopt the heuristic simulated annealing (SA algorithm to optimize the AP locations for the sake of approaching the fundamental limit of localization precision. Finally, the extensive simulations and experiments are conducted in both regular line-of-sight (LOS and irregular non-line-of-sight (NLOS environments to demonstrate that the proposed approach can not only effectively improve the WLAN fingerprint-based localization precision, but also reduce the time overhead.

  14. Geographic Object-Based Image Analysis: Towards a new paradigm

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.|info:eu-repo/dai/nl/224281216; Queiroz Feitosa, R.; van der Meer, F.D.|info:eu-repo/dai/nl/138940908; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  15. From Pixels to Geographic Objects in Remote Sensing Image Analysis

    Addink, E.A.; Van Coillie, Frieke M.B.; Jong, Steven M. de

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received

  16. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms

    Qianqian Wu

    2015-08-01

    Full Text Available High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms.

  17. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms.

    Wu, Qianqian; Yue, Honghao; Liu, Rongqiang; Zhang, Xiaoyou; Ding, Liang; Liang, Tian; Deng, Zongquan

    2015-08-14

    High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms.

  18. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  19. Droplet-counting Microtitration System for Precise On-site Analysis.

    Kawakubo, Susumu; Omori, Taichi; Suzuki, Yasutada; Ueta, Ikuo

    2018-01-01

    A new microtitration system based on the counting of titrant droplets has been developed for precise on-site analysis. The dropping rate was controlled by inserting a capillary tube as a flow resistance in a laboratory-made micropipette. The error of titration was 3% in a simulated titration with 20 droplets. The pre-addition of a titrant was proposed for precise titration within an error of 0.5%. The analytical performances were evaluated for chelate titration, redox titration and acid-base titration.

  20. Data analysis in an Object Request Broker environment

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  1. Ten years of Object-Oriented analysis on H1

    Laycock, Paul

    2012-01-01

    Over a decade ago, the H1 Collaboration decided to embrace the object-oriented paradigm and completely redesign its data analysis model and data storage format. The event data model, based on the ROOT framework, consists of three layers - tracks and calorimeter clusters, identified particles and finally event summary data - with a singleton class providing unified access. This original solution was then augmented with a fourth layer containing user-defined objects. This contribution will summarise the history of the solutions used, from modifications to the original design, to the evolution of the high-level end-user analysis object framework which is used by H1 today. Several important issues are addressed - the portability of expert knowledge to increase the efficiency of data analysis, the flexibility of the framework to incorporate new analyses, the performance and ease of use, and lessons learned for future projects.

  2. Determining characteristics of artificial near-Earth objects using observability analysis

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  3. Improvement of precision method of spectrophotometry with inner standardization and its use in plutonium solutions analysis

    Stepanov, A.V.; Stepanov, D.A.; Nikitina, S.A.; Gogoleva, T.D.; Grigor'eva, M.G.; Bulyanitsa, L.S.; Panteleev, Yu.A.; Pevtsova, E.V.; Domkin, V.D.; Pen'kin, M.V.

    2006-01-01

    Precision method of spectrophotometry with inner standardization is used for analysis of pure Pu solutions. Improvement of the spectrophotometer and spectrophotometric method of analysis is done to decrease accidental constituent of relative error of the method. Influence of U, Np impurities and corrosion products on systematic constituent of error of the method, and effect of fluoride-ion on completeness of Pu oxidation in sample preparation are studied [ru

  4. Examination of objective analysis precision using wind profiler and radiosonde network data

    Mace, G.G.; Ackerman, T.P. [Penn State Univ., University Park, PA (United States)

    1996-04-01

    One of the principal research strategies that has emerged from the science team of the Atmospheric Radiation Measurement (ARM) Program is the use of a single column model (SCM). The basic assumption behind the SCM approach is that a cloud and radiation parameterization embedded in a general circulation model can be effectively tested and improved by extracting that column parameterization from the general circulation model and then driving this single column at the lateral boundaries of the column with diagnosed large-scale atmospheric forcing. A second and related assumption is that the large-scale atmospheric state, and hence the associated forcing, can be characterized directly from observations. One of the primary reasons that the Southern Great Plains (SGP) site is located in Lamont, Oklahoma, is because Lamont is at the approximate center of the NOM Wind Profiler Demonstration Array (WPDA). The assumption was that hourly average wind profiles provided by the 7 wind profilers (one Lamont and six surrounding it in a hexagon) coupled with radiosonde launches every three hours at 5 sites (Lamont plus four of the six profiler locations forming the hexagon) would be sufficient to characterize accurately the large-scale forcing at the site and thereby provide the required forcing for the SCM. The goal of this study was to examine these three assumptions.

  5. Comparative analysis of imaging configurations and objectives for Fourier microscopy.

    Kurvits, Jonathan A; Jiang, Mingming; Zia, Rashid

    2015-11-01

    Fourier microscopy is becoming an increasingly important tool for the analysis of optical nanostructures and quantum emitters. However, achieving quantitative Fourier space measurements requires a thorough understanding of the impact of aberrations introduced by optical microscopes that have been optimized for conventional real-space imaging. Here we present a detailed framework for analyzing the performance of microscope objectives for several common Fourier imaging configurations. To this end, we model objectives from Nikon, Olympus, and Zeiss using parameters that were inferred from patent literature and confirmed, where possible, by physical disassembly. We then examine the aberrations most relevant to Fourier microscopy, including the alignment tolerances of apodization factors for different objective classes, the effect of magnification on the modulation transfer function, and vignetting-induced reductions of the effective numerical aperture for wide-field measurements. Based on this analysis, we identify an optimal objective class and imaging configuration for Fourier microscopy. In addition, the Zemax files for the objectives and setups used in this analysis have been made publicly available as a resource for future studies.

  6. The emerging potential for network analysis to inform precision cancer medicine.

    Ozturk, Kivilcim; Dow, Michelle; Carlin, Daniel E; Bejar, Rafael; Carter, Hannah

    2018-06-14

    Precision cancer medicine promises to tailor clinical decisions to patients using genomic information. Indeed, successes of drugs targeting genetic alterations in tumors, such as imatinib that targets BCR-ABL in chronic myelogenous leukemia, have demonstrated the power of this approach. However biological systems are complex, and patients may differ not only by the specific genetic alterations in their tumor, but by more subtle interactions among such alterations. Systems biology and more specifically, network analysis, provides a framework for advancing precision medicine beyond clinical actionability of individual mutations. Here we discuss applications of network analysis to study tumor biology, early methods for N-of-1 tumor genome analysis and the path for such tools to the clinic. Copyright © 2018. Published by Elsevier Ltd.

  7. Evaluation of precision and accuracy of neutron activation analysis method of environmental samples analysis

    Wardani, Sri; Rina M, Th.; L, Dyah

    2000-01-01

    Evaluation of precision and accuracy of Neutron Activation Analysis (NAA) method used by P2TRR performed by analyzed the standard reference samples from the National Institute of Environmental Study of Japan (NIES-CRM No.10 (rice flour) and the National Bureau of USA (NBS-SRM 1573a (tomato leave) by NAA method. In analyze the environmental SRM No.10 by NAA method in qualitatively could identified multi elements of contents, namely: Br, Ca, Co, CI, Cs, Gd, I, K< La, Mg, Mn, Na, Pa, Sb, Sm, Sr, Ta, Th, and Zn (19 elements) for SRM 1573a; As, Br, Cr, CI, Ce, Co, Cs, Fe, Ga, Hg, K, Mn, Mg, Mo, Na, Ni, Pb, Rb, Sr, Se, Sc, Sb, Ti, and Zn, (25 elements) for CRM No.10a; Ag, As, Br, Cr, CI, Ce, Cd, Co, Cs, Eu, Fe, Ga, Hg, K, Mg, Mn, Mo, Na, Nb, Pb, Rb, Sb, Sc, Th, TI, and Zn, (26 elements) for CRM No. 10b; As, Br, Co, CI, Ce, Cd, Ga, Hg, K, Mn, Mg, Mo, Na, Nb, Pb, Rb, Sb, Se, TI, and Zn (20 elementary) for CRM No.10c. In the quantitatively analysis could determined only some element of sample contents, namely: As, Co, Cd, Mo, Mn, and Zn. From the result compared with NIES or NBS values attained with deviation of 3% ∼ 15%. Overall, the result shown that the method and facilities have a good capability, but the irradiation facility and the software of spectrometry gamma ray necessary to developing or seriously research perform

  8. Frame sequences analysis technique of linear objects movement

    Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.

    2017-12-01

    Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.

  9. Quantitative analysis of factors affecting intraoperative precision and stability of optoelectronic and electromagnetic tracking systems

    Wagner, A.; Schicho, K.; Birkfellner, W.; Figl, M.; Seemann, R.; Koenig, F.; Kainberger, Franz; Ewers, R.

    2002-01-01

    This study aims to provide a quantitative analysis of the factors affecting the actual precision and stability of optoelectronic and electromagnetic tracking systems in computer-aided surgery under real clinical/intraoperative conditions. A 'phantom-skull' with five precisely determined reference distances between marker spheres is used for all measurements. Three optoelectronic and one electromagnetic tracking systems are included in this study. The experimental design is divided into three parts: (1) evaluation of serial- and multislice-CT (computed tomography) images of the phantom-skull for the precision of distance measurements by means of navigation software without a digitizer, (2) digitizer measurements under realistic intraoperative conditions with the factors OR-lamp (radiating into the field of view of the digitizer) or/and 'handling with ferromagnetic surgical instruments' (in the field of view of the digitizer) and (3) 'point-measurements' to analyze the influence of changes in the angle of inclination of the stylus axis. Deviations between reference distances and measured values are statistically investigated by means of analysis of variance. Computerized measurements of distances based on serial-CT data were more precise than based on multislice-CT data. All tracking systems included in this study proved to be considerably less precise under realistic OR conditions when compared to the technical specifications in the manuals of the systems. Changes in the angle of inclination of the stylus axis resulted in deviations of up to 3.40 mm (mean deviations for all systems ranging from 0.49 to 1.42 mm, variances ranging from 0.09 to 1.44 mm), indicating a strong need for improvements of stylus design. The electromagnetic tracking system investigated in this study was not significantly affected by small ferromagnetic surgical instruments

  10. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  11. Precision analysis of a multi-slice ultrasound sensor for non-invasive 3D kinematic analysis of knee joints.

    Masum, Md Abdullah; Lambert, Andrew J; Pickering, Mark R; Scarvell, J M; Smith, P N

    2012-01-01

    Currently the standard clinical practice for measuring the motion of bones in a knee joint with sufficient precision involves implanting tantalum beads into the bones to act as fiducial markers prior to imaging using X-ray equipment. This procedure is invasive in nature and exposure to ionizing radiation imposes a cancer risk and the patient's movements are confined to a narrow field of view. In this paper, an ultrasound based system for non-invasive kinematic evaluation of knee joints is proposed. The results of an initial analysis show that this system can provide the precision required for non-invasive motion analysis while the patient performs normal physical activities.

  12. Data analysis in an object request broker environment

    Malon, David M.; May, Edward N.; Grossman, Robert L.; Day, Christopher T.; Quarrie, David R.

    1996-01-01

    Computing for the Next Millennium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanism for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function is such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study. (author)

  13. Sternal instability measured with radiostereometric analysis. A study of method feasibility, accuracy and precision.

    Vestergaard, Rikke Falsig; Søballe, Kjeld; Hasenkam, John Michael; Stilling, Maiken

    2018-05-18

    A small, but unstable, saw-gap may hinder bone-bridging and induce development of painful sternal dehiscence. We propose the use of Radiostereometric Analysis (RSA) for evaluation of sternal instability and present a method validation. Four bone analogs (phantoms) were sternotomized and tantalum beads were inserted in each half. The models were reunited with wire cerclage and placed in a radiolucent separation device. Stereoradiographs (n = 48) of the phantoms in 3 positions were recorded at 4 imposed separation points. The accuracy and precision was compared statistically and presented as translations along the 3 orthogonal axes. 7 sternotomized patients were evaluated for clinical RSA precision by double-examination stereoradiographs (n = 28). In the phantom study, we found no systematic error (p > 0.3) between the three phantom positions, and precision for evaluation of sternal separation was 0.02 mm. Phantom accuracy was mean 0.13 mm (SD 0.25). In the clinical study, we found a detection limit of 0.42 mm for sternal separation and of 2 mm for anterior-posterior dislocation of the sternal halves for the individual patient. RSA is a precise and low-dose image modality feasible for clinical evaluation of sternal stability in research. ClinicalTrials.gov Identifier: NCT02738437 , retrospectively registered.

  14. Detailed precision and accuracy analysis of swarm parameters from a pulsed Townsend experiment

    Haefliger, P.; Franck, C. M.

    2018-02-01

    A newly built pulsed Townsend experimental setup which allows one to measure both electron and ion currents is presented. The principle of pulsed Townsend measurements itself is well established to obtain swarm parameters such as the effective ionization rate coefficient, the density-reduced mobility, and the density-normalized longitudinal diffusion coefficient. The main novelty of the present contribution is a detailed and comprehensive analysis of the entire measurement and evaluation chain with respect to accuracy, precision, and reproducibility. The influence of the input parameters (gap distance, applied voltage, measured pressure, and temperature) is analyzed in detail. An overall accuracy of ±0.5% in the density reduced electric field (E/N) is achieved, which is close to the theoretically possible limit using the chosen components. The precision of the experimental results is higher than the accuracy. Through an extensive measurement campaign, the repeatability of our measurements proved to be high and similar to the precision. The reproducibility of results at identical (E/N) is similar to the precision for different distances but decreases for varying pressures. For benchmark purposes, measurements for Ar, CO2, and N2 are presented and compared with our previous experimental setup, simulations, and other experimental references.

  15. Accurate and Precise Titriraetric Analysis of Uranium in Nuclear Fuels. RCN Report

    Tolk, A.; Lingerak, W.A.; Verheul-Klompmaker, T.A.

    1970-09-01

    For the accurate and precise titrimetric analysis of uranium in nuclear fuels, the material is dissolved in orthophosphoric and nitric acids. The nitric acid is fumed off, and the U (VI) present is analysed reductometrically in a CO 2 -atmosphere with iron (II) ammonium sulfate. For U 3 O 8 -test-sample aliquots of resp. 800 and 80 mg coefficients of variation of 0.012 resp. 0.11% are measured. (author)

  16. Critical Steps in Data Analysis for Precision Casimir Force Measurements with Semiconducting Films

    Banishev, A. A.; Chang, Chia-Cheng; Mohideen, U.

    2011-06-01

    Some experimental procedures and corresponding results of the precision measurement of the Casimir force between low doped Indium Tin Oxide (ITO) film and gold sphere are described. Measurements were performed using an Atomic Force Microscope in high vacuum. It is shown that the magnitude of the Casimir force decreases after prolonged UV treatment of the ITO film. Some critical data analysis steps such as the correction for the mechanical drift of the sphere-plate system and photodiodes are discussed.

  17. Forest Rent as an Object of Economic Analysis

    Lisichko Andriyana M.

    2018-01-01

    Full Text Available The article is aimed at researching the concept of forest rent as an object of economic analysis. The essence of the concept of «forest rent» has been researched. It has been defined that the forest rent is the object of management of the forest complex of Ukraine as a whole and forest enterprises in particular. Rent for special use of forest resources is the object of interest om the part of both the State and the corporate sector, because its value depends on the cost of timber for industry and households. Works of scholars on classification of rents were studied. It has been determined that the rent for specialized use of forest resources is a special kind of natural rent. The structure of constituents in the system of rent relations in the forest sector has been defined in accordance with provisions of the tax code of Ukraine.

  18. Fast grasping of unknown objects using principal component analysis

    Lei, Qujiang; Chen, Guangming; Wisse, Martijn

    2017-09-01

    Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.

  19. Objective Bayesian analysis of neutrino masses and hierarchy

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  20. Scout: orbit analysis and hazard assessment for NEOCP objects

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  1. Some new mathematical methods for variational objective analysis

    Wahba, Grace; Johnson, Donald R.

    1994-01-01

    Numerous results were obtained relevant to remote sensing, variational objective analysis, and data assimilation. A list of publications relevant in whole or in part is attached. The principal investigator gave many invited lectures, disseminating the results to the meteorological community as well as the statistical community. A list of invited lectures at meetings is attached, as well as a list of departmental colloquia at various universities and institutes.

  2. Analysis of manufacturing based on object oriented discrete event simulation

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  3. Maintaining high precision of isotope ratio analysis over extended periods of time.

    Brand, Willi A

    2009-06-01

    Stable isotope ratios are reliable and long lasting process tracers. In order to compare data from different locations or different sampling times at a high level of precision, a measurement strategy must include reliable traceability to an international stable isotope scale via a reference material (RM). Since these international RMs are available in low quantities only, we have developed our own analysis schemes involving laboratory working RM. In addition, quality assurance RMs are used to control the long-term performance of the delta-value assignments. The analysis schemes allow the construction of quality assurance performance charts over years of operation. In this contribution, the performance of three typical techniques established in IsoLab at the MPI-BGC in Jena is discussed. The techniques are (1) isotope ratio mass spectrometry with an elemental analyser for delta(15)N and delta(13)C analysis of bulk (organic) material, (2) high precision delta(13)C and delta(18)O analysis of CO(2) in clean-air samples, and (3) stable isotope analysis of water samples using a high-temperature reaction with carbon. In addition, reference strategies on a laser ablation system for high spatial resolution delta(13)C analysis in tree rings is exemplified briefly.

  4. Head First Object-Oriented Analysis and Design

    McLaughlin, Brett D; West, David

    2006-01-01

    "Head First Object Oriented Analysis and Design is a refreshing look at subject of OOAD. What sets this book apart is its focus on learning. The authors have made the content of OOAD accessible, usable for the practitioner." Ivar Jacobson, Ivar Jacobson Consulting "I just finished reading HF OOA&D and I loved it! The thing I liked most about this book was its focus on why we do OOA&D-to write great software!" Kyle Brown, Distinguished Engineer, IBM "Hidden behind the funny pictures and crazy fonts is a serious, intelligent, extremely well-crafted presentation of OO Analysis and Design

  5. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  6. Visual Field Preferences of Object Analysis for Grasping with One Hand

    Ada eLe

    2014-10-01

    Full Text Available When we grasp an object using one hand, the opposite hemisphere predominantly guides the motor control of grasp movements (Davare et al. 2007; Rice et al. 2007. However, it is unclear whether visual object analysis for grasp control relies more on inputs (a from the contralateral than the ipsilateral visual field, (b from one dominant visual field regardless of the grasping hand, or (c from both visual fields equally. For bimanual grasping of a single object we have recently demonstrated a visual field preference for the left visual field (Le and Niemeier 2013a, 2013b, consistent with a general right-hemisphere dominance for sensorimotor control of bimanual grasps (Le et al., 2013. But visual field differences have never been tested for unimanual grasping. Therefore, here we asked right-handed participants to fixate to the left or right of an object and then grasp the object either with their right or left hand using a precision grip. We found that participants grasping with their right hand performed better with objects in the right visual field: maximum grip apertures (MGAs were more closely matched to the object width and were smaller than for objects in the left visual field. In contrast, when people grasped with their left hand, preferences switched to the left visual field. What is more, MGA scaling showed greater visual field differences compared to right-hand grasping. Our data suggest that, visual object analysis for unimanual grasping shows a preference for visual information from the ipsilateral visual field, and that the left hemisphere is better equipped to control grasps in both visual fields.

  7. Precision analysis for standard deviation measurements of immobile single fluorescent molecule images.

    DeSantis, Michael C; DeCenzo, Shawn H; Li, Je-Luen; Wang, Y M

    2010-03-29

    Standard deviation measurements of intensity profiles of stationary single fluorescent molecules are useful for studying axial localization, molecular orientation, and a fluorescence imaging system's spatial resolution. Here we report on the analysis of the precision of standard deviation measurements of intensity profiles of single fluorescent molecules imaged using an EMCCD camera.We have developed an analytical expression for the standard deviation measurement error of a single image which is a function of the total number of detected photons, the background photon noise, and the camera pixel size. The theoretical results agree well with the experimental, simulation, and numerical integration results. Using this expression, we show that single-molecule standard deviation measurements offer nanometer precision for a large range of experimental parameters.

  8. Precision Nutrition 4.0: A Big Data and Ethics Foresight Analysis--Convergence of Agrigenomics, Nutrigenomics, Nutriproteomics, and Nutrimetabolomics.

    Özdemir, Vural; Kolker, Eugene

    2016-02-01

    Nutrition is central to sustenance of good health, not to mention its role as a cultural object that brings together or draws lines among societies. Undoubtedly, understanding the future paths of nutrition science in the current era of Big Data remains firmly on science, technology, and innovation strategy agendas around the world. Nutrigenomics, the confluence of nutrition science with genomics, brought about a new focus on and legitimacy for "variability science" (i.e., the study of mechanisms of person-to-person and population differences in response to food, and the ways in which food variably impacts the host, for example, nutrient-related disease outcomes). Societal expectations, both public and private, and claims over genomics-guided and individually-tailored precision diets continue to proliferate. While the prospects of nutrition science, and nutrigenomics in particular, are established, there is a need to integrate the efforts in four Big Data domains that are naturally allied--agrigenomics, nutrigenomics, nutriproteomics, and nutrimetabolomics--that address complementary variability questions pertaining to individual differences in response to food-related environmental exposures. The joint use of these four omics knowledge domains, coined as Precision Nutrition 4.0 here, has sadly not been realized to date, but the potentials for such integrated knowledge innovation are enormous. Future personalized nutrition practices would benefit from a seamless planning of life sciences funding, research, and practice agendas from "farm to clinic to supermarket to society," and from "genome to proteome to metabolome." Hence, this innovation foresight analysis explains the already existing potentials waiting to be realized, and suggests ways forward for innovation in both technology and ethics foresight frames on precision nutrition. We propose the creation of a new Precision Nutrition Evidence Barometer for periodic, independent, and ongoing retrieval, screening

  9. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    Adam W Green

    Full Text Available Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA and available information to inform a formal decision process to determine optimal and timely management policies.

  10. Mediman: Object oriented programming approach for medical image analysis

    Coppens, A.; Sibomana, M.; Bol, A.; Michel, C.

    1993-01-01

    Mediman is a new image analysis package which has been developed to analyze quantitatively Positron Emission Tomography (PET) data. It is object-oriented, written in C++ and its user interface is based on InterViews on top of which new classes have been added. Mediman accesses data using external data representation or import/export mechanism which avoids data duplication. Multimodality studies are organized in a simple database which includes images, headers, color tables, lists and objects of interest (OOI's) and history files. Stored color table parameters allow to focus directly on the interesting portion of the dynamic range. Lists allow to organize the study according to modality, acquisition protocol, time and spatial properties. OOI's (points, lines and regions) are stored in absolute 3-D coordinates allowing correlation with other co-registered imaging modalities such as MRI or SPECT. OOI's have visualization properties and are organized into groups. Quantitative ROI analysis of anatomic images consists of position, distance, volume calculation on selected OOI's. An image calculator is connected to mediman. Quantitation of metabolic images is performed via profiles, sectorization, time activity curves and kinetic modeling. Mediman is menu and mouse driven, macro-commands can be registered and replayed. Its interface is customizable through a configuration file. The benefit of the object-oriented approach are discussed from a development point of view

  11. Elevation data fitting and precision analysis of Google Earth in road survey

    Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei

    2018-05-01

    Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously

  12. Analysis and Optimization of Dynamic Measurement Precision of Fiber Optic Gyroscope

    Hui Li

    2013-01-01

    Full Text Available In order to improve the dynamic performance of high precision interferometer fiber optic gyroscope (IFOG, the influencing factors of the fast response characteristics are analyzed based on a proposed assistant design setup, and a high dynamic detection method is proposed to suppress the adverse effects of the key influencing factors. The assistant design platform is built by using the virtual instrument technology for IFOG, which can monitor the closed-loop state variables in real time for analyzing the influence of both the optical components and detection circuit on the dynamic performance of IFOG. The analysis results indicate that nonlinearity of optical Sagnac effect, optical parameter uncertainty, dynamic characteristics of internal modules and time delay of signal detection circuit are the major causes of dynamic performance deterioration, which can induce potential system instability in practical control systems. By taking all these factors into consideration, we design a robust control algorithm to realize the high dynamic closed-loop detection of IFOG. Finally, experiments show that the improved 0.01 deg/h high precision IFOG with the proposed control algorithm can achieve fast tracking and good dynamic measurement precision.

  13. Iso-precision scaling of digitized mammograms to facilitate image analysis

    Karssmeijer, N.; van Erning, L.

    1991-01-01

    This paper reports on a 12 bit CCD camera equipped with a linear sensor of 4096 photodiodes which is used to digitize conventional mammographic films. An iso-precision conversion of the pixel values is preformed to transform the image data to a scale on which the image noise is equal at each level. For this purpose film noise and digitization noise have been determined as a function of optical density and pixel size. It appears that only at high optical densities digitization noise is comparable to or larger than film noise. The quantization error caused by compression of images recorded with 12 bits per pixel to 8 bit images by an iso-precision conversion has been calculated as a function of the number of quantization levels. For mammograms digitized in a 4096 2 matrix the additional error caused by such a scale transform is only about 1.5 percent. An iso-precision scale transform can be advantageous when automated procedures for quantitative image analysis are developed. Especially when detection of signals in noise is aimed at, a constant noise level over the whole pixel value range is very convenient. This is demonstrated by applying local thresholding to detect small microcalcifications. Results are compared to those obtained by using logarithmic or linearized scales

  14. EGYPTIAN MUTUAL FUNDS ANALYSIS: HISTORY, PERFORMANCE, OBJECTIVES, RISK AND RETURN

    Petru STEFEA

    2013-10-01

    Full Text Available The present research aims to overview the mutual fund in Egypt. The establishment of the first mutual funds was achieved in 1994. Nowadays, the total mutual funds reached 90 funds , approximately. The income funds represent the largest share of the Egyptian mutual funds (40%, growth funds (25% and the private equity funds is at least (1%. The total population of the Egyptian mutual funds reached 22. Finally, the study proved that the Egyptian mutual funds have an impact on fund return , total risk and systemic; when analysis relationship between risk and return. The study found influencing for mutual fund's objectives on Sharpe and Terynor ratios.

  15. voom: Precision weights unlock linear model analysis tools for RNA-seq read counts.

    Law, Charity W; Chen, Yunshun; Shi, Wei; Smyth, Gordon K

    2014-02-03

    New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods.

  16. Object-Based Image Analysis in Wetland Research: A Review

    Iryna Dronova

    2015-05-01

    Full Text Available The applications of object-based image analysis (OBIA in remote sensing studies of wetlands have been growing over recent decades, addressing tasks from detection and delineation of wetland bodies to comprehensive analyses of within-wetland cover types and their change. Compared to pixel-based approaches, OBIA offers several important benefits to wetland analyses related to smoothing of the local noise, incorporating meaningful non-spectral features for class separation and accounting for landscape hierarchy of wetland ecosystem organization and structure. However, there has been little discussion on whether unique challenges of wetland environments can be uniformly addressed by OBIA across different types of data, spatial scales and research objectives, and to what extent technical and conceptual aspects of this framework may themselves present challenges in a complex wetland setting. This review presents a synthesis of 73 studies that applied OBIA to different types of remote sensing data, spatial scale and research objectives. It summarizes the progress and scope of OBIA uses in wetlands, key benefits of this approach, factors related to accuracy and uncertainty in its applications and the main research needs and directions to expand the OBIA capacity in the future wetland studies. Growing demands for higher-accuracy wetland characterization at both regional and local scales together with advances in very high resolution remote sensing and novel tasks in wetland restoration monitoring will likely continue active exploration of the OBIA potential in these diverse and complex environments.

  17. Objective image analysis of the meibomian gland area.

    Arita, Reiko; Suehiro, Jun; Haraguchi, Tsuyoshi; Shirakawa, Rika; Tokoro, Hideaki; Amano, Shiro

    2014-06-01

    To evaluate objectively the meibomian gland area using newly developed software for non-invasive meibography. Eighty eyelids of 42 patients without meibomian gland loss (meiboscore=0), 105 eyelids of 57 patients with loss of less than one-third total meibomian gland area (meiboscore=1), 13 eyelids of 11 patients with between one-third and two-thirds loss of meibomian gland area (meiboscore=2) and 20 eyelids of 14 patients with two-thirds loss of meibomian gland area (meiboscore=3) were studied. Lid borders were automatically determined. The software evaluated the distribution of the luminance and, by enhancing the contrast and reducing image noise, the meibomian gland area was automatically discriminated. The software calculated the ratio of the total meibomian gland area relative to the total analysis area in all subjects. Repeatability of the software was also evaluated. The mean ratio of the meibomian gland area to the total analysis area in the upper/lower eyelids was 51.9±5.7%/54.7±5.4% in subjects with a meiboscore of 0, 47.7±6.0%/51.5±5.4% in those with a meiboscore of 1, 32.0±4.4%/37.2±3.5% in those with a meiboscore of 2 and 16.7±6.4%/19.5±5.8% in subjects with a meiboscore of 3. The meibomian gland area was objectively evaluated using the developed software. This system could be useful for objectively evaluating the effect of treatment on meibomian gland dysfunction. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  18. Virtual learning object and environment: a concept analysis.

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  19. Trace element analysis by EPMA in geosciences: detection limit, precision and accuracy

    Batanova, V. G.; Sobolev, A. V.; Magnin, V.

    2018-01-01

    Use of the electron probe microanalyser (EPMA) for trace element analysis has increased over the last decade, mainly because of improved stability of spectrometers and the electron column when operated at high probe current; development of new large-area crystal monochromators and ultra-high count rate spectrometers; full integration of energy-dispersive / wavelength-dispersive X-ray spectrometry (EDS/WDS) signals; and the development of powerful software packages. For phases that are stable under a dense electron beam, the detection limit and precision can be decreased to the ppm level by using high acceleration voltage and beam current combined with long counting time. Data on 10 elements (Na, Al, P, Ca, Ti, Cr, Mn, Co, Ni, Zn) in olivine obtained on a JEOL JXA-8230 microprobe with tungsten filament show that the detection limit decreases proportionally to the square root of counting time and probe current. For all elements equal or heavier than phosphorus (Z = 15), the detection limit decreases with increasing accelerating voltage. The analytical precision for minor and trace elements analysed in olivine at 25 kV accelerating voltage and 900 nA beam current is 4 - 18 ppm (2 standard deviations of repeated measurements of the olivine reference sample) and is similar to the detection limit of corresponding elements. To analyse trace elements accurately requires careful estimation of background, and consideration of sample damage under the beam and secondary fluorescence from phase boundaries. The development and use of matrix reference samples with well-characterised trace elements of interest is important for monitoring and improving of the accuracy. An evaluation of the accuracy of trace element analyses in olivine has been made by comparing EPMA data for new reference samples with data obtained by different in-situ and bulk analytical methods in six different laboratories worldwide. For all elements, the measured concentrations in the olivine reference sample

  20. The precision of textural analysis in {sup 18}F-FDG-PET scans of oesophageal cancer

    Doumou, Georgia; Siddique, Musib [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Tsoumpas, Charalampos [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); University of Leeds, The Division of Medical Physics, Leeds (United Kingdom); Goh, Vicky [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Guy' s and St Thomas' Hospitals NHS Foundation Trust, Radiology Department, London (United Kingdom); Cook, Gary J. [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Guy' s and St Thomas' Hospitals NHS Foundation Trust, The PET Centre, London (United Kingdom); University of Leeds, The Division of Medical Physics, Leeds (United Kingdom); St Thomas' Hospital, Clinical PET Centre, Division of Imaging Sciences and Biomedical Engineering, Kings College London, London (United Kingdom)

    2015-09-15

    Measuring tumour heterogeneity by textural analysis in {sup 18}F-fluorodeoxyglucose positron emission tomography ({sup 18}F-FDG PET) provides predictive and prognostic information but technical aspects of image processing can influence parameter measurements. We therefore tested effects of image smoothing, segmentation and quantisation on the precision of heterogeneity measurements. Sixty-four {sup 18}F-FDG PET/CT images of oesophageal cancer were processed using different Gaussian smoothing levels (2.0, 2.5, 3.0, 3.5, 4.0 mm), maximum standardised uptake value (SUV{sub max}) segmentation thresholds (45 %, 50 %, 55 %, 60 %) and quantisation (8, 16, 32, 64, 128 bin widths). Heterogeneity parameters included grey-level co-occurrence matrix (GLCM), grey-level run length matrix (GLRL), neighbourhood grey-tone difference matrix (NGTDM), grey-level size zone matrix (GLSZM) and fractal analysis methods. The concordance correlation coefficient (CCC) for the three processing variables was calculated for each heterogeneity parameter. Most parameters showed poor agreement between different bin widths (CCC median 0.08, range 0.004-0.99). Segmentation and smoothing showed smaller effects on precision (segmentation: CCC median 0.82, range 0.33-0.97; smoothing: CCC median 0.99, range 0.58-0.99). Smoothing and segmentation have only a small effect on the precision of heterogeneity measurements in {sup 18}F-FDG PET data. However, quantisation often has larger effects, highlighting a need for further evaluation and standardisation of parameters for multicentre studies. (orig.)

  1. Significant improvement of accuracy and precision in the determination of trace rare earths by fluorescence analysis

    Ozawa, L.; Hersh, H.N.

    1976-01-01

    Most of the rare earths in yttrium, gadolinium and lanthanum oxides emit characteristic fluorescent line spectra under irradiation with photons, electrons and x rays. The sensitivity and selectivity of the rare earth fluorescences are high enough to determine the trace amounts (0.01 to 100 ppM) of rare earths. The absolute fluorescent intensities of solids, however, are markedly affected by the synthesis procedure, level of contamination and crystal perfection, resulting in poor accuracy and low precision for the method (larger than 50 percent error). Special care in preparation of the samples is required to obtain good accuracy and precision. It is found that the accuracy and precision for the determination of trace (less than 10 ppM) rare earths by fluorescence analysis improved significantly, while still maintaining the sensitivity, when the determination is made by comparing the ratio of the fluorescent intensities of the trace rare earths to that of a deliberately added rare earth as reference. The variation in the absolute fluorescent intensity remains, but is compensated for by measuring the fluorescent line intensity ratio. Consequently, the determination of trace rare earths (with less than 3 percent error) is easily made by a photoluminescence technique in which the rare earths are excited directly by photons. Accuracy is still maintained when the absolute fluorescent intensity is reduced by 50 percent through contamination by Ni, Fe, Mn or Pb (about 100 ppM). Determination accuracy is also improved for fluorescence analysis by electron excitation and x-ray excitation. For some rare earths, however, accuracy by these techniques is reduced because indirect excitation mechanisms are involved. The excitation mechanisms and the interferences between rare earths are also reported

  2. A method of precise profile analysis of diffuse scattering for the KENS pulsed neutrons

    Todate, Y.; Fukumura, T.; Fukazawa, H.

    2001-01-01

    An outline of our profile analysis method, which is now of practical use for the asymmetric KENS pulsed thermal neutrons, are presented. The analysis of the diffuse scattering from a single crystal of D 2 O is shown as an example. The pulse shape function is based on the Ikeda-Carpenter function adjusted for the KENS neutron pulses. The convoluted intensity is calculated by a Monte-Carlo method and the precision of the calculation is controlled. Fitting parameters in the model cross section can be determined by the built-in nonlinear least square fitting procedure. Because this method is the natural extension of the procedure conventionally used for the triple-axis data, it is easy to apply with generality and versatility. Most importantly, furthermore, this method has capability of precise correction of the time shift of the observed peak position which is inevitably caused in the case of highly asymmetric pulses and broad scattering function. It will be pointed out that the accurate determination of true time-of-flight is important especially in the single crystal inelastic experiments. (author)

  3. A novel algorithm for a precise analysis of subchondral bone alterations

    Gao, Liang; Orth, Patrick; Goebel, Lars K. H.; Cucchiarini, Magali; Madry, Henning

    2016-01-01

    Subchondral bone alterations are emerging as considerable clinical problems associated with articular cartilage repair. Their analysis exposes a pattern of variable changes, including intra-lesional osteophytes, residual microfracture holes, peri-hole bone resorption, and subchondral bone cysts. A precise distinction between them is becoming increasingly important. Here, we present a tailored algorithm based on continuous data to analyse subchondral bone changes using micro-CT images, allowing for a clear definition of each entity. We evaluated this algorithm using data sets originating from two large animal models of osteochondral repair. Intra-lesional osteophytes were detected in 3 of 10 defects in the minipig and in 4 of 5 defects in the sheep model. Peri-hole bone resorption was found in 22 of 30 microfracture holes in the minipig and in 17 of 30 microfracture holes in the sheep model. Subchondral bone cysts appeared in 1 microfracture hole in the minipig and in 5 microfracture holes in the sheep model (n = 30 holes each). Calculation of inter-rater agreement (90% agreement) and Cohen’s kappa (kappa = 0.874) revealed that the novel algorithm is highly reliable, reproducible, and valid. Comparison analysis with the best existing semi-quantitative evaluation method was also performed, supporting the enhanced precision of this algorithm. PMID:27596562

  4. A high precision mass spectrometer for hydrogen isotopic analysis of water samples

    Murthy, M.S.; Prahallada Rao, B.S.; Handu, V.K.; Satam, J.V.

    1979-01-01

    A high precision mass spectrometer with two ion collector assemblies and direct on line reduction facility (with uranium at 700 0 C) for water samples for hydrogen isotopic analysis has been designed and developed. The ion source particularly gives high sensitivity and at the same tike limits the H 3 + ions to a minimum. A digital ratiometer with a H 2 + compensator has also been developed. The overall precision obtained on the spectrometer is 0.07% 2sub(sigmasub(10)) value. Typical results on the performance of the spectrometer, which is working since a year and a half are given. Possible methods of extending the ranges of concentration the spectrometer can handle, both on lower and higher sides are discussed. Problems of memory between samples are briefly listed. A multiple inlet system to overcome these problems is suggested. This will also enable faster analysis when samples of highly varying concentrations are to be analyzed. A few probable areas in which the spectrometer will be shortly put to use are given. (auth.)

  5. Oxygen isotope analysis of phosphate: improved precision using TC/EA CF-IRMS.

    LaPorte, D F; Holmden, C; Patterson, W P; Prokopiuk, T; Eglington, B M

    2009-06-01

    Oxygen isotope values of biogenic apatite have long demonstrated considerable promise for paleothermometry potential because of the abundance of material in the fossil record and greater resistance of apatite to diagenesis compared to carbonate. Unfortunately, this promise has not been fully realized because of relatively poor precision of isotopic measurements, and exceedingly small size of some substrates for analysis. Building on previous work, we demonstrate that it is possible to improve precision of delta18O(PO4) measurements using a 'reverse-plumbed' thermal conversion elemental analyzer (TC/EA) coupled to a continuous flow isotope ratio mass spectrometer (CF-IRMS) via a helium stream [Correction made here after initial online publication]. This modification to the flow of helium through the TC/EA, and careful location of the packing of glassy carbon fragments relative to the hot spot in the reactor, leads to narrower, more symmetrically distributed CO elution peaks with diminished tailing. In addition, we describe our apatite purification chemistry that uses nitric acid and cation exchange resin. Purification chemistry is optimized for processing small samples, minimizing isotopic fractionation of PO4(-3) and permitting Ca, Sr and Nd to be eluted and purified further for the measurement of delta44Ca and 87Sr/86Sr in modern biogenic apatite and 143Nd/144Nd in fossil apatite. Our methodology yields an external precision of +/- 0.15 per thousand (1sigma) for delta18O(PO4). The uncertainty is related to the preparation of the Ag3PO4 salt, conversion to CO gas in a reversed-plumbed TC/EA, analysis of oxygen isotopes using a CF-IRMS, and uncertainty in constructing calibration lines that convert raw delta18O data to the VSMOW scale. Matrix matching of samples and standards for the purpose of calibration to the VSMOW scale was determined to be unnecessary. Our method requires only slightly modified equipment that is widely available. This fact, and the

  6. Joint Tensor Feature Analysis For Visual Object Recognition.

    Wong, Wai Keung; Lai, Zhihui; Xu, Yong; Wen, Jiajun; Ho, Chu Po

    2015-11-01

    Tensor-based object recognition has been widely studied in the past several years. This paper focuses on the issue of joint feature selection from the tensor data and proposes a novel method called joint tensor feature analysis (JTFA) for tensor feature extraction and recognition. In order to obtain a set of jointly sparse projections for tensor feature extraction, we define the modified within-class tensor scatter value and the modified between-class tensor scatter value for regression. The k-mode optimization technique and the L(2,1)-norm jointly sparse regression are combined together to compute the optimal solutions. The convergent analysis, computational complexity analysis and the essence of the proposed method/model are also presented. It is interesting to show that the proposed method is very similar to singular value decomposition on the scatter matrix but with sparsity constraint on the right singular value matrix or eigen-decomposition on the scatter matrix with sparse manner. Experimental results on some tensor datasets indicate that JTFA outperforms some well-known tensor feature extraction and selection algorithms.

  7. Analysis and Comparison of Objective Methods for Image Quality Assessment

    P. S. Babkin

    2014-01-01

    Full Text Available The purpose of this work is research and modification of the reference objective methods for image quality assessment. The ultimate goal is to obtain a modification of formal assessments that more closely corresponds to the subjective expert estimates (MOS.In considering the formal reference objective methods for image quality assessment we used the results of other authors, which offer results and comparative analyzes of the most effective algorithms. Based on these investigations we have chosen two of the most successful algorithm for which was made a further analysis in the MATLAB 7.8 R 2009 a (PQS and MSSSIM. The publication focuses on the features of the algorithms, which have great importance in practical implementation, but are insufficiently covered in the publications by other authors.In the implemented modification of the algorithm PQS boundary detector Kirsch was replaced by the boundary detector Canny. Further experiments were carried out according to the method of the ITU-R VT.500-13 (01/2012 using monochrome images treated with different types of filters (should be emphasized that an objective assessment of image quality PQS is applicable only to monochrome images. Images were obtained with a thermal imaging surveillance system. The experimental results proved the effectiveness of this modification.In the specialized literature in the field of formal to evaluation methods pictures, this type of modification was not mentioned.The method described in the publication can be applied to various practical implementations of digital image processing.Advisability and effectiveness of using the modified method of PQS to assess the structural differences between the images are shown in the article and this will be used in solving the problems of identification and automatic control.

  8. Analysis of Camera Parameters Value in Various Object Distances Calibration

    Yusoff, Ahmad Razali; Ariff, Mohd Farid Mohd; Idris, Khairulnizam M; Majid, Zulkepli; Setan, Halim; Chong, Albert K

    2014-01-01

    In photogrammetric applications, good camera parameters are needed for mapping purpose such as an Unmanned Aerial Vehicle (UAV) that encompassed with non-metric camera devices. Simple camera calibration was being a common application in many laboratory works in order to get the camera parameter's value. In aerial mapping, interior camera parameters' value from close-range camera calibration is used to correct the image error. However, the causes and effects of the calibration steps used to get accurate mapping need to be analyze. Therefore, this research aims to contribute an analysis of camera parameters from portable calibration frame of 1.5 × 1 meter dimension size. Object distances of two, three, four, five, and six meters are the research focus. Results are analyzed to find out the changes in image and camera parameters' value. Hence, camera calibration parameter's of a camera is consider different depend on type of calibration parameters and object distances

  9. Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry

    Lukomski, Michal; Krzemien, Leszek

    2013-05-01

    Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.

  10. Evaluation and analysis of real-time precise orbits and clocks products from different IGS analysis centers

    Zhang, Liang; Yang, Hongzhou; Gao, Yang; Yao, Yibin; Xu, Chaoqian

    2018-06-01

    To meet the increasing demands from the real-time Precise Point Positioning (PPP) users, the real-time satellite orbit and clock products are generated by different International GNSS Service (IGS) real-time analysis centers and can be publicly received through the Internet. Based on different data sources and processing strategies, the real-time products from different analysis centers therefore differ in availability and accuracy. The main objective of this paper is to evaluate availability and accuracy of different real-time products and their effects on real-time PPP. A total of nine commonly used Real-Time Service (RTS) products, namely IGS01, IGS03, CLK01, CLK15, CLK22, CLK52, CLK70, CLK81 and CLK90, will be evaluated in this paper. Because not all RTS products support multi-GNSS, only GPS products are analyzed in this paper. Firstly, the availability of all RTS products is analyzed in two levels. The first level is the epoch availability, indicating whether there is outage for that epoch. The second level is the satellite availability, which defines the available satellite number for each epoch. Then the accuracy of different RTS products is investigated on nominal accuracy and the accuracy degradation over time. Results show that Root-Mean-Square Error (RMSE) of satellite orbit ranges from 3.8 cm to 7.5 cm for different RTS products. While the mean Standard Deviations of Errors (STDE) of satellite clocks range from 1.9 cm to 5.6 cm. The modified Signal In Space Range Error (SISRE) for all products are from 1.3 cm to 5.5 cm for different RTS products. The accuracy degradation of the orbit has the linear trend for all RTS products and the satellite clock degradation depends on the satellite clock types. The Rb clocks on board of GPS IIF satellites have the smallest degradation rate of less than 3 cm over 10 min while the Cs clocks on board of GPS IIF have the largest degradation rate of more than 10 cm over 10 min. Finally, the real-time kinematic PPP is

  11. Tendency for interlaboratory precision in the GMO analysis method based on real-time PCR.

    Kodama, Takashi; Kurosawa, Yasunori; Kitta, Kazumi; Naito, Shigehiro

    2010-01-01

    The Horwitz curve estimates interlaboratory precision as a function only of concentration, and is frequently used as a method performance criterion in food analysis with chemical methods. The quantitative biochemical methods based on real-time PCR require an analogous criterion to progressively promote method validation. We analyzed the tendency of precision using a simplex real-time PCR technique in 53 collaborative studies of seven genetically modified (GM) crops. Reproducibility standard deviation (SR) and repeatability standard deviation (Sr) of the genetically modified organism (GMO) amount (%) was more or less independent of GM crops (i.e., maize, soybean, cotton, oilseed rape, potato, sugar beet, and rice) and evaluation procedure steps. Some studies evaluated whole steps consisting of DNA extraction and PCR quantitation, whereas others focused only on the PCR quantitation step by using DNA extraction solutions. Therefore, SR and Sr for GMO amount (%) are functions only of concentration similar to the Horwitz curve. We proposed S(R) = 0.1971C 0.8685 and S(r) = 0.1478C 0.8424, where C is the GMO amount (%). We also proposed a method performance index in GMO quantitative methods that is analogous to the Horwitz Ratio.

  12. Analysis and experiments of a novel and compact 3-DOF precision positioning platform

    Huang, Hu; Zhao, Hongwei; Fan, Zunqiang; Zhang, Hui; Ma, Zhichao; Yang, Zhaojun

    2013-01-01

    A novel 3-DOF precision positioning platform with dimensions of 48 mm X 50 mm X 35 mm was designed by integrating piezo actuators and flexure hinges. The platform has a compact structure but it can do high precision positioning in three axes. The dynamic model of the platform in a single direction was established. Stiffness of the flexure hinges and modal characteristics of the flexure hinge mechanism were analyzed by the finite element method. Output displacements of the platform along three axes were forecasted via stiffness analysis. Output performance of the platform in x and y axes with open-loop control as well as the z-axis with closed-loop control was tested and discussed. The preliminary application of the platform in the field of nanoindentation indicates that the designed platform works well during nanoindentation tests, and the closed-loop control ensures the linear displacement output. With suitable control, the platform has the potential to realize different positioning functions under various working conditions.

  13. Analysis of web-based online services for GPS relative and precise point positioning techniques

    Taylan Ocalan

    Full Text Available Nowadays, Global Positioning System (GPS has been used effectively in several engineering applications for the survey purposes by multiple disciplines. Web-based online services developed by several organizations; which are user friendly, unlimited and most of them are free; have become a significant alternative against the high-cost scientific and commercial software on achievement of post processing and analyzing the GPS data. When centimeter (cm or decimeter (dm level accuracies are desired, that can be obtained easily regarding different quality engineering applications through these services. In this paper, a test study was conducted at ISKI-CORS network; Istanbul-Turkey in order to figure out the accuracy analysis of the most used web based online services around the world (namely OPUS, AUSPOS, SCOUT, CSRS-PPP, GAPS, APPS, magicGNSS. These services use relative and precise point positioning (PPP solution approaches. In this test study, the coordinates of eight stations were estimated by using of both online services and Bernese 5.0 scientific GPS processing software from 24-hour GPS data set and then the coordinate differences between the online services and Bernese processing software were computed. From the evaluations, it was seen that the results for each individual differences were less than 10 mm regarding relative online service, and less than 20 mm regarding precise point positioning service. The accuracy analysis was gathered from these coordinate differences and standard deviations of the obtained coordinates from different techniques and then online services were compared to each other. The results show that the position accuracies obtained by associated online services provide high accurate solutions that may be used in many engineering applications and geodetic analysis.

  14. Objective high Resolution Analysis over Complex Terrain with VERA

    Mayer, D.; Steinacker, R.; Steiner, A.

    2012-04-01

    VERA (Vienna Enhanced Resolution Analysis) is a model independent, high resolution objective analysis of meteorological fields over complex terrain. This system consists of a special developed quality control procedure and a combination of an interpolation and a downscaling technique. Whereas the so called VERA-QC is presented at this conference in the contribution titled "VERA-QC, an approved Data Quality Control based on Self-Consistency" by Andrea Steiner, this presentation will focus on the method and the characteristics of the VERA interpolation scheme which enables one to compute grid point values of a meteorological field based on irregularly distributed observations and topography related aprior knowledge. Over a complex topography meteorological fields are not smooth in general. The roughness which is induced by the topography can be explained physically. The knowledge about this behavior is used to define the so called Fingerprints (e.g. a thermal Fingerprint reproducing heating or cooling over mountainous terrain or a dynamical Fingerprint reproducing positive pressure perturbation on the windward side of a ridge) under idealized conditions. If the VERA algorithm recognizes patterns of one or more Fingerprints at a few observation points, the corresponding patterns are used to downscale the meteorological information in a greater surrounding. This technique allows to achieve an analysis with a resolution much higher than the one of the observational network. The interpolation of irregularly distributed stations to a regular grid (in space and time) is based on a variational principle applied to first and second order spatial and temporal derivatives. Mathematically, this can be formulated as a cost function that is equivalent to the penalty function of a thin plate smoothing spline. After the analysis field has been divided into the Fingerprint components and the unexplained part respectively, the requirement of a smooth distribution is applied to the

  15. Decoupling of the leading contribution in the discrete BFKL analysis of high-precision HERA data

    Kowalski, H. [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Lipatov, L.N. [St. Petersburg State University, St. Petersburg (Russian Federation); Petersburg Nuclear Physics Institute, Gatchina (Russian Federation); Ross, D.A. [University of Southampton, School of Physics and Astronomy, Southampton (United Kingdom); Schulz, O. [Max Planck Institute for Physics, Munich (Germany)

    2017-11-15

    We analyse, in NLO, the physical properties of the discrete eigenvalue solution for the BFKL equation. We show that a set of eigenfunctions with positive eigenvalues, ω, together with a small contribution from a continuum of eigenfunctions with negative ω, provide an excellent description of high-precision HERA F{sub 2} data in the region, x < 0.001, Q{sup 2} > 6 GeV{sup 2}. The phases of the eigenfunctions can be obtained from a simple parametrisation of the pomeron spectrum, which has a natural motivation within BFKL. The data analysis shows that the first eigenfunction decouples completely or almost completely from the proton. This suggests that there exists an additional ground state, which is naturally saturated and may have the properties of the soft pomeron. (orig.)

  16. Analysis of residual stress in subsurface layers after precision hard machining of forging tools

    Czan Andrej

    2018-01-01

    Full Text Available This paper is focused on analysis of residual stress of functional surfaces and subsurface layers created by precision technologies of hard machining for progressive constructional materials of forging tools. Methods of experiments are oriented on monitoring of residual stress in surface which is created by hard turning (roughing and finishing operations. Subsequently these surfaces were etched in thin layers by electro-chemical polishing. The residual stress was monitored in each etched layer. The measuring was executed by portable X-ray diffractometer for detection of residual stress and structural phases. The results significantly indicate rise and distribution of residual stress in surface and subsurface layers and their impact on functional properties of surface integrity.

  17. Error analysis of marker-based object localization using a single-plane XRII

    Habets, Damiaan F.; Pollmann, Steven I.; Yuan, Xunhua; Peters, Terry M.; Holdsworth, David W.

    2009-01-01

    The role of imaging and image guidance is increasing in surgery and therapy, including treatment planning and follow-up. Fluoroscopy is used for two-dimensional (2D) guidance or localization; however, many procedures would benefit from three-dimensional (3D) guidance or localization. Three-dimensional computed tomography (CT) using a C-arm mounted x-ray image intensifier (XRII) can provide high-quality 3D images; however, patient dose and the required acquisition time restrict the number of 3D images that can be obtained. C-arm based 3D CT is therefore limited in applications for x-ray based image guidance or dynamic evaluations. 2D-3D model-based registration, using a single-plane 2D digital radiographic system, does allow for rapid 3D localization. It is our goal to investigate - over a clinically practical range - the impact of x-ray exposure on the resulting range of 3D localization precision. In this paper it is assumed that the tracked instrument incorporates a rigidly attached 3D object with a known configuration of markers. A 2D image is obtained by a digital fluoroscopic x-ray system and corrected for XRII distortions (±0.035 mm) and mechanical C-arm shift (±0.080 mm). A least-square projection-Procrustes analysis is then used to calculate the 3D position using the measured 2D marker locations. The effect of x-ray exposure on the precision of 2D marker localization and on 3D object localization was investigated using numerical simulations and x-ray experiments. The results show a nearly linear relationship between 2D marker localization precision and the 3D localization precision. However, a significant amplification of error, nonuniformly distributed among the three major axes, occurs, and that is demonstrated. To obtain a 3D localization error of less than ±1.0 mm for an object with 20 mm marker spacing, the 2D localization precision must be better than ±0.07 mm. This requirement was met for all investigated nominal x-ray exposures at 28 cm FOV, and

  18. Real-time GPS seismology using a single receiver: method comparison, error analysis and precision validation

    Li, Xingxing

    2014-05-01

    Earthquake monitoring and early warning system for hazard assessment and mitigation has traditional been based on seismic instruments. However, for large seismic events, it is difficult for traditional seismic instruments to produce accurate and reliable displacements because of the saturation of broadband seismometers and problematic integration of strong-motion data. Compared with the traditional seismic instruments, GPS can measure arbitrarily large dynamic displacements without saturation, making them particularly valuable in case of large earthquakes and tsunamis. GPS relative positioning approach is usually adopted to estimate seismic displacements since centimeter-level accuracy can be achieved in real-time by processing double-differenced carrier-phase observables. However, relative positioning method requires a local reference station, which might itself be displaced during a large seismic event, resulting in misleading GPS analysis results. Meanwhile, the relative/network approach is time-consuming, particularly difficult for the simultaneous and real-time analysis of GPS data from hundreds or thousands of ground stations. In recent years, several single-receiver approaches for real-time GPS seismology, which can overcome the reference station problem of the relative positioning approach, have been successfully developed and applied to GPS seismology. One available method is real-time precise point positioning (PPP) relied on precise satellite orbit and clock products. However, real-time PPP needs a long (re)convergence period, of about thirty minutes, to resolve integer phase ambiguities and achieve centimeter-level accuracy. In comparison with PPP, Colosimo et al. (2011) proposed a variometric approach to determine the change of position between two adjacent epochs, and then displacements are obtained by a single integration of the delta positions. This approach does not suffer from convergence process, but the single integration from delta positions to

  19. Omics AnalySIs System for PRecision Oncology (OASISPRO): A Web-based Omics Analysis Tool for Clinical Phenotype Prediction.

    Yu, Kun-Hsing; Fitzpatrick, Michael R; Pappas, Luke; Chan, Warren; Kung, Jessica; Snyder, Michael

    2017-09-12

    Precision oncology is an approach that accounts for individual differences to guide cancer management. Omics signatures have been shown to predict clinical traits for cancer patients. However, the vast amount of omics information poses an informatics challenge in systematically identifying patterns associated with health outcomes, and no general-purpose data-mining tool exists for physicians, medical researchers, and citizen scientists without significant training in programming and bioinformatics. To bridge this gap, we built the Omics AnalySIs System for PRecision Oncology (OASISPRO), a web-based system to mine the quantitative omics information from The Cancer Genome Atlas (TCGA). This system effectively visualizes patients' clinical profiles, executes machine-learning algorithms of choice on the omics data, and evaluates the prediction performance using held-out test sets. With this tool, we successfully identified genes strongly associated with tumor stage, and accurately predicted patients' survival outcomes in many cancer types, including mesothelioma and adrenocortical carcinoma. By identifying the links between omics and clinical phenotypes, this system will facilitate omics studies on precision cancer medicine and contribute to establishing personalized cancer treatment plans. This web-based tool is available at http://tinyurl.com/oasispro ;source codes are available at http://tinyurl.com/oasisproSourceCode . © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  20. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Machine learning and data mining advance predictive big data analysis in precision animal agriculture.

    Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C

    2018-04-14

    Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.

  1. Poka Yoke system based on image analysis and object recognition

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  2. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  3. Developing web-based data analysis tools for precision farming using R and Shiny

    Jahanshiri, Ebrahim; Mohd Shariff, Abdul Rashid

    2014-06-01

    Technologies that are set to increase the productivity of agricultural practices require more and more data. Nevertheless, farming data is also being increasingly cheap to collect and maintain. Bulk of data that are collected by the sensors and samples need to be analysed in an efficient and transparent manner. Web technologies have long being used to develop applications that can assist the farmers and managers. However until recently, analysing the data in an online environment has not been an easy task especially in the eyes of data analysts. This barrier is now overcome by the availability of new application programming interfaces that can provide real-time web based data analysis. In this paper developing a prototype web based application for data analysis using new facilities in R statistical package and its web development facility, Shiny is explored. The pros and cons of this type of data analysis environment for precision farming are enumerated and future directions in web application development for agricultural data are discussed.

  4. Developing web-based data analysis tools for precision farming using R and Shiny

    Jahanshiri, Ebrahim; Shariff, Abdul Rashid Mohd

    2014-01-01

    Technologies that are set to increase the productivity of agricultural practices require more and more data. Nevertheless, farming data is also being increasingly cheap to collect and maintain. Bulk of data that are collected by the sensors and samples need to be analysed in an efficient and transparent manner. Web technologies have long being used to develop applications that can assist the farmers and managers. However until recently, analysing the data in an online environment has not been an easy task especially in the eyes of data analysts. This barrier is now overcome by the availability of new application programming interfaces that can provide real-time web based data analysis. In this paper developing a prototype web based application for data analysis using new facilities in R statistical package and its web development facility, Shiny is explored. The pros and cons of this type of data analysis environment for precision farming are enumerated and future directions in web application development for agricultural data are discussed

  5. Development of High Precision Tsunami Runup Calculation Method Coupled with Structure Analysis

    Arikawa, Taro; Seki, Katsumi; Chida, Yu; Takagawa, Tomohiro; Shimosako, Kenichiro

    2017-04-01

    The 2011 Great East Japan Earthquake (GEJE) has shown that tsunami disasters are not limited to inundation damage in a specified region, but may destroy a wide area, causing a major disaster. Evaluating standing land structures and damage to them requires highly precise evaluation of three-dimensional fluid motion - an expensive process. Our research goals were thus to develop a coupling STOC-CADMAS (Arikawa and Tomita, 2016) coupling with the structure analysis (Arikawa et. al., 2009) to efficiently calculate all stages from tsunami source to runup including the deformation of structures and to verify their applicability. We also investigated the stability of breakwaters at Kamaishi Bay. Fig. 1 shows the whole of this calculation system. The STOC-ML simulator approximates pressure by hydrostatic pressure and calculates the wave profiles based on an equation of continuity, thereby lowering calculation cost, primarily calculating from a e epi center to the shallow region. As a simulator, STOC-IC solves pressure based on a Poisson equation to account for a shallower, more complex topography, but reduces computation cost slightly to calculate the area near a port by setting the water surface based on an equation of continuity. CS3D also solves a Navier-Stokes equation and sets the water surface by VOF to deal with the runup area, with its complex surfaces of overflows and bores. STR solves the structure analysis including the geo analysis based on the Biot's formula. By coupling these, it efficiently calculates the tsunami profile from the propagation to the inundation. The numerical results compared with the physical experiments done by Arikawa et. al.,2012. It was good agreement with the experimental ones. Finally, the system applied to the local situation at Kamaishi bay. The almost breakwaters were washed away, whose situation was similar to the damage at Kamaishi bay. REFERENCES T. Arikawa and T. Tomita (2016): "Development of High Precision Tsunami Runup

  6. Precision manufacturing

    Dornfeld, David

    2008-01-01

    Today there is a high demand for high-precision products. The manufacturing processes are now highly sophisticated and derive from a specialized genre called precision engineering. Precision Manufacturing provides an introduction to precision engineering and manufacturing with an emphasis on the design and performance of precision machines and machine tools, metrology, tooling elements, machine structures, sources of error, precision machining processes and precision process planning. As well as discussing the critical role precision machine design for manufacturing has had in technological developments over the last few hundred years. In addition, the influence of sustainable manufacturing requirements in precision processes is introduced. Drawing upon years of practical experience and using numerous examples and illustrative applications, David Dornfeld and Dae-Eun Lee cover precision manufacturing as it applies to: The importance of measurement and metrology in the context of Precision Manufacturing. Th...

  7. Static analysis of unbounded structures in object-oriented programs

    Grabe, Immo

    2012-01-01

    In this thesis we investigate different techniques and formalisms to address complexity introduced by unbounded structures in object-oriented programs. We give a representation of a weakest precondition calculus for abstract object creation in dynamic logic. Based on this calculus we define symbolic

  8. A Comparative Analysis of Structured and Object-Oriented ...

    The concepts of structured and object-oriented programming methods are not relatively new but these approaches are still very much useful and relevant in today's programming paradigm. In this paper, we distinguish the features of structured programs from that of object oriented programs. Structured programming is a ...

  9. Towards a syntactic analysis of European Portuguese cognate objects

    Celda Morgado Choupina

    2013-01-01

    Full Text Available The present paper aims at discussing selected syntactic aspects of cognate objects in European Portuguese, along the lines of Distributed Morphology (Haugen, 2009. Cognate objects may be readily discovered in numerous human languages, including European Portuguese (Chovia uma chuva miudinha. It is assumed in papers devoted to their English counterparts that they belong to various subclasses. Indeed, some of them are genuine cognates (to sleep a sleep... or hyponyms (to dance a jig; Hale & Keyser, 2002. It turns out that in European Portuguese, they can be split into four different categories: (i genuine cognate objects (chorar um choro..., (ii similar cognate objects (dançar uma dança (iii objects hyponyms (dançar um tango and (iv prepositional cognate objects (morrer de uma morte .... There are, then, significant differences between various classes of cognate objects: whereas the genuine ones call imperatively for a restrictive modifier and a definite article, the remaining ones admit it only optionally. It might be concluded, then, that a lexicalist theory set up along the lines of Hale and Keyser is unable to deal successfully with distributional facts proper to various classes of cognate constructions in European Portuguese. That is why the present study is conducted more in accordance with syntactic principles of Distributed Morphology, with a strong impact of hypotheses put forward by Haugen (2009.

  10. Towards an understanding of dark matter: Precise gravitational lensing analysis complemented by robust photometric redshifts

    Coe, Daniel Aaron

    The goal of thesis is to help scientists resolve one of the great mysteries of our time: the nature of Dark Matter. Dark Matter is currently believed to make up over 80% of the material in our universe, yet we have so far inferred but a few of its basic properties. Here we study the Dark Matter surrounding a galaxy cluster, Abell 1689, via the most direct method currently available--gravitational lensing. Abell 1689 is a "strong" gravitational lens, meaning it produces multiple images of more distant galaxies. The observed positions of these images can be measured very precisely and act as a blueprint allowing us to reconstruct the Dark Matter distribution of the lens. Until now, such mass models of Abell 1689 have reproduced the observed multiple images well but with significant positional offsets. Using a new method we develop here, we obtain a new mass model which perfectly reproduces the observed positions of 168 knots identified within 135 multiple images of 42 galaxies. An important ingredient to our mass model is the accurate measurement of distances to the lensed galaxies via their photometric redshifts. Here we develop tools which improve the accuracy of these measurements based on our study of the Hubble Ultra Deep Field, the only image yet taken to comparable depth as the magnified regions of Abell 1689. We present results both for objects in the Hubble Ultra Deep Field and for galaxies gravitationally lensed by Abell 1689. As part of this thesis, we also provide reviews of Dark Matter and Gravitational Lensing, including a chapter devoted to the mass profiles of Dark Matter halos realized in simulations. The original work presented here was performed primarily by myself under the guidance of Narciso Benítez and Holland Ford as a member of the Advanced Camera for Surveys GTO Science Team at Johns Hopkins University and the Instituto de Astrofisica de Andalucfa. My advisors served on my thesis committee along with Rick White, Gabor Domokos, and Steve

  11. Theoretical Analysis of Heat Stress Prefabricating the Crack in Precision Cropping

    Lijun Zhang

    2013-07-01

    Full Text Available The mathematical model of the metal bar in course of heat treatment is built by regarding the convective heat transfer process of the metal bar as the heat conduction boundary condition. By the theory analysis and numerical simulation methods, the theoretical expression of unsteady multidimensional temperature field for the axisymmetric model of metal bar is obtained. Temperature field distribution of bar V-shaped notch equivalent tip is given by ANSYS software. The quantitative relationship between temperature of bar inner key points and the time is determined. Through the polynomial curve fitting, the relation between the ultimate strength and the temperature is also given. Based on it, the influences of the width of the adiabatic boundary and water velocity on the critical temperature gradient of germinating heat crack in the tip of V-shaped notch are analyzed. The experimental results in precision cropping show that the expression of unsteady multidimensional temperature field is feasible in the rapid calculation of crack generation.

  12. Kinematic analysis and experimental verification of a eccentric wheel based precision alignment mechanism for LINAC

    Mundra, G.; Jain, V.; Singh, K.K.; Saxena, P.; Khare, R.K.; Bagre, M.

    2011-01-01

    Eccentric wheel based precision alignment system was designed for the remote motorized alignment of proposed proton injector LINAC (SFDTL). As a part of the further development for the alignment and monitoring scheme, a menu driven alignment system is being developed. The paper describes a general kinematic equation (with base line tilt correction) based on the various parameters of the mechanism like eccentricity, wheel diameter, distance between the wheels and the diameter of the cylindrical accelerator component. Based on this equation the extent of the alignment range for the 4 degree of freedom is evaluated and analysis on some of the parameters variation and the theoretical accuracy/resolution is computed. For the same a computer program is written which can compute the various points for the each discrete position of the two motor combinations. The paper also describes the experimentally evaluated values of these positions (for the full extent of area) and the matching/comparison of the two data. These data now can be used for the movement computation required for alignment of the four motors (two front and two rear motors of the support structure). (author)

  13. FEM analysis of impact of external objects to pipelines

    Gracie, Robert; Konuk, Ibrahim [Geological Survey of Canada, Ottawa, ON (Canada)]. E-mail: ikonuk@NRCan.gc.ca; Fredj, Abdelfettah [BMT Fleet Technology Limited, Ottawa, ON (Canada)

    2003-07-01

    One of the most common hazards to pipelines is impact of external objects. Earth moving machinery, farm equipment or bullets can dent or fail land pipelines. External objects such as anchors, fishing gear, ice can damage offshore pipelines. This paper develops an FEM model to simulate the impact process and presents investigations using the FEM model to determine the influence of the geometry and velocity of the impacting object and also will study the influence of the pipe diameter, wall thickness, and concrete thickness along with internal pressure. The FEM model is developed by using LS-DYNA explicit FEM software utilizing shell and solid elements. The model allows damage and removal of the concrete and corrosion coating elements during impact. Parametric studies will be presented relating the dent size to pipe diameter, wall thickness and concrete thickness, internal pipe pressure, and impacting object geometry. The primary objective of this paper is to develop and present the FEM model. The model can be applied to both offshore and land pipeline problems. Some examples are used to illustrate how the model can be applied to real life problems. A future paper will present more detailed parametric studies. (author)

  14. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  15. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  16. Voice analysis as an objective state marker in bipolar disorder

    Faurholt-Jepsen, M.; Busk, Jonas; Frost, M.

    2016-01-01

    Changes in speech have been suggested as sensitive and valid measures of depression and mania in bipolar disorder. The present study aimed at investigating (1) voice features collected during phone calls as objective markers of affective states in bipolar disorder and (2) if combining voice...... features, automatically generated objective smartphone data on behavioral activities and electronic self-monitored data were collected from 28 outpatients with bipolar disorder in naturalistic settings on a daily basis during a period of 12 weeks. Depressive and manic symptoms were assessed using...... and electronic self-monitored data increased the accuracy, sensitivity and specificity of classification of affective states slightly. Voice features collected in naturalistic settings using smartphones may be used as objective state markers in patients with bipolar disorder....

  17. Software Analysis of Mining Images for Objects Detection

    Jan Tomecek

    2013-11-01

    Full Text Available The contribution is dealing with the development of the new module of robust FOTOMNG system for editing images from a video or miningimage from measurements for subsequent improvement of detection of required objects in the 2D image. The generated module allows create a finalhigh-quality picture by combination of multiple images with the search objects. We can combine input data according to the parameters or basedon reference frames. Correction of detected 2D objects is also part of this module. The solution is implemented intoFOTOMNG system and finishedwork has been tested in appropriate frames, which were validated core functionality and usability. Tests confirmed the function of each part of themodule, its accuracy and implications of integration.

  18. Voice analysis as an objective state marker in bipolar disorder

    Faurholt-Jepsen, M.; Busk, Jonas; Frost, M.

    2016-01-01

    features with automatically generated objective smartphone data on behavioral activities (for example, number of text messages and phone calls per day) and electronic self-monitored data (mood) on illness activity would increase the accuracy as a marker of affective states. Using smartphones, voice...... features, automatically generated objective smartphone data on behavioral activities and electronic self-monitored data were collected from 28 outpatients with bipolar disorder in naturalistic settings on a daily basis during a period of 12 weeks. Depressive and manic symptoms were assessed using...... to be more accurate, sensitive and specific in the classification of manic or mixed states with an area under the curve (AUC)=0.89 compared with an AUC=0.78 for the classification of depressive states. Combining voice features with automatically generated objective smartphone data on behavioral activities...

  19. Analysis for the high-level waste disposal cost object

    Kim, S. K.; Lee, J. R.; Choi, J. W.; Han, P. S.

    2003-01-01

    The purpose of this study is to analyse the ratio of cost object in terms of the disposal cost estimation. According to the result, the ratio of operating cost is the most significant object in total cost. There are a lot of differences between the disposal costs and product costs in view of their constituents. While the product costs may be classified by the direct materials cost, direct manufacturing labor cost, and factory overhead the disposal cost factors should be constituted by the technical factors and the non-technical factors

  20. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  1. Increased precision for analysis of protein-ligand dissociation constants determined from chemical shift titrations

    Markin, Craig J.; Spyracopoulos, Leo, E-mail: leo.spyracopoulos@ualberta.ca [University of Alberta, Department of Biochemistry (Canada)

    2012-06-15

    NMR is ideally suited for the analysis of protein-protein and protein ligand interactions with dissociation constants ranging from {approx}2 {mu}M to {approx}1 mM, and with kinetics in the fast exchange regime on the NMR timescale. For the determination of dissociation constants (K{sub D}) of 1:1 protein-protein or protein-ligand interactions using NMR, the protein and ligand concentrations must necessarily be similar in magnitude to the K{sub D}, and nonlinear least squares analysis of chemical shift changes as a function of ligand concentration is employed to determine estimates for the parameters K{sub D} and the maximum chemical shift change ({Delta}{delta}{sub max}). During a typical NMR titration, the initial protein concentration, [P{sub 0}], is held nearly constant. For this condition, to determine the most accurate parameters for K{sub D} and {Delta}{delta}{sub max} from nonlinear least squares analyses requires initial protein concentrations that are {approx}0.5 Multiplication-Sign K{sub D}, and a maximum concentration for the ligand, or titrant, of {approx}10 Multiplication-Sign [P{sub 0}]. From a practical standpoint, these requirements are often difficult to achieve. Using Monte Carlo simulations, we demonstrate that co-variation of the ligand and protein concentrations during a titration leads to an increase in the precision of the fitted K{sub D} and {Delta}{delta}{sub max} values when [P{sub 0}] > K{sub D}. Importantly, judicious choice of protein and ligand concentrations for a given NMR titration, combined with nonlinear least squares analyses using two independent variables (ligand and protein concentrations) and two parameters (K{sub D} and {Delta}{delta}{sub max}) is a straightforward approach to increasing the accuracy of measured dissociation constants for 1:1 protein-ligand interactions.

  2. Insurer’s activity as object of economic analysis

    O.O. Poplavskiy

    2015-12-01

    Full Text Available The article is devoted to the substantiation of theoretical fundamentals of insurer’s analysis and peculiarities of its implementation. The attention has been focused on the important role of economic analysis in economic science which is confirmed by its active use in research and practical orientation. The author summarizes the classification and principles of insurer’s activity analysis, supplements it with specific principles for insurer’s environment, publicity and risk-orientation which enable increasingly to take into account the peculiarities of insurance relations. The paper pays attention to the specification of elements of analysis and its key directions including the analysis of insurer’s financing, the analysis of insurance operations and the analysis of investment activity which will allow the effective functioning of risk management system.

  3. GuidosToolbox: universal digital image object analysis

    Peter Vogt; Kurt Riitters

    2017-01-01

    The increased availability of mapped environmental data calls for better tools to analyze the spatial characteristics and information contained in those maps. Publicly available, userfriendly and universal tools are needed to foster the interdisciplinary development and application of methodologies for the extraction of image object information properties contained in...

  4. Contextual object understanding through geospatial analysis and reasoning (COUGAR)

    Douglas, Joel; Antone, Matthew; Coggins, James; Rhodes, Bradley J.; Sobel, Erik; Stolle, Frank; Vinciguerra, Lori; Zandipour, Majid; Zhong, Yu

    2009-05-01

    Military operations in urban areas often require detailed knowledge of the location and identity of commonly occurring objects and spatial features. The ability to rapidly acquire and reason over urban scenes is critically important to such tasks as mission and route planning, visibility prediction, communications simulation, target recognition, and inference of higher-level form and function. Under DARPA's Urban Reasoning and Geospatial ExploitatioN Technology (URGENT) Program, the BAE Systems team has developed a system that combines a suite of complementary feature extraction and matching algorithms with higher-level inference and contextual reasoning to detect, segment, and classify urban entities of interest in a fully automated fashion. Our system operates solely on colored 3D point clouds, and considers object categories with a wide range of specificity (fire hydrants, windows, parking lots), scale (street lights, roads, buildings, forests), and shape (compact shapes, extended regions, terrain). As no single method can recognize the diverse set of categories under consideration, we have integrated multiple state-of-the-art technologies that couple hierarchical associative reasoning with robust computer vision and machine learning techniques. Our solution leverages contextual cues and evidence propagation from features to objects to scenes in order to exploit the combined descriptive power of 3D shape, appearance, and learned inter-object spatial relationships. The result is a set of tools designed to significantly enhance the productivity of analysts in exploiting emerging 3D data sources.

  5. Multi-element analysis of unidentified fallen objects from Tatale in ...

    A multi-element analysis has been carried out on two fallen objects, # 01 and # 02, using instrumental neutron activation analysis technique. A total of 17 elements were identified in object # 01 while 21 elements were found in object # 02. The two major elements in object # 01 were Fe and Mg, which together constitute ...

  6. X-ray analysis of objects of art and archaeology

    Mantler, M.; Schreiner, M.

    2001-01-01

    Some theoretical aspects and limitations of XRF are discussed, including information depths in layered materials, characterization of inhomogeneous specimens, light element analysis, and radiation damage. Worked examples of applications of XRF and XRD are pigment analysis in delicate Chinese Paper, corrosion of glass, and leaching effects in soil-buried medieval coins. (author)

  7. The MUSIC algorithm for sparse objects: a compressed sensing analysis

    Fannjiang, Albert C

    2011-01-01

    The multiple signal classification (MUSIC) algorithm, and its extension for imaging sparse extended objects, with noisy data is analyzed by compressed sensing (CS) techniques. A thresholding rule is developed to augment the standard MUSIC algorithm. The notion of restricted isometry property (RIP) and an upper bound on the restricted isometry constant (RIC) are employed to establish sufficient conditions for the exact localization by MUSIC with or without noise. In the noiseless case, the sufficient condition gives an upper bound on the numbers of random sampling and incident directions necessary for exact localization. In the noisy case, the sufficient condition assumes additionally an upper bound for the noise-to-object ratio in terms of the RIC and the dynamic range of objects. This bound points to the super-resolution capability of the MUSIC algorithm. Rigorous comparison of performance between MUSIC and the CS minimization principle, basis pursuit denoising (BPDN), is given. In general, the MUSIC algorithm guarantees to recover, with high probability, s scatterers with n=O(s 2 ) random sampling and incident directions and sufficiently high frequency. For the favorable imaging geometry where the scatterers are distributed on a transverse plane MUSIC guarantees to recover, with high probability, s scatterers with a median frequency and n=O(s) random sampling/incident directions. Moreover, for the problems of spectral estimation and source localizations both BPDN and MUSIC guarantee, with high probability, to identify exactly the frequencies of random signals with the number n=O(s) of sampling times. However, in the absence of abundant realizations of signals, BPDN is the preferred method for spectral estimation. Indeed, BPDN can identify the frequencies approximately with just one realization of signals with the recovery error at worst linearly proportional to the noise level. Numerical results confirm that BPDN outperforms MUSIC in the well-resolved case while

  8. Improvement in precision and trueness of quantitative XRF analysis with glass-bead method. 1

    Yamamoto, Yasuyuki; Ogasawara, Noriko; Yuhara, Yoshitaroh; Yokoyama, Yuichi

    1995-01-01

    The factors which lower the precisions of simultaneous X-ray Fluorescence (XRF) spectrometer were investigated. Especially in quantitative analyses of oxide powders with glass-bead method, X-ray optical characteristics of the equipment affects the precision of the X-ray intensities. In focused (curved) crystal spectrometers, the precision depends on the deviation of the actual size and position of the crystals from those of theoretical designs, thus the precision differs for each crystal for each element. When the deviation is large, a dispersion of the measured X-ray intensities is larger than the statistical dispersion, even though the intensity itself keeps unchanged. Moreover, a waviness of the surface of glass-beads makes the difference of the height of an analyzed surface from that of the designed one. This difference makes the change of the amount of the X-ray incident on the analyzing crystal and makes the dispersion of the X-ray intensity larger. Considering these factors, a level of the waviness must be regulated to improve the precision under exsisting XRF equipments. In this study, measurement precisions of 4 simultaneous XRF spectrometers were evaluated, and the element lead (Pb-Lβ1) was found to have the lowest precision. Relative standard deviation (RSD) of the measurements of 10 glass-beads for the same powder sample was 0.3% without the regulation of the waviness of analytical surface. With mechanical flattening of the glass-bead surface, the level of waviness, which is the maximum difference of the heights in a glass-bead, was regulated as under 30 μm, RSD was 0.038%, which is almost comparable to the statistical RSD 0.033%. (author)

  9. Object-oriented data analysis framework for neutron scattering experiments

    Suzuki, Jiro; Nakatani, Takeshi; Ohhara, Takashi; Inamura, Yasuhiro; Yonemura, Masao; Morishima, Takahiro; Aoyagi, Tetsuo; Manabe, Atsushi; Otomo, Toshiya

    2009-01-01

    Materials and Life Science Facility (MLF) of Japan Proton Accelerator Research Complex (J-PARC) is one of the facilities that provided the highest intensity pulsed neutron and muon beams. The MLF computing environment design group organizes the computing environments of MLF and instruments. It is important that the computing environment is provided by the facility side, because meta-data formats, the analysis functions and also data analysis strategy should be shared among many instruments in MLF. The C++ class library, named Manyo-lib, is a framework software for developing data reduction and analysis softwares. The framework is composed of the class library for data reduction and analysis operators, network distributed data processing modules and data containers. The class library is wrapped by the Python interface created by SWIG. All classes of the framework can be called from Python language, and Manyo-lib will be cooperated with the data acquisition and data-visualization components through the MLF-platform, a user interface unified in MLF, which is working on Python language. Raw data in the event-data format obtained by data acquisition systems will be converted into histogram format data on Manyo-lib in high performance, and data reductions and analysis are performed with user-application software developed based on Manyo-lib. We enforce standardization of data containers with Manyo-lib, and many additional fundamental data containers in Manyo-lib have been designed and developed. Experimental and analysis data in the data containers can be converted into NeXus file. Manyo-lib is the standard framework for developing analysis software in MLF, and prototypes of data-analysis softwares for each instrument are being developed by the instrument teams.

  10. Multispectral image analysis for object recognition and classification

    Viau, C. R.; Payeur, P.; Cretu, A.-M.

    2016-05-01

    Computer and machine vision applications are used in numerous fields to analyze static and dynamic imagery in order to assist or automate decision-making processes. Advancements in sensor technologies now make it possible to capture and visualize imagery at various wavelengths (or bands) of the electromagnetic spectrum. Multispectral imaging has countless applications in various fields including (but not limited to) security, defense, space, medical, manufacturing and archeology. The development of advanced algorithms to process and extract salient information from the imagery is a critical component of the overall system performance. The fundamental objective of this research project was to investigate the benefits of combining imagery from the visual and thermal bands of the electromagnetic spectrum to improve the recognition rates and accuracy of commonly found objects in an office setting. A multispectral dataset (visual and thermal) was captured and features from the visual and thermal images were extracted and used to train support vector machine (SVM) classifiers. The SVM's class prediction ability was evaluated separately on the visual, thermal and multispectral testing datasets.

  11. Analysis of process parameters in surface grinding using single objective Taguchi and multi-objective grey relational grade

    Prashant J. Patil

    2016-09-01

    Full Text Available Close tolerance and good surface finish are achieved by means of grinding process. This study was carried out for multi-objective optimization of MQL grinding process parameters. Water based Al2O3 and CuO nanofluids of various concentrations are used as lubricant for MQL system. Grinding experiments were carried out on instrumented surface grinding machine. For experimentation purpose Taguchi's method was used. Important process parameters that affect the G ratio and surface finish in MQL grinding are depth of cut, type of lubricant, feed rate, grinding wheel speed, coolant flow rate, and nanoparticle size. Grinding performance was calculated by the measurement G ratio and surface finish. For improvement of grinding process a multi-objective process parameter optimization is performed by use of Taguchi based grey relational analysis. To identify most significant factor of process analysis of variance (ANOVA has been used.

  12. Impacts of the precision agricultural technologies in Iran: An analysis experts' perception & their determinants

    Somayeh Tohidyan Far

    2018-03-01

    Full Text Available Nowadays agricultural methods developments that are productively, economically, environmentally and socially sustainable are required immediately. The concept of precision agriculture is becoming an attractive idea for managing natural resources and realizing modern sustainable agricultural development. The purpose of this study was to investigate factors influencing impacts of precision agriculture from the viewpoints of Boushehr Province experts. The research method was a cross sectional survey and multi-stage random sampling was used to collect data from 115 experts in Boushehr province. According to the results, experts found underground and surface waters conservation, rural areas development, increase of productivity and increasing income as the most important impacts of precision agricultural technologies. Experts’ attitudes indicate their positive view toward these kinds of impacts. Also behavioral attitude has the most effect on impacts.

  13. Precision of coherence analysis to detect cerebral autoregulation by near-infrared spectroscopy in preterm infants

    Hahn, GH; Christensen, KB; Leung, TS

    2010-01-01

    Coherence between spontaneous fluctuations in arterial blood pressure (ABP) and the cerebral near-infrared spectroscopy signal can detect cerebral autoregulation. Because reliable measurement depends on signals with high signal-to-noise ratio, we hypothesized that coherence is more precisely...... determined when fluctuations in ABP are large rather than small. Therefore, we investigated whether adjusting for variability in ABP (variabilityABP) improves precision. We examined the impact of variabilityABP within the power spectrum in each measurement and between repeated measurements in preterm infants....... We also examined total monitoring time required to discriminate among infants with a simulation study. We studied 22 preterm infants (GAABP within the power spectrum did not improve the precision. However, adjusting...

  14. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Gunawan, Hendra; Micheldiament, Micheldiament; Mikhailov, Valentin

    2008-01-01

    http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density) estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting ...

  15. Object permanence in cats: Analysis in locomotor space.

    Thinus-Blanc, C; Poucet, B; Chapuis, N

    1982-04-01

    Stages IV and V object permanence were studied with 38-40-week-old cats. A constraining apparatus preventing animals from pursuing the bowl containing meat before it was concealed was used. Either the bowl was seen moving and disappeared from view behind a screen (stage IV trials), or after this sequence, it reappeared from behind the first screen and disappeared behind a second screen (stage V trials). In both situations cats performed significantly above chance but the paths taken to reach the food were different according to the stage. In stage V trials, cats expressed a preference for the path leading to the end of the second screen where the food was last seen disappearing. Copyright © 1982. Published by Elsevier B.V.

  16. Which diabetic patients should receive podiatry care? An objective analysis.

    McGill, M; Molyneaux, L; Yue, D K

    2005-08-01

    Diabetes is the leading cause of lower limb amputation in Australia. However, due to limited resources, it is not feasible for everyone with diabetes to access podiatry care, and some objective guidelines of who should receive podiatry is required. A total of 250 patients with neuropathy (Biothesiometer; Biomedical Instruments, Newbury, Ohio, USA) ( > 30, age podiatry care (mean of estimates from 10 reports), the NNT to prevent one foot ulcer per year was: no neuropathy (vibration perception threshold (VPT) 30) alone, NNT = 45; +cannot feel monofilament, NNT = 18; +previous ulcer/amputation, NNT = 7. Provision of podiatry care to diabetic patients should not be only economically based, but should also be directed to those with reduced sensation, especially where there is a previous history of ulceration or amputation.

  17. Heating Development Analysis in Long HTS Objects - Updated Results

    Vysotsky, V S; Repnikov, V V; Lobanov, E A; Karapetyan, G H; Sytnikov, V E [All-Russian Scientific R and D Cable Institute, 5, Shosse Entuziastov, 111024, Moscow (Russian Federation)

    2006-06-01

    During fault in a grid large overload current, up to 30-times fold, forcibly will go to an HTS superconducting cable installed in a grid causing its quench and heating. The upgraded model has been used to analyse the heating development in long HTS objects during overloads. The model better presents real properties of materials used. New calculations coincide well with experiments and permit to determine the cooling coefficients. The stability limit (thermal runaway current) was determined for different cooling and index n. The overload currents, at which the superconductor will be heated up to 100 K during 250 ms can be determined also. The model may be used for practical evaluations of operational parameters.

  18. Introductory Psychology Textbooks: An Objective Analysis and Update.

    Griggs, Richard A.; Jackson, Sherri L.; Christopher, Andrew N.; Marek, Pam

    1999-01-01

    Explores changes in the introductory psychology textbook market through an analysis of edition, author, length, and content coverage of the volumes that comprise the current market. Finds a higher edition average, a decrease in the number of authors, an increase in text pages, and a focus on developmental psychology and sensation/perception. (CMK)

  19. Precision evaluation of pressed pastille preparation different methods for X-ray fluorescence analysis

    Lima, Raquel Franco de Souza; Melo Junior, Germano; Sa, Jaziel Martins

    1997-01-01

    This work relates the comparison between the results obtained with the two different methods of preparing pressed pastilles from the crushed sample. In this study, the reproductivity is evaluated, aiming to define the method that furnishes a better analytic precision. These analyses were realized with a X-ray fluorescence spectrometer at the Geology Department of the Federal University of Rio Grande do Norte

  20. The precision of circadian clocks : Assessment and analysis in Syrian hamsters

    Daan, S; Oklejewicz, M

    2003-01-01

    Locomotor activity recordings of Syrian hamsters were systematically analyzed to estimate the precision of the overt circadian activity rhythm in constant darkness. Phase variation, i.e., the standard deviation of phase markers around the regression line, varied with the definition of phase.

  1. The Army Communications Objectives Measurement System (ACOMS): Survey Analysis Plan

    1988-05-01

    Analysis Plan 12. PERSONAL AUTHOR(S) Gregory H. Gaertner (Westat) and Timothy W. Elig (ARI), editors 13a. TYPE OF REPORT 13b. TIME COVERED 14. DATE OF...such as those of Lavidge and Steiner (1961), McGuire (1969), and Fishbein and Azjen (1975). Fishbein and Azjen (1975) and Aaker (1975) present...for college, challenge and personal development, or patriotic service). Corresponding to these beliefs are evaluations of the importance of these

  2. Robustness of Multiple Objective Decision Analysis Preference Functions

    2002-06-01

    Bayesian Decision Theory and Utilitarian Ethics ,” American Economic Review Papers and Proceedings, 68: 223-228 (May 1978). Hartsough, Bruce R. “A...1983). Morrell, Darryl and Eric Driver. “ Bayesian Network Implementation of Levi’s Epistemic Utility Decision Theory ,” International Journal Of...elicitation efficiency for the decision maker. Subject Terms Decision Analysis, Utility Theory , Elicitation Error, Operations Research, Decision

  3. Stochastic precision analysis of 2D cardiac strain estimation in vivo

    Bunting, E A; Provost, J; Konofagou, E E

    2014-01-01

    Ultrasonic strain imaging has been applied to echocardiography and carries great potential to be used as a tool in the clinical setting. Two-dimensional (2D) strain estimation may be useful when studying the heart due to the complex, 3D deformation of the cardiac tissue. Increasing the framerate used for motion estimation, i.e. motion estimation rate (MER), has been shown to improve the precision of the strain estimation, although maintaining the spatial resolution necessary to view the entire heart structure in a single heartbeat remains challenging at high MERs. Two previously developed methods, the temporally unequispaced acquisition sequence (TUAS) and the diverging beam sequence (DBS), have been used in the past to successfully estimate in vivo axial strain at high MERs without compromising spatial resolution. In this study, a stochastic assessment of 2D strain estimation precision is performed in vivo for both sequences at varying MERs (65, 272, 544, 815 Hz for TUAS; 250, 500, 1000, 2000 Hz for DBS). 2D incremental strains were estimated during left ventricular contraction in five healthy volunteers using a normalized cross-correlation function and a least-squares strain estimator. Both sequences were shown capable of estimating 2D incremental strains in vivo. The conditional expected value of the elastographic signal-to-noise ratio (E(SNRe|ε)) was used to compare strain estimation precision of both sequences at multiple MERs over a wide range of clinical strain values. The results here indicate that axial strain estimation precision is much more dependent on MER than lateral strain estimation, while lateral estimation is more affected by strain magnitude. MER should be increased at least above 544 Hz to avoid suboptimal axial strain estimation. Radial and circumferential strain estimations were influenced by the axial and lateral strain in different ways. Furthermore, the TUAS and DBS were found to be of comparable precision at similar MERs. (paper)

  4. Change Analysis and Decision Tree Based Detection Model for Residential Objects across Multiple Scales

    CHEN Liyan

    2018-03-01

    Full Text Available Change analysis and detection plays important role in the updating of multi-scale databases.When overlap an updated larger-scale dataset and a to-be-updated smaller-scale dataset,people usually focus on temporal changes caused by the evolution of spatial entities.Little attention is paid to the representation changes influenced by map generalization.Using polygonal building data as an example,this study examines the changes from different perspectives,such as the reasons for their occurrence,their performance format.Based on this knowledge,we employ decision tree in field of machine learning to establish a change detection model.The aim of the proposed model is to distinguish temporal changes that need to be applied as updates to the smaller-scale dataset from representation changes.The proposed method is validated through tests using real-world building data from Guangzhou city.The experimental results show the overall precision of change detection is more than 90%,which indicates our method is effective to identify changed objects.

  5. Hadronic Triggers and trigger-object level analysis at ATLAS

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program, and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors to more deeply probing for new physics, such as storage and computing requirements f...

  6. Hadronic triggers and trigger object-level analysis at ATLAS

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program at the Large Hadron Collider (LHC), and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous event rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors, such as storage and computing requirements...

  7. Precise ground motion measurements to support multi-hazard analysis in Jakarta

    Koudogbo, Fifamè; Duro, Javier; Garcia Robles, Javier; Abidin, Hasanuddin Z.

    2015-04-01

    Jakarta is the capital of Indonesia and is home to approximately 10 million people on the coast of the Java Sea. The Capital District of Jakarta (DKI) sits in the lowest lying areas of the basin. Its topography varies, with the northern part just meters above current sea level and lying on a flood plain. Subsequently, this portion of the city frequently floods. Flood events have been increasing in severity during the past decade. The February 2007 event inundated 235 Km2 (about 36%) of the city, by up to seven meters in some areas. This event affected more than 2.6 million people; the estimated financial and economic losses from this event amounted to US900 million [1][2]. Inundations continue to occur under any sustained rainfall conditions. Flood events in Jakarta are expected to become more frequent in coming years, with a shift from previously slow natural processes with low frequency to a high frequency process resulting in severe socio-economic damage. Land subsidence in Jakarta results in increased vulnerability to flooding due to the reduced gravitational capacity to channel storm flows to the sea and an increased risk of tidal flooding. It continues at increasingly alarming rates, principally caused by intensive deep groundwater abstraction [3]. Recent studies have found typical subsidence rates of 7.5-10 cm a year. In localized areas of north Jakarta subsidence in the range 15-25 cm a year is occurring which, if sustained, would result in them sinking to 4-5 m below sea level by 2025 [3]. ALTAMIRA INFORMATION, company specialized in ground motion monitoring, has developed GlobalSARTM, which combines several processing techniques and algorithms based on InSAR technology, to achieve ground motion measurements with millimetric precision and high accuracy [4]. Within the RASOR (Rapid Analysis and Spatialisation and Of Risk) project, ALTAMIRA INFORMATION will apply GlobalSARTM to assess recent land subsidence in Jakarta, based on the processing of Very High

  8. Analysis on the precision of the dimensions of self-ligating brackets.

    Erduran, Rackel Hatice Milhomens Gualberto; Maeda, Fernando Akio; Ortiz, Sandra Regina Mota; Triviño, Tarcila; Fuziy, Acácio; Carvalho, Paulo Eduardo Guedes

    2016-12-01

    The present study aimed to evaluate the precision of the torque applied by 0.022" self-ligating brackets of different brands, the precision of parallelism between the inner walls of their slots, and precision of their slot height. Eighty brackets for upper central incisors of eight trademarked models were selected: Abzil, GAC, American Orthodontics, Morelli, Orthometric, Ormco, Forestadent, and Ortho Organizers. Images of the brackets were obtained using a scanning electron microscope (SEM) and these were measured using the AutoCAD 2011 software. The tolerance parameters stated in the ISO 27020 standard were used as references. The results showed that only the Orthometric, Morelli, and Ormco groups showed results inconsistent with the ISO standard. Regarding the parallelism of the internal walls of the slots, most of the models studied had results in line with the ISO prescription, except the Morelli group. In assessing bracket slot height, only the Forestadent, GAC, American Orthodontics, and Ormco groups presented results in accordance with the ISO standard. The GAC, Forestadent, and American Orthodontics groups did not differ in relation to the three factors of the ISO 27020 standard. Great variability of results is observed in relation to all the variables. © 2016 Wiley Periodicals, Inc.

  9. Categorical data processing for real estate objects valuation using statistical analysis

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  10. Why precision?

    Bluemlein, Johannes

    2012-05-15

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  11. Why precision?

    Bluemlein, Johannes

    2012-05-01

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  12. Objective analysis of image quality of video image capture systems

    Rowberg, Alan H.

    1990-07-01

    As Picture Archiving and Communication System (PACS) technology has matured, video image capture has become a common way of capturing digital images from many modalities. While digital interfaces, such as those which use the ACR/NEMA standard, will become more common in the future, and are preferred because of the accuracy of image transfer, video image capture will be the dominant method in the short term, and may continue to be used for some time because of the low cost and high speed often associated with such devices. Currently, virtually all installed systems use methods of digitizing the video signal that is produced for display on the scanner viewing console itself. A series of digital test images have been developed for display on either a GE CT9800 or a GE Signa MRI scanner. These images have been captured with each of five commercially available image capture systems, and the resultant images digitally transferred on floppy disk to a PC1286 computer containing Optimast' image analysis software. Here the images can be displayed in a comparative manner for visual evaluation, in addition to being analyzed statistically. Each of the images have been designed to support certain tests, including noise, accuracy, linearity, gray scale range, stability, slew rate, and pixel alignment. These image capture systems vary widely in these characteristics, in addition to the presence or absence of other artifacts, such as shading and moire pattern. Other accessories such as video distribution amplifiers and noise filters can also add or modify artifacts seen in the captured images, often giving unusual results. Each image is described, together with the tests which were performed using them. One image contains alternating black and white lines, each one pixel wide, after equilibration strips ten pixels wide. While some systems have a slew rate fast enough to track this correctly, others blur it to an average shade of gray, and do not resolve the lines, or give

  13. SOCIAL EXCLUSION AS AN OBJECT OF ECONOMIC ANALYSIS

    Z. Halushka

    2014-06-01

    Full Text Available In the article essence and forms of display of social exception of separate citizens and certain layers of population are certain as the socioeconomic phenomenon. Theoretical principles and methodology of estimation of the phenomenon of social exception are analyzed. Certain characteristic lines of social exception: subzero even consumptions and profit of individuals or groups; a limit access is to the public mechanisms of increase of welfare; a mainly passive type of cooperating is with society. Attention is accented on a defect for the individuals of row of rights, limit nature of access to the institutes that distribute resources, to the labor-market. Poverty is certain the main category of social exception. A concept "circles of poverty" and mechanisms of its existence are reasonable. Other displays of social exception-direct violation of base human rights are examined on quality education, on medical services and kind health, on the acceptable standard of living, on access to cultural acquisition, on defense of the interests and on the whole on participating in economic, social, in a civilized manner, political life of country. Cited data about part of torn away housekeeping of Ukraine on separate signs. The analysis of distribution of housekeeping after the amount of the accumulated signs of the social tearing away gave an opportunity to set a limit after that the social tearing away begins brightly to show up, at the level of 5 signs. It is certain the limit of the sharp tearing away. The second degree of tearing away – critical – answers a presence 7thsigns. At this level in Ukraine there are 37,7. That's far more than those, who are considered poor on a relative national criterion (24,0. It is set that conception of social exception shows the "horizontal cut" of the system of social relations and place of individual, layer, group and others like that in this system, certain on certain signs. The necessity of the use of

  14. Theoretical analysis of hidden photon searches in high-precision experiments

    Beranek, Tobias

    2014-01-01

    Although the Standard Model of particle physics (SM) provides an extremely successful description of the ordinary matter, one knows from astronomical observations that it accounts only for around 5% of the total energy density of the Universe, whereas around 30% are contributed by the dark matter. Motivated by anomalies in cosmic ray observations and by attempts to solve questions of the SM like the (g-2) μ discrepancy, proposed U(1) extensions of the Standard Model gauge group SU(3) x SU(2) x U(1) have raised attention in recent years. In the considered U(1) extensions a new, light messenger particle γ', the hidden photon, couples to the hidden sector as well as to the electromagnetic current of the SM by kinetic mixing. This allows for a search for this particle in laboratory experiments exploring the electromagnetic interaction. Various experimental programs have been started to search for the γ' boson, such as in electron-scattering experiments, which are a versatile tool to explore various physics phenomena. One approach is the dedicated search in fixed-target experiments at modest energies as performed at MAMI or at JLAB. In these experiments the scattering of an electron beam off a hadronic target e→e(A,Z)l + l - is investigated and a search for a very narrow resonance in the invariant mass distribution of the l + l - pair is performed. This requires an accurate understanding of the theoretical basis of the underlying processes. For this purpose it is demonstrated in the first part of this work, in which way the hidden photon can be motivated from existing puzzles encountered at the precision frontier of the SM. The main part of this thesis deals with the analysis of the theoretical framework for electron scattering fixed-target experiments searching for hidden photons. As a first step, the cross section for the bremsstrahlung emission of hidden photons in such experiments is studied. Based on these results, the applicability of the Weizsaecker

  15. Diachronic and Synchronic Analysis - the Case of the Indirect Object in Spanish

    Dam, Lotte; Dam-Jensen, Helle

    2007-01-01

    The article deals with a monograph on the indirect object in Spanish. The book offers a many-faceted analysis of the indrect object, as it, on the one hand, gives a detailed diachronic analysis of what is known as clitic-doubled constructions and, on the other, a synchronic analysis of both...

  16. Performance Analysis of Several GPS/Galileo Precise Point Positioning Models.

    Afifi, Akram; El-Rabbany, Ahmed

    2015-06-19

    This paper examines the performance of several precise point positioning (PPP) models, which combine dual-frequency GPS/Galileo observations in the un-differenced and between-satellite single-difference (BSSD) modes. These include the traditional un-differenced model, the decoupled clock model, the semi-decoupled clock model, and the between-satellite single-difference model. We take advantage of the IGS-MGEX network products to correct for the satellite differential code biases and the orbital and satellite clock errors. Natural Resources Canada's GPSPace PPP software is modified to handle the various GPS/Galileo PPP models. A total of six data sets of GPS and Galileo observations at six IGS stations are processed to examine the performance of the various PPP models. It is shown that the traditional un-differenced GPS/Galileo PPP model, the GPS decoupled clock model, and the semi-decoupled clock GPS/Galileo PPP model improve the convergence time by about 25% in comparison with the un-differenced GPS-only model. In addition, the semi-decoupled GPS/Galileo PPP model improves the solution precision by about 25% compared to the traditional un-differenced GPS/Galileo PPP model. Moreover, the BSSD GPS/Galileo PPP model improves the solution convergence time by about 50%, in comparison with the un-differenced GPS PPP model, regardless of the type of BSSD combination used. As well, the BSSD model improves the precision of the estimated parameters by about 50% and 25% when the loose and the tight combinations are used, respectively, in comparison with the un-differenced GPS-only model. Comparable results are obtained through the tight combination when either a GPS or a Galileo satellite is selected as a reference.

  17. Dipole model analysis of highest precision HERA data, including very low Q"2's

    Luszczak, A.; Kowalski, H.

    2016-12-01

    We analyse, within a dipole model, the final, inclusive HERA DIS cross section data in the low χ region, using fully correlated errors. We show, that these highest precision data are very well described within the dipole model framework starting from Q"2 values of 3.5 GeV"2 to the highest values of Q"2=250 GeV"2. To analyze the saturation effects we evaluated the data including also the very low 0.35< Q"2 GeV"2 region. The fits including this region show a preference of the saturation ansatz.

  18. Analysis of diodes used as precision power detectors above the square law region

    Guldbrandsen, Tom

    1990-01-01

    The deviation from square law found in diode power detectors at moderate power levels has been modeled for a general system consisting of a number of diode detectors connected to a common arbitrary linear passive network, containing an approximately sinusoidal source. This situation covers the case...... if an extra-set of measurements is made in situ. For precision measurements the maximum power level can be increased by about 10 dB. The dynamic range can thus be increased sufficiently to enable fast measurements to be made with an accuracy of 10-3 dB...

  19. Accuracy and precision of 3 intraoral scanners and accuracy of conventional impressions: A novel in vivo analysis method.

    Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A

    2018-02-01

    To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Estimation of Bouguer Density Precision: Development of Method for Analysis of La Soufriere Volcano Gravity Data

    Hendra Gunawan

    2014-06-01

    Full Text Available http://dx.doi.org/10.17014/ijog.vol3no3.20084The precision of topographic density (Bouguer density estimation by the Nettleton approach is based on a minimum correlation of Bouguer gravity anomaly and topography. The other method, the Parasnis approach, is based on a minimum correlation of Bouguer gravity anomaly and Bouguer correction. The precision of Bouguer density estimates was investigated by both methods on simple 2D syntetic models and under an assumption free-air anomaly consisting of an effect of topography, an effect of intracrustal, and an isostatic compensation. Based on simulation results, Bouguer density estimates were then investigated for a gravity survey of 2005 on La Soufriere Volcano-Guadeloupe area (Antilles Islands. The Bouguer density based on the Parasnis approach is 2.71 g/cm3 for the whole area, except the edifice area where average topography density estimates are 2.21 g/cm3 where Bouguer density estimates from previous gravity survey of 1975 are 2.67 g/cm3. The Bouguer density in La Soufriere Volcano was uncertainly estimated to be 0.1 g/cm3. For the studied area, the density deduced from refraction seismic data is coherent with the recent Bouguer density estimates. New Bouguer anomaly map based on these Bouguer density values allows to a better geological intepretation.    

  1. Detailed seismotectonic analysis of Sumatra subduction zone revealed by high precision earthquake location

    Sagala, Ricardo Alfencius; Harjadi, P. J. Prih; Heryandoko, Nova; Sianipar, Dimas

    2017-07-01

    Sumatra was one of the most high seismicity regions in Indonesia. The subduction of Indo-Australian plate beneath Eurasian plate in western Sumatra contributes for many significant earthquakes that occur in this area. These earthquake events can be used to analyze the seismotectonic of Sumatra subduction zone and its system. In this study we use teleseismic double-difference method to obtain more high precision earthquake distribution in Sumatra subduction zone. We use a 3D nested regional-global velocity model. We use a combination of data from both of ISC (International Seismological Center) and BMKG (Agency for Meteorology Climatology and Geophysics, Indonesia). We successfully relocate about 6886 earthquakes that occur on period of 1981-2015. We consider that this new location is more precise than the regular bulletin. The relocation results show greatly reduced of RMS residual of travel time. Using this data, we can construct a new seismotectonic map of Sumatra. A well-built geometry of subduction slab, faults and volcano arc can be obtained from the new bulletin. It is also showed that at a depth of 140-170 km, there is many events occur as moderate-to-deep earthquakes, and we consider about the relation of the slab's events with volcanic arc and inland fault system. A reliable slab model is also built from regression equation using new relocated data. We also analyze the spatial-temporal of seismotectonic using b-value mapping that inspected in detail horizontally and vertically cross-section.

  2. Design and Analysis of a Compact Precision Positioning Platform Integrating Strain Gauges and the Piezoactuator

    Shunguang Wan

    2012-07-01

    Full Text Available Miniaturization precision positioning platforms are needed for in situ nanomechanical test applications. This paper proposes a compact precision positioning platform integrating strain gauges and the piezoactuator. Effects of geometric parameters of two parallel plates on Von Mises stress distribution as well as static and dynamic characteristics of the platform were studied by the finite element method. Results of the calibration experiment indicate that the strain gauge sensor has good linearity and its sensitivity is about 0.0468 mV/μm. A closed-loop control system was established to solve the problem of nonlinearity of the platform. Experimental results demonstrate that for the displacement control process, both the displacement increasing portion and the decreasing portion have good linearity, verifying that the control system is available. The developed platform has a compact structure but can realize displacement measurement with the embedded strain gauges, which is useful for the closed-loop control and structure miniaturization of piezo devices. It has potential applications in nanoindentation and nanoscratch tests, especially in the field of in situ nanomechanical testing which requires compact structures.

  3. Analysis of the Murine Immune Response to Pulmonary Delivery of Precisely Fabricated Nano- and Microscale Particles

    Roberts, Reid A.; Shen, Tammy; Allen, Irving C.; Hasan, Warefta; DeSimone, Joseph M.; Ting, Jenny P. Y.

    2013-01-01

    Nanomedicine has the potential to transform clinical care in the 21st century. However, a precise understanding of how nanomaterial design parameters such as size, shape and composition affect the mammalian immune system is a prerequisite for the realization of nanomedicine's translational promise. Herein, we make use of the recently developed Particle Replication in Non-wetting Template (PRINT) fabrication process to precisely fabricate particles across and the nano- and micro-scale with defined shapes and compositions to address the role of particle design parameters on the murine innate immune response in both in vitro and in vivo settings. We find that particles composed of either the biodegradable polymer poly(lactic-co-glycolic acid) (PLGA) or the biocompatible polymer polyethylene glycol (PEG) do not cause release of pro-inflammatory cytokines nor inflammasome activation in bone marrow-derived macrophages. When instilled into the lungs of mice, particle composition and size can augment the number and type of innate immune cells recruited to the lungs without triggering inflammatory responses as assayed by cytokine release and histopathology. Smaller particles (80×320 nm) are more readily taken up in vivo by monocytes and macrophages than larger particles (6 µm diameter), yet particles of all tested sizes remained in the lungs for up to 7 days without clearance or triggering of host immunity. These results suggest rational design of nanoparticle physical parameters can be used for sustained and localized delivery of therapeutics to the lungs. PMID:23593509

  4. Intermediary object for participative design processes based on the ergonomic work analysis

    Souza da Conceição, Carolina; Duarte, F.; Broberg, Ole

    2012-01-01

    The objective of this paper is to present and discuss the use of an intermediary object, built from the ergonomic work analysis, in a participative design process. The object was a zoning pattern, developed as a visual representation ‘mapping’ of the interrelations among the functional units of t...

  5. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  6. Accuracy and Precision in Elemental Analysis of Environmental Samples using Inductively Coupled Plasma-Atomic Emission Spectrometry

    Quraishi, Shamsad Begum; Chung, Yong-Sam; Choi, Kwang Soon

    2005-01-01

    Inductively Coupled Plasma-Atomic Emission Spectrometry followed by micro-wave digestion have been performed on different environmental Certified Reference Materials (CRMs). Analytical results show that accuracy and precision in ICP-AES analysis were acceptable and satisfactory in case of soil and hair CRM samples. The relative error of most of the elements in these two CRMs is within 10% with few exceptions and coefficient of variation is also less than 10%. Z-score as an analytical performance was also within the acceptable range (±2). ICP-AES was found as an inadequate method for Air Filter CRM due to incomplete dissolution, low concentration of elements and very low mass of the sample. However, real air filter sample could have been analyzed with high accuracy and precision by increasing sample mass during collection. (author)

  7. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  8. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision

  9. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  10. Ion chromatography for the precise analysis of chloride and sodium in sweat for the diagnosis of cystic fibrosis.

    Doorn, J; Storteboom, T T R; Mulder, A M; de Jong, W H A; Rottier, B L; Kema, I P

    2015-07-01

    Measurement of chloride in sweat is an essential part of the diagnostic algorithm for cystic fibrosis. The lack in sensitivity and reproducibility of current methods led us to develop an ion chromatography/high-performance liquid chromatography (IC/HPLC) method, suitable for the analysis of both chloride and sodium in small volumes of sweat. Precision, linearity and limit of detection of an in-house developed IC/HPLC method were established. Method comparison between the newly developed IC/HPLC method and the traditional Chlorocounter was performed, and trueness was determined using Passing Bablok method comparison with external quality assurance material (Royal College of Pathologists of Australasia). Precision and linearity fulfill criteria as established by UK guidelines are comparable with inductively coupled plasma-mass spectrometry methods. Passing Bablok analysis demonstrated excellent correlation between IC/HPLC measurements and external quality assessment target values, for both chloride and sodium. With a limit of quantitation of 0.95 mmol/L, our method is suitable for the analysis of small amounts of sweat and can thus be used in combination with the Macroduct collection system. Although a chromatographic application results in a somewhat more expensive test compared to a Chlorocounter test, more accurate measurements are achieved. In addition, simultaneous measurements of sodium concentrations will result in better detection of false positives, less test repeating and thus faster and more accurate and effective diagnosis. The described IC/HPLC method, therefore, provides a precise, relatively cheap and easy-to-handle application for the analysis of both chloride and sodium in sweat. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  11. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    Ren, M J; Cheung, C F; Kong, L B

    2012-01-01

    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  12. Foreign object detection and removal to improve automated analysis of chest radiographs

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  13. Using beta-binomial regression for high-precision differential methylation analysis in multifactor whole-genome bisulfite sequencing experiments

    2014-01-01

    Background Whole-genome bisulfite sequencing currently provides the highest-precision view of the epigenome, with quantitative information about populations of cells down to single nucleotide resolution. Several studies have demonstrated the value of this precision: meaningful features that correlate strongly with biological functions can be found associated with only a few CpG sites. Understanding the role of DNA methylation, and more broadly the role of DNA accessibility, requires that methylation differences between populations of cells are identified with extreme precision and in complex experimental designs. Results In this work we investigated the use of beta-binomial regression as a general approach for modeling whole-genome bisulfite data to identify differentially methylated sites and genomic intervals. Conclusions The regression-based analysis can handle medium- and large-scale experiments where it becomes critical to accurately model variation in methylation levels between replicates and account for influence of various experimental factors like cell types or batch effects. PMID:24962134

  14. Analysis tools for precision studies of hadronic three-body decays and transition form factors

    Schneider, Sebastian Philipp

    2013-01-01

    Due to the running coupling constant of Quantum Chromodynamics one of the pillars of the Standard Model, the strong interactions, is still insufficiently understood at low energies. In order to describe the interactions of hadrons that form in this physical regime, one has to devise methods that are non-perturbative in the strong coupling constant. In particular hadronic three-body decays and transition form factors present a great challenge due to the complex analytic structure ensued by strong final-state interactions. In this thesis we present two approaches to tackle these processes. In the first part we use a modified version of non-relativistic effective field theory to analyze the decay η→3π. This perturbative low-energy expansion is ideally suited to study the effects of ππ rescattering and contributes greatly to the understanding of the slope parameter of the η→3π 0 Dalitz plot, a quantity that is strongly influenced by final-state interactions and has presented a long-standing puzzle for theoretical approaches. In the second part we present dispersion relations as a non-perturbative means to study three-particle decays. Using the example of η'→ηππ we give a detailed introduction to the framework and its numerical implementation. We confront our findings with recent experimental data from the BES-III and VES collaborations and discuss whether the extraction of πη scattering parameters, one of the prime motives to study this decay channel, is feasible in such an approach. A more clear-cut application is given in our study of the decays ω/φ→3π due to the relative simplicity of this decay channel: our results are solely dependent on the ππ P-wave scattering phase shift. We give predictions for the Dalitz plot distributions and compare our findings to very precise data on φ→3π by the KLOE and CMD-2 collaborations. We also predict Dalitz plot parameters that may be determined in future high-precision measurements of ω→3π and

  15. Analysis tools for precision studies of hadronic three-body decays and transition form factors

    Schneider, Sebastian Philipp

    2013-02-14

    Due to the running coupling constant of Quantum Chromodynamics one of the pillars of the Standard Model, the strong interactions, is still insufficiently understood at low energies. In order to describe the interactions of hadrons that form in this physical regime, one has to devise methods that are non-perturbative in the strong coupling constant. In particular hadronic three-body decays and transition form factors present a great challenge due to the complex analytic structure ensued by strong final-state interactions. In this thesis we present two approaches to tackle these processes. In the first part we use a modified version of non-relativistic effective field theory to analyze the decay {eta}{yields}3{pi}. This perturbative low-energy expansion is ideally suited to study the effects of {pi}{pi} rescattering and contributes greatly to the understanding of the slope parameter of the {eta}{yields}3{pi}{sup 0} Dalitz plot, a quantity that is strongly influenced by final-state interactions and has presented a long-standing puzzle for theoretical approaches. In the second part we present dispersion relations as a non-perturbative means to study three-particle decays. Using the example of {eta}'{yields}{eta}{pi}{pi} we give a detailed introduction to the framework and its numerical implementation. We confront our findings with recent experimental data from the BES-III and VES collaborations and discuss whether the extraction of {pi}{eta} scattering parameters, one of the prime motives to study this decay channel, is feasible in such an approach. A more clear-cut application is given in our study of the decays {omega}/{phi}{yields}3{pi} due to the relative simplicity of this decay channel: our results are solely dependent on the {pi}{pi} P-wave scattering phase shift. We give predictions for the Dalitz plot distributions and compare our findings to very precise data on {phi}{yields}3{pi} by the KLOE and CMD-2 collaborations. We also predict Dalitz plot

  16. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  17. SU-F-I-56: High-Precision Gamma-Ray Analysis of Medical Isotopes

    Chopra, N; Chillery, T; Chowdhury, P; Lister, C [University of Massachusetts-Lowell, Lowell, MA (United States); McCutchan, E [National Nuclear Data Center, Brookhaven National Laboratory, Upton, NY (United States); Smith, C [BLIP Facility, Brookhaven National Laboratory, Upton, NY (United States)

    2016-06-15

    Purpose: Advanced, time-resolved, Compton-suppressed gamma-ray spectroscopy with germanium detectors is implemented for assaying medical isotopes to study the radioactive decay process leading to a more accurate appraisal of the received dose and treatment planning. Lowell’s Array for Radiological Assay (LARA), a detector array that is comprised of six Compton-suppressed high-purity germanium detectors, is currently under development at UMass-Lowell which combines Compton-suppression and time-and-angle correlations to allow for highly efficient and highly sensitive measurements. Methods: Two isotopes produced Brookhaven Linac Isotope Producer (BLIP) were investigated. {sup 82}Sr which is the parent isotope for producing {sup 82}Rb is often used in cardiac PET. {sup 82}Sr gamma-ray spectrum is dominated by the 511keV photons from positron annihilation which prevent precise measurement of co-produced contaminant isotopes. A second project was to investigate the production of platinum isotopes. Natural platinum was bombarded with protons from 53MeV to 200MeV. The resulting spectrum was complicated due to the large number of stable platinum isotopes in the target, the variety of open reaction channels (p,xn), (p,pxn), (p,axn). Results: By using face-to-face NaI(Tl) counters 90-degrees to the Compton-suppressed germaniums to detect the 511keV photons, a much cleaner and more sensitive measurement of {sup 85}Sr and other contaminants was obtained. For the platinum target, we identified the production of {sup 188–189–191–195}Pt, {sup 191–192–193–194–195–196}Au and {sup 186–188–189–190–192–194–189–190–192–194}Ir. For example, at the lower energies (53 and 65MeV), we measured {sup 191}Pt production cross-sections of 144mb and 157mb. Considerable care was needed in following the process of dissolving and diluting the samples to get consistent results. The new LARA array will help us better ascertain the absolute efficiency of the counting

  18. Measures of precision for dissimilarity-based multivariate analysis of ecological communities.

    Anderson, Marti J; Santana-Garcon, Julia

    2015-01-01

    Ecological studies require key decisions regarding the appropriate size and number of sampling units. No methods currently exist to measure precision for multivariate assemblage data when dissimilarity-based analyses are intended to follow. Here, we propose a pseudo multivariate dissimilarity-based standard error (MultSE) as a useful quantity for assessing sample-size adequacy in studies of ecological communities. Based on sums of squared dissimilarities, MultSE measures variability in the position of the centroid in the space of a chosen dissimilarity measure under repeated sampling for a given sample size. We describe a novel double resampling method to quantify uncertainty in MultSE values with increasing sample size. For more complex designs, values of MultSE can be calculated from the pseudo residual mean square of a permanova model, with the double resampling done within appropriate cells in the design. R code functions for implementing these techniques, along with ecological examples, are provided. © 2014 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.

  19. Precision Analysis of the Lightest MSSM Higgs Boson at Future Colliders

    Ellis, Jonathan Richard; Olive, Keith A; Weiglein, Georg; Ellis, John; Heinemeyer, Sven; Olive, Keith A.; Weiglein, Georg

    2003-01-01

    We investigate the sensitivity of observables measurable in e^+ e^-, gamma gamma and mu^+ mu^- collisions for distinguishing the properties of the light neutral CP-even Higgs boson in the minimal supersymmetric extension of the Standard Model (MSSM) from those of a Standard Model (SM) Higgs boson with the same mass. We explore first the available parameter space in the constrained MSSM (CMSSM), with universal soft supersymmetry-breaking parameters, incorporating the most recent direct limits on sparticle and Higgs masses, the indirect constraints from b to s gamma and g_mu - 2, and the cosmological relic density. We calculate the products of the expected CMSSM Higgs production cross sections and decay branching ratios sigma X B normalized by the corresponding values expected for those of a SM Higgs boson of the same mass. The results are compared with the precisions expected at each collider, and allow for a direct comparison of the different channels. The measurements in the Higgs sector are found to provide...

  20. The Analysis of Height System Definition and the High Precision GNSS Replacing Leveling Method

    ZHANG Chuanyin

    2017-08-01

    Full Text Available Based on the definition of height system, the gravitational equipotential property of height datum surface is discussed in this paper, differences of the heights at ground points that defined in different height systems are tested and analyzed as well. A new method for replacing leveling using GNSS is proposed to ensure the consistency between GNSS replacing leveling and spirit leveling at mm accuracy level. The main conclusions include:①For determining normal height at centimeter accuracy level, the datum surface of normal height should be the geoid. The 1985 national height datum of China adopts normal height system, its datum surface is the geoid passing the Qingdao zero point.②The surface of equi-orthometric height in the near earth space is parallel to the geoid. The combination of GNSS precise positioning and geoid model can be directly used for orthometric height determination. However, the normal height system is more advantageous for describing the terrain and relief.③Based on the proposed method of GNSS replacing leveling, the errors in geodetic height affect more on normal height result than the errors of geoid model, the former is about 1.5 times of the latter.

  1. Assessing the Potential Economic Viability of Precision Irrigation: A Theoretical Analysis and Pilot Empirical Evaluation

    Francesco Galioto

    2017-12-01

    Full Text Available The present study explores the value generated by the use of information to rationalize the use of water resources in agriculture. The study introduces the value of information concept in the field of irrigation developing a theoretical assessment framework to evaluate whether the introduction of “Precision Irrigation” (PI practices can improve expectations on income. This is supported by a Stakeholders consultation and by a numerical example, using secondary data and crop growth models. The study reveals that the value generated with the transition to PI varies with pedo-climate, economic, technological and other conditions, and it depends on the initial status of the farmer’s information environment. These factors affect the prerequisite needed to make viable PI. To foster the adoption of PI, stakeholders envisaged the need to set up free meteorological information and advisory service that supports farmers in using PI, as well as other type of instruments. The paper concludes that the profitability of adoption and the relevant impact on the environment cannot be considered as generally given, but must be evaluated case by case justifying (or not the activation of specific agricultural policy measures supporting PI practices to target regions.

  2. Fast and Precise Symbolic Analysis of Concurrency Bugs in Device Drivers

    Deligiannis, P; Donaldson, AF; Rakamaric, Z

    2015-01-01

    ? 2015 IEEE.Concurrency errors, such as data races, make device drivers notoriously hard to develop and debug without automated tool support. We present Whoop, a new automated approach that statically analyzes drivers for data races. Whoop is empowered by symbolic pairwise lockset analysis, a novel analysis that can soundly detect all potential races in a driver. Our analysis avoids reasoning about thread interleavings and thus scales well. Exploiting the race-freedom guarantees provided by W...

  3. Semi-on-line analysis for fast and precise monitoring of bioreaction processes

    Christensen, L.H.; Marcher, J.; Schulze, Ulrik

    1996-01-01

    Monitoring of substrates and products during fermentation processes can be achieved either by on-line, in situ sensors or by semi-on-line analysis consisting of an automatic sampling step followed by an ex situ analysis of the retrieved sample. The potential risk of introducing time delays...

  4. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  5. The analysis of the bedrock deformation in Olkiluoto using precise levelling measurements

    Saaranen, V.; Rouhiainen, P.; Suurmaeki, H.

    2014-01-01

    In order to research vertical bedrock deformations in the Olkiluoto area, Posiva Oy and the Finnish Geodetic Institute began monitoring with precise levelling in 2003. At the moment, the measuring plan includes a loop between the monitoring GPS stations around the island, a levelling line from the island to the mainland, levelling loops to ONKALO, the final disposal site, and VLJ, the low and intermediate level waste repository there. The levelling to the mainland has been performed every fourth year and the levelling of the GPS stations every second year. The micro loops (ONKALO and VLJ) have been measured annually. In this report, we use three-step method to research a vertical deformation of the Olkiluoto area. Firstly, the linear deformation rate in the area has been determined by the least squares adjustment of the levelling data. It varies from -0.2 mm/yr to +0.2 mm/yr. Secondly, local deformations have been analysed by comparing the height differences for different years. In this comparison a starting value for the yearly adjustment has been corrected for land uplift. Using this method the elevation changes are relative to the whole network. For a fixed benchmark, we correct its yearly deformation. Thirdly, the fault lines have been analysed by comparing the elevation changes between the successive benchmarks from one observation epoch to another. The results show that ONKALO and Lapijoki are in the subsidence area of the network, and VLJ has small uplift rate. On the island some deformations exist, but elevation difference from 2003 to 2011 is less than one millimetre at every benchmarks. The measurements in the Lapijoki-Olkiluoto line in 2003, 2007 and 2011 show that linear elevation change between the mainland and Olkiluoto island is a little since 2003. The elevation differences, from Olkiluoto to Lapijoki, measured in 2003 and 2011 differ less than one millimetre each other, but the 2007 observation differs three millimetres from the other measurements

  6. Precision Attitude Determination System (PADS) system design and analysis: Single-axis gimbal star tracker

    1974-01-01

    The feasibility is evaluated of an evolutionary development for use of a single-axis gimbal star tracker from prior two-axis gimbal star tracker based system applications. Detailed evaluation of the star tracker gimbal encoder is considered. A brief system description is given including the aspects of tracker evolution and encoder evaluation. System analysis includes evaluation of star availability and mounting constraints for the geosynchronous orbit application, and a covariance simulation analysis to evaluate performance potential. Star availability and covariance analysis digital computer programs are included.

  7. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  8. Evaluation of the prediction precision capability of partial least squares regression approach for analysis of high alloy steel by laser induced breakdown spectroscopy

    Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.

    2015-06-01

    Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).

  9. Precise Model Analysis for 3-phase High Power Converter using the Harmonic State Space Modeling

    Kwon, Jun Bum; Wang, Xiongfei; Blaabjerg, Frede

    2015-01-01

    This paper presents about the generalized multi-frequency modeling and analysis methodology, which can be used in control loop design and stability analysis. In terms of the switching frequency of high power converter, there can be harmonics interruption if the voltage source converter has a low...... switching frequency ratio or multi-sampling frequency. The range of the control bandwidth can include the switching component. Thus, the systems become unstable. This paper applies the Harmonic State Space (HSS) Modeling method in order to find out the transfer function for each harmonics terms...

  10. Sternal instability measured with radiostereometric analysis. A study of method feasibility, accuracy and precision

    Vestergaard, Rikke Falsig; Søballe, Kjeld; Hasenkam, John Michael

    2018-01-01

    BACKGROUND: A small, but unstable, saw-gap may hinder bone-bridging and induce development of painful sternal dehiscence. We propose the use of Radiostereometric Analysis (RSA) for evaluation of sternal instability and present a method validation. METHODS: Four bone analogs (phantoms) were sterno...... modality feasible for clinical evaluation of sternal stability in research. TRIAL REGISTRATION: ClinicalTrials.gov Identifier: NCT02738437 , retrospectively registered.......BACKGROUND: A small, but unstable, saw-gap may hinder bone-bridging and induce development of painful sternal dehiscence. We propose the use of Radiostereometric Analysis (RSA) for evaluation of sternal instability and present a method validation. METHODS: Four bone analogs (phantoms) were...

  11. Analysis of 14C and 13C in teeth provides precise birth dating and clues to geographical origin.

    Alkass, K; Buchholz, B A; Druid, H; Spalding, K L

    2011-06-15

    The identification of human bodies in situations when there are no clues as to the person's identity from circumstantial data, poses a difficult problem to the investigators. The determination of age and sex of the body can be crucial in order to limit the search to individuals that are a possible match. We analyzed the proportion of bomb pulse derived carbon-14 ((14)C) incorporated in the enamel of teeth from individuals from different geographical locations. The 'bomb pulse' refers to a significant increase in (14)C levels in the atmosphere caused by above ground test detonations of nuclear weapons during the cold war (1955-1963). By comparing (14)C levels in enamel with (14)C atmospheric levels systematically recorded over time, high precision birth dating of modern biological material is possible. Above ground nuclear bomb testing was largely restricted to a couple of locations in the northern hemisphere, producing differences in atmospheric (14)C levels at various geographical regions, particularly in the early phase. Therefore, we examined the precision of (14)C birth dating of enamel as a function of time of formation and geographical location. We also investigated the use of the stable isotope (13)C as an indicator of geographical origin of an individual. Dental enamel was isolated from 95 teeth extracted from 84 individuals to study the precision of the (14)C method along the bomb spike. For teeth formed before 1955 (N=17), all but one tooth showed negative Δ(14)C values. Analysis of enamel from teeth formed during the rising part of the bomb-spike (1955-1963, N=12) and after the peak (>1963, N=66) resulted in an average absolute date of birth estimation error of 1.9±1.4 and 1.3±1.0 years, respectively. Geographical location of an individual had no adverse effect on the precision of year of birth estimation using radiocarbon dating. In 46 teeth, measurement of (13)C was also performed. Scandinavian teeth showed a substantially greater depression in

  12. X-ray fluorescence analysis of archaeological finds and art objects: Recognizing gold and gilding

    Trojek, Tomáš; Hložek, Martin

    2012-01-01

    Many cultural heritage objects were gilded in the past, and nowadays they can be found in archeological excavations or in historical buildings dating back to the Middle Ages, or from the modern period. Old gilded artifacts have been studied using X-ray fluorescence analysis and 2D microanalysis. Several techniques that enable the user to distinguish gold and gilded objects are described and then applied to investigate artifacts. These techniques differ in instrumentation, data analysis and numbers of measurements. The application of Monte Carlo calculation to a quantitative analysis of gilded objects is also introduced. - Highlights: ► Three techniques of gilding identification with XRF analysis are proposed. ► These techniques are applied to gold and gilded art and archeological objects. ► Composition of a substrate material is determined by a Monte Carlo simulation.

  13. Precision translator

    Reedy, Robert P.; Crawford, Daniel W.

    1984-01-01

    A precision translator for focusing a beam of light on the end of a glass fiber which includes two turning fork-like members rigidly connected to each other. These members have two prongs each with its separation adjusted by a screw, thereby adjusting the orthogonal positioning of a glass fiber attached to one of the members. This translator is made of simple parts with capability to keep adjustment even in condition of rough handling.

  14. Precision Cosmology

    Jones, Bernard J. T.

    2017-04-01

    Preface; Notation and conventions; Part I. 100 Years of Cosmology: 1. Emerging cosmology; 2. The cosmic expansion; 3. The cosmic microwave background; 4. Recent cosmology; Part II. Newtonian Cosmology: 5. Newtonian cosmology; 6. Dark energy cosmological models; 7. The early universe; 8. The inhomogeneous universe; 9. The inflationary universe; Part III. Relativistic Cosmology: 10. Minkowski space; 11. The energy momentum tensor; 12. General relativity; 13. Space-time geometry and calculus; 14. The Einstein field equations; 15. Solutions of the Einstein equations; 16. The Robertson-Walker solution; 17. Congruences, curvature and Raychaudhuri; 18. Observing and measuring the universe; Part IV. The Physics of Matter and Radiation: 19. Physics of the CMB radiation; 20. Recombination of the primeval plasma; 21. CMB polarisation; 22. CMB anisotropy; Part V. Precision Tools for Precision Cosmology: 23. Likelihood; 24. Frequentist hypothesis testing; 25. Statistical inference: Bayesian; 26. CMB data processing; 27. Parametrising the universe; 28. Precision cosmology; 29. Epilogue; Appendix A. SI, CGS and Planck units; Appendix B. Magnitudes and distances; Appendix C. Representing vectors and tensors; Appendix D. The electromagnetic field; Appendix E. Statistical distributions; Appendix F. Functions on a sphere; Appendix G. Acknowledgements; References; Index.

  15. Very high precision and accuracy analysis of triple isotopic ratios of water. A critical instrumentation comparison study.

    Gkinis, Vasileios; Holme, Christian; Morris, Valerie; Thayer, Abigail Grace; Vaughn, Bruce; Kjaer, Helle Astrid; Vallelonga, Paul; Simonsen, Marius; Jensen, Camilla Marie; Svensson, Anders; Maffrezzoli, Niccolo; Vinther, Bo; Dallmayr, Remi

    2017-04-01

    We present a performance comparison study between two state of the art Cavity Ring Down Spectrometers (Picarro L2310-i, L2140-i). The comparison took place during the Continuous Flow Analysis (CFA) campaign for the measurement of the Renland ice core, over a period of three months. Instant and complete vaporisation of the ice core melt stream, as well as of in-house water reference materials is achieved by accurate control of microflows of liquid into a homemade calibration system by following simple principles of the Hagen-Poiseuille law. Both instruments share the same vaporisation unit in a configuration that minimises sample preparation discrepancies between the two analyses. We describe our SMOW-SLAP calibration and measurement protocols for such a CFA application and present quality control metrics acquired during the full period of the campaign on a daily basis. The results indicate an unprecedented performance for all 3 isotopic ratios (δ2H, δ17O, δ18O ) in terms of precision, accuracy and resolution. We also comment on the precision and accuracy of the second order excess parameters of HD16O and H217O over H218O (Dxs, Δ17O ). To our knowledge these are the first reported CFA measurements at this level of precision and accuracy for all three isotopic ratios. Differences on the performance of the two instruments are carefully assessed during the measurement and reported here. Our quality control protocols extend to the area of low water mixing ratios, a regime in which often atmospheric vapour measurements take place and Cavity Ring Down Analysers show a poorer performance due to the lower signal to noise ratios. We address such issues and propose calibration protocols from which water vapour isotopic analyses can benefit from.

  16. Global analysis of general SU(2) x SU(2) x U(1) models with precision data

    Hsieh, Ken; Yu, Jiang-Hao; Yuan, C.P. [Michigan State Univ., East Lansing, MI (United States). Dept. of Physics and Astronomy; Schmitz, Kai [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Michigan State Univ., East Lansing, MI (United States). Dept. of Physics and Astronomy

    2010-05-15

    We present the results of a global analysis of a class of models with an extended electroweak gauge group of the form SU(2) x SU(2) x U(1), often denoted as G(221) models, which include as examples the left-right, the lepto-phobic, the hadro-phobic, the fermio-phobic, the un-unified, and the non-universal models. Using an effective Lagrangian approach, we compute the shifts to the coeffcients in the electroweak Lagrangian due to the new heavy gauge bosons, and obtain the lower bounds on the masses of the Z' and W' bosons. The analysis of the electroweak parameter bounds reveals a consistent pattern of several key observables that are especially sensitive to the effects of new physics and thus dominate the overall shape of the respective parameter contours. (orig.)

  17. Global analysis of general SU(2) x SU(2) x U(1) models with precision data

    Hsieh, Ken; Yu, Jiang-Hao; Yuan, C.P.; Schmitz, Kai; Michigan State Univ., East Lansing, MI

    2010-05-01

    We present the results of a global analysis of a class of models with an extended electroweak gauge group of the form SU(2) x SU(2) x U(1), often denoted as G(221) models, which include as examples the left-right, the lepto-phobic, the hadro-phobic, the fermio-phobic, the un-unified, and the non-universal models. Using an effective Lagrangian approach, we compute the shifts to the coeffcients in the electroweak Lagrangian due to the new heavy gauge bosons, and obtain the lower bounds on the masses of the Z' and W' bosons. The analysis of the electroweak parameter bounds reveals a consistent pattern of several key observables that are especially sensitive to the effects of new physics and thus dominate the overall shape of the respective parameter contours. (orig.)

  18. Precise estimation of HPHT nanodiamond size distribution based on transmission electron microscopy image analysis

    Řehoř, Ivan; Cígler, Petr

    2014-01-01

    Roč. 46, Jun (2014), s. 21-24 ISSN 0925-9635 R&D Projects: GA ČR GAP108/12/0640; GA MŠk(CZ) LH11027 Grant - others:OPPK(CZ) CZ.2.16/3.1.00/24016 Institutional support: RVO:61388963 Keywords : TEM * nanoparticles * nanodiamonds * size distribution * high-pressure high-temperature * image analysis Subject RIV: CC - Organic Chemistry Impact factor: 1.919, year: 2014

  19. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... the entire BPMN language, allow for more complex annotations and ultimately to automatically synthesize workflows by composing predefined subprocesses, in order to achieve a configuration that is optimal for parameters of interest....

  20. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  1. An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking

    Jože Guna

    2014-02-01

    Full Text Available We present the results of an evaluation of the performance of the Leap Motion Controller with the aid of a professional, high-precision, fast motion tracking system. A set of static and dynamic measurements was performed with different numbers of tracking objects and configurations. For the static measurements, a plastic arm model simulating a human arm was used. A set of 37 reference locations was selected to cover the controller’s sensory space. For the dynamic measurements, a special V-shaped tool, consisting of two tracking objects maintaining a constant distance between them, was created to simulate two human fingers. In the static scenario, the standard deviation was less than 0.5 mm. The linear correlation revealed a significant increase in the standard deviation when moving away from the controller. The results of the dynamic scenario revealed the inconsistent performance of the controller, with a significant drop in accuracy for samples taken more than 250 mm above the controller’s surface. The Leap Motion Controller undoubtedly represents a revolutionary input device for gesture-based human-computer interaction; however, due to its rather limited sensory space and inconsistent sampling frequency, in its current configuration it cannot currently be used as a professional tracking system.

  2. An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking.

    Guna, Jože; Jakus, Grega; Pogačnik, Matevž; Tomažič, Sašo; Sodnik, Jaka

    2014-02-21

    We present the results of an evaluation of the performance of the Leap Motion Controller with the aid of a professional, high-precision, fast motion tracking system. A set of static and dynamic measurements was performed with different numbers of tracking objects and configurations. For the static measurements, a plastic arm model simulating a human arm was used. A set of 37 reference locations was selected to cover the controller's sensory space. For the dynamic measurements, a special V-shaped tool, consisting of two tracking objects maintaining a constant distance between them, was created to simulate two human fingers. In the static scenario, the standard deviation was less than 0.5 mm. The linear correlation revealed a significant increase in the standard deviation when moving away from the controller. The results of the dynamic scenario revealed the inconsistent performance of the controller, with a significant drop in accuracy for samples taken more than 250 mm above the controller's surface. The Leap Motion Controller undoubtedly represents a revolutionary input device for gesture-based human-computer interaction; however, due to its rather limited sensory space and inconsistent sampling frequency, in its current configuration it cannot currently be used as a professional tracking system.

  3. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  4. GNSS global real-time augmentation positioning: Real-time precise satellite clock estimation, prototype system construction and performance analysis

    Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang

    2018-01-01

    Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm

  5. Precise and fast spatial-frequency analysis using the iterative local Fourier transform.

    Lee, Sukmock; Choi, Heejoo; Kim, Dae Wook

    2016-09-19

    The use of the discrete Fourier transform has decreased since the introduction of the fast Fourier transform (fFT), which is a numerically efficient computing process. This paper presents the iterative local Fourier transform (ilFT), a set of new processing algorithms that iteratively apply the discrete Fourier transform within a local and optimal frequency domain. The new technique achieves 210 times higher frequency resolution than the fFT within a comparable computation time. The method's superb computing efficiency, high resolution, spectrum zoom-in capability, and overall performance are evaluated and compared to other advanced high-resolution Fourier transform techniques, such as the fFT combined with several fitting methods. The effectiveness of the ilFT is demonstrated through the data analysis of a set of Talbot self-images (1280 × 1024 pixels) obtained with an experimental setup using grating in a diverging beam produced by a coherent point source.

  6. Precision Airdrop (Largage de precision)

    2005-12-01

    NAVIGATION TO A PRECISION AIRDROP OVERVIEW RTO-AG-300-V24 2 - 9 the point from various compass headings. As the tests are conducted, the resultant...rate. This approach avoids including a magnetic compass for the heading reference, which has difficulties due to local changes in the magnetic field...Scientifica della Difesa ROYAUME-UNI Via XX Settembre 123 Dstl Knowledge Services ESPAGNE 00187 Roma Information Centre, Building 247 SDG TECEN / DGAM

  7. Extending Track Analysis from Animals in the Lab to Moving Objects Anywhere

    Dommelen, W. van; Laar, P.J.L.J. van de; Noldus, L.P.J.J.

    2013-01-01

    In this chapter we compare two application domains in which the tracking of objects and the analysis of their movements are core activities, viz. animal tracking and vessel tracking. More specifically, we investigate whether EthoVision XT, a research tool for video tracking and analysis of the

  8. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  9. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  10. SEGMENT OF FINANCIAL CORPORATIONS AS AN OBJECT OF FINANCIAL AND STATISTICAL ANALYSIS

    Marat F. Mazitov

    2013-01-01

    The article is devoted to the study specific features of the formation and change of economic assets of financial corporations as an object of management and financial analysis. He author identifies the features and gives the classification of institutional units belonging to the sector of financial corporations from the viewpoint of assessment and financial analysis of the flows, reflecting change of their assets.

  11. The economic case for precision medicine.

    Gavan, Sean P; Thompson, Alexander J; Payne, Katherine

    2018-01-01

    Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.

  12. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    Hoelttae, P.; Rosenberg, R.J.

    1987-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results are compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are also presented. (author) 3 refs.; 4 tabs

  13. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    Hoelttae, P.; Rosenberg, R.J.

    1986-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are presented. (author)

  14. Research for developing precise tsunami evaluation methods. Probabilistic tsunami hazard analysis/numerical simulation method with dispersion and wave breaking

    2007-01-01

    The present report introduces main results of investigations on precise tsunami evaluation methods, which were carried out from the viewpoint of safety evaluation for nuclear power facilities and deliberated by the Tsunami Evaluation Subcommittee. A framework for the probabilistic tsunami hazard analysis (PTHA) based on logic tree is proposed and calculation on the Pacific side of northeastern Japan is performed as a case study. Tsunami motions with dispersion and wave breaking were investigated both experimentally and numerically. The numerical simulation method is verified for its practicability by applying to a historical tsunami. Tsunami force is also investigated and formulae of tsunami pressure acting on breakwaters and on building due to inundating tsunami are proposed. (author)

  15. Isolation and genetic analysis of pure cells from forensic biological mixtures: The precision of a digital approach.

    Fontana, F; Rapone, C; Bregola, G; Aversa, R; de Meo, A; Signorini, G; Sergio, M; Ferrarini, A; Lanzellotto, R; Medoro, G; Giorgini, G; Manaresi, N; Berti, A

    2017-07-01

    Latest genotyping technologies allow to achieve a reliable genetic profile for the offender identification even from extremely minute biological evidence. The ultimate challenge occurs when genetic profiles need to be retrieved from a mixture, which is composed of biological material from two or more individuals. In this case, DNA profiling will often result in a complex genetic profile, which is then subject matter for statistical analysis. In principle, when more individuals contribute to a mixture with different biological fluids, their single genetic profiles can be obtained by separating the distinct cell types (e.g. epithelial cells, blood cells, sperm), prior to genotyping. Different approaches have been investigated for this purpose, such as fluorescent-activated cell sorting (FACS) or laser capture microdissection (LCM), but currently none of these methods can guarantee the complete separation of different type of cells present in a mixture. In other fields of application, such as oncology, DEPArray™ technology, an image-based, microfluidic digital sorter, has been widely proven to enable the separation of pure cells, with single-cell precision. This study investigates the applicability of DEPArray™ technology to forensic samples analysis, focusing on the resolution of the forensic mixture problem. For the first time, we report here the development of an application-specific DEPArray™ workflow enabling the detection and recovery of pure homogeneous cell pools from simulated blood/saliva and semen/saliva mixtures, providing full genetic match with genetic profiles of corresponding donors. In addition, we assess the performance of standard forensic methods for DNA quantitation and genotyping on low-count, DEPArray™-isolated cells, showing that pure, almost complete profiles can be obtained from as few as ten haploid cells. Finally, we explore the applicability in real casework samples, demonstrating that the described approach provides complete

  16. A functional analysis of photo-object matching skills of severely retarded adolescents.

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photos and the objects. Only one student demonstrated photo-object matching. The results of the four students who failed to demonstrate photo-object matching suggested that physical properties of photos (flat, rectangular) and depth dimensions of objects may exert more control over matching than the similarities of the objects and images within the photos. An analysis of figure-ground variables was conducted to provide an empirical basis for program development in the use of pictures. In one series of tests, rectangular shape and background were removed by cutting out the figures in the photos. The edge shape of the photo and the edge shape of the image were then identical. The results suggest that photo-object matching may be facilitated by using cut-out figures rather than the complete rectangular photo.

  17. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    2017-02-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) is a novel information framework developed...prototyping. It supports dynamic plugin of analysis modules, for either research or analysis tasks. The framework integrates multiple image processing...Requirements 2 3. Installing the Software for IOAIDE 2 3.1 Load ARL Software 2 3.2 Load ARL Applications 4 3.3 Load the DSPro Software 7 3.4 Update Java

  18. The Making of Paranormal Belief: History, Discourse Analysis and the Object of Belief

    White, Lewis

    2013-01-01

    The present study comprises a discursive analysis of a cognitive phenomenon, paranormal beliefs. A discursive psychological approach to belief highlights that an important component of the cognitivist work has been how the object of paranormal belief has been defined in formal study. Using discourse analysis, as developed as a method in the history of psychology, this problem is explored through analysis of published scales. The findings highlight three rhetorical themes that are deployed in ...

  19. Using Epistemic Network Analysis to understand core topics as planned learning objectives

    Allsopp, Benjamin Brink; Dreyøe, Jonas; Misfeldt, Morten

    Epistemic Network Analysis is a tool developed by the epistemic games group at the University of Wisconsin Madison for tracking the relations between concepts in students discourse (Shaffer 2017). In our current work we are applying this tool to learning objectives in teachers digital preparation....... The danish mathematics curriculum is organised in six competencies and three topics. In the recently implemented learning platforms teacher choose which of the mathematical competencies that serves as objective for a specific lesson or teaching sequence. Hence learning objectives for lessons and teaching...... sequences are defining a network of competencies, where two competencies are closely related of they often are part of the same learning objective or teaching sequence. We are currently using Epistemic Network Analysis to study these networks. In the poster we will include examples of different networks...

  20. An Analysis on Usage Preferences of Learning Objects and Learning Object Repositories among Pre-Service Teachers

    Yeni, Sabiha; Ozdener, Nesrin

    2014-01-01

    The purpose of the study is to investigate how pre-service teachers benefit from learning objects repositories while preparing course content. Qualitative and quantitative data collection methods were used in a mixed methods approach. This study was carried out with 74 teachers from the Faculty of Education. In the first phase of the study,…

  1. GPR Detection of Buried Symmetrically Shaped Mine-like Objects using Selective Independent Component Analysis

    Karlsen, Brian; Sørensen, Helge Bjarup Dissing; Larsen, Jan

    2003-01-01

    from small-scale anti-personal (AP) mines to large-scale anti-tank (AT) mines were designed. Large-scale SF-GPR measurements on this series of mine-like objects buried in soil were performed. The SF-GPR data was acquired using a wideband monostatic bow-tie antenna operating in the frequency range 750......This paper addresses the detection of mine-like objects in stepped-frequency ground penetrating radar (SF-GPR) data as a function of object size, object content, and burial depth. The detection approach is based on a Selective Independent Component Analysis (SICA). SICA provides an automatic...... ranking of components, which enables the suppression of clutter, hence extraction of components carrying mine information. The goal of the investigation is to evaluate various time and frequency domain ICA approaches based on SICA. Performance comparison is based on a series of mine-like objects ranging...

  2. 3D object-oriented image analysis in 3D geophysical modelling

    Fadel, I.; van der Meijde, M.; Kerle, N.

    2015-01-01

    Non-uniqueness of satellite gravity interpretation has traditionally been reduced by using a priori information from seismic tomography models. This reduction in the non-uniqueness has been based on velocity-density conversion formulas or user interpretation of the 3D subsurface structures (objects......) based on the seismic tomography models and then forward modelling these objects. However, this form of object-based approach has been done without a standardized methodology on how to extract the subsurface structures from the 3D models. In this research, a 3D object-oriented image analysis (3D OOA......) approach was implemented to extract the 3D subsurface structures from geophysical data. The approach was applied on a 3D shear wave seismic tomography model of the central part of the East African Rift System. Subsequently, the extracted 3D objects from the tomography model were reconstructed in the 3D...

  3. Precision digital control systems

    Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.

    This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.

  4. Feasibility analysis of CNP 1000 computerized I and C system design objectives

    Zhang Mingguang; Xu Jijun; Zhang Qinshen

    2000-01-01

    The author states the design objectives of the computerized I and C (CIC) system and advanced main control room (AMCR), which could and should be achieved in CNP 1000, based on the national 1E computer production technology including software and hardware, and current instrumentation and control design technique of nuclear power plant. The feasibility analysis on the design objectives and the reasons or necessity to do the design research projects have been described. The objectives of design research on CIC and AMCR as well as the self-design proficiency after the design research have been given

  5. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  6. Error analysis of motion correction method for laser scanning of moving objects

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  7. Topological situational analysis and synthesis of strategies of object management in the conditions of conflict, uncertainty of behaviour and varible amount of the observed objects

    Віктор Володимирович Семко

    2016-09-01

    Full Text Available The conflict of cooperation of objects is considered in observation space as integral phenomenon with the certain variety of types of connections between its elements, objects, systems and environment that erected in a single theoretical conception and comprehensively and deeply determine the real features of object of researches. Methodology of system-structural analysis of conflict is used as research of the phenomenon in the whole and system-functional analysis as research with the aim of determination of all basic intercommunications with an environment

  8. Real-time analysis of δ13C- and δD-CH4 by high precision laser spectroscopy

    Eyer, Simon; Emmenegger, Lukas; Tuzson, Béla; Fischer, Hubertus; Mohn, Joachim

    2014-05-01

    Methane (CH4) is the most important non-CO2 greenhouse gas (GHG) contributing 18% to total radiative forcing. Anthropogenic sources (e.g. ruminants, landfills) contribute 60% to total emissions and led to an increase in its atmospheric mixing ratio from 700 ppb in pre-industrial times to 1819 ± 1 ppb in 2012 [1]. Analysis of the most abundant methane isotopologues 12CH4, 13CH4 and 12CH3D can be used to disentangle the various source/sink processes [2] and to develop target oriented reduction strategies. High precision isotopic analysis of CH4 can be accomplished by isotope-ratio mass-spectrometry (IRMS) [2] and more recently by mid-infrared laser-based spectroscopic techniques. For high precision measurements in ambient air, however, both techniques rely on preconcentration of the target gas [3]. In an on-going project, we developed a fully-automated, field-deployable CH4 preconcentration unit coupled to a dual quantum cascade laser absorption spectrometer (QCLAS) for real-time analysis of CH4 isotopologues. The core part of the rack-mounted (19 inch) device is a highly-efficient adsorbent trap attached to a motorized linear drive system and enclosed in a vacuum chamber. Thereby, the adsorbent trap can be decoupled from the Stirling cooler during desorption for fast desorption and optimal heat management. A wide variety of adsorbents, including: HayeSep D, molecular sieves as well as the novel metal-organic frameworks and carbon nanotubes were characterized regarding their surface area, isosteric enthalpy of adsorption and selectivity for methane over nitrogen. The most promising candidates were tested on the preconcentration device and a preconcentration by a factor > 500 was obtained. Furthermore analytical interferants (e.g. N2O, CO2) are separated by step-wise desorption of trace gases. A QCL absorption spectrometer previously described by Tuzson et al. (2010) for CH4 flux measurements was modified to obtain a platform for high precision and simultaneous

  9. Exergoeconomic multi objective optimization and sensitivity analysis of a regenerative Brayton cycle

    Naserian, Mohammad Mahdi; Farahat, Said; Sarhaddi, Faramarz

    2016-01-01

    Highlights: • Finite time exergoeconomic multi objective optimization of a Brayton cycle. • Comparing the exergoeconomic and the ecological function optimization results. • Inserting the cost of fluid streams concept into finite-time thermodynamics. • Exergoeconomic sensitivity analysis of a regenerative Brayton cycle. • Suggesting the cycle performance curve drawing and utilization. - Abstract: In this study, the optimal performance of a regenerative Brayton cycle is sought through power maximization and then exergoeconomic optimization using finite-time thermodynamic concept and finite-size components. Optimizations are performed using genetic algorithm. In order to take into account the finite-time and finite-size concepts in current problem, a dimensionless mass-flow parameter is used deploying time variations. The decision variables for the optimum state (of multi objective exergoeconomic optimization) are compared to the maximum power state. One can see that the multi objective exergoeconomic optimization results in a better performance than that obtained with the maximum power state. The results demonstrate that system performance at optimum point of multi objective optimization yields 71% of the maximum power, but only with exergy destruction as 24% of the amount that is produced at the maximum power state and 67% lower total cost rate than that of the maximum power state. In order to assess the impact of the variation of the decision variables on the objective functions, sensitivity analysis is conducted. Finally, the cycle performance curve drawing according to exergoeconomic multi objective optimization results and its utilization, are suggested.

  10. Worst-case execution time analysis-driven object cache design

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  11. Automated quantification and sizing of unbranched filamentous cyanobacteria by model based object oriented image analysis

    Zeder, M; Van den Wyngaert, S; Köster, O; Felder, K M; Pernthaler, J

    2010-01-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-...

  12. ART OF METALLOGRAPHY: POSSIBILITIES OF DARK-FIELD MICROSCOPY APPLICATION FOR COLORED OBJECTS STRUCTURE ANALYSIS

    A. G. Anisovich

    2015-01-01

    Full Text Available The application of the method of dark field microscopy for the study of colored objects of material technology was researched. The capability of corrosive damage analysis and determination of the thickness of the metal coating were demonstrated. The performance capability of analysis of «reflection» in the dark field during the study of non-metallic materials – orthopedic implants and fireclay refractory were tested. An example of defect detection of carbon coating was displayed.

  13. Featureous: infrastructure for feature-centric analysis of object-oriented software

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  14. Context-based object-of-interest detection for a generic traffic surveillance analysis system

    Bao, X.; Javanbakhti, S.; Zinger, S.; Wijnhoven, R.G.J.; With, de P.H.N.

    2014-01-01

    We present a new traffic surveillance video analysis system, focusing on building a framework with robust and generic techniques, based on both scene understanding and moving object-of-interest detection. Since traffic surveillance is widely applied, we want to design a single system that can be

  15. Analysis of Various Multi-Objective Optimization Evolutionary Algorithms for Monte Carlo Treatment Planning System

    Tydrichova, Magdalena

    2017-01-01

    In this project, various available multi-objective optimization evolutionary algorithms were compared considering their performance and distribution of solutions. The main goal was to select the most suitable algorithms for applications in cancer hadron therapy planning. For our purposes, a complex testing and analysis software was developed. Also, many conclusions and hypothesis have been done for the further research.

  16. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    Schuts, Emelie C.; Hulscher, Marlies E. J. L.; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W. T. Cohen; Overdiek, Hans W. P. M.; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M. P. M.; Wolfs, Tom F. W.; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes: clinical outcomes,

  17. Current evidence on hospital antimicrobial stewardship objectives : A systematic review and meta-analysis

    Schuts, Emelie C.; Hulscher, Marlies E J L; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W T Cohen; Overdiek, Hans W P M; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M P M; Wolfs, Tom F W; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  18. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    Schuts, E.C.; Hulscher, M.E.J.L.; Mouton, J.W.; Verduin, C.M.; Stuart, J.W.; Overdiek, H.W.; Linden, P.D. van der; Natsch, S.S.; Hertogh, C.M.; Wolfs, T.F.; Schouten, J.A.; Kullberg, B.J.; Prins, J.M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  19. OBJECTIVE EVALUATION OF HYPERACTIVATED MOTILITY IN RAT SPERMATOZA USING COMPUTER-ASSISTED SPERM ANALYSIS (CASA)

    Objective evaluation of hyperactivated motility in rat spermatozoa using computer-assisted sperm analysis.Cancel AM, Lobdell D, Mendola P, Perreault SD.Toxicology Program, University of North Carolina, Chapel Hill, NC 27599, USA.The aim of this study was t...

  20. Analysis of micro computed tomography images; a look inside historic enamelled metal objects

    van der Linden, Veerle; van de Casteele, Elke; Thomas, Mienke Simon; de Vos, Annemie; Janssen, Elsje; Janssens, Koen

    2010-02-01

    In this study the usefulness of micro-Computed Tomography (µ-CT) for the in-depth analysis of enamelled metal objects was tested. Usually investigations of enamelled metal artefacts are restricted to non-destructive surface analysis or analysis of cross sections after destructive sampling. Radiography, a commonly used technique in the field of cultural heritage studies, is limited to providing two-dimensional information about a three-dimensional object (Lang and Middleton, Radiography of Cultural Material, pp. 60-61, Elsevier-Butterworth-Heinemann, Amsterdam-Stoneham-London, 2005). Obtaining virtual slices and information about the internal structure of these objects was made possible by CT analysis. With this technique the underlying metal work was studied without removing the decorative enamel layer. Moreover visible defects such as cracks were measured in both width and depth and as of yet invisible defects and weaker areas are visualised. All these features are of great interest to restorers and conservators as they allow a view inside these objects without so much as touching them.

  1. High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis

    MacFaden, Sean W.; O'Neil-Dunne, Jarlath P. M.; Royar, Anna R.; Lu, Jacqueline W. T.; Rundle, Andrew G.

    2012-01-01

    Urban tree canopy is widely believed to have myriad environmental, social, and human-health benefits, but a lack of precise canopy estimates has hindered quantification of these benefits in many municipalities. This problem was addressed for New York City using object-based image analysis (OBIA) to develop a comprehensive land-cover map, including tree canopy to the scale of individual trees. Mapping was performed using a rule-based expert system that relied primarily on high-resolution LIDAR, specifically its capacity for evaluating the height and texture of aboveground features. Multispectral imagery was also used, but shadowing and varying temporal conditions limited its utility. Contextual analysis was a key part of classification, distinguishing trees according to their physical and spectral properties as well as their relationships to adjacent, nonvegetated features. The automated product was extensively reviewed and edited via manual interpretation, and overall per-pixel accuracy of the final map was 96%. Although manual editing had only a marginal effect on accuracy despite requiring a majority of project effort, it maximized aesthetic quality and ensured the capture of small, isolated trees. Converting high-resolution LIDAR and imagery into usable information is a nontrivial exercise, requiring significant processing time and labor, but an expert system-based combination of OBIA and manual review was an effective method for fine-scale canopy mapping in a complex urban environment.

  2. Precise outage analysis of mixed RF/unified-FSO DF relaying with HD and 2 IM-DD channel models

    Al-Ebraheemy, Omer Mahmoud S.; Salhab, Anas M.; Chaaban, Anas; Zummo, Salam A.; Alouini, Mohamed-Slim

    2017-01-01

    This paper derives and analyzes the outage probability of mixed radio frequency (RF)/unified free space optical (FSO) dual-hop decode-and-forward (DF) relaying scheme, where heterodyne detection (HD) and intensity modulation-direct detection (IM-DD) are considered for FSO detection. In doing that, we correctly utilize, for the first time to the best of our knowledge, a precise channel capacity result for the IM-DD channel. Moreover, this is the first time that not only the (IM-DD input-independent) but also the (IM-DD cost-dependent) AWGN channel is considered in such system analysis. This work assumes that the first hop (RF link) follows Naka-gami-m fading, while the second hop (FSO link) follows Málaga (M) turbulence with pointing error. These fading and turbulence models include other ones (such as Rayleigh fading and Gamma-Gamma turbulence) as special cases, so our analysis can be considered as a generalized one from both RF and FSO fading models point of view. Additionally, the system outage probability is investigated asymptotically in high signal-to-noise ratio (SNR) regime, where a new non-reported diversity order and coding gain analysis are shown. Interestingly, we find that in the FSO hop, based on SNR, the HD or IM-DD cost-dependent results in a same diversity order which is twice the one of IM-DD input-independent. However, based on transmitted power all these FSO detectors result in a same diversity order. Furthermore, we offer simulation results which confirm the derived exact and asymptotic expressions.

  3. NiftyPET: a High-throughput Software Platform for High Quantitative Accuracy and Precision PET Imaging and Analysis.

    Markiewicz, Pawel J; Ehrhardt, Matthias J; Erlandsson, Kjell; Noonan, Philip J; Barnes, Anna; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Ourselin, Sebastien

    2018-01-01

    We present a standalone, scalable and high-throughput software platform for PET image reconstruction and analysis. We focus on high fidelity modelling of the acquisition processes to provide high accuracy and precision quantitative imaging, especially for large axial field of view scanners. All the core routines are implemented using parallel computing available from within the Python package NiftyPET, enabling easy access, manipulation and visualisation of data at any processing stage. The pipeline of the platform starts from MR and raw PET input data and is divided into the following processing stages: (1) list-mode data processing; (2) accurate attenuation coefficient map generation; (3) detector normalisation; (4) exact forward and back projection between sinogram and image space; (5) estimation of reduced-variance random events; (6) high accuracy fully 3D estimation of scatter events; (7) voxel-based partial volume correction; (8) region- and voxel-level image analysis. We demonstrate the advantages of this platform using an amyloid brain scan where all the processing is executed from a single and uniform computational environment in Python. The high accuracy acquisition modelling is achieved through span-1 (no axial compression) ray tracing for true, random and scatter events. Furthermore, the platform offers uncertainty estimation of any image derived statistic to facilitate robust tracking of subtle physiological changes in longitudinal studies. The platform also supports the development of new reconstruction and analysis algorithms through restricting the axial field of view to any set of rings covering a region of interest and thus performing fully 3D reconstruction and corrections using real data significantly faster. All the software is available as open source with the accompanying wiki-page and test data.

  4. Precise outage analysis of mixed RF/unified-FSO DF relaying with HD and 2 IM-DD channel models

    Al-Ebraheemy, Omer Mahmoud S.

    2017-07-20

    This paper derives and analyzes the outage probability of mixed radio frequency (RF)/unified free space optical (FSO) dual-hop decode-and-forward (DF) relaying scheme, where heterodyne detection (HD) and intensity modulation-direct detection (IM-DD) are considered for FSO detection. In doing that, we correctly utilize, for the first time to the best of our knowledge, a precise channel capacity result for the IM-DD channel. Moreover, this is the first time that not only the (IM-DD input-independent) but also the (IM-DD cost-dependent) AWGN channel is considered in such system analysis. This work assumes that the first hop (RF link) follows Naka-gami-m fading, while the second hop (FSO link) follows Málaga (M) turbulence with pointing error. These fading and turbulence models include other ones (such as Rayleigh fading and Gamma-Gamma turbulence) as special cases, so our analysis can be considered as a generalized one from both RF and FSO fading models point of view. Additionally, the system outage probability is investigated asymptotically in high signal-to-noise ratio (SNR) regime, where a new non-reported diversity order and coding gain analysis are shown. Interestingly, we find that in the FSO hop, based on SNR, the HD or IM-DD cost-dependent results in a same diversity order which is twice the one of IM-DD input-independent. However, based on transmitted power all these FSO detectors result in a same diversity order. Furthermore, we offer simulation results which confirm the derived exact and asymptotic expressions.

  5. Feasibility study for objective oriented design of system thermal hydraulic analysis program

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. In this work, objective oriented program for system safety analysis code has been tried utilizing modernized C language. The analysis, design, implementation and verification steps for OOP system code development are described with some implementation examples. The system code SYSTF based on three-fluid thermal hydraulic solver has been developed by OOP design. The verifications of feasibility are performed with simple fundamental problems and plant models. (author)

  6. COMBO-FISH Enables High Precision Localization Microscopy as a Prerequisite for Nanostructure Analysis of Genome Loci

    Rainer Kaufmann

    2010-10-01

    Full Text Available With the completeness of genome databases, it has become possible to develop a novel FISH (Fluorescence in Situ Hybridization technique called COMBO-FISH (COMBinatorial Oligo FISH. In contrast to other FISH techniques, COMBO-FISH makes use of a bioinformatics approach for probe set design. By means of computer genome database searching, several oligonucleotide stretches of typical lengths of 15–30 nucleotides are selected in such a way that all uniquely colocalize at the given genome target. The probes applied here were Peptide Nucleic Acids (PNAs—synthetic DNA analogues with a neutral backbone—which were synthesized under high purity conditions. For a probe repetitively highlighted in centromere 9, PNAs labeled with different dyes were tested, among which Alexa 488Ò showed reversible photobleaching (blinking between dark and bright state a prerequisite for the application of SPDM (Spectral Precision Distance/Position Determination Microscopy a novel technique of high resolution fluorescence localization microscopy. Although COMBO-FISH labeled cell nuclei under SPDM conditions sometimes revealed fluorescent background, the specific locus was clearly discriminated by the signal intensity and the resulting localization accuracy in the range of 10–20 nm for a detected oligonucleotide stretch. The results indicate that COMBO-FISH probes with blinking dyes are well suited for SPDM, which will open new perspectives on molecular nanostructural analysis of the genome.

  7. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  8. The Analysis of Object-Based Change Detection in Mining Area: a Case Study with Pingshuo Coal Mine

    Zhang, M.; Zhou, W.; Li, Y.

    2017-09-01

    Accurate information on mining land use and land cover change are crucial for monitoring and environmental change studies. In this paper, RapidEye Remote Sensing Image (Map 2012) and SPOT7 Remote Sensing Image (Map 2015) in Pingshuo Mining Area are selected to monitor changes combined with object-based classification and change vector analysis method, we also used R in highresolution remote sensing image for mining land classification, and found the feasibility and the flexibility of open source software. The results show that (1) the classification of reclaimed mining land has higher precision, the overall accuracy and kappa coefficient of the classification of the change region map were 86.67 % and 89.44 %. It's obvious that object-based classification and change vector analysis which has a great significance to improve the monitoring accuracy can be used to monitor mining land, especially reclaiming mining land; (2) the vegetation area changed from 46 % to 40 % accounted for the proportion of the total area from 2012 to 2015, and most of them were transformed into the arable land. The sum of arable land and vegetation area increased from 51 % to 70 %; meanwhile, build-up land has a certain degree of increase, part of the water area was transformed into arable land, but the extent of the two changes is not obvious. The result illustrated the transformation of reclaimed mining area, at the same time, there is still some land convert to mining land, and it shows the mine is still operating, mining land use and land cover are the dynamic procedure.

  9. Art, historical and cultural heritage objects studied with different non-destructive analysis

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M.

    2012-01-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  10. Art, historical and cultural heritage objects studied with different non-destructive analysis

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M. [Universidade de Sao Paulo (IF/USP), SP (Brazil). Inst. de Fisica

    2012-07-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  11. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  12. Context based Coding of Binary Shapes by Object Boundary Straightness Analysis

    Aghito, Shankar Manuel; Forchhammer, Søren

    2004-01-01

    A new lossless compression scheme for bilevel images targeted at binary shapes of image and video objects is presented. The scheme is based on a local analysis of the digital straightness of the causal part of the object boundary, which is used in the context definition for arithmetic encoding....... Tested on individual images of binary shapes and binary layers of digital maps the algorithm outperforms PWC, JBIG and MPEG-4 CAE. On the binary shapes the code lengths are reduced by 21%, 25%, and 42%, respectively. On the maps the reductions are 34%, 32%, and 59%, respectively. The algorithm is also...

  13. The precise measurement of TL isotopic compositions by MC-ICPMS: Application to the analysis of geological materials and meteorites.

    Rehkämper, Mark; Halliday, Alex N.

    1999-07-01

    The precision of Tl isotopic measurements by thermal ionization mass spectrometry (TIMS) is severely limited by the fact that Tl possesses only two naturally occurring isotopes, such that there is no invariant isotope ratio that can be used to correct for instrumental mass discrimination. In this paper we describe new chemical and mass spectrometric techniques for the determination of Tl isotopic compositions at a level of precision hitherto unattained. Thallium is first separated from the geological matrix using a two-stage anion-exchange procedure. Thallium isotopic compositions are then determined by multiple-collector inductively coupled plasma-mass spectrometry with correction for mass discrimination using the known isotopic composition of Pb that is admixed to the sample solutions. With these procedures we achieve a precision of 0.01-0.02% for Tl isotope ratio measurements in geological samples and this is a factor of ≥3-4 better than the best published results by TIMS. However, without adequate precautions, experimental artifacts can be generated that result in apparent Tl isotopic fractionations of up to one per mil. Analysis of five terrestrial samples indicate the existence of Tl isotopic variations related to natural fractionation processes on the Earth. Two of the three igneous rocks analyzed in this study display Tl isotopic compositions indistinguishable from our laboratory standard, the reference material NIST-997 Tl. A third sample, however, is characterized by ɛ Tl ≈ 2.5 ± 1.5, where ɛ Tl represents the deviation of the 205Tl/ 203Tl ratio of the sample relative to NIST-997 Tl in parts per 10 4. Even larger deviations were identified for two ferromanganese crusts from the Pacific Ocean, which display ɛ Tl-values of +5.0 ± 1.5 and +11.7 ± 1.3. We suggest that the large variability of Tl isotopic compositions in the latter samples are caused by low-temperature processes related to the formation of the Fe-Mn crusts by precipitation and

  14. Precision medicine in myasthenia graves: begin from the data precision

    Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng

    2016-01-01

    Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759

  15. FEATUREOUS: AN INTEGRATED ENVIRONMENT FOR FEATURE-CENTRIC ANALYSIS AND MODIFICATION OF OBJECT-ORIENTED SOFTWARE

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2011-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand the implementations of user-observable program features and their respective interdependencies. As feature-centric program understanding and modification are essential during...... software maintenance and evolution, this situation needs to change. In this paper, we present Featureous, an integrated development environment built on top of the NetBeans IDE that facilitates feature-centric analysis of object-oriented software. Our integrated development environment encompasses...... a lightweight feature location mechanism, a number of reusable analytical views, and necessary APIs for supporting future extensions. The base of the integrated development environment is a conceptual framework comprising of three complementary dimensions of comprehension: perspective, abstraction...

  16. The application of the unified modeling language in object-oriented analysis of healthcare information systems.

    Aggarwal, Vinod

    2002-10-01

    This paper concerns itself with the beneficial effects of the Unified Modeling Language (UML), a nonproprietary object modeling standard, in specifying, visualizing, constructing, documenting, and communicating the model of a healthcare information system from the user's perspective. The author outlines the process of object-oriented analysis (OOA) using the UML and illustrates this with healthcare examples to demonstrate the practicality of application of the UML by healthcare personnel to real-world information system problems. The UML will accelerate advanced uses of object-orientation such as reuse technology, resulting in significantly higher software productivity. The UML is also applicable in the context of a component paradigm that promises to enhance the capabilities of healthcare information systems and simplify their management and maintenance.

  17. An Analysis of Periodic Components in BL Lac Object S5 0716 +714 with MUSIC Method

    Tang, J.

    2012-01-01

    Multiple signal classification (MUSIC) algorithms are introduced to the estimation of the period of variation of BL Lac objects.The principle of MUSIC spectral analysis method and theoretical analysis of the resolution of frequency spectrum using analog signals are included. From a lot of literatures, we have collected a lot of effective observation data of BL Lac object S5 0716 + 714 in V, R, I bands from 1994 to 2008. The light variation periods of S5 0716 +714 are obtained by means of the MUSIC spectral analysis method and periodogram spectral analysis method. There exist two major periods: (3.33±0.08) years and (1.24±0.01) years for all bands. The estimation of the period of variation of the algorithm based on the MUSIC spectral analysis method is compared with that of the algorithm based on the periodogram spectral analysis method. It is a super-resolution algorithm with small data length, and could be used to detect the period of variation of weak signals.

  18. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  19. Nurse-surgeon object transfer: video analysis of communication and situation awareness in the operating theatre.

    Korkiakangas, Terhi; Weldon, Sharon-Marie; Bezemer, Jeff; Kneebone, Roger

    2014-09-01

    One of the most central collaborative tasks during surgical operations is the passing of objects, including instruments. Little is known about how nurses and surgeons achieve this. The aim of the present study was to explore what factors affect this routine-like task, resulting in fast or slow transfer of objects. A qualitative video study, informed by an observational ethnographic approach, was conducted in a major teaching hospital in the UK. A total of 20 general surgical operations were observed. In total, approximately 68 h of video data have been reviewed. A subsample of 225 min has been analysed in detail using interactional video-analysis developed within the social sciences. Two factors affecting object transfer were observed: (1) relative instrument trolley position and (2) alignment. The scrub nurse's instrument trolley position (close to vs. further back from the surgeon) and alignment (gaze direction) impacts on the communication with the surgeon, and consequently, on the speed of object transfer. When the scrub nurse was standing close to the surgeon, and "converged" to follow the surgeon's movements, the transfer occurred more seamlessly and faster (1.0 s). The smoothness of object transfer can be improved by adjusting the scrub nurse's instrument trolley position, enabling a better monitoring of surgeon's bodily conduct and affording early orientation (awareness) to an upcoming request (changing situation). Object transfer is facilitated by the surgeon's embodied practices, which can elicit the nurse's attention to the request and, as a response, maximise a faster object transfer. A simple intervention to highlight the significance of these factors could improve communication in the operating theatre. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  1. Parametric analysis of energy quality management for district in China using multi-objective optimization approach

    Lu, Hai; Yu, Zitao; Alanne, Kari; Xu, Xu; Fan, Liwu; Yu, Han; Zhang, Liang; Martinac, Ivo

    2014-01-01

    Highlights: • A time-effective multi-objective design optimization scheme is proposed. • The scheme aims at exploring suitable 3E energy system for the specific case. • A realistic case located in China is used for the analysis. • Parametric study is investigated to test the effects of different parameters. - Abstract: Due to the increasing energy demands and global warming, energy quality management (EQM) for districts has been getting importance over the last few decades. The evaluation of the optimum energy systems for specific districts is an essential part of EQM. This paper presents a deep analysis of the optimum energy systems for a district sited in China. A multi-objective optimization approach based on Genetic Algorithm (GA) is proposed for the analysis. The optimization process aims to search for the suitable 3E (minimum economic cost and environmental burden as well as maximum efficiency) energy systems. Here, life cycle CO 2 equivalent (LCCO 2 ), life cycle cost (LCC) and exergy efficiency (EE) are set as optimization objectives. Then, the optimum energy systems for the Chinese case are presented. The final work is to investigate the effects of different energy parameters. The results show the optimum energy systems might vary significantly depending on some parameters

  2. Geographic Object-Based Image Analysis – Towards a new paradigm

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  3. The objective assessment of experts' and novices' suturing skills using an image analysis program.

    Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M

    2013-02-01

    To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.

  4. Fourier analysis of intracranial aneurysms: towards an objective and quantitative evaluation of the shape of aneurysms

    Rohde, Stefan; Lahmann, Katharina; Nafe, Reinhold; Yan, Bernard; Berkefeld, Joachim; Beck, Juergen; Raabe, Andreas

    2005-01-01

    Shape irregularities of intracranial aneurysms may indicate an increased risk of rupture. To quantify morphological differences, Fourier analysis of the shape of intracranial aneurysms was introduced. We compared the morphology of 45 unruptured (UIA) and 46 ruptured intracranial aneurysms (RIA) in 70 consecutive patients on the basis of 3D-rotational angiography. Fourier analysis, coefficient of roundness and qualitative shape assessment were determined for each aneurysm. Morphometric analysis revealed significantly smaller coefficient of roundness (P<0.02) and higher values for Fourier amplitudes numbers 2, 3 and 7 (P<0.01) in the RIA group, indicating more complex and irregular morphology in RIA. Qualitative assessment from 3D-reconstructions showed surface irregularities in 78% of RIA and 42% of UIA (P<0.05). Our data have shown significant differences in shape between RIA and UIA, and further developments of Fourier analysis may provide an objective factor for the assessment of the risk of rupture. (orig.)

  5. Chromatographic speciation of Cr(III)-species, inter-species equilibrium isotope fractionation and improved chemical purification strategies for high-precision isotope analysis

    Larsen, Kirsten Kolbjørn; Wielandt, Daniel Kim Peel; Schiller, Martin

    2016-01-01

    Chromatographic purification of chromium (Cr), which is required for high-precision isotope analysis, is complicated by the presence of multiple Cr-species with different effective charges in the acid digested sample aliquots. The differing ion exchange selectivity and sluggish reaction rates of ...

  6. A decision analysis approach for risk management of near-earth objects

    Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.

    2014-10-01

    Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in

  7. Neural regions supporting lexical processing of objects and actions: A case series analysis

    Bonnie L Breining

    2014-04-01

    Full Text Available Introduction. Linking semantic representations to lexical items is an important cognitive process for both producing and comprehending language. Past research has suggested that the bilateral anterior temporal lobes are critical for this process (e.g. Patterson, Nestor, & Rogers, 2007. However, the majority of studies focused on object concepts alone, ignoring actions. The few that considered actions suggest that the temporal poles are not critical for their processing (e.g. Kemmerer et al., 2010. In this case series, we investigated the neural substrates of linking object and action concepts to lexical labels by correlating the volume of defined regions of interest with behavioral performance on picture-word verification and picture naming tasks of individuals with primary progressive aphasia (PPA. PPA is a neurodegenerative condition with heterogeneous neuropathological causes, characterized by increasing language deficits for at least two years in the face of relatively intact cognitive function in other domains (Gorno-Tempini et al., 2011. This population displays appropriate heterogeneity of performance and focal atrophy for investigating the neural substrates involved in lexical semantic processing of objects and actions. Method. Twenty-one individuals with PPA participated in behavioral assessment within six months of high resolution anatomical MRI scans. Behavioral assessments consisted of four tasks: picture-word verification and picture naming of objects and actions. Performance on these assessments was correlated with brain volume measured using atlas-based analysis in twenty regions of interest that are commonly atrophied in PPA and implicated in language processing. Results. Impaired performance for all four tasks significantly correlated with atrophy in the right superior temporal pole, left anterior middle temporal gyrus, and left fusiform gyrus. No regions were identified in which volume correlated with performance for both

  8. Intellectual capital: approaches to analysis as an object of the internal environment of an economic entity

    O. E. Ustinova

    2017-01-01

    Full Text Available Intellectual capital is of strategic importance for a modern company. At the same time, its effective management, including a stimulating and creative approach to solving problems, will help to increase the competitiveness and development of economic entities. The article considers intellectual capital as an object of analysis of the internal environment. In the context of the proposed approaches to its study, its impact on the development of the company is also considered. The intellectual capital has a special significance and influence on internal processes, since on each of them the intellectual component allows to achieve a positive synergetic effect from the interaction of different objects. In more detail, it is proposed to consider it in terms of the position of the company it occupies on the market, the principles of its activities, the formation of marketing policies, the use of resources, methods and means of making managerial decisions, and the organizational culture formed. For the analysis of the state of the internal environment, the main approaches are proposed, in which the intellectual capital is considered, among them: methods for analyzing cash flows, economic efficiency and financial feasibility of the project, analysis of the consolidated financial flow by group of objects, assessment of the potential of the business entity, technology of choice of investment policy, technology Selection of incentive mechanisms. In this regard, it is advisable to analyze the company's internal environment from the position of influencing its state of intellectual capital. The scheme of interaction of intellectual capital and objects of an estimation of an internal environment of the managing subject is offered. The results of this study should be considered as initial data for the further development of the economic evaluation of the influence of intellectual capital on the competitiveness of companies.

  9. A Retrospective Analysis of Precision Medicine Outcomes in Patients With Advanced Cancer Reveals Improved Progression-Free Survival Without Increased Health Care Costs.

    Haslem, Derrick S; Van Norman, S Burke; Fulde, Gail; Knighton, Andrew J; Belnap, Tom; Butler, Allison M; Rhagunath, Sharanya; Newman, David; Gilbert, Heather; Tudor, Brian P; Lin, Karen; Stone, Gary R; Loughmiller, David L; Mishra, Pravin J; Srivastava, Rajendu; Ford, James M; Nadauld, Lincoln D

    2017-02-01

    The advent of genomic diagnostic technologies such as next-generation sequencing has recently enabled the use of genomic information to guide targeted treatment in patients with cancer, an approach known as precision medicine. However, clinical outcomes, including survival and the cost of health care associated with precision cancer medicine, have been challenging to measure and remain largely unreported. We conducted a matched cohort study of 72 patients with metastatic cancer of diverse subtypes in the setting of a large, integrated health care delivery system. We analyzed the outcomes of 36 patients who received genomic testing and targeted therapy (precision cancer medicine) between July 1, 2013, and January 31, 2015, compared with 36 historical control patients who received standard chemotherapy (n = 29) or best supportive care (n = 7). The average progression-free survival was 22.9 weeks for the precision medicine group and 12.0 weeks for the control group ( P = .002) with a hazard ratio of 0.47 (95% CI, 0.29 to 0.75) when matching on age, sex, histologic diagnosis, and previous lines of treatment. In a subset analysis of patients who received all care within the Intermountain Healthcare system (n = 44), per patient charges per week were $4,665 in the precision treatment group and $5,000 in the control group ( P = .126). These findings suggest that precision cancer medicine may improve survival for patients with refractory cancer without increasing health care costs. Although the results of this study warrant further validation, this precision medicine approach may be a viable option for patients with advanced cancer.

  10. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  11. Fast and objective detection and analysis of structures in downhole images

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  12. Multi-objective optimization of a cascade refrigeration system: Exergetic, economic, environmental, and inherent safety analysis

    Eini, Saeed; Shahhosseini, Hamidreza; Delgarm, Navid; Lee, Moonyong; Bahadori, Alireza

    2016-01-01

    Highlights: • A multi-objective optimization is performed for a cascade refrigeration cycle. • The optimization problem considers inherently safe design as well as 3E analysis. • As a measure of inherent safety level a quantitative risk analysis is utilized. • A CO 2 /NH 3 cascade refrigeration system is compared with a CO 2 /C 3 H 8 system. - Abstract: Inherently safer design is the new approach to maximize the overall safety of a process plant. This approach suggests some risk reduction strategies to be implemented in the early stages of design. In this paper a multi-objective optimization was performed considering economic, exergetic, and environmental aspects besides evaluation of the inherent safety level of a cascade refrigeration system. The capital costs, the processing costs, and the social cost due to CO 2 emission were considered to be included in the economic objective function. Exergetic efficiency of the plant was considered as the second objective function. As a measure of inherent safety level, Quantitative Risk Assessment (QRA) was performed to calculate total risk level of the cascade as the third objective function. Two cases (ammonia and propane) were considered to be compared as the refrigerant of the high temperature circuit. The achieved optimum solutions from the multi–objective optimization process were given as Pareto frontier. The ultimate optimal solution from available solutions on the Pareto optimal curve was selected using Decision-Makings approaches. NSGA-II algorithm was used to obtain Pareto optimal frontiers. Also, three decision-making approaches (TOPSIS, LINMAP, and Shannon’s entropy methods) were utilized to select the final optimum point. Considering continuous material release from the major equipment in the plant, flash and jet fire scenarios were considered for the CO 2 /C 3 H 8 cycle and toxic hazards were considered for the CO 2 /NH 3 cycle. The results showed no significant differences between CO 2 /NH 3 and

  13. Dynamical analysis of nearby clusters. Automated astrometry from the ground: precision proper motions over a wide field

    Bouy, H.; Bertin, E.; Moraux, E.; Cuillandre, J.-C.; Bouvier, J.; Barrado, D.; Solano, E.; Bayo, A.

    2013-06-01

    Context. The kinematic properties of the different classes of objects in a given association hold important clues about the history of its members, and offer a unique opportunity to test the predictions of the various models of stellar formation and evolution. Aims: DANCe (standing for dynamical analysis of nearby clusters) is a survey program aimed at deriving a comprehensive and homogeneous census of the stellar and substellar content of a number of nearby (history, and the presence of reference extragalactic sources for the anchoring onto the ICRS. Based on observations obtained with MegaPrime/MegaCam, a joint project of CFHT and CEA/DAPNIA, at the Canada-France-Hawaii Telescope (CFHT) which is operated by the National Research Council (NRC) of Canada, the Institut National des Science de l'Univers of the Centre National de la Recherche Scientifique (CNRS) of France, and the University of Hawaii.

  14. Objective voice and speech analysis of persons with chronic hoarseness by prosodic analysis of speech samples.

    Haderlein, Tino; Döllinger, Michael; Matoušek, Václav; Nöth, Elmar

    2016-10-01

    Automatic voice assessment is often performed using sustained vowels. In contrast, speech analysis of read-out texts can be applied to voice and speech assessment. Automatic speech recognition and prosodic analysis were used to find regression formulae between automatic and perceptual assessment of four voice and four speech criteria. The regression was trained with 21 men and 62 women (average age 49.2 years) and tested with another set of 24 men and 49 women (48.3 years), all suffering from chronic hoarseness. They read the text 'Der Nordwind und die Sonne' ('The North Wind and the Sun'). Five voice and speech therapists evaluated the data on 5-point Likert scales. Ten prosodic and recognition accuracy measures (features) were identified which describe all the examined criteria. Inter-rater correlation within the expert group was between r = 0.63 for the criterion 'match of breath and sense units' and r = 0.87 for the overall voice quality. Human-machine correlation was between r = 0.40 for the match of breath and sense units and r = 0.82 for intelligibility. The perceptual ratings of different criteria were highly correlated with each other. Likewise, the feature sets modeling the criteria were very similar. The automatic method is suitable for assessing chronic hoarseness in general and for subgroups of functional and organic dysphonia. In its current version, it is almost as reliable as a randomly picked rater from a group of voice and speech therapists.

  15. Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy.

    Vandenabeele, Peter; Tate, Jim; Moens, Luc

    2007-02-01

    Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland.

  16. Benchmarking the Applicability of Ontology in Geographic Object-Based Image Analysis

    Sachit Rajbhandari

    2017-11-01

    Full Text Available In Geographic Object-based Image Analysis (GEOBIA, identification of image objects is normally achieved using rule-based classification techniques supported by appropriate domain knowledge. However, GEOBIA currently lacks a systematic method to formalise the domain knowledge required for image object identification. Ontology provides a representation vocabulary for characterising domain-specific classes. This study proposes an ontological framework that conceptualises domain knowledge in order to support the application of rule-based classifications. The proposed ontological framework is tested with a landslide case study. The Web Ontology Language (OWL is used to construct an ontology in the landslide domain. The segmented image objects with extracted features are incorporated into the ontology as instances. The classification rules are written in Semantic Web Rule Language (SWRL and executed using a semantic reasoner to assign instances to appropriate landslide classes. Machine learning techniques are used to predict new threshold values for feature attributes in the rules. Our framework is compared with published work on landslide detection where ontology was not used for the image classification. Our results demonstrate that a classification derived from the ontological framework accords with non-ontological methods. This study benchmarks the ontological method providing an alternative approach for image classification in the case study of landslides.

  17. Multi-objective Analysis for a Sequencing Planning of Mixed-model Assembly Line

    Shimizu, Yoshiaki; Waki, Toshiya; Yoo, Jae Kyu

    Diversified customer demands are raising importance of just-in-time and agile manufacturing much more than before. Accordingly, introduction of mixed-model assembly lines becomes popular to realize the small-lot-multi-kinds production. Since it produces various kinds on the same assembly line, a rational management is of special importance. With this point of view, this study focuses on a sequencing problem of mixed-model assembly line including a paint line as its preceding process. By taking into account the paint line together, reducing work-in-process (WIP) inventory between these heterogeneous lines becomes a major concern of the sequencing problem besides improving production efficiency. Finally, we have formulated the sequencing problem as a bi-objective optimization problem to prevent various line stoppages, and to reduce the volume of WIP inventory simultaneously. Then we have proposed a practical method for the multi-objective analysis. For this purpose, we applied the weighting method to derive the Pareto front. Actually, the resulting problem is solved by a meta-heuristic method like SA (Simulated Annealing). Through numerical experiments, we verified the validity of the proposed approach, and discussed the significance of trade-off analysis between the conflicting objectives.

  18. Precision of DVC approaches for strain analysis in bone imaged with μCT at different dimensional levels.

    Dall'Ara, Enrico; Peña-Fernández, Marta; Palanca, Marco; Giorgi, Mario; Cristofolini, Luca; Tozzi, Gianluca

    2017-11-01

    Accurate measurement of local strain in heterogeneous and anisotropic bone tissue is fundamental to understand the pathophysiology of musculoskeletal diseases, to evaluate the effect of interventions from preclinical studies, and to optimize the design and delivery of biomaterials. Digital volume correlation (DVC) can be used to measure the three-dimensional displacement and strain fields from micro-Computed Tomography (µCT) images of loaded specimens. However, this approach is affected by the quality of the input images, by the morphology and density of the tissue under investigation, by the correlation scheme, and by the operational parameters used in the computation. Therefore, for each application the precision of the method should be evaluated. In this paper we present the results collected from datasets analyzed in previous studies as well as new data from a recent experimental campaign for characterizing the relationship between the precision of two different DVC approaches and the spatial resolution of the outputs. Different bone structures scanned with laboratory source µCT or Synchrotron light µCT (SRµCT) were processed in zero-strain tests to evaluate the precision of the DVC methods as a function of the subvolume size that ranged from 8 to 2500 micrometers. The results confirmed that for every microstructure the precision of DVC improves for larger subvolume size, following power laws. However, for the first time large differences in the precision of both local and global DVC approaches have been highlighted when SRµCT or in vivo µCT images were used instead of conventional ex vivo µCT. These findings suggest that in situ mechanical testing protocols applied in SRµCT facilities should be optimized in order to allow DVC analyses of localized strain measurements. Moreover, for in vivo µCT applications DVC analyses should be performed only with relatively course spatial resolution for achieving a reasonable precision of the method. In conclusion

  19. A modified precise integration method based on Magnus expansion for transient response analysis of time varying dynamical structure

    Yue, Cong; Ren, Xingmin; Yang, Yongfeng; Deng, Wangqun

    2016-01-01

    This paper provides a precise and efficacious methodology for manifesting forced vibration response with respect to the time-variant linear rotational structure subjected to unbalanced excitation. A modified algorithm based on time step precise integration method and Magnus expansion is developed for instantaneous dynamic problems. The iterative solution is achieved by the ideology of transition and dimensional increment matrix. Numerical examples on a typical accelerating rotation system considering gyroscopic moment and mass unbalance force comparatively demonstrate the validity, effectiveness and accuracy with Newmark-β method. It is shown that the proposed algorithm has high accuracy without loss efficiency.

  20. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  1. Interaction between High-Level and Low-Level Image Analysis for Semantic Video Object Extraction

    Andrea Cavallaro

    2004-06-01

    Full Text Available The task of extracting a semantic video object is split into two subproblems, namely, object segmentation and region segmentation. Object segmentation relies on a priori assumptions, whereas region segmentation is data-driven and can be solved in an automatic manner. These two subproblems are not mutually independent, and they can benefit from interactions with each other. In this paper, a framework for such interaction is formulated. This representation scheme based on region segmentation and semantic segmentation is compatible with the view that image analysis and scene understanding problems can be decomposed into low-level and high-level tasks. Low-level tasks pertain to region-oriented processing, whereas the high-level tasks are closely related to object-level processing. This approach emulates the human visual system: what one “sees” in a scene depends on the scene itself (region segmentation as well as on the cognitive task (semantic segmentation at hand. The higher-level segmentation results in a partition corresponding to semantic video objects. Semantic video objects do not usually have invariant physical properties and the definition depends on the application. Hence, the definition incorporates complex domain-specific knowledge and is not easy to generalize. For the specific implementation used in this paper, motion is used as a clue to semantic information. In this framework, an automatic algorithm is presented for computing the semantic partition based on color change detection. The change detection strategy is designed to be immune to the sensor noise and local illumination variations. The lower-level segmentation identifies the partition corresponding to perceptually uniform regions. These regions are derived by clustering in an N-dimensional feature space, composed of static as well as dynamic image attributes. We propose an interaction mechanism between the semantic and the region partitions which allows to

  2. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  3. High Precision Thermal, Structural and Optical Analysis of an External Occulter Using a Common Model and the General Purpose Multi-Physics Analysis Tool Cielo

    Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg

    2011-01-01

    The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.

  4. Objective and quantitative analysis of daytime sleepiness in physicians after night duties.

    Wilhelm, Barbara J; Widmann, Anja; Durst, Wilhelm; Heine, Christian; Otto, Gerhard

    2009-06-01

    Work place studies often have the disadvantage of lacking objective data less prone to subject bias. The aim of this study was to contribute objective data to the discussion about safety aspects of night shifts in physicians. For this purpose we applied the Pupillographic Sleepiness Test (PST). The PST allows recording and analyses of pupillary sleepiness-related oscillations in darkness for 11 min in the sitting subject. The parameter of evaluation is the Pupillary Unrest Index (PUI; mm/min). For statistical analysis the natural logarithm of this parameter is used (lnPUI). Thirty-four physicians were examined by the PST and subjective scales during the first half of the day. Data taken during a day work period (D) were compared to those taken directly after night duty (N) by a Wilcoxon signed rank test. Night duty caused a mean sleep reduction of 3 h (Difference N-D: median 3 h, minimum 0 h, maximum 7 h, p home.

  5. Approaches to defining «financial potential» concept as of economic analysis object

    O.M. Dzyubenkо

    2017-12-01

    Full Text Available The research analyzes the works of scientists who studied the issues of financial potential as an economic category. Due to analyzing the approaches of the scientists to the concept of "financial potential" the author identifies six approaches to the interpretation of its essence, they are: the totality of the enterprise financial resources, the sources of the enterprise economic activity financing, the enterprise economic activity development, the enterprise financial indicators, the system of enterprise financial management, the enterprise efficiency characteristics. It is established that the financial potential is the multifaceted category that characterizes the financial and economic activity of enterprises. The author's definition of the financial potential in the context of its place in the objects of economic analysis is proposed. It is established that the financial potential is the object of the enterprise economic activity management and is the subject to analytical assessments for establishing its state and directions of development.

  6. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present new challenges to state of the art image analysis tools, asking for automated methods specifically suited to map landslide events on VHR optical images. In this work we develop and test a methodology for semi-automatic landslide recognition and mapping of landslide source and transport areas. The method combines object-based image analysis and a Support Vector Machine supervised learning algorithm, and was tested using a GeoEye-1 multispectral image, sensed 3 days after a damaging landslide event in Madeira Island, together with a pre-event LiDAR DEM. Our approach has proved successful in the recognition of landslides on a 15 Km2-wide study area, with 81 out of 85 landslides detected in its validation regions. The classifier also showed reasonable performance (false positive rate 60% and false positive rate below 36% in both validation regions) in the internal mapping of landslide source and transport areas, in particular in the sunnier east-facing slopes. In the less illuminated areas the classifier is still able to accurately map the source areas, but performs poorly in the mapping of landslide transport areas.

  7. Application of LC-MS to the analysis of dyes in objects of historical interest

    Zhang, Xian; Laursen, Richard

    2009-07-01

    High-performance liquid chromatography (HPLC) with photodiode array and mass spectrometric detection permits dyes extracted from objects of historical interest or from natural plant or animal dyestuffs to be characterized on the basis of three orthogonal properties: HPLC retention time, UV-visible spectrum and molecular mass. In the present study, we have focused primarily on yellow dyes, the bulk of which are flavonoid glycosides that would be almost impossible to characterize without mass spectrometric detection. Also critical for this analysis is a method for mild extraction of the dyes from objects (e.g., textiles) without hydrolyzing the glycosidic linkages. This was accomplished using 5% formic acid in methanol, rather than the more traditional 6 M HCl. Mass spectroscopy, besides providing the molecular mass of the dye molecule, sometimes yields additional structural data based on fragmentation patterns. In addition, coeluting compounds can often be detected using extracted ion chromatography. The utility of mass spectrometry is illustrated by the analysis of historical specimens of silk that had been dyed yellow with flavonoid glycosides from Sophora japonica (pagoda tree) and curcumins from Curcuma longa (turmeric). In addition, we have used these techniques to identify the dye type, and sometimes the specific dyestuff, in a variety of objects, including a yellow varnish from a 19th century Tibetan altar and a 3000-year-old wool mortuary textiles, from Xinjiang, China. We are using HPLC with diode array and mass spectrometric detection to create a library of analyzed dyestuffs (>200 so far; mostly plants) to serve as references for identification of dyes in objects of historical interest.

  8. Reduced object related negativity response indicates impaired auditory scene analysis in adults with autistic spectrum disorder

    Veema Lodhia

    2014-02-01

    Full Text Available Auditory Scene Analysis provides a useful framework for understanding atypical auditory perception in autism. Specifically, a failure to segregate the incoming acoustic energy into distinct auditory objects might explain the aversive reaction autistic individuals have to certain auditory stimuli or environments. Previous research with non-autistic participants has demonstrated the presence of an Object Related Negativity (ORN in the auditory event related potential that indexes pre-attentive processes associated with auditory scene analysis. Also evident is a later P400 component that is attention dependent and thought to be related to decision-making about auditory objects. We sought to determine whether there are differences between individuals with and without autism in the levels of processing indexed by these components. Electroencephalography (EEG was used to measure brain responses from a group of 16 autistic adults, and 16 age- and verbal-IQ-matched typically-developing adults. Auditory responses were elicited using lateralized dichotic pitch stimuli in which inter-aural timing differences create the illusory perception of a pitch that is spatially separated from a carrier noise stimulus. As in previous studies, control participants produced an ORN in response to the pitch stimuli. However, this component was significantly reduced in the participants with autism. In contrast, processing differences were not observed between the groups at the attention-dependent level (P400. These findings suggest that autistic individuals have difficulty segregating auditory stimuli into distinct auditory objects, and that this difficulty arises at an early pre-attentive level of processing.

  9. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  10. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  11. On-line HPLC Analysis System for Metabolism and Inhibition Studies in Precision-Cut Liver Slices

    van Midwoud, Paul M.; Janssen, Joost; Merema, M.T.; de Graaf, Inge A. M.; Groothuis, Geny M. M.; Verpoorte, Elisabeth

    2011-01-01

    A novel approach for on-line monitoring of drug metabolism in continuously perifused, precision-cut liver slices (PCLS) in a microfluidic system has been developed using high-performance liquid chromatography with UV detection (HPLC-UV). In this approach, PCLS are incubated in a microfluidic device

  12. Precision and accuracy of multi-element analysis of aerosols using energy-dispersive x-ray fluorescence

    Adams, F.; Van Espen, P.

    1976-01-01

    Measurements have been carried out for the determination of the inherent errors of energy-dispersive X-ray fluorescence and for the evaluation of its precision and accuracy. The accuracy of the method is confirmed by independent determinations on the same samples using other analytical methods

  13. Precision of radiostereometric analysis (RSA) of acetabular cup stability and polyethylene wear improved by adding tantalum beads to the liner

    Nebergall, Audrey K; Rader, Kevin; Palm, Henrik

    2015-01-01

    segment to measure wear and acetabular cup stability. The standard deviation multiplied by the critical value (from a t distribution) established the precision of each method. Results - Due to the imprecision of the automated edge detection, the shell-only method was least desirable. The shell + liner...

  14. Using Item Analysis to Assess Objectively the Quality of the Calgary-Cambridge OSCE Checklist

    Tyrone Donnon

    2011-06-01

    Full Text Available Background:  The purpose of this study was to investigate the use of item analysis to assess objectively the quality of items on the Calgary-Cambridge Communications OSCE checklist. Methods:  A total of 150 first year medical students were provided with extensive teaching on the use of the Calgary-Cambridge Guidelines for interviewing patients and participated in a final year end 20 minute communication OSCE station.  Grouped into either the upper half (50% or lower half (50% communication skills performance groups, discrimination, difficulty and point biserial values were calculated for each checklist item. Results:  The mean score on the 33 item communication checklist was 24.09 (SD = 4.46 and the internal reliability coefficient was ? = 0.77. Although most of the items were found to have moderate (k = 12, 36% or excellent (k = 10, 30% discrimination values, there were 6 (18% identified as ‘fair’ and 3 (9% as ‘poor’. A post-examination review focused on item analysis findings resulted in an increase in checklist reliability (? = 0.80. Conclusions:  Item analysis has been used with MCQ exams extensively. In this study, it was also found to be an objective and practical approach to use in evaluating the quality of a standardized OSCE checklist.

  15. Mapping of crop calendar events by object-based analysis of MODIS and ASTER images

    A.I. De Castro

    2014-06-01

    Full Text Available A method to generate crop calendar and phenology-related maps at a parcel level of four major irrigated crops (rice, maize, sunflower and tomato is shown. The method combines images from the ASTER and MODIS sensors in an object-based image analysis framework, as well as testing of three different fitting curves by using the TIMESAT software. Averaged estimation of calendar dates were 85%, from 92% in the estimation of emergence and harvest dates in rice to 69% in the case of harvest date in tomato.

  16. Analysis of Scattering by Inhomogeneous Dielectric Objects Using Higher-Order Hierarchical MoM

    Kim, Oleksiy S.; Jørgensen, Erik; Meincke, Peter

    2003-01-01

    An efficient technique for the analysis of electromagnetic scattering by arbitrary shaped inhomogeneous dielectric objects is presented. The technique is based on a higher-order method of moments (MoM) solution of the volume integral equation. This higher-order MoM solution comprises recently...... that the condition number of the resulting MoM matrix is reduced by several orders of magnitude in comparison to existing higher-order hierarchical basis functions and, consequently, an iterative solver can be applied even for high expansion orders. Numerical results demonstrate excellent agreement...

  17. Object-Oriented Programming in the Development of Containment Analysis Code

    Han, Tae Young; Hong, Soon Joon; Hwang, Su Hyun; Lee, Byung Chul; Byun, Choong Sup

    2009-01-01

    After the mid 1980s, the new programming concept, Object-Oriented Programming (OOP), was introduced and designed, which has the features such as the information hiding, encapsulation, modularity and inheritance. These offered much more convenient programming paradigm to code developers. The OOP concept was readily developed into the programming language as like C++ in the 1990s and is being widely used in the modern software industry. In this paper, we show that the OOP concept is successfully applicable to the development of safety analysis code for containment and propose the more explicit and easy OOP design for developers

  18. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Dean, Stephen O.

    1988-09-01

    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications.

  19. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Dean, Stephen O.

    1988-03-01

    Fusion is an essentially inexhaustible source of energy that has the potential for economically attractive commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion-energy development program is the generation of centralstation electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high-energy neutrons suggests potentially unique applications. These include breeding of fissile fuels, production of hydrogen and other chemical products, transmutation or “burning” of various nuclear or chemical wastes, radiation processing of materials, production of radioisotopes, food preservation, medical diagnosis and medical treatment, and space power and space propulsion. In addition, fusion R&D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other hand, are the two primary criteria for setting long-range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R&D program toward practical applications. The transfer of fusion technology and skills from the national laboratories and universities to industry is the key to achieving the long-range objective of commercial fusion applications.

  20. Statistical motion vector analysis for object tracking in compressed video streams

    Leny, Marc; Prêteux, Françoise; Nicholson, Didier

    2008-02-01

    Compressed video is the digital raw material provided by video-surveillance systems and used for archiving and indexing purposes. Multimedia standards have therefore a direct impact on such systems. If MPEG-2 used to be the coding standard, MPEG-4 (part 2) has now replaced it in most installations, and MPEG-4 AVC/H.264 solutions are now being released. Finely analysing the complex and rich MPEG-4 streams is a challenging issue addressed in that paper. The system we designed is based on five modules: low-resolution decoder, motion estimation generator, object motion filtering, low-resolution object segmentation, and cooperative decision. Our contributions refer to as the statistical analysis of the spatial distribution of the motion vectors, the computation of DCT-based confidence maps, the automatic motion activity detection in the compressed file and a rough indexation by dedicated descriptors. The robustness and accuracy of the system are evaluated on a large corpus (hundreds of hours of in-and outdoor videos with pedestrians and vehicles). The objective benchmarking of the performances is achieved with respect to five metrics allowing to estimate the error part due to each module and for different implementations. This evaluation establishes that our system analyses up to 200 frames (720x288) per second (2.66 GHz CPU).

  1. Approach to proliferation risk assessment based on multiple objective analysis framework

    Andrianov, A.; Kuptsov, I. [Obninsk Institute for Nuclear Power Engineering of NNRU MEPhI (Russian Federation); Studgorodok 1, Obninsk, Kaluga region, 249030 (Russian Federation)

    2013-07-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk.

  2. A descriptive analysis of quantitative indices for multi-objective block layout

    Amalia Medina Palomera

    2013-01-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  3. Approach to proliferation risk assessment based on multiple objective analysis framework

    Andrianov, A.; Kuptsov, I.

    2013-01-01

    The approach to the assessment of proliferation risk using the methods of multi-criteria decision making and multi-objective optimization is presented. The approach allows the taking into account of the specifics features of the national nuclear infrastructure, and possible proliferation strategies (motivations, intentions, and capabilities). 3 examples of applying the approach are shown. First, the approach has been used to evaluate the attractiveness of HEU (high enriched uranium)production scenarios at a clandestine enrichment facility using centrifuge enrichment technology. Secondly, the approach has been applied to assess the attractiveness of scenarios for undeclared production of plutonium or HEU by theft of materials circulating in nuclear fuel cycle facilities and thermal reactors. Thirdly, the approach has been used to perform a comparative analysis of the structures of developing nuclear power systems based on different types of nuclear fuel cycles, the analysis being based on indicators of proliferation risk

  4. Prioritization of buffer areas with multi objective analysis: application in the Basin Creek St. Helena

    Zuluaga, Julian; Carvajal, Luis Fernando

    2006-01-01

    This paper shows a Multi objective Analysis (AMO-ELECTRE 111) with Geographical Information System (GIS) to establish priorities of buffer zones on the drainage network of the Santa Elena Creek, Medellin middle-east zone. 38 alternatives (small catchment) are evaluated with seven criteria, from field work, and maps. The criteria are: susceptibility to mass sliding, surface and lineal erosion, conflict by land use, and state of the waterways network in respect to hydrology, geology and human impact. The ELECTERE III method allows establishing priorities of buffer zones for each catchment; the indifference, acceptance, veto, and credibility threshold values, as well as those for criteria weighting factors are very important. The results show that the north zone of the catchment, commune 8, in particular La Castro creek, is most affected. The sensibility analysis shows that the obtained solution is robust, and that the anthropic and geologic criteria are paramount

  5. Precise Truss Assembly Using Commodity Parts and Low Precision Welding

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, W. R.; Correll, Nikolaus

    2014-01-01

    Hardware and software design and system integration for an intelligent precision jigging robot (IPJR), which allows high precision assembly using commodity parts and low-precision bonding, is described. Preliminary 2D experiments that are motivated by the problem of assembling space telescope optical benches and very large manipulators on orbit using inexpensive, stock hardware and low-precision welding are also described. An IPJR is a robot that acts as the precise "jigging", holding parts of a local structure assembly site in place, while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (for this prototype, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. The analysis of the assembly error and the results of building a square structure and a ring structure are discussed. Options for future work, to extend the IPJR paradigm to building in 3D structures at micron precision are also summarized.

  6. Systems Biology of Metabolism: A Driver for Developing Personalized and Precision Medicine

    Nielsen, Jens

    2017-01-01

    for advancing the development of personalized and precision medicine to treat metabolic diseases like insulin resistance, obesity, NAFLD, NASH, and cancer. It will be illustrated how the concept of genome-scale metabolic models can be used for integrative analysis of big data with the objective of identifying...... novel biomarkers that are foundational for personalized and precision medicine....

  7. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective

  8. Robust object tracking techniques for vision-based 3D motion analysis applications

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  9. Object selection costs in visual working memory: A diffusion model analysis of the focus of attention.

    Sewell, David K; Lilburn, Simon D; Smith, Philip L

    2016-11-01

    A central question in working memory research concerns the degree to which information in working memory is accessible to other cognitive processes (e.g., decision-making). Theories assuming that the focus of attention can only store a single object at a time require the focus to orient to a target representation before further processing can occur. The need to orient the focus of attention implies that single-object accounts typically predict response time costs associated with object selection even when working memory is not full (i.e., memory load is less than 4 items). For other theories that assume storage of multiple items in the focus of attention, predictions depend on specific assumptions about the way resources are allocated among items held in the focus, and how this affects the time course of retrieval of items from the focus. These broad theoretical accounts have been difficult to distinguish because conventional analyses fail to separate components of empirical response times related to decision-making from components related to selection and retrieval processes associated with accessing information in working memory. To better distinguish these response time components from one another, we analyze data from a probed visual working memory task using extensions of the diffusion decision model. Analysis of model parameters revealed that increases in memory load resulted in (a) reductions in the quality of the underlying stimulus representations in a manner consistent with a sample size model of visual working memory capacity and (b) systematic increases in the time needed to selectively access a probed representation in memory. The results are consistent with single-object theories of the focus of attention. The results are also consistent with a subset of theories that assume a multiobject focus of attention in which resource allocation diminishes both the quality and accessibility of the underlying representations. (PsycINFO Database Record (c) 2016

  10. Analysis of Lung Tumor Motion in a Large Sample: Patterns and Factors Influencing Precise Delineation of Internal Target Volume

    Knybel, Lukas [Department of Oncology, University Hospital Ostrava, Ostrava (Czech Republic); VŠB-Technical University of Ostrava, Ostrava (Czech Republic); Cvek, Jakub, E-mail: Jakub.cvek@fno.cz [Department of Oncology, University Hospital Ostrava, Ostrava (Czech Republic); Molenda, Lukas; Stieberova, Natalie; Feltl, David [Department of Oncology, University Hospital Ostrava, Ostrava (Czech Republic)

    2016-11-15

    Purpose/Objective: To evaluate lung tumor motion during respiration and to describe factors affecting the range and variability of motion in patients treated with stereotactic ablative radiation therapy. Methods and Materials: Log file analysis from online respiratory tumor tracking was performed in 145 patients. Geometric tumor location in the lungs, tumor volume and origin (primary or metastatic), sex, and tumor motion amplitudes in the superior-inferior (SI), latero-lateral (LL), and anterior-posterior (AP) directions were recorded. Tumor motion variability during treatment was described using intrafraction/interfraction amplitude variability and tumor motion baseline changes. Tumor movement dependent on the tumor volume, position and origin, and sex were evaluated using statistical regression and correlation analysis. Results: After analysis of >500 hours of data, the highest rates of motion amplitudes, intrafraction/interfraction variation, and tumor baseline changes were in the SI direction (6.0 ± 2.2 mm, 2.2 ± 1.8 mm, 1.1 ± 0.9 mm, and −0.1 ± 2.6 mm). The mean motion amplitudes in the lower/upper geometric halves of the lungs were significantly different (P<.001). Motion amplitudes >15 mm were observed only in the lower geometric quarter of the lungs. Higher tumor motion amplitudes generated higher intrafraction variations (R=.86, P<.001). Interfraction variations and baseline changes >3 mm indicated tumors contacting mediastinal structures or parietal pleura. On univariate analysis, neither sex nor tumor origin (primary vs metastatic) was an independent predictive factor of different movement patterns. Metastatic lesions in women, but not men, showed significantly higher mean amplitudes (P=.03) and variability (primary, 2.7 mm; metastatic, 4.9 mm; P=.002) than primary tumors. Conclusion: Online tracking showed significant irregularities in lung tumor movement during respiration. Motion amplitude was significantly lower in upper lobe

  11. Analysis of Lung Tumor Motion in a Large Sample: Patterns and Factors Influencing Precise Delineation of Internal Target Volume

    Knybel, Lukas; Cvek, Jakub; Molenda, Lukas; Stieberova, Natalie; Feltl, David

    2016-01-01

    Purpose/Objective: To evaluate lung tumor motion during respiration and to describe factors affecting the range and variability of motion in patients treated with stereotactic ablative radiation therapy. Methods and Materials: Log file analysis from online respiratory tumor tracking was performed in 145 patients. Geometric tumor location in the lungs, tumor volume and origin (primary or metastatic), sex, and tumor motion amplitudes in the superior-inferior (SI), latero-lateral (LL), and anterior-posterior (AP) directions were recorded. Tumor motion variability during treatment was described using intrafraction/interfraction amplitude variability and tumor motion baseline changes. Tumor movement dependent on the tumor volume, position and origin, and sex were evaluated using statistical regression and correlation analysis. Results: After analysis of >500 hours of data, the highest rates of motion amplitudes, intrafraction/interfraction variation, and tumor baseline changes were in the SI direction (6.0 ± 2.2 mm, 2.2 ± 1.8 mm, 1.1 ± 0.9 mm, and −0.1 ± 2.6 mm). The mean motion amplitudes in the lower/upper geometric halves of the lungs were significantly different (P 15 mm were observed only in the lower geometric quarter of the lungs. Higher tumor motion amplitudes generated higher intrafraction variations (R=.86, P 3 mm indicated tumors contacting mediastinal structures or parietal pleura. On univariate analysis, neither sex nor tumor origin (primary vs metastatic) was an independent predictive factor of different movement patterns. Metastatic lesions in women, but not men, showed significantly higher mean amplitudes (P=.03) and variability (primary, 2.7 mm; metastatic, 4.9 mm; P=.002) than primary tumors. Conclusion: Online tracking showed significant irregularities in lung tumor movement during respiration. Motion amplitude was significantly lower in upper lobe tumors; higher interfraction amplitude variability indicated tumors in contact

  12. Simulation of multicomponent light source for optical-electronic system of color analysis objects

    Peretiagin, Vladimir S.; Alekhin, Artem A.; Korotaev, Valery V.

    2016-04-01

    Development of lighting technology has led to possibility of using LEDs in the specialized devices for outdoor, industrial (decorative and accent) and domestic lighting. In addition, LEDs and devices based on them are widely used for solving particular problems. For example, the LED devices are widely used for lighting of vegetables and fruit (for their sorting or growing), textile products (for the control of its quality), minerals (for their sorting), etc. Causes of active introduction LED technology in different systems, including optical-electronic devices and systems, are a large choice of emission color and LED structure, that defines the spatial, power, thermal and other parameters. Furthermore, multi-element and color devices of lighting with adjustable illumination properties can be designed and implemented by using LEDs. However, devices based on LEDs require more attention if you want to provide a certain nature of the energy or color distribution at all the work area (area of analysis or observation) or surface of the object. This paper is proposed a method of theoretical modeling of the lighting devices. The authors present the models of RGB multicomponent light source applied to optical-electronic system for the color analysis of mineral objects. The possibility of formation the uniform and homogeneous on energy and color illumination of the work area for this system is presented. Also authors showed how parameters and characteristics of optical radiation receiver (by optical-electronic system) affect on the energy, spatial, spectral and colorimetric properties of a multicomponent light source.

  13. GRAIN-SIZE MEASUREMENTS OF FLUVIAL GRAVEL BARS USING OBJECT-BASED IMAGE ANALYSIS

    Pedro Castro

    2018-01-01

    Full Text Available Traditional techniques for classifying the average grain size in gravel bars require manual measurements of each grain diameter. Aiming productivity, more efficient methods have been developed by applying remote sensing techniques and digital image processing. This research proposes an Object-Based Image Analysis methodology to classify gravel bars in fluvial channels. First, the study evaluates the performance of multiresolution segmentation algorithm (available at the software eCognition Developer in performing shape recognition. The linear regression model was applied to assess the correlation between the gravels’ reference delineation and the gravels recognized by the segmentation algorithm. Furthermore, the supervised classification was validated by comparing the results with field data using the t-statistic test and the kappa index. Afterwards, the grain size distribution in gravel bars along the upper Bananeiras River, Brazil was mapped. The multiresolution segmentation results did not prove to be consistent with all the samples. Nonetheless, the P01 sample showed an R2 =0.82 for the diameter estimation and R2=0.45 the recognition of the eliptical ft. The t-statistic showed no significant difference in the efficiencies of the grain size classifications by the field survey data and the Object-based supervised classification (t = 2.133 for a significance level of 0.05. However, the kappa index was 0.54. The analysis of the both segmentation and classification results did not prove to be replicable.

  14. Transferability of Object-Oriented Image Analysis Methods for Slum Identification

    Alfred Stein

    2013-08-01

    Full Text Available Updated spatial information on the dynamics of slums can be helpful to measure and evaluate progress of policies. Earlier studies have shown that semi-automatic detection of slums using remote sensing can be challenging considering the large variability in definition and appearance. In this study, we explored the potential of an object-oriented image analysis (OOA method to detect slums, using very high resolution (VHR imagery. This method integrated expert knowledge in the form of a local slum ontology. A set of image-based parameters was identified that was used for differentiating slums from non-slum areas in an OOA environment. The method was implemented on three subsets of the city of Ahmedabad, India. Results show that textural features such as entropy and contrast derived from a grey level co-occurrence matrix (GLCM and the size of image segments are stable parameters for classification of built-up areas and the identification of slums. Relation with classified slum objects, in terms of enclosed by slums and relative border with slums was used to refine classification. The analysis on three different subsets showed final accuracies ranging from 47% to 68%. We conclude that our method produces useful results as it allows including location specific adaptation, whereas generically applicable rulesets for slums are still to be developed.

  15. Multi-objective optimization of GPU3 Stirling engine using third order analysis

    Toghyani, Somayeh; Kasaeian, Alibakhsh; Hashemabadi, Seyyed Hasan; Salimi, Morteza

    2014-01-01

    Highlights: • A third-order analysis is carried out for optimization of Stirling engine. • The triple-optimization is done on a GPU3 Stirling engine. • A multi-objective optimization is carried out for a Stirling engine. • The results are compared with an experimental previous work for checking the model improvement. • The methods of TOPSIS, Fuzzy, and LINMAP are compared with each other in aspect of optimization. - Abstract: Stirling engine is an external combustion engine that uses any external heat source to generate mechanical power which operates at closed cycles. These engines are good choices for using in power generation systems; because these engines present a reasonable theoretical efficiency which can be closer to the Carnot efficiency, comparing with other reciprocating thermal engines. Hence, many studies have been conducted on Stirling engines and the third order thermodynamic analysis is one of them. In this study, multi-objective optimization with four decision variables including the temperature of heat source, stroke, mean effective pressure, and the engine frequency were applied in order to increase the efficiency and output power and reduce the pressure drop. Three decision-making procedures were applied to optimize the answers from the results. At last, the applied methods were compared with the results obtained of one experimental work and a good agreement was observed

  16. Objective Oriented Design of System Thermal Hydraulic Analysis Program and Verification of Feasibility

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. TRACE, RELAP5-3D and MARS codes are examples of these activities. The codes were redesigned to have modular structures utilizing Fortran 90 features. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. It was not commonly used in mainstream software application development until the early 1990s. Many modern programming languages now support OOP. Although the recent Fortran language also support the OOP, it is considered to have limited functions compared to the modern software features. In this work, objective oriented program for system safety analysis code has been tried utilizing modern C language feature. The advantage of OOP has been discussed after verification of design feasibility

  17. ANALYSIS AND PARTICULARITIES OF EXTERNAL FACTORS IMPACT ON ECONOMICAL RESULTS OF STRATEGIC OBJECTS PLANNING DEVELOPMENT

    V. V. Gromov

    2015-01-01

    Full Text Available Summary. The relevance of the scientific problem described in the article are: to determine changes in economic performance, the effectiveness of the sectoral components of the service sector from the effects of environmental factors, which allows them to reach the planned long-term economic performance; management decision-making about structural and organizational changes, implementation of investment projects in the renovation and modernization of fixed capital, the creation of technology, process and product innovations directly connected with the impact analysis of such external factors as economic, socio-cultural, legal, political, innovative. The structure of the article is formed on the basis of presentation of the impact of specific groups of environmental factors on the competitiveness and economic performance of industry components of services based on the technology of strategic planning; complience of logical sequence of presentation of materials, establishing a causal relationship, the interaction of factors and elements of studied problems and objects. Features of external factors impact on the effectiveness of macro-economic entities, sectoral components of services are to the adequacy of the measures and strategies to counter the negative impact on the economic development of the objects of strategic development. Features of status changes and influence of internal factors on local and sectoral socio-economic systems dictate the need for a part of the available resources, the level of efficiency of the use of labor resources, fixed and current assets. The contribution of the author in a scientific perspective of this topic is to carry out a comprehensive analysis of the impact of the main groups of external factors on economic activities of the service sector development; identifying features of internal factors impact on the economic and innovative development of strategic planning objects.

  18. Laser-induced breakdown spectroscopy (LIBS) analysis of calcium ions dissolved in water using filter paper substrates: an ideal internal standard for precision improvement.

    Choi, Daewoong; Gong, Yongdeuk; Nam, Sang-Ho; Han, Song-Hee; Yoo, Jonghyun; Lee, Yonghoon

    2014-01-01

    We report an approach for selecting an internal standard to improve the precision of laser-induced breakdown spectroscopy (LIBS) analysis for determining calcium (Ca) concentration in water. The dissolved Ca(2+) ions were pre-concentrated on filter paper by evaporating water. The filter paper was dried and analyzed using LIBS. By adding strontium chloride to sample solutions and using a Sr II line at 407.771 nm for the intensity normalization of Ca II lines at 393.366 or 396.847 nm, the analysis precision could be significantly improved. The Ca II and Sr II line intensities were mapped across the filter paper, and they showed a strong positive shot-to-shot correlation with the same spatial distribution on the filter paper surface. We applied this analysis approach for the measurement of Ca(2+) in tap, bottled, and ground water samples. The Ca(2+) concentrations determined using LIBS are in good agreement with those obtained from flame atomic absorption spectrometry. Finally, we suggest a homologous relation of the strongest emission lines of period 4 and 5 elements in groups IA and IIA based on their similar electronic structures. Our results indicate that the LIBS can be effectively applied for liquid analysis at the sub-parts per million level with high precision using a simple drying of liquid solutions on filter paper and the use of the correct internal standard elements with the similar valence electronic structure with respect to the analytes of interest.

  19. High-precision drop shape analysis on inclining flat surfaces: introduction and comparison of this special method with commercial contact angle analysis.

    Schmitt, Michael; Heib, Florian

    2013-10-07

    Drop shape analysis is one of the most important and frequently used methods to characterise surfaces in the scientific and industrial communities. An especially large number of studies, which use contact angle measurements to analyse surfaces, are characterised by incorrect or misdirected conclusions such as the determination of surface energies from poorly performed contact angle determinations. In particular, the characterisation of surfaces, which leads to correlations between the contact angle and other effects, must be critically validated for some publications. A large number of works exist concerning the theoretical and thermodynamic aspects of two- and tri-phase boundaries. The linkage between theory and experiment is generally performed by an axisymmetric drop shape analysis, that is, simulations of the theoretical drop profiles by numerical integration onto a number of points of the drop meniscus (approximately 20). These methods work very well for axisymmetric profiles such as those obtained by pendant drop measurements, but in the case of a sessile drop onto real surfaces, additional unknown and misunderstood effects on the dependence of the surface must be considered. We present a special experimental and practical investigation as another way to transition from experiment to theory. This procedure was developed to be especially sensitive to small variations in the dependence of the dynamic contact angle on the surface; as a result, this procedure will allow the properties of the surface to be monitored with a higher precession and sensitivity. In this context, water drops onto a 111 silicon wafer are dynamically measured by video recording and by inclining the surface, which results in a sequence of non-axisymmetric drops. The drop profiles are analysed by commercial software and by the developed and presented high-precision drop shape analysis. In addition to the enhanced sensitivity for contact angle determination, this analysis technique, in

  20. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Mevludin Memedi

    2015-09-01

    well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  1. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson's Disease.

    Memedi, Mevludin; Sadikov, Aleksander; Groznik, Vida; Žabkar, Jure; Možina, Martin; Bergquist, Filip; Johansson, Anders; Haubenberger, Dietrich; Nyholm, Dag

    2015-09-17

    test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  2. Hyper-Fractal Analysis: A visual tool for estimating the fractal dimension of 4D objects

    Grossu, I. V.; Grossu, I.; Felea, D.; Besliu, C.; Jipa, Al.; Esanu, T.; Bordeianu, C. C.; Stan, E.

    2013-04-01

    This work presents a new version of a Visual Basic 6.0 application for estimating the fractal dimension of images and 3D objects (Grossu et al. (2010) [1]). The program was extended for working with four-dimensional objects stored in comma separated values files. This might be of interest in biomedicine, for analyzing the evolution in time of three-dimensional images. New version program summaryProgram title: Hyper-Fractal Analysis (Fractal Analysis v03) Catalogue identifier: AEEG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEG_v3_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 745761 No. of bytes in distributed program, including test data, etc.: 12544491 Distribution format: tar.gz Programming language: MS Visual Basic 6.0 Computer: PC Operating system: MS Windows 98 or later RAM: 100M Classification: 14 Catalogue identifier of previous version: AEEG_v2_0 Journal reference of previous version: Comput. Phys. Comm. 181 (2010) 831-832 Does the new version supersede the previous version? Yes Nature of problem: Estimating the fractal dimension of 4D images. Solution method: Optimized implementation of the 4D box-counting algorithm. Reasons for new version: Inspired by existing applications of 3D fractals in biomedicine [3], we extended the optimized version of the box-counting algorithm [1, 2] to the four-dimensional case. This might be of interest in analyzing the evolution in time of 3D images. The box-counting algorithm was extended in order to support 4D objects, stored in comma separated values files. A new form was added for generating 2D, 3D, and 4D test data. The application was tested on 4D objects with known dimension, e.g. the Sierpinski hypertetrahedron gasket, Df=ln(5)/ln(2) (Fig. 1). The algorithm could be extended, with minimum effort, to

  3. Restructuring of burnup sensitivity analysis code system by using an object-oriented design approach

    Kenji, Yokoyama; Makoto, Ishikawa; Masahiro, Tatsumi; Hideaki, Hyoudou

    2005-01-01

    A new burnup sensitivity analysis code system was developed with help from the object-oriented technique and written in Python language. It was confirmed that they are powerful to support complex numerical calculation procedure such as reactor burnup sensitivity analysis. The new burnup sensitivity analysis code system PSAGEP was restructured from a complicated old code system and reborn as a user-friendly code system which can calculate the sensitivity coefficients of the nuclear characteristics considering multicycle burnup effect based on the generalized perturbation theory (GPT). A new encapsulation framework for conventional codes written in Fortran was developed. This framework supported to restructure the software architecture of the old code system by hiding implementation details and allowed users of the new code system to easily calculate the burnup sensitivity coefficients. The framework can be applied to the other development projects since it is carefully designed to be independent from PSAGEP. Numerical results of the burnup sensitivity coefficient of a typical fast breeder reactor were given with components based on GPT and the multicycle burnup effects on the sensitivity coefficient were discussed. (authors)

  4. Infrared spectroscopy with multivariate analysis to interrogate endometrial tissue: a novel and objective diagnostic approach.

    Taylor, S E; Cheung, K T; Patel, I I; Trevisan, J; Stringfellow, H F; Ashton, K M; Wood, N J; Keating, P J; Martin-Hirsch, P L; Martin, F L

    2011-03-01

    Endometrial cancer is the most common gynaecological malignancy in the United Kingdom. Diagnosis currently involves subjective expert interpretation of highly processed tissue, primarily using microscopy. Previous work has shown that infrared (IR) spectroscopy can be used to distinguish between benign and malignant cells in a variety of tissue types. Tissue was obtained from 76 patients undergoing hysterectomy, 36 had endometrial cancer. Slivers of endometrial tissue (tumour and tumour-adjacent tissue if present) were dissected and placed in fixative solution. Before analysis, tissues were thinly sliced, washed, mounted on low-E slides and desiccated; 10 IR spectra were obtained per slice by attenuated total reflection Fourier-transform IR (ATR-FTIR) spectroscopy. Derived data was subjected to principal component analysis followed by linear discriminant analysis. Post-spectroscopy analyses, tissue sections were haematoxylin and eosin-stained to provide histological verification. Using this approach, it is possible to distinguish benign from malignant endometrial tissue, and various subtypes of both. Cluster vector plots of benign (verified post-spectroscopy to be free of identifiable pathology) vs malignant tissue indicate the importance of the lipid and secondary protein structure (Amide I and Amide II) regions of the spectrum. These findings point towards the possibility of a simple objective test for endometrial cancer using ATR-FTIR spectroscopy. This would facilitate earlier diagnosis and so reduce the morbidity and mortality associated with this disease.

  5. Qualitative content analysis experiences with objective structured clinical examination among Korean nursing students.

    Jo, Kae-Hwa; An, Gyeong-Ju

    2014-04-01

    The aim of this study was to explore the experiences of Korean nursing students with an objective structured clinical examination (OSCE) assessment regarding the 12 cranial nerves using qualitative content analysis. Qualitative content analysis was used to explore the subjective experiences of nursing baccalaureate students after taking the OSCE. Convenience sampling was used to select 64 4th year nursing students who were interested in taking the OSCE. The participants learned content about the 12 cranial nerve assessment by lectures, demonstrations, and videos before the OSCE. The OSCE consisted of examinations in each of three stations for 2 days. The participants wrote information about their experiences on sheets of paper immediately after the OSCE anonymously in an adjacent room. The submitted materials were analyzed via qualitative content analysis. The collected materials were classified into two themes and seven categories. One theme was "awareness of inner capabilities", which included three categories: "inner motivation", "inner confidence", and "creativity". The other theme was "barriers to nursing performance", which included four categories: "deficiency of knowledge", "deficiency of communication skill", "deficiency of attitude toward comfort", and "deficiency of repetitive practice". This study revealed that the participants simultaneously experienced the potential and deficiency of their nursing competency after an OSCE session on cranial nerves. OSCE also provided the opportunity for nursing students to realize nursing care in a holistic manner unlike concern that OSCE undermines holism. © 2013 The Authors. Japan Journal of Nursing Science © 2013 Japan Academy of Nursing Science.

  6. Multi-objective optimization and grey relational analysis on configurations of organic Rankine cycle

    Wang, Y.Z.; Zhao, J.; Wang, Y.; An, Q.S.

    2017-01-01

    Highlights: • Pareto frontier is an effective way to make comprehensive comparison of ORC. • Comprehensive performance from energy and economics of basic ORC is the best. • R141b shows the best comprehensive performance from energy and economics. - Abstract: Concerning the comprehensive performance of organic Rankine cycle (ORC), comparisons and optimizations on 3 different configurations of ORC (basic, regenerative and extractive ORCs) are investigated in this paper. Medium-temperature geothermal water is used for comparing the influence of configurations, working fluids and operating parameters on different evaluation criteria. Different evaluation and optimization methods are adopted in evaluation of ORCs to obtain the one with the best comprehensive performance, such as exergoeconomic analysis, bi-objective optimization and grey relational analysis. The results reveal that the basic ORC performs the best among these 3 ORCs in terms of comprehensive thermodynamic and economic performances when using R245fa and driven by geothermal water at 150 °C. Furthermore, R141b shows the best comprehensive performance among 14 working fluids based on the Pareto frontier solutions without considering safe factors. Meanwhile, R141b is the best among all 14 working fluids with the optimal comprehensive performance when regarding all the evaluation criteria as equal by using grey relational analysis.

  7. Gaming Change: A Many-objective Analysis of Water Supply Portfolios under Uncertainty

    Reed, P. M.; Kasprzyk, J.; Characklis, G.; Kirsch, B.

    2008-12-01

    This study explores the uncertainty and tradeoffs associated with up to six conflicting water supply portfolio planning objectives. A ten-year Monte Carlo simulation model is used to evaluate water supply portfolios blending permanent rights, adaptive options contracts, and spot leases for a single city in the Lower Rio Grande Valley. Historical records of reservoir mass balance, lease pricing, and demand serve as the source data for the Monte Carlo simulation. Portfolio planning decisions include the initial volume and annual increases of permanent rights, thresholds for an adaptive options contract, and anticipatory decision rules for purchasing leases and exercising options. Our work distinguishes three cases: (1) permanent rights as the sole source of supply, (2) permanent rights and adaptive options, and (3) a combination of permanent rights, adaptive options, and leases. The problems have been formulated such that cases 1 and 2 are sub-spaces of the six objective formulation used for case 3. Our solution sets provide the tradeoff surfaces between portfolios' expected values for cost, cost variability, reliability, frequency of purchasing permanent rights increases, frequency of using leases, and dropped (or unused) transfers of water. The tradeoff surfaces for the three cases show that options and leases have a dramatic impact on the marginal costs associated with improving the efficiency and reliability of urban water supplies. Moreover, our many-objective analysis permits the discovery of a broad range of high quality portfolio strategies. We differentiate the value of adaptive options versus leases by testing a representative subset of optimal portfolios' abilities to effectively address regional increases in demand during drought periods. These results provide insights into the tradeoffs inherent to a more flexible, portfolio-style approach to urban water resources management, an approach that should become increasingly attractive in an environment of

  8. A REGION-BASED MULTI-SCALE APPROACH FOR OBJECT-BASED IMAGE ANALYSIS

    T. Kavzoglu

    2016-06-01

    Full Text Available Within the last two decades, object-based image analysis (OBIA considering objects (i.e. groups of pixels instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient. Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  9. MULTIPLE OBJECTS

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  10. Systematic analysis of the heat exchanger arrangement problem using multi-objective genetic optimization

    Daróczy, László; Janiga, Gábor; Thévenin, Dominique

    2014-01-01

    A two-dimensional cross-flow tube bank heat exchanger arrangement problem with internal laminar flow is considered in this work. The objective is to optimize the arrangement of tubes and find the most favorable geometries, in order to simultaneously maximize the rate of heat exchange while obtaining a minimum pressure loss. A systematic study was performed involving a large number of simulations. The global optimization method NSGA-II was retained. A fully automatized in-house optimization environment was used to solve the problem, including mesh generation and CFD (computational fluid dynamics) simulations. The optimization was performed in parallel on a Linux cluster with a very good speed-up. The main purpose of this article is to illustrate and analyze a heat exchanger arrangement problem in its most general form and to provide a fundamental understanding of the structure of the Pareto front and optimal geometries. The considered conditions are particularly suited for low-power applications, as found in a growing number of practical systems in an effort toward increasing energy efficiency. For such a detailed analysis with more than 140 000 CFD-based evaluations, a design-of-experiment study involving a response surface would not be sufficient. Instead, all evaluations rely on a direct solution using a CFD solver. - Highlights: • Cross-flow tube bank heat exchanger arrangement problem. • A fully automatized multi-objective optimization based on genetic algorithm. • A systematic study involving a large number of CFD (computational fluid dynamics) simulations

  11. Energy regulation in China: Objective selection, potential assessment and responsibility sharing by partial frontier analysis

    Xia, X.H.; Chen, Y.B.; Li, J.S.; Tasawar, H.; Alsaedi, A.; Chen, G.Q.

    2014-01-01

    To cope with the excessive growth of energy consumption, the Chinese government has been trying to strengthen the energy regulation system by introducing new initiatives that aim at controlling the total amount of energy consumption. A partial frontier analysis is performed in this paper to make a comparative assessment of the combinations of possible energy conservation objectives, new constraints and regulation strategies. According to the characteristics of the coordination of existing regulation structure and the optimality of regulation strategy, four scenarios are constructed and regional responsibilities are reasonably divided by fully considering the production technology in the economy. The relative importance of output objectives and the total amount controlling is compared and the impacts on the regional economy caused by the changes of regulation strategy are also evaluated for updating regulation policy. - Highlights: • New initiatives to control the total amount of energy consumption are evaluated. • Twenty-four regulation strategies and four scenarios are designed and compared. • Crucial regions for each sector and regional potential are identified. • The national goals of energy abatement are decomposed into regional responsibilities. • The changes of regulation strategy are evaluated for updating regulation policy

  12. Testing and injury potential analysis of rollovers with narrow object impacts.

    Meyer, Steven E; Forrest, Stephen; Herbst, Brian; Hayden, Joshua; Orton, Tia; Sances, Anthony; Kumaresan, Srirangam

    2004-01-01

    Recent statistics highlight the significant risk of serious and fatal injuries to occupants involved in rollover collisions due to excessive roof crush. The government has reported that in 2002. Sports Utility Vehicle rollover related fatalities increased by 14% to more than 2400 annually. 61% of all SUV fatalities included rollovers [1]. Rollover crashes rely primarily upon the roof structures to maintain occupant survival space. Frequently these crashes occur off the travel lanes of the roadway and, therefore, can include impacts with various types of narrow objects such as light poles, utility poles and/or trees. A test device and methodology is presented which facilitates dynamic, repeatable rollover impact evaluation of complete vehicle roof structures with such narrow objects. These tests allow for the incorporation of Anthropomorphic Test Dummies (ATDs) which can be instrumented to measure accelerations, forces and moments to evaluate injury potential. High-speed video permits for detailed analysis of occupant kinematics and evaluation of injury causation. Criteria such as restraint performance, injury potential, survival space and the effect of roof crush associated with various types of design alternatives, countermeasures and impact circumstances can also be evaluated. In addition to presentation of the methodology, two representative vehicle crash tests are also reported. Results indicated that the reinforced roof structure significantly reduced the roof deformation compared to the production roof structure.

  13. Accelerometry-based gait analysis, an additional objective approach to screen subjects at risk for falling.

    Senden, R; Savelberg, H H C M; Grimm, B; Heyligers, I C; Meijer, K

    2012-06-01

    This study investigated whether the Tinetti scale, as a subjective measure for fall risk, is associated with objectively measured gait characteristics. It is studied whether gait parameters are different for groups that are stratified for fall risk using the Tinetti scale. Moreover, the discriminative power of gait parameters to classify elderly according to the Tinetti scale is investigated. Gait of 50 elderly with a Tinneti>24 and 50 elderly with a Tinetti≤24 was analyzed using acceleration-based gait analysis. Validated algorithms were used to derive spatio-temporal gait parameters, harmonic ratio, inter-stride amplitude variability and root mean square (RMS) from the accelerometer data. Clear differences in gait were found between the groups. All gait parameters correlated with the Tinetti scale (r-range: 0.20-0.73). Only walking speed, step length and RMS showed moderate to strong correlations and high discriminative power to classify elderly according to the Tinetti scale. It is concluded that subtle gait changes that have previously been related to fall risk are not captured by the subjective assessment. It is therefore worthwhile to include objective gait assessment in fall risk screening. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Scanpath-based analysis of objects conspicuity in context of human vision physiology.

    Augustyniak, Piotr

    2007-01-01

    This paper discusses principal aspects of objects conspicuity investigated with use of an eye tracker and interpreted on the background of human vision physiology. Proper management of objects conspicuity is fundamental in several leading edge applications in the information society like advertisement, web design, man-machine interfacing and ergonomics. Although some common rules of human perception are applied since centuries in the art, the interest of human perception process is motivated today by the need of gather and maintain the recipient attention by putting selected messages in front of the others. Our research uses the visual tasks methodology and series of progressively modified natural images. The modifying details were attributed by their size, color and position while the scanpath-derived gaze points confirmed or not the act of perception. The statistical analysis yielded the probability of detail perception and correlations with the attributes. This probability conforms to the knowledge about the retina anatomy and perception physiology, although we use noninvasive methods only.

  15. Objective and subjective analysis of women's voice with idiopathic Parkinson's disease

    Riviana Rodrigues das Graças

    2012-07-01

    Full Text Available OBJECTIVE: To compare the voice quality of women with idiopathic Parkinson's disease and those without it. METHODS: An evaluation was performed including 19 female patients diagnosed with idiopathic Parkinson's disease, with an average age of 66 years, and 27 women with an average of 67 years-old in the Control Group. The assessment was performed by computed acoustic analysis and perceptual evaluation. RESULTS: Parkinson's disease patients presented moderate rough and unstable voice quality. The parameters of grade, roughness, and instability had higher scores in Parkinson's disease patients with statistically significant differences. Acoustic measures of Jitter and period perturbation quotient (PPQ significantly differ between groups. CONCLUSIONS: Parkinson's disease female individuals showed more vocal alterations compared to the Control Group, when both perceptual and acoustic evaluations were analyzed.

  16. Comprehensive benefit analysis of regional water resources based on multi-objective evaluation

    Chi, Yixia; Xue, Lianqing; Zhang, Hui

    2018-01-01

    The purpose of the water resources comprehensive benefits analysis is to maximize the comprehensive benefits on the aspects of social, economic and ecological environment. Aiming at the defects of the traditional analytic hierarchy process in the evaluation of water resources, it proposed a comprehensive benefit evaluation of social, economic and environmental benefits index from the perspective of water resources comprehensive benefit in the social system, economic system and environmental system; determined the index weight by the improved fuzzy analytic hierarchy process (AHP), calculated the relative index of water resources comprehensive benefit and analyzed the comprehensive benefit of water resources in Xiangshui County by the multi-objective evaluation model. Based on the water resources data in Xiangshui County, 20 main comprehensive benefit assessment factors of 5 districts belonged to Xiangshui County were evaluated. The results showed that the comprehensive benefit of Xiangshui County was 0.7317, meanwhile the social economy has a further development space in the current situation of water resources.

  17. Analysis of art objects by means of ion beam induced luminescence

    Quaranta, A; Dran, J C; Salomon, J; Pivin, J C; Vomiero, A; Tonezzer, M; Maggioni, G; Carturan, S; Mea, G Della

    2006-01-01

    The impact of energetic ions on solid samples gives rise to the emission of visible light owing to the electronic excitation of intrinsic defects or extrinsic impurities. The intensity and position of the emission features provide information on the nature of the luminescence centers and on their chemical environments. This makes ion beam induced luminescence (IBIL) a useful complement to other ion beam analyses, like PIXE, in the cultural heritage field in characterizing the composition and the provenience of art objects. In the present paper, IBIL measurements have been performed on inorganic pigments for underlying the complementary role played by IBIL in the analysis of artistic works. Some blue and red pigment has been presented as case study

  18. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  19. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    Norbert Pfeifer

    2008-08-01

    Full Text Available Airborne laser scanning (ALS is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2 and additional classification variables from full-waveform (FWF ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original

  20. Infant search and object permanence: a meta-analysis of the A-not-B error.

    Wellman, H M; Cross, D; Bartsch, K

    1987-01-01

    Research on Piaget's stage 4 object concept has failed to reveal a clear or consistent pattern of results. Piaget found that 8-12-month-old infants would make perserverative errors; his explanation for this phenomenon was that the infant's concept of the object was contextually dependent on his or her actions. Some studies designed to test Piaget's explanation have replicated Piaget's basic finding, yet many have found no preference for the A location or the B location or an actual preference for the B location. More recently, researchers have attempted to uncover the causes for these results concerning the A-not-B error. Again, however, different studies have yielded different results, and qualitative reviews have failed to yield a consistent explanation for the results of the individual studies. This state of affairs suggests that the phenomenon may simply be too complex to be captured by individual studies varying 1 factor at a time and by reviews based on similar qualitative considerations. Therefore, the current investigation undertook a meta-analysis, a synthesis capturing the quantitative information across the now sizable number of studies. We entered several important factors into the meta-analysis, including the effects of age, the number of A trials, the length of delay between hiding and search, the number of locations, the distances between locations, and the distinctive visual properties of the hiding arrays. Of these, the analysis consistently indicated that age, delay, and number of hiding locations strongly influence infants' search. The pattern of specific findings also yielded new information about infant search. A general characterization of the results is that, at every age, both above-chance and below-chance performance was observed. That is, at each age at least 1 combination of delay and number of locations yielded above-chance A-not-B errors or significant perseverative search. At the same time, at each age at least 1 alternative

  1. Quantification and Analysis of Icebergs in a Tidewater Glacier Fjord Using an Object-Based Approach.

    Robert W McNabb

    Full Text Available Tidewater glaciers are glaciers that terminate in, and calve icebergs into, the ocean. In addition to the influence that tidewater glaciers have on physical and chemical oceanography, floating icebergs serve as habitat for marine animals such as harbor seals (Phoca vitulina richardii. The availability and spatial distribution of glacier ice in the fjords is likely a key environmental variable that influences the abundance and distribution of selected marine mammals; however, the amount of ice and the fine-scale characteristics of ice in fjords have not been systematically quantified. Given the predicted changes in glacier habitat, there is a need for the development of methods that could be broadly applied to quantify changes in available ice habitat in tidewater glacier fjords. We present a case study to describe a novel method that uses object-based image analysis (OBIA to classify floating glacier ice in a tidewater glacier fjord from high-resolution aerial digital imagery. Our objectives were to (i develop workflows and rule sets to classify high spatial resolution airborne imagery of floating glacier ice; (ii quantify the amount and fine-scale characteristics of floating glacier ice; (iii and develop processes for automating the object-based analysis of floating glacier ice for large number of images from a representative survey day during June 2007 in Johns Hopkins Inlet (JHI, a tidewater glacier fjord in Glacier Bay National Park, southeastern Alaska. On 18 June 2007, JHI was comprised of brash ice ([Formula: see text] = 45.2%, SD = 41.5%, water ([Formula: see text] = 52.7%, SD = 42.3%, and icebergs ([Formula: see text] = 2.1%, SD = 1.4%. Average iceberg size per scene was 5.7 m2 (SD = 2.6 m2. We estimate the total area (± uncertainty of iceberg habitat in the fjord to be 455,400 ± 123,000 m2. The method works well for classifying icebergs across scenes (classification accuracy of 75.6%; the largest classification errors occur in areas

  2. Exploring the impact of learning objects in middle school mathematics and science classrooms: A formative analysis

    Robin H. Kay

    2008-12-01

    Full Text Available The current study offers a formative analysis of the impact of learning objects in middle school mathematics and science classrooms. Five reliable and valid measure of effectiveness were used to examine the impact of learning objects from the perspective of 262 students and 8 teachers (14 classrooms in science or mathematics. The results indicate that teachers typically spend 1-2 hours finding and preparing for learning-object based lesson plans that focus on the review of previous concepts. Both teachers and students are positive about the learning benefits, quality, and engagement value of learning objects, although teachers are more positive than students. Student performance increased significantly, over 40%, when learning objects were used in conjunction with a variety of teaching strategies. It is reasonable to conclude that learning objects have potential as a teaching tool in a middle school environment. L’impacte des objets d’apprentissage dans les classes de mathématique et de sciences à l’école intermédiaire : une analyse formative Résumé : Cette étude présente une analyse formative de l’impacte des objets d’apprentissage dans les classes de mathématique et de sciences à l’école intermédiaire. Cinq mesures de rendement fiables et valides ont été exploitées pour examiner l’effet des objets d’apprentissage selon 262 élèves et 8 enseignants (414 classes en science ou mathématiques. Les résultats indiquent que les enseignants passent typiquement 1-2 heures pour trouver des objets d’apprentissage et préparer les leçons associées qui seraient centrées sur la revue de concepts déjà vus en classe. Quoique les enseignants aient répondu de façon plus positive que les élèves, les deux groupes ont répondu positivement quant aux avantages au niveau de l’apprentissage, à la qualité ainsi qu’à la valeur motivationnelle des objets d’apprentissage. Le rendement des élèves aurait aussi augment

  3. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Chung, Bub Dong

    2008-03-15

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation.

  4. Objective Oriented Design of Architecture for TH System Safety Analysis Code and Verification

    Chung, Bub Dong

    2008-03-01

    In this work, objective oriented design of generic system analysis code has been tried based on the previous works in KAERI for two phase three field Pilot code. It has been performed to implement of input and output design, TH solver, component model, special TH models, heat structure solver, general table, trip and control, and on-line graphics. All essential features for system analysis has been designed and implemented in the final product SYSTF code. The computer language C was used for implementation in the Visual studio 2008 IDE (Integrated Development Environment) since it has easier and lighter than C++ feature. The code has simple and essential features of models and correlation, special component, special TH model and heat structure model. However the input features is able to simulate the various scenarios, such as steady state, non LOCA transient and LOCA accident. The structure validity has been tested through the various verification tests and it has been shown that the developed code can treat the non LOCA and LOCA simulation. However more detailed design and implementation of models are required to get the physical validity of SYSTF code simulation

  5. Ethical objections against including life-extension costs in cost-effectiveness analysis: a consistent approach.

    Gandjour, Afschin; Müller, Dirk

    2014-10-01

    One of the major ethical concerns regarding cost-effectiveness analysis in health care has been the inclusion of life-extension costs ("it is cheaper to let people die"). For this reason, many analysts have opted to rule out life-extension costs from the analysis. However, surprisingly little has been written in the health economics literature regarding this ethical concern and the resulting practice. The purpose of this work was to present a framework and potential solution for ethical objections against life-extension costs. This work found three levels of ethical concern: (i) with respect to all life-extension costs (disease-related and -unrelated); (ii) with respect to disease-unrelated costs only; and (iii) regarding disease-unrelated costs plus disease-related costs not influenced by the intervention. Excluding all life-extension costs for ethical reasons would require-for reasons of consistency-a simultaneous exclusion of savings from reducing morbidity. At the other extreme, excluding only disease-unrelated life-extension costs for ethical reasons would require-again for reasons of consistency-the exclusion of health gains due to treatment of unrelated diseases. Therefore, addressing ethical concerns regarding the inclusion of life-extension costs necessitates fundamental changes in the calculation of cost effectiveness.

  6. Objective Analysis of Performance of Activities of Daily Living in People With Central Field Loss.

    Pardhan, Shahina; Latham, Keziah; Tabrett, Daryl; Timmis, Matthew A

    2015-11-01

    People with central visual field loss (CFL) adopt various strategies to complete activities of daily living (ADL). Using objective movement analysis, we compared how three ADLs were completed by people with CFL compared with age-matched, visually healthy individuals. Fourteen participants with CFL (age 81 ± 10 years) and 10 age-matched, visually healthy (age 75 ± 5 years) participated. Three ADLs were assessed: pick up food from a plate, pour liquid from a bottle, and insert a key in a lock. Participants with CFL completed each ADL habitually (as they would in their home). Data were compared with visually healthy participants who were asked to complete the tasks as they would normally, but under specified experimental conditions. Movement kinematics were compared using three-dimension motion analysis (Vicon). Visual functions (distance and near acuities, contrast sensitivity, visual fields) were recorded. All CFL participants were able to complete each ADL. However, participants with CFL demonstrated significantly (P approach. Various kinematic indices correlated significantly to visual function parameters including visual acuity and midperipheral visual field loss.

  7. Aligning experimental design with bioinformatics analysis to meet discovery research objectives.

    Kane, Michael D

    2002-01-01

    The utility of genomic technology and bioinformatic analytical support to provide new and needed insight into the molecular basis of disease, development, and diversity continues to grow as more research model systems and populations are investigated. Yet deriving results that meet a specific set of research objectives requires aligning or coordinating the design of the experiment, the laboratory techniques, and the data analysis. The following paragraphs describe several important interdependent factors that need to be considered to generate high quality data from the microarray platform. These factors include aligning oligonucleotide probe design with the sample labeling strategy if oligonucleotide probes are employed, recognizing that compromises are inherent in different sample procurement methods, normalizing 2-color microarray raw data, and distinguishing the difference between gene clustering and sample clustering. These factors do not represent an exhaustive list of technical variables in microarray-based research, but this list highlights those variables that span both experimental execution and data analysis. Copyright 2001 Wiley-Liss, Inc.

  8. WE-AB-202-09: Feasibility and Quantitative Analysis of 4DCT-Based High Precision Lung Elastography

    Hasse, K; Neylon, J; Low, D; Santhanam, A

    2016-01-01

    Purpose: The purpose of this project is to derive high precision elastography measurements from 4DCT lung scans to facilitate the implementation of elastography in a radiotherapy context. Methods: 4DCT scans of the lungs were acquired, and breathing stages were subsequently registered to each other using an optical flow DIR algorithm. The displacement of each voxel gleaned from the registration was taken to be the ground-truth deformation. These vectors, along with the 4DCT source datasets, were used to generate a GPU-based biomechanical simulation that acted as a forward model to solve the inverse elasticity problem. The lung surface displacements were applied as boundary constraints for the model-guided lung tissue elastography, while the inner voxels were allowed to deform according to the linear elastic forces within the model. A biomechanically-based anisotropic convergence magnification technique was applied to the inner voxels in order to amplify the subtleties of the interior deformation. Solving the inverse elasticity problem was accomplished by modifying the tissue elasticity and iteratively deforming the biomechanical model. Convergence occurred when each voxel was within 0.5 mm of the ground-truth deformation and 1 kPa of the ground-truth elasticity distribution. To analyze the feasibility of the model-guided approach, we present the results for regions of low ventilation, specifically, the apex. Results: The maximum apical boundary expansion was observed to be between 2 and 6 mm. Simulating this expansion within an apical lung model, it was observed that 100% of voxels converged within 0.5 mm of ground-truth deformation, while 91.8% converged within 1 kPa of the ground-truth elasticity distribution. A mean elasticity error of 0.6 kPa illustrates the high precision of our technique. Conclusion: By utilizing 4DCT lung data coupled with a biomechanical model, high precision lung elastography can be accurately performed, even in low ventilation regions of

  9. WE-AB-202-09: Feasibility and Quantitative Analysis of 4DCT-Based High Precision Lung Elastography

    Hasse, K; Neylon, J; Low, D; Santhanam, A [UCLA, Los Angeles, CA (United States)

    2016-06-15

    Purpose: The purpose of this project is to derive high precision elastography measurements from 4DCT lung scans to facilitate the implementation of elastography in a radiotherapy context. Methods: 4DCT scans of the lungs were acquired, and breathing stages were subsequently registered to each other using an optical flow DIR algorithm. The displacement of each voxel gleaned from the registration was taken to be the ground-truth deformation. These vectors, along with the 4DCT source datasets, were used to generate a GPU-based biomechanical simulation that acted as a forward model to solve the inverse elasticity problem. The lung surface displacements were applied as boundary constraints for the model-guided lung tissue elastography, while the inner voxels were allowed to deform according to the linear elastic forces within the model. A biomechanically-based anisotropic convergence magnification technique was applied to the inner voxels in order to amplify the subtleties of the interior deformation. Solving the inverse elasticity problem was accomplished by modifying the tissue elasticity and iteratively deforming the biomechanical model. Convergence occurred when each voxel was within 0.5 mm of the ground-truth deformation and 1 kPa of the ground-truth elasticity distribution. To analyze the feasibility of the model-guided approach, we present the results for regions of low ventilation, specifically, the apex. Results: The maximum apical boundary expansion was observed to be between 2 and 6 mm. Simulating this expansion within an apical lung model, it was observed that 100% of voxels converged within 0.5 mm of ground-truth deformation, while 91.8% converged within 1 kPa of the ground-truth elasticity distribution. A mean elasticity error of 0.6 kPa illustrates the high precision of our technique. Conclusion: By utilizing 4DCT lung data coupled with a biomechanical model, high precision lung elastography can be accurately performed, even in low ventilation regions of

  10. Ultra-precision bearings

    Wardle, F

    2015-01-01

    Ultra-precision bearings can achieve extreme accuracy of rotation, making them ideal for use in numerous applications across a variety of fields, including hard disk drives, roundness measuring machines and optical scanners. Ultraprecision Bearings provides a detailed review of the different types of bearing and their properties, as well as an analysis of the factors that influence motion error, stiffness and damping. Following an introduction to basic principles of motion error, each chapter of the book is then devoted to the basic principles and properties of a specific type of bearin

  11. Characterization of analysis activity in the development of object-oriented software. Application to a examination system in nuclear medicine

    Bayas, Marcos Raul Cordova.

    1995-01-01

    The object-oriented approach, formerly proposed as an alternative to conventional software coding techniques, has expanded its scope to other phases in software development, including the analysis phase. This work discusses basic concepts and major object oriented analysis methods, drawing comparisons with structured analysis, which has been the dominant paradigm in systems analysis. The comparison is based on three interdependent system aspects, that must be specified during the analysis phase: data, control and functionality. The specification of a radioisotope examination archive system is presented as a case study. (author). 45 refs., 87 figs., 1 tab

  12. MAPPING ERODED AREAS ON MOUNTAIN GRASSLAND WITH TERRESTRIAL PHOTOGRAMMETRY AND OBJECT-BASED IMAGE ANALYSIS

    A. Mayr

    2016-06-01

    Full Text Available In the Alps as well as in other mountain regions steep grassland is frequently affected by shallow erosion. Often small landslides or snow movements displace the vegetation together with soil and/or unconsolidated material. This results in bare earth surface patches within the grass covered slope. Close-range and remote sensing techniques are promising for both mapping and monitoring these eroded areas. This is essential for a better geomorphological process understanding, to assess past and recent developments, and to plan mitigation measures. Recent developments in image matching techniques make it feasible to produce high resolution orthophotos and digital elevation models from terrestrial oblique images. In this paper we propose to delineate the boundary of eroded areas for selected scenes of a study area, using close-range photogrammetric data. Striving for an efficient, objective and reproducible workflow for this task, we developed an approach for automated classification of the scenes into the classes grass and eroded. We propose an object-based image analysis (OBIA workflow which consists of image segmentation and automated threshold selection for classification using the Excess Green Vegetation Index (ExG. The automated workflow is tested with ten different scenes. Compared to a manual classification, grass and eroded areas are classified with an overall accuracy between 90.7% and 95.5%, depending on the scene. The methods proved to be insensitive to differences in illumination of the scenes and greenness of the grass. The proposed workflow reduces user interaction and is transferable to other study areas. We conclude that close-range photogrammetry is a valuable low-cost tool for mapping this type of eroded areas in the field with a high level of detail and quality. In future, the output will be used as ground truth for an area-wide mapping of eroded areas in coarser resolution aerial orthophotos acquired at the same time.

  13. Object-based image analysis and data mining for building ontology of informal urban settlements

    Khelifa, Dejrriri; Mimoun, Malki

    2012-11-01

    During recent decades, unplanned settlements have been appeared around the big cities in most developing countries and as consequence, numerous problems have emerged. Thus the identification of different kinds of settlements is a major concern and challenge for authorities of many countries. Very High Resolution (VHR) Remotely Sensed imagery has proved to be a very promising way to detect different kinds of settlements, especially through the using of new objectbased image analysis (OBIA). The most important key is in understanding what characteristics make unplanned settlements differ from planned ones, where most experts characterize unplanned urban areas by small building sizes at high densities, no orderly road arrangement and Lack of green spaces. Knowledge about different kinds of settlements can be captured as a domain ontology that has the potential to organize knowledge in a formal, understandable and sharable way. In this work we focus on extracting knowledge from VHR images and expert's knowledge. We used an object based strategy by segmenting a VHR image taken over urban area into regions of homogenous pixels at adequate scale level and then computing spectral, spatial and textural attributes for each region to create objects. A genetic-based data mining was applied to generate high predictive and comprehensible classification rules based on selected samples from the OBIA result. Optimized intervals of relevant attributes are found, linked with land use types for forming classification rules. The unplanned areas were separated from the planned ones, through analyzing of the line segments detected from the input image. Finally a simple ontology was built based on the previous processing steps. The approach has been tested to VHR images of one of the biggest Algerian cities, that has grown considerably in recent decades.

  14. Video image analysis in the Australian meat industry - precision and accuracy of predicting lean meat yield in lamb carcasses.

    Hopkins, D L; Safari, E; Thompson, J M; Smith, C R

    2004-06-01

    A wide selection of lamb types of mixed sex (ewes and wethers) were slaughtered at a commercial abattoir and during this process images of 360 carcasses were obtained online using the VIAScan® system developed by Meat and Livestock Australia. Soft tissue depth at the GR site (thickness of tissue over the 12th rib 110 mm from the midline) was measured by an abattoir employee using the AUS-MEAT sheep probe (PGR). Another measure of this thickness was taken in the chiller using a GR knife (NGR). Each carcass was subsequently broken down to a range of trimmed boneless retail cuts and the lean meat yield determined. The current industry model for predicting meat yield uses hot carcass weight (HCW) and tissue depth at the GR site. A low level of accuracy and precision was found when HCW and PGR were used to predict lean meat yield (R(2)=0.19, r.s.d.=2.80%), which could be improved markedly when PGR was replaced by NGR (R(2)=0.41, r.s.d.=2.39%). If the GR measures were replaced by 8 VIAScan® measures then greater prediction accuracy could be achieved (R(2)=0.52, r.s.d.=2.17%). A similar result was achieved when the model was based on principal components (PCs) computed from the 8 VIAScan® measures (R(2)=0.52, r.s.d.=2.17%). The use of PCs also improved the stability of the model compared to a regression model based on HCW and NGR. The transportability of the models was tested by randomly dividing the data set and comparing coefficients and the level of accuracy and precision. Those models based on PCs were superior to those based on regression. It is demonstrated that with the appropriate modeling the VIAScan® system offers a workable method for predicting lean meat yield automatically.

  15. A Comprehensive Proteomics Analysis of the Human Iris Tissue: Ready to Embrace Postgenomics Precision Medicine in Ophthalmology?

    Murthy, Krishna R; Dammalli, Manjunath; Pinto, Sneha M; Murthy, Kalpana Babu; Nirujogi, Raja Sekhar; Madugundu, Anil K; Dey, Gourav; Subbannayya, Yashwanth; Mishra, Uttam Kumar; Nair, Bipin; Gowda, Harsha; Prasad, T S Keshava

    2016-09-01

    The annual economic burden of visual disorders in the United States was estimated at $139 billion. Ophthalmology is therefore one of the salient application fields of postgenomics biotechnologies such as proteomics in the pursuit of global precision medicine. Interestingly, the protein composition of the human iris tissue still remains largely unexplored. In this context, the uveal tract constitutes the vascular middle coat of the eye and is formed by the choroid, ciliary body, and iris. The iris forms the anterior most part of the uvea. It is a thin muscular diaphragm with a central perforation called pupil. Inflammation of the uvea is termed uveitis and causes reduced vision or blindness. However, the pathogenesis of the spectrum of diseases causing uveitis is still not very well understood. We investigated the proteome of the iris tissue harvested from healthy donor eyes that were enucleated within 6 h of death using high-resolution Fourier transform mass spectrometry. A total of 4959 nonredundant proteins were identified in the human iris, which included proteins involved in signaling, cell communication, metabolism, immune response, and transport. This study is the first attempt to comprehensively profile the global proteome of the human iris tissue and, thus, offers the potential to facilitate biomedical research into pathological diseases of the uvea such as Behcet's disease, Vogt Koyonagi Harada's disease, and juvenile rheumatoid arthritis. Finally, we make a call to the broader visual health and ophthalmology community that proteomics offers a veritable prospect to obtain a systems scale, functional, and dynamic picture of the eye tissue in health and disease. This knowledge is ultimately pertinent for precision medicine diagnostics and therapeutics innovation to address the pressing needs of the 21st century visual health.

  16. An experimental analysis of design choices of multi-objective ant colony optimization algorithms

    Lopez-Ibanez, Manuel; Stutzle, Thomas

    2012-01-01

    There have been several proposals on how to apply the ant colony optimization (ACO) metaheuristic to multi-objective combinatorial optimization problems (MOCOPs). This paper proposes a new formulation of these multi-objective ant colony optimization (MOACO) algorithms. This formulation is based on adding specific algorithm components for tackling multiple objectives to the basic ACO metaheuristic. Examples of these components are how to represent multiple objectives using pheromone and heuris...

  17. Precision Joining Center

    Powell, J. W.; Westphal, D. A.

    1991-08-01

    A workshop to obtain input from industry on the establishment of the Precision Joining Center (PJC) was held on July 10-12, 1991. The PJC is a center for training Joining Technologists in advanced joining techniques and concepts in order to promote the competitiveness of U.S. industry. The center will be established as part of the DOE Defense Programs Technology Commercialization Initiative, and operated by EG&G Rocky Flats in cooperation with the American Welding Society and the Colorado School of Mines Center for Welding and Joining Research. The overall objectives of the workshop were to validate the need for a Joining Technologists to fill the gap between the welding operator and the welding engineer, and to assure that the PJC will train individuals to satisfy that need. The consensus of the workshop participants was that the Joining Technologist is a necessary position in industry, and is currently used, with some variation, by many companies. It was agreed that the PJC core curriculum, as presented, would produce a Joining Technologist of value to industries that use precision joining techniques. The advantage of the PJC would be to train the Joining Technologist much more quickly and more completely. The proposed emphasis of the PJC curriculum on equipment intensive and hands-on training was judged to be essential.

  18. Analysis and optimization with ecological objective function of irreversible single resonance energy selective electron heat engines

    Zhou, Junle; Chen, Lingen; Ding, Zemin; Sun, Fengrui

    2016-01-01

    Ecological performance of a single resonance ESE heat engine with heat leakage is conducted by applying finite time thermodynamics. By introducing Nielsen function and numerical calculations, expressions about power output, efficiency, entropy generation rate and ecological objective function are derived; relationships between ecological objective function and power output, between ecological objective function and efficiency as well as between power output and efficiency are demonstrated; influences of system parameters of heat leakage, boundary energy and resonance width on the optimal performances are investigated in detail; a specific range of boundary energy is given as a compromise to make ESE heat engine system work at optimal operation regions. Comparing performance characteristics with different optimization objective functions, the significance of selecting ecological objective function as the design objective is clarified specifically: when changing the design objective from maximum power output into maximum ecological objective function, the improvement of efficiency is 4.56%, while the power output drop is only 2.68%; when changing the design objective from maximum efficiency to maximum ecological objective function, the improvement of power output is 229.13%, and the efficiency drop is only 13.53%. - Highlights: • An irreversible single resonance energy selective electron heat engine is studied. • Heat leakage between two reservoirs is considered. • Power output, efficiency and ecological objective function are derived. • Optimal performance comparison for three objective functions is carried out.

  19. Object methods of analysis and design: presentation of U R L ...

    Objects invaded the world of data processing, and there is no field which did not feel their effects. The object approach originates in the programming object, whose languages Smalltalk and C++ are the most known representatives. Thereafter, its application spread with many fields such as the software genius, the left again ...

  20. Object-oriented analysis and design of a GEANT based detector simulator

    Amako, K.; Kanzaki, J.; Sasaki, T.; Takaiwa, Y.; Nakagawa, Y.; Yamagata, T.

    1994-01-01

    The authors give a status report of the project to design a detector simulation program by reengineering GEANT with the object-oriented methodology. They followed the Object Modeling Technique. They explain the object model they constructed. Also problems of the technique found during their study are discussed

  1. report on the french objectives of electricity consumption, produced from renewable energies sources and on the analysis of their realization

    2007-01-01

    This report presents the french objectives of electricity, from renewable energies sources, internal consumption for the next ten years, as the analysis of their realization taking into account the climatic factors likely to change the realization of these objectives. It also discusses the adequacy of the actions to the national engagement in matter of climatic change. (A.L.B.)

  2. Pigments analysis and gold layer thickness evaluation of polychromy on wood objects by PXRF

    Blonski, M.S.; Appoloni, C.R.

    2014-01-01

    The X-ray fluorescence technique by energy dispersion (EDXRF), being a multi elemental and non-destructive technique, has been widely used in the analysis of artworks and archeometry. An X-ray fluorescence portable equipment from the Laboratory of Applied Nuclear Physics of the State University of Londrina (LFNA/UEL) was used for the measurement of pigments in golden parts of a Gilding Preparation Standard Plaque and also pigments measurement on the Wood Adornment of the High Altar Column of the Side Pulpit of the Immaculate Conception Church Parish Sao Paulo-SP. The portable X-ray fluorescence PXRF-LFNA-02 consists of an X-ray tube with Ag anode, a Si-PIN detector (FWHM=221 eV for Mn line at 5.9 keV), a chain of electronics nuclear standard of X-ray spectrometer, a multichannel 8 K, a notebook and a mechanical system designed for the positioning of detector and X-ray tube, which allows movements with two degrees of freedom from the system of excitation–detection. The excitation–detection time of each measurement was 100 and 500 s, respectively. The presence of elements Ti, Cr, Fe, Cu, Zn and Au was found in the golden area of the Altar Column ornament. On the other hand, analysis of the ratios for the intensities of K α /K β lines measured in the areas made it possible to explore the possibility of measuring the stratigraphies of the layers of pigments and to estimate the thickness of the same. - Highlights: • The X-ray fluorescence technique by energy dispersion (EDXRF) and an X-ray fluorescence portable equipment are used for measurement of pigments. • Analysis of the ratios for the intensities of K α /K β lines measured in the areas made it possible to explore the possibility of measuring the stratigraphies of the layers of pigments and to estimate the thickness of the same. • The result of pigment analysis performed on these objects indicates that they are of the twentieth century

  3. Bridge Crack Detection Using Multi-Rotary Uav and Object-Base Image Analysis

    Rau, J. Y.; Hsiao, K. W.; Jhan, J. P.; Wang, S. H.; Fang, W. C.; Wang, J. L.

    2017-08-01

    Bridge is an important infrastructure for human life. Thus, the bridge safety monitoring and maintaining is an important issue to the government. Conventionally, bridge inspection were conducted by human in-situ visual examination. This procedure sometimes require under bridge inspection vehicle or climbing under the bridge personally. Thus, its cost and risk is high as well as labor intensive and time consuming. Particularly, its documentation procedure is subjective without 3D spatial information. In order cope with these challenges, this paper propose the use of a multi-rotary UAV that equipped with a SONY A7r2 high resolution digital camera, 50 mm fixed focus length lens, 135 degrees up-down rotating gimbal. The target bridge contains three spans with a total of 60 meters long, 20 meters width and 8 meters height above the water level. In the end, we took about 10,000 images, but some of them were acquired by hand held method taken on the ground using a pole with 2-8 meters long. Those images were processed by Agisoft PhotoscanPro to obtain exterior and interior orientation parameters. A local coordinate system was defined by using 12 ground control points measured by a total station. After triangulation and camera self-calibration, the RMS of control points is less than 3 cm. A 3D CAD model that describe the bridge surface geometry was manually measured by PhotoscanPro. They were composed of planar polygons and will be used for searching related UAV images. Additionally, a photorealistic 3D model can be produced for 3D visualization. In order to detect cracks on the bridge surface, we utilize object-based image analysis (OBIA) technique to segment the image into objects. Later, we derive several object features, such as density, area/bounding box ratio, length/width ratio, length, etc. Then, we can setup a classification rule set to distinguish cracks. Further, we apply semi-global-matching (SGM) to obtain 3D crack information and based on image scale we

  4. BRIDGE CRACK DETECTION USING MULTI-ROTARY UAV AND OBJECT-BASE IMAGE ANALYSIS

    J. Y. Rau

    2017-08-01

    Full Text Available Bridge is an important infrastructure for human life. Thus, the bridge safety monitoring and maintaining is an important issue to the government. Conventionally, bridge inspection were conducted by human in-situ visual examination. This procedure sometimes require under bridge inspection vehicle or climbing under the bridge personally. Thus, its cost and risk is high as well as labor intensive and time consuming. Particularly, its documentation procedure is subjective without 3D spatial information. In order cope with these challenges, this paper propose the use of a multi-rotary UAV that equipped with a SONY A7r2 high resolution digital camera, 50 mm fixed focus length lens, 135 degrees up-down rotating gimbal. The target bridge contains three spans with a total of 60 meters long, 20 meters width and 8 meters height above the water level. In the end, we took about 10,000 images, but some of them were acquired by hand held method taken on the ground using a pole with 2–8 meters long. Those images were processed by Agisoft PhotoscanPro to obtain exterior and interior orientation parameters. A local coordinate system was defined by using 12 ground control points measured by a total station. After triangulation and camera self-calibration, the RMS of control points is less than 3 cm. A 3D CAD model that describe the bridge surface geometry was manually measured by PhotoscanPro. They were composed of planar polygons and will be used for searching related UAV images. Additionally, a photorealistic 3D model can be produced for 3D visualization. In order to detect cracks on the bridge surface, we utilize object-based image analysis (OBIA technique to segment the image into objects. Later, we derive several object features, such as density, area/bounding box ratio, length/width ratio, length, etc. Then, we can setup a classification rule set to distinguish cracks. Further, we apply semi-global-matching (SGM to obtain 3D crack information and based

  5. Automated retroillumination photography analysis for objective assessment of Fuchs Corneal Dystrophy severity

    Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.

    2016-01-01

    Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565

  6. Sliding thin slab, minimum intensity projection imaging for objective analysis of emphysema

    Satoh, Shiro; Ohdama, Shinichi; Shibuya, Hitoshi

    2006-01-01

    The aim of this study was to determine whether sliding thin slab, minimum intensity projection (STS-MinIP) imaging is more advantageous than thin-section computed tomography (CT) for detecting and assessing emphysema. Objective quantification of emphysema by STS-MinIP and thin-section CT was defined as the percentage of area lower than the threshold in the lung section at the level of the aortic arch, tracheal carina, and 5 cm below the carina. Quantitative analysis in 100 subjects was performed and compared with pulmonary function test results. The ratio of the low attenuation area in the lung measured by STS-MinIP was significantly higher than that found by thin-section CT (P<0.01). The difference between STS-MinIP and thin-section CT was statistically evident even for mild emphysema and increased depending on whether the low attenuation in the lung increased. Moreover, STS-MinIP showed a stronger regression relation with pulmonary function results than did thin-section CT (P<0.01). STS-MinIP can be recommended as a new morphometric method for detecting and assessing the severity of emphysema. (author)

  7. UAV-based urban structural damage assessment using object-based image analysis and semantic reasoning

    Fernandez Galarreta, J.; Kerle, N.; Gerke, M.

    2015-06-01

    Structural damage assessment is critical after disasters but remains a challenge. Many studies have explored the potential of remote sensing data, but limitations of vertical data persist. Oblique imagery has been identified as more useful, though the multi-angle imagery also adds a new dimension of complexity. This paper addresses damage assessment based on multi-perspective, overlapping, very high resolution oblique images obtained with unmanned aerial vehicles (UAVs). 3-D point-cloud assessment for the entire building is combined with detailed object-based image analysis (OBIA) of façades and roofs. This research focuses not on automatic damage assessment, but on creating a methodology that supports the often ambiguous classification of intermediate damage levels, aiming at producing comprehensive per-building damage scores. We identify completely damaged structures in the 3-D point cloud, and for all other cases provide the OBIA-based damage indicators to be used as auxiliary information by damage analysts. The results demonstrate the usability of the 3-D point-cloud data to identify major damage features. Also the UAV-derived and OBIA-processed oblique images are shown to be a suitable basis for the identification of detailed damage features on façades and roofs. Finally, we also demonstrate the possibility of aggregating the multi-perspective damage information at building level.

  8. A cluster analysis of patterns of objectively measured physical activity in Hong Kong.

    Lee, Paul H; Yu, Ying-Ying; McDowell, Ian; Leung, Gabriel M; Lam, T H

    2013-08-01

    The health benefits of exercise are clear. In targeting interventions it would be valuable to know whether characteristic patterns of physical activity (PA) are associated with particular population subgroups. The present study used cluster analysis to identify characteristic hourly PA patterns measured by accelerometer. Cross-sectional design. Objectively measured PA in Hong Kong adults. Four-day accelerometer data were collected during 2009 to 2011 for 1714 participants in Hong Kong (mean age 44?2 years, 45?9% male). Two clusters were identified, one more active than the other. The ‘active cluster’ (n 480) was characterized by a routine PA pattern on weekdays and a more active and varied pattern on weekends; the other, the ‘less active cluster’ (n 1234), by a consistently low PA pattern on both weekdays and weekends with little variation from day to day. Demographic, lifestyle, PA level and health characteristics of the two clusters were compared. They differed in age, sex, smoking, income and level of PA required at work. The odds of having any chronic health conditions was lower for the active group (adjusted OR50?62, 95% CI 0?46, 0?84) but the two groups did not differ in terms of specific chronic health conditions or obesity. Implications are drawn for targeting exercise promotion programmes at the population level.

  9. Cognition and objectively measured sleep duration in children: a systematic review and meta-analysis.

    Short, Michelle A; Blunden, Sarah; Rigney, Gabrielle; Matricciani, Lisa; Coussens, Scott; M Reynolds, Chelsea; Galland, Barbara

    2018-06-01

    Sleep recommendations are widely used to guide communities on children's sleep needs. Following recent adjustments to guidelines by the National Sleep Foundation and the subsequent consensus statement by the American Academy of Sleep Medicine, we undertook a systematic literature search to evaluate the current evidence regarding relationships between objectively measured sleep duration and cognitive function in children aged 5 to 13 years. Cognitive function included measures of memory, attention, processing speed, and intelligence in children aged 5 to 13 years. Keyword searches of 7 databases to December 2016 found 23 meeting inclusion criteria from 137 full articles reviewed, 19 of which were suitable for meta-analysis. A significant effect (r = .06) was found between sleep duration and cognition, suggesting that longer sleep durations were associated with better cognitive functioning. Analyses of different cognitive domains revealed that full/verbal IQ was significantly associated with sleep loss, but memory, fluid IQ, processing speed and attention were not. Comparison of study sleep durations with current sleep recommendations showed that most children studied had sleep durations that were not within the range of recommended sleep. As such, the true effect of sleep loss on cognitive function may be obscured in these samples, as most children were sleep restricted. Future research using more rigorous experimental methodologies is needed to properly elucidate the relationship between sleep duration and cognition in this age group. Copyright © 2018 National Sleep Foundation. Published by Elsevier Inc. All rights reserved.

  10. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues

  11. Calculating potential error in sodium MRI with respect to the analysis of small objects.

    Stobbe, Robert W; Beaulieu, Christian

    2018-06-01

    To facilitate correct interpretation of sodium MRI measurements, calculation of error with respect to rapid signal decay is introduced and combined with that of spatially correlated noise to assess volume-of-interest (VOI) 23 Na signal measurement inaccuracies, particularly for small objects. Noise and signal decay-related error calculations were verified using twisted projection imaging and a specially designed phantom with different sized spheres of constant elevated sodium concentration. As a demonstration, lesion signal measurement variation (5 multiple sclerosis participants) was compared with that predicted from calculation. Both theory and phantom experiment showed that VOI signal measurement in a large 10-mL, 314-voxel sphere was 20% less than expected on account of point-spread-function smearing when the VOI was drawn to include the full sphere. Volume-of-interest contraction reduced this error but increased noise-related error. Errors were even greater for smaller spheres (40-60% less than expected for a 0.35-mL, 11-voxel sphere). Image-intensity VOI measurements varied and increased with multiple sclerosis lesion size in a manner similar to that predicted from theory. Correlation suggests large underestimation of 23 Na signal in small lesions. Acquisition-specific measurement error calculation aids 23 Na MRI data analysis and highlights the limitations of current low-resolution methodologies. Magn Reson Med 79:2968-2977, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  12. Application of Object Based Classification and High Resolution Satellite Imagery for Savanna Ecosystem Analysis

    Jane Southworth

    2010-12-01

    Full Text Available Savanna ecosystems are an important component of dryland regions and yet are exceedingly difficult to study using satellite imagery. Savannas are composed are varying amounts of trees, shrubs and grasses and typically traditional classification schemes or vegetation indices cannot differentiate across class type. This research utilizes object based classification (OBC for a region in Namibia, using IKONOS imagery, to help differentiate tree canopies and therefore woodland savanna, from shrub or grasslands. The methodology involved the identification and isolation of tree canopies within the imagery and the creation of tree polygon layers had an overall accuracy of 84%. In addition, the results were scaled up to a corresponding Landsat image of the same region, and the OBC results compared to corresponding pixel values of NDVI. The results were not compelling, indicating once more the problems of these traditional image analysis techniques for savanna ecosystems. Overall, the use of the OBC holds great promise for this ecosystem and could be utilized more frequently in studies of vegetation structure.

  13. Objective classification of ecological status in marine water bodies using ecotoxicological information and multivariate analysis.

    Beiras, Ricardo; Durán, Iria

    2014-12-01

    Some relevant shortcomings have been identified in the current approach for the classification of ecological status in marine water bodies, leading to delays in the fulfillment of the Water Framework Directive objectives. Natural variability makes difficult to settle fixed reference values and boundary values for the Ecological Quality Ratios (EQR) for the biological quality elements. Biological responses to environmental degradation are frequently of nonmonotonic nature, hampering the EQR approach. Community structure traits respond only once ecological damage has already been done and do not provide early warning signals. An alternative methodology for the classification of ecological status integrating chemical measurements, ecotoxicological bioassays and community structure traits (species richness and diversity), and using multivariate analyses (multidimensional scaling and cluster analysis), is proposed. This approach does not depend on the arbitrary definition of fixed reference values and EQR boundary values, and it is suitable to integrate nonlinear, sensitive signals of ecological degradation. As a disadvantage, this approach demands the inclusion of sampling sites representing the full range of ecological status in each monitoring campaign. National or international agencies in charge of coastal pollution monitoring have comprehensive data sets available to overcome this limitation.

  14. Experimental assessment of precision and accuracy of radiostereometric analysis for the determination of polyethylene wear in a total hip replacement model.

    Bragdon, Charles R; Malchau, Henrik; Yuan, Xunhua; Perinchief, Rebecca; Kärrholm, Johan; Börlin, Niclas; Estok, Daniel M; Harris, William H

    2002-07-01

    The purpose of this study was to develop and test a phantom model based on actual total hip replacement (THR) components to simulate the true penetration of the femoral head resulting from polyethylene wear. This model was used to study both the accuracy and the precision of radiostereometric analysis, RSA, in measuring wear. We also used this model to evaluate optimum tantalum bead configuration for this particular cup design when used in a clinical setting. A physical model of a total hip replacement (a phantom) was constructed which could simulate progressive, three-dimensional (3-D) penetration of the femoral head into the polyethylene component of a THR. Using a coordinate measuring machine (CMM) the positioning of the femoral head using the phantom was measured to be accurate to within 7 microm. The accuracy and precision of an RSA analysis system was determined from five repeat examinations of the phantom using various experimental set-ups of the phantom. The accuracy of the radiostereometric analysis, in this optimal experimental set-up studied was 33 microm for the medial direction, 22 microm for the superior direction, 86 microm for the posterior direction and 55 microm for the resultant 3-D vector length. The corresponding precision at the 95% confidence interval of the test results for repositioning the phantom five times, measured 8.4 microm for the medial direction, 5.5 microm for the superior direction, 16.0 microm for the posterior direction, and 13.5 microm for the resultant 3-D vector length. This in vitro model is proposed as a useful tool for developing a standard for the evaluation of radiostereometric and other radiographic methods used to measure in vivo wear.

  15. Experience in usage of T-108 titrimetric laboratory unit for precision analysis of uranium-containing materials

    Ryzhinskij, M.V.; Bronzov, P.A.

    1989-01-01

    Possibilities of the T-108 device of potentiometric titration for precise determination of uranium in various uranium-containing materials are studied, the results being presented. Principle flowsheet of the device and the sequence of analytic procedure of uranium potentiometric titration are considered. U 3 O 8 , UO 2 and UF 4 were used as materials to be analyzed, state standard samples of K 2 Cr 2 O 7 -SSS 2215-81 and U 3 O 8 SSS 2396-83P- as standard samples. It is shown that relative standard deviation during titration using the T-108 device is mainly determined by the error of determination of the final titration point potention and it must not exceed 0.002 for uranium titration considered. The conclusion is made that the variant of potentiometric titration of uranium with the use of the T-108 device is not inferior in its accuracy to gravimetry and surpasses it in productivity and possibility of automation. 4 refs.; 2 figs.; 2 tabs

  16. Object-oriented analysis and design of a health care management information system.

    Krol, M; Reich, D L

    1999-04-01

    We have created a prototype for a universal object-oriented model of a health care system compatible with the object-oriented approach used in version 3.0 of the HL7 standard for communication messages. A set of three models has been developed: (1) the Object Model describes the hierarchical structure of objects in a system--their identity, relationships, attributes, and operations; (2) the Dynamic Model represents the sequence of operations in time as a collection of state diagrams for object classes in the system; and (3) functional Diagram represents the transformation of data within a system by means of data flow diagrams. Within these models, we have defined major object classes of health care participants and their subclasses, associations, attributes and operators, states, and behavioral scenarios. We have also defined the major processes and subprocesses. The top-down design approach allows use, reuse, and cloning of standard components.

  17. An empirical analysis of the precision of estimating the numbers of neurons and glia in human neocortex using a fractionator-design with sub-sampling

    Lyck, L.; Santamaria, I.D.; Pakkenberg, B.

    2009-01-01

    Improving histomorphometric analysis of the human neocortex by combining stereological cell counting with immunchistochemical visualisation of specific neuronal and glial cell populations is a methodological challenge. To enable standardized immunohistochemical staining, the amount of brain tissue...... at each level of sampling was determined empirically. The methodology was tested in three brains analysing the contribution of the multi-step sampling procedure to the precision on the estimated total numbers of immunohistochemically defined NeuN expressing (NeuN(+)) neurons and CD45(+) microglia...

  18. A functional analysis of photo-object matching skills of severely retarded adolescents.

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photo...

  19. Advanced bioanalytics for precision medicine.

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  20. Impact of PET/CT system, reconstruction protocol, data analysis method, and repositioning on PET/CT precision: An experimental evaluation using an oncology and brain phantom.

    Mansor, Syahir; Pfaehler, Elisabeth; Heijtel, Dennis; Lodge, Martin A; Boellaard, Ronald; Yaqub, Maqsood

    2017-12-01

    In longitudinal oncological and brain PET/CT studies, it is important to understand the repeatability of quantitative PET metrics in order to assess change in tracer uptake. The present studies were performed in order to assess precision as function of PET/CT system, reconstruction protocol, analysis method, scan duration (or image noise), and repositioning in the field of view. Multiple (repeated) scans have been performed using a NEMA image quality (IQ) phantom and a 3D Hoffman brain phantom filled with 18 F solutions on two systems. Studies were performed with and without randomly (PET/CT, especially in the case of smaller spheres (PET metrics depends on the combination of reconstruction protocol, data analysis methods and scan duration (scan statistics). Moreover, precision was also affected by phantom repositioning but its impact depended on the data analysis method in combination with the reconstructed voxel size (tissue fraction effect). This study suggests that for oncological PET studies the use of SUV peak may be preferred over SUV max because SUV peak is less sensitive to patient repositioning/tumor sampling. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  1. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  2. Analysis of the application of poly-nanocrystalline diamond tools for ultra precision machining of steel with ultrasonic assistance

    Doetz, M.; Dambon, O.; Klocke, F.; Bulla, B.; Schottka, K.; Robertson, D. J.

    2017-10-01

    Ultra-precision diamond turning enables the manufacturing of parts with mirror-like surfaces and highest form accuracies out of non-ferrous, a few crystalline and plastic materials. Furthermore, an ultrasonic assistance has the ability to push these boundaries and enables the machining of materials like steel, which is not possible in a conventional way due to the excessive tool wear caused by the affinity of carbon to iron. Usually monocrystalline diamonds tools are applied due to their unsurpassed cutting edge properties. New cutting tool material developments have shown that it is possible to produce tools made of nano-polycrystalline diamonds with cutting edges equivalent to monocrystalline diamonds. In nano-polycrystalline diamonds ultra-fine grains of a few tens of nanometers are firmly and directly bonded together creating an unisotropic structure. The properties of this material are described to be isotropic, harder and tougher than those of the monocrystalline diamonds, which are unisotropic. This publication will present machining results from the newest investigations of the process potential of this new polycrystalline cutting material. In order to provide a baseline with which to characterize the cutting material cutting experiments on different conventional machinable materials like Cooper or Aluminum are performed. The results provide information on the roughness and the topography of the surface focusing on the comparison to the results while machining with monocrystalline diamond. Furthermore, the cutting material is tested in machining steel with ultrasonic assistance with a focus on tool life time and surface roughness. An outlook on the machinability of other materials will be given.

  3. A sensitive, reproducible and objective immunofluorescence analysis method of dystrophin in individual fibers in samples from patients with duchenne muscular dystrophy.

    Chantal Beekman

    Full Text Available Duchenne muscular dystrophy (DMD is characterized by the absence or reduced levels of dystrophin expression on the inner surface of the sarcolemmal membrane of muscle fibers. Clinical development of therapeutic approaches aiming to increase dystrophin levels requires sensitive and reproducible measurement of differences in dystrophin expression in muscle biopsies of treated patients with DMD. This, however, poses a technical challenge due to intra- and inter-donor variance in the occurrence of revertant fibers and low trace dystrophin expression throughout the biopsies. We have developed an immunofluorescence and semi-automated image analysis method that measures the sarcolemmal dystrophin intensity per individual fiber for the entire fiber population in a muscle biopsy. Cross-sections of muscle co-stained for dystrophin and spectrin have been imaged by confocal microscopy, and image analysis was performed using Definiens software. Dystrophin intensity has been measured in the sarcolemmal mask of spectrin for each individual muscle fiber and multiple membrane intensity parameters (mean, maximum, quantiles per fiber were calculated. A histogram can depict the distribution of dystrophin intensities for the fiber population in the biopsy. This method was tested by measuring dystrophin in DMD, Becker muscular dystrophy, and healthy muscle samples. Analysis of duplicate or quadruplicate sections of DMD biopsies on the same or multiple days, by different operators, or using different antibodies, was shown to be objective and reproducible (inter-assay precision, CV 2-17% and intra-assay precision, CV 2-10%. Moreover, the method was sufficiently sensitive to detect consistently small differences in dystrophin between two biopsies from a patient with DMD before and after treatment with an investigational compound.

  4. An Exploration and Analysis of the Relationships among Object Oriented Programming, Hypermedia, and Hypertalk.

    Milet, Lynn K.; Harvey, Francis A.

    Hypermedia and object oriented programming systems (OOPs) represent examples of "open" computer environments that allow the user access to parts of the code or operating system. Both systems share fundamental intellectual concepts (objects, messages, methods, classes, and inheritance), so that an understanding of hypermedia can help in…

  5. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  6. Forecast skill score assessment of a relocatable ocean prediction system, using a simplified objective analysis method

    Onken, Reiner

    2017-11-01

    A relocatable ocean prediction system (ROPS) was employed to an observational data set which was collected in June 2014 in the waters to the west of Sardinia (western Mediterranean) in the framework of the REP14-MED experiment. The observational data, comprising more than 6000 temperature and salinity profiles from a fleet of underwater gliders and shipborne probes, were assimilated in the Regional Ocean Modeling System (ROMS), which is the heart of ROPS, and verified against independent observations from ScanFish tows by means of the forecast skill score as defined by Murphy(1993). A simplified objective analysis (OA) method was utilised for assimilation, taking account of only those profiles which were located within a predetermined time window W. As a result of a sensitivity study, the highest skill score was obtained for a correlation length scale C = 12.5 km, W = 24 h, and r = 1, where r is the ratio between the error of the observations and the background error, both for temperature and salinity. Additional ROPS runs showed that (i) the skill score of assimilation runs was mostly higher than the score of a control run without assimilation, (i) the skill score increased with increasing forecast range, and (iii) the skill score for temperature was higher than the score for salinity in the majority of cases. Further on, it is demonstrated that the vast number of observations can be managed by the applied OA method without data reduction, enabling timely operational forecasts even on a commercially available personal computer or a laptop.

  7. EXTRACTION OF BENTHIC COVER INFORMATION FROM VIDEO TOWS AND PHOTOGRAPHS USING OBJECT-BASED IMAGE ANALYSIS

    M. T. L. Estomata

    2012-07-01

    Full Text Available Mapping benthic cover in deep waters comprises a very small proportion of studies in the field of research. Majority of benthic cover mapping makes use of satellite images and usually, classification is carried out only for shallow waters. To map the seafloor in optically deep waters, underwater videos and photos are needed. Some researchers have applied this method on underwater photos, but made use of different classification methods such as: Neural Networks, and rapid classification via down sampling. In this study, accurate bathymetric data obtained using a multi-beam echo sounder (MBES was attempted to be used as complementary data with the underwater photographs. Due to the absence of a motion reference unit (MRU, which applies correction to the data gathered by the MBES, accuracy of the said depth data was compromised. Nevertheless, even with the absence of accurate bathymetric data, object-based image analysis (OBIA, which used rule sets based on information such as shape, size, area, relative distance, and spectral information, was still applied. Compared to pixel-based classifications, OBIA was able to classify more specific benthic cover types other than coral and sand, such as rubble and fish. Through the use of rule sets on area, less than or equal to 700 pixels for fish and between 700 to 10,000 pixels for rubble, as well as standard deviation values to distinguish texture, fish and rubble were identified. OBIA produced benthic cover maps that had higher overall accuracy, 93.78±0.85%, as compared to pixel-based methods that had an average accuracy of only 87.30±6.11% (p-value = 0.0001, α = 0.05.

  8. Analysis Of Tourism Object Demand In The Pekanbaru City With Travel Cost Method

    Eriyati

    2017-11-01

    Full Text Available The tourism sector gets attention when world oil prices are decreasing. It can not be denied that during this time the largest contribution of Pekanbaru city revenue from profit-sharing funding comes from the oil and gas sector. Currently Pekanbaru revenue is small from the oil and gas sector as oil prices continue to decline. The existence of Pekanbaru City away from the coast and mountains causing focus on the development of artificial attractions such as Alam Mayang artificial lake Bandar Kayangan Lembah Sari Pekanbaru Mosque and the tomb of the founder of Pekanbaru city. Many people bring families visiting artificial tourist attractions on weekends and holidays.This study aims to determine the factors that affect the demand and economic value of tourist attractions in Kota Pekanbaru with Travel Cost Method. Sampling non probability as much as 100 respondents visitor attraction in Pekanbaru City of population 224896 people with sampling technique using slovin formula data analysis method used in this research is descriptive quantitative method. From the results of research states that the factors that influence the demand for tourist attraction in the city of Pekanbaru is income cost and distance. The economic value of tourism object of Pekanbaru city with cost of travel method is Rp42.679.638.400 per year. This means that the price given by a person to something at a certain place and time with the size of the price specified by time goods or money that will be sacrificed by someone to own or use goods and services in want.

  9. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    MULKEY, C.H.

    1999-07-06

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants.

  10. Monitoring Urban Tree Cover Using Object-Based Image Analysis and Public Domain Remotely Sensed Data

    Meghan Halabisky

    2011-10-01

    Full Text Available Urban forest ecosystems provide a range of social and ecological services, but due to the heterogeneity of these canopies their spatial extent is difficult to quantify and monitor. Traditional per-pixel classification methods have been used to map urban canopies, however, such techniques are not generally appropriate for assessing these highly variable landscapes. Landsat imagery has historically been used for per-pixel driven land use/land cover (LULC classifications, but the spatial resolution limits our ability to map small urban features. In such cases, hyperspatial resolution imagery such as aerial or satellite imagery with a resolution of 1 meter or below is preferred. Object-based image analysis (OBIA allows for use of additional variables such as texture, shape, context, and other cognitive information provided by the image analyst to segment and classify image features, and thus, improve classifications. As part of this research we created LULC classifications for a pilot study area in Seattle, WA, USA, using OBIA techniques and freely available public aerial photography. We analyzed the differences in accuracies which can be achieved with OBIA using multispectral and true-color imagery. We also compared our results to a satellite based OBIA LULC and discussed the implications of per-pixel driven vs. OBIA-driven field sampling campaigns. We demonstrated that the OBIA approach can generate good and repeatable LULC classifications suitable for tree cover assessment in urban areas. Another important finding is that spectral content appeared to be more important than spatial detail of hyperspatial data when it comes to an OBIA-driven LULC.

  11. Pricing index-based catastrophe bonds: Part 2: Object-oriented design issues and sensitivity analysis

    Unger, André J. A.

    2010-02-01

    This work is the second installment in a two-part series, and focuses on object-oriented programming methods to implement an augmented-state variable approach to aggregate the PCS index and introduce the Bermudan-style call feature into the proposed CAT bond model. The PCS index is aggregated quarterly using a discrete Asian running-sum formulation. The resulting aggregate PCS index augmented-state variable is used to specify the payoff (principle) on the CAT bond based on reinsurance layers. The purpose of the Bermudan-style call option is to allow the reinsurer to minimize their interest rate risk exposure on making fixed coupon payments under prevailing interest rates. A sensitivity analysis is performed to determine the impact of uncertainty in the frequency and magnitude of hurricanes on the price of the CAT bond. Results indicate that while the CAT bond is highly sensitive to the natural variability in the frequency of landfalling hurricanes between El Ninõ and non-El Ninõ years, it remains relatively insensitive to uncertainty in the magnitude of damages. In addition, results indicate that the maximum price of the CAT bond is insensitive to whether it is engineered to cover low frequency high magnitude events in a 'high' reinsurance layer relative to high frequency low magnitude events in a 'low' reinsurance layer. Also, while it is possible for the reinsurer to minimize their interest rate risk exposure on the fixed coupon payments, the impact of this risk on the price of the CAT bond appears small relative to the natural variability in the CAT bond price, and consequently catastrophic risk, due to uncertainty in the frequency and magnitude of landfalling hurricanes.

  12. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants

  13. Data Quality Objectives for Regulatory Requirements for Hazardous and Radioactive Air Emissions Sampling and Analysis; FINAL

    MULKEY, C.H.

    1999-01-01

    This document describes the results of the data quality objective (DQO) process undertaken to define data needs for state and federal requirements associated with toxic, hazardous, and/or radiological air emissions under the jurisdiction of the River Protection Project (RPP). Hereafter, this document is referred to as the Air DQO. The primary drivers for characterization under this DQO are the regulatory requirements pursuant to Washington State regulations, that may require sampling and analysis. The federal regulations concerning air emissions are incorporated into the Washington State regulations. Data needs exist for nonradioactive and radioactive waste constituents and characteristics as identified through the DQO process described in this document. The purpose is to identify current data needs for complying with regulatory drivers for the measurement of air emissions from RPP facilities in support of air permitting. These drivers include best management practices; similar analyses may have more than one regulatory driver. This document should not be used for determining overall compliance with regulations because the regulations are in constant change, and this document may not reflect the latest regulatory requirements. Regulatory requirements are also expected to change as various permits are issued. Data needs require samples for both radionuclides and nonradionuclide analytes of air emissions from tanks and stored waste containers. The collection of data is to support environmental permitting and compliance, not for health and safety issues. This document does not address health or safety regulations or requirements (those of the Occupational Safety and Health Administration or the National Institute of Occupational Safety and Health) or continuous emission monitoring systems. This DQO is applicable to all equipment, facilities, and operations under the jurisdiction of RPP that emit or have the potential to emit regulated air pollutants

  14. Feature extraction and selection for objective gait analysis and fall risk assessment by accelerometry

    Cremer Gerald

    2011-01-01

    Full Text Available Abstract Background Falls in the elderly is nowadays a major concern because of their consequences on elderly general health and moral states. Moreover, the aging of the population and the increasing life expectancy make the prediction of falls more and more important. The analysis presented in this article makes a first step in this direction providing a way to analyze gait and classify hospitalized elderly fallers and non-faller. This tool, based on an accelerometer network and signal processing, gives objective informations about the gait and does not need any special gait laboratory as optical analysis do. The tool is also simple to use by a non expert and can therefore be widely used on a large set of patients. Method A population of 20 hospitalized elderlies was asked to execute several classical clinical tests evaluating their risk of falling. They were also asked if they experienced any fall in the last 12 months. The accelerations of the limbs were recorded during the clinical tests with an accelerometer network distributed on the body. A total of 67 features were extracted from the accelerometric signal recorded during a simple 25 m walking test at comfort speed. A feature selection algorithm was used to select those able to classify subjects at risk and not at risk for several classification algorithms types. Results The results showed that several classification algorithms were able to discriminate people from the two groups of interest: fallers and non-fallers hospitalized elderlies. The classification performances of the used algorithms were compared. Moreover a subset of the 67 features was considered to be significantly different between the two groups using a t-test. Conclusions This study gives a method to classify a population of hospitalized elderlies in two groups: at risk of falling or not at risk based on accelerometric data. This is a first step to design a risk of falling assessment system that could be used to provide

  15. Analysis of disease-associated objects at the Rat Genome Database

    Wang, Shur-Jen; Laulederkind, Stanley J. F.; Hayman, G. T.; Smith, Jennifer R.; Petri, Victoria; Lowry, Timothy F.; Nigam, Rajni; Dwinell, Melinda R.; Worthey, Elizabeth A.; Munzenmaier, Diane H.; Shimoyama, Mary; Jacob, Howard J.

    2013-01-01

    The Rat Genome Database (RGD) is the premier resource for genetic, genomic and phenotype data for the laboratory rat, Rattus norvegicus. In addition to organizing biological data from rats, the RGD team focuses on manual curation of gene–disease associations for rat, human and mouse. In this work, we have analyzed disease-associated strains, quantitative trait loci (QTL) and genes from rats. These disease objects form the basis for seven disease portals. Among disease portals, the cardiovascular disease and obesity/metabolic syndrome portals have the highest number of rat strains and QTL. These two portals share 398 rat QTL, and these shared QTL are highly concentrated on rat chromosomes 1 and 2. For disease-associated genes, we performed gene ontology (GO) enrichment analysis across portals using RatMine enrichment widgets. Fifteen GO terms, five from each GO aspect, were selected to profile enrichment patterns of each portal. Of the selected biological process (BP) terms, ‘regulation of programmed cell death’ was the top enriched term across all disease portals except in the obesity/metabolic syndrome portal where ‘lipid metabolic process’ was the most enriched term. ‘Cytosol’ and ‘nucleus’ were common cellular component (CC) annotations for disease genes, but only the cancer portal genes were highly enriched with ‘nucleus’ annotations. Similar enrichment patterns were observed in a parallel analysis using the DAVID functional annotation tool. The relationship between the preselected 15 GO terms and disease terms was examined reciprocally by retrieving rat genes annotated with these preselected terms. The individual GO term–annotated gene list showed enrichment in physiologically related diseases. For example, the ‘regulation of blood pressure’ genes were enriched with cardiovascular disease annotations, and the ‘lipid metabolic process’ genes with obesity annotations. Furthermore, we were able to enhance enrichment of neurological

  16. Concept Maps as Instructional Tools for Improving Learning of Phase Transitions in Object-Oriented Analysis and Design

    Shin, Shin-Shing

    2016-01-01

    Students attending object-oriented analysis and design (OOAD) courses typically encounter difficulties transitioning from requirements analysis to logical design and then to physical design. Concept maps have been widely used in studies of user learning. The study reported here, based on the relationship of concept maps to learning theory and…

  17. Medical Assistance in Dying in Canada: An Ethical Analysis of Conscientious and Religious Objections

    Christie, Timothy

    2016-08-01

    Full Text Available Background: The Supreme Court of Canada (SCC has ruled that the federal government is required to remove the provisions of the Criminal Code of Canada that prohibit medical assistance in dying (MAID. The SCC has stipulated that individual physicians will not be required to provide MAID should they have a religious or conscientious objection. Therefore, the pending legislative response will have to balance the rights of the patients with the rights of physicians, other health care professionals, and objecting institutions. Objective: The objective of this paper is to critically assess, within the Canadian context, the moral probity of individual or institutional objections to MAID that are for either religious or conscientious reasons. Methods: Deontological ethics and the Doctrine of Double Effect. Results: The religious or conscientious objector has conflicting duties, i.e., a duty to respect the “right to life” (section 7 of the Charter and a duty to respect the tenets of his or her religious or conscientious beliefs (protected by section 2 of the Charter. Conclusion: The discussion of religious or conscientious objections to MAID has not explicitly considered the competing duties of the conscientious objector. It has focussed on the fact that a conscientious objection exists and has ignored the normative question of whether the duty to respect one’s conscience or religion supersedes the duty to respect the patient’s right to life.

  18. N-of-1-pathways MixEnrich: advancing precision medicine via single-subject analysis in discovering dynamic changes of transcriptomes.

    Li, Qike; Schissler, A Grant; Gardeux, Vincent; Achour, Ikbel; Kenost, Colleen; Berghout, Joanne; Li, Haiquan; Zhang, Hao Helen; Lussier, Yves A

    2017-05-24

    Transcriptome analytic tools are commonly used across patient cohorts to develop drugs and predict clinical outcomes. However, as precision medicine pursues more accurate and individualized treatment decisions, these methods are not designed to address single-patient transcriptome analyses. We previously developed and validated the N-of-1-pathways framework using two methods, Wilcoxon and Mahalanobis Distance (MD), for personal transcriptome analysis derived from a pair of samples of a single patient. Although, both methods uncover concordantly dysregulated pathways, they are not designed to detect dysregulated pathways with up- and down-regulated genes (bidirectional dysregulation) that are ubiquitous in biological systems. We developed N-of-1-pathways MixEnrich, a mixture model followed by a gene set enrichment test, to uncover bidirectional and concordantly dysregulated pathways one patient at a time. We assess its accuracy in a comprehensive simulation study and in a RNA-Seq data analysis of head and neck squamous cell carcinomas (HNSCCs). In presence of bidirectionally dysregulated genes in the pathway or in presence of high background noise, MixEnrich substantially outperforms previous single-subject transcriptome analysis methods, both in the simulation study and the HNSCCs data analysis (ROC Curves; higher true positive rates; lower false positive rates). Bidirectional and concordant dysregulated pathways uncovered by MixEnrich in each patient largely overlapped with the quasi-gold standard compared to other single-subject and cohort-based transcriptome analyses. The greater performance of MixEnrich presents an advantage over previous methods to meet the promise of providing accurate personal transcriptome analysis to support precision medicine at point of care.

  19. High-Precision Land-Cover-Land-Use GIS Mapping and Land Availability and Suitability Analysis for Grass Biomass Production in the Aroostook River Valley, Maine, USA

    Chunzeng Wang

    2015-03-01

    Full Text Available High-precision land-cover-land-use GIS mapping was performed in four major townships in Maine’s Aroostook River Valley, using on-screen digitization and direct interpretation of very high spatial resolution satellite multispectral imagery (15–60 cm and high spatial resolution LiDAR data (2 m and the field mapping method. The project not only provides the first-ever high-precision land-use maps for northern Maine, but it also yields accurate hectarage estimates of different land-use types, in particular grassland, defined as fallow land, pasture, and hay field. This enables analysis of potential land availability and suitability for grass biomass production and other sustainable land uses. The results show that the total area of fallow land in the four towns is 7594 hectares, which accounts for 25% of total open land, and that fallow plots equal to or over four hectares in size total 4870, or 16% of open land. Union overlay analysis, using the Natural Resources Conservation Service (NRCS soil data, indicates that only a very small percentage of grassland (4.9% is on “poorly-drained” or “very-poorly-drained” soils, and that most grassland (85% falls into the “farmland of state importance” or “prime farmland” categories, as determined by NRCS. It is concluded that Maine’s Aroostook River Valley has an ample base of suitable, underutilized land for producing grass biomass.

  20. Development and application of objective uncertainty measures for nuclear power plant transient analysis[Dissertation 3897

    Vinai, P

    2007-10-15

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire

  1. Development and application of objective uncertainty measures for nuclear power plant transient analysis

    Vinai, P.

    2007-10-01

    For the development, design and licensing of a nuclear power plant (NPP), a sound safety analysis is necessary to study the diverse physical phenomena involved in the system behaviour under operational and transient conditions. Such studies are based on detailed computer simulations. With the progresses achieved in computer technology and the greater availability of experimental and plant data, the use of best estimate codes for safety evaluations has gained increasing acceptance. The application of best estimate safety analysis has raised new problems that need to be addressed: it has become more crucial to assess as to how reliable code predictions are, especially when they need to be compared against safety limits that must not be crossed. It becomes necessary to identify and quantify the various possible sources of uncertainty that affect the reliability of the results. Currently, such uncertainty evaluations are generally based on experts' opinion. In the present research, a novel methodology based on a non-parametric statistical approach has been developed for objective quantification of best-estimate code uncertainties related to the physical models used in the code. The basis is an evaluation of the accuracy of a given physical model achieved by comparing its predictions with experimental data from an appropriate set of separate-effect tests. The differences between measurements and predictions can be considered stochastically distributed, and thus a statistical approach can be employed. The first step was the development of a procedure for investigating the dependence of a given physical model's accuracy on the experimental conditions. Each separate-effect test effectively provides a random sample of discrepancies between measurements and predictions, corresponding to a location in the state space defined by a certain number of independent system variables. As a consequence, the samples of 'errors', achieved from analysis of the entire database, are

  2. A Case Study on Coloured Petri Nets in Object-oriented Analysis and Design

    Barros, Joao Paulo; Jørgensen, Jens Bæk

    2005-01-01

    is structurally and conceptually closer to class diagrams and object-oriented programming languages. The CPN models reduce the gap between user-level requirements and the respective implementation, thus simplifying the implementation or code generation. Finally, we discuss the code generation from object-oriented......In this paper, we first demonstrate how a coloured Petri nets (CPN) model can be used to capture requirements for a considered example system, an elevator controller. Then, we show how this requirements-level CPN model is transformed into a design-level object-oriented CPN model, which...

  3. Detecting peatland drains with Object Based Image Analysis and Geoeye-1 imagery.

    Connolly, J; Holden, N M

    2017-12-01

    Peatlands play an important role in the global carbon cycle. They provide important ecosystem services including carbon sequestration and storage. Drainage disturbs peatland ecosystem services. Mapping drains is difficult and expensive and their spatial extent is, in many cases, unknown. An object based image analysis (OBIA) was performed on a very high resolution satellite image (Geoeye-1) to extract information about drain location and extent on a blanket peatland in Ireland. Two accuracy assessment methods: Error matrix and the completeness, correctness and quality (CCQ) were used to assess the extracted data across the peatland and at several sub sites. The cost of the OBIA method was compared with manual digitisation and field survey. The drain maps were also used to assess the costs relating to blocking drains vs. a business-as-usual scenario and estimating the impact of each on carbon fluxes at the study site. The OBIA method performed well at almost all sites. Almost 500 km of drains were detected within the peatland. In the error matrix method, overall accuracy (OA) of detecting the drains was 94% and the kappa statistic was 0.66. The OA for all sub-areas, except one, was 95-97%. The CCQ was 85%, 85% and 71% respectively. The OBIA method was the most cost effective way to map peatland drains and was at least 55% cheaper than either field survey or manual digitisation, respectively. The extracted drain maps were used constrain the study area CO 2 flux which was 19% smaller than the prescribed Peatland Code value for drained peatlands. The OBIA method used in this study showed that it is possible to accurately extract maps of fine scale peatland drains over large areas in a cost effective manner. The development of methods to map the spatial extent of drains is important as they play a critical role in peatland carbon dynamics. The objective of this study was to extract data on the spatial extent of drains on a blanket bog in the west of Ireland. The

  4. Detecting peatland drains with Object Based Image Analysis and Geoeye-1 imagery

    J. Connolly

    2017-03-01

    Full Text Available Abstract Background Peatlands play an important role in the global carbon cycle. They provide important ecosystem services including carbon sequestration and storage. Drainage disturbs peatland ecosystem services. Mapping drains is difficult and expensive and their spatial extent is, in many cases, unknown. An object based image analysis (OBIA was performed on a very high resolution satellite image (Geoeye-1 to extract information about drain location and extent on a blanket peatland in Ireland. Two accuracy assessment methods: Error matrix and the completeness, correctness and quality (CCQ were used to assess the extracted data across the peatland and at several sub sites. The cost of the OBIA method was compared with manual digitisation and field survey. The drain maps were also used to assess the costs relating to blocking drains vs. a business-as-usual scenario and estimating the impact of each on carbon fluxes at the study site. Results The OBIA method performed well at almost all sites. Almost 500 km of drains were detected within the peatland. In the error matrix method, overall accuracy (OA of detecting the drains was 94% and the kappa statistic was 0.66. The OA for all sub-areas, except one, was 95–97%. The CCQ was 85%, 85% and 71% respectively. The OBIA method was the most cost effective way to map peatland drains and was at least 55% cheaper than either field survey or manual digitisation, respectively. The extracted drain maps were used constrain the study area CO2 flux which was 19% smaller than the prescribed Peatland Code value for drained peatlands. Conclusions The OBIA method used in this study showed that it is possible to accurately extract maps of fine scale peatland drains over large areas in a cost effective manner. The development of methods to map the spatial extent of drains is important as they play a critical role in peatland carbon dynamics. The objective of this study was to extract data on the spatial extent of

  5. Objective Acoustic-Phonetic Speech Analysis in Patients Treated for Oral or Oropharyngeal Cancer

    de Bruijn, Marieke J.; ten Bosch, Louis; Kuik, Dirk J.; Quene, Hugo; Langendijk, Johannes A.; Leemans, C. Rene; Verdonck-de Leeuw, Irma M.

    2009-01-01

    Objective: Speech impairment often occurs in patients after treatment for head and neck cancer. New treatment modalities such as surgical reconstruction or (chemo) radiation techniques aim at sparing anatomical structures that are correlated with speech and swallowing. In randomized trials

  6. Learning Objectives and Testing: An Analysis of Six Principles of Economics Textbooks, Using Bloom's Taxonomy.

    Karns, James M. L.; And Others

    1983-01-01

    Significant differences were found between the stated objectives of most college level economics textbooks and the instruments included in the instructor's manuals to measure student achievement. (Author/RM)

  7. Analysis of students’ spatial thinking in geometry: 3D object into 2D representation

    Fiantika, F. R.; Maknun, C. L.; Budayasa, I. K.; Lukito, A.

    2018-05-01

    The aim of this study is to find out the spatial thinking process of students in transforming 3-dimensional (3D) object to 2-dimensional (2D) representation. Spatial thinking is helpful in using maps, planning routes, designing floor plans, and creating art. The student can engage geometric ideas by using concrete models and drawing. Spatial thinking in this study is identified through geometrical problems of transforming a 3-dimensional object into a 2-dimensional object image. The problem was resolved by the subject and analyzed by reference to predetermined spatial thinking indicators. Two representative subjects of elementary school were chosen based on mathematical ability and visual learning style. Explorative description through qualitative approach was used in this study. The result of this study are: 1) there are different representations of spatial thinking between a boy and a girl object, 2) the subjects has their own way to invent the fastest way to draw cube net.

  8. Analysis of double support phase of biped robot and multi-objective ...

    ing objectives, namely power consumption and dynamic balance margin have been ... in detail to arrive at a complete knowledge of the biped walking systems on .... measured in the anti-clockwise sense with respect to the vertical axis.

  9. DGTD Analysis of Electromagnetic Scattering from Penetrable Conductive Objects with IBC

    Li, Ping; Shi, Yifei; Jiang, Li; Bagci, Hakan

    2015-01-01

    To avoid straightforward volumetric discretization, a discontinuous Galerkin time-domain (DGTD) method integrated with the impedance boundary condition (IBC) is presented in this paper to analyze the scattering from objects with finite conductivity

  10. Measurement precision and biological variation of cranial arteries using automated analysis of 3 T magnetic resonance angiography

    Amin, Faisal Mohammad; Lundholm, Elisabet; Hougaard, Anders

    2014-01-01

    BACKGROUND: Non-invasive magnetic resonance angiography (MRA) has facilitated repeated measurements of human cranial arteries in several headache and migraine studies. To ensure comparability across studies the same automated analysis software has been used, but the intra- and interobserver, day-...

  11. Ion chromatography for the precise analysis of chloride and sodium in sweat for the diagnosis of cystic fibrosis

    Doorn, J.; Storteboom, T. T. R.; Mulder, A. M.; de Jong, W. H. A.; Rottier, B. L.; Kema, I. P.

    BACKGROUND: Measurement of chloride in sweat is an essential part of the diagnostic algorithm for cystic fibrosis. The lack in sensitivity and reproducibility of current methods led us to develop an ion chromatography/high-performance liquid chromatography (IC/HPLC) method, suitable for the analysis

  12. Scientific analysis of a calcified object from a post-medieval burial in Vienna, Austria.

    Binder, Michaela; Berner, Margit; Krause, Heike; Kucera, Matthias; Patzak, Beatrix

    2016-09-01

    Calcifications commonly occur in association with soft tissue inflammation. However, they are not often discussed in palaeopathological literature, frequently due to problems of identification and diagnosis. We present a calcified object (40×27×27cm) found with a middle-aged male from a post-medieval cemetery in Vienna. It was not recognized during excavation, thus its anatomical location within the body remains unknown. The object was subject to X-ray, SEM and CT scanning and compared to historic pathological objects held in the collection of the Natural History Museum Vienna. Two of closest resemblance, a thyroid adenoma and goitre were subject to similar analytical techniques for comparison. Despite similarities between all objects, the structure of the object most closely conforms to a thyroid tumor. Nevertheless, due to similar pathophysiological pathways and biochemical composition of calcified soft tissue, a secure identification outside of its anatomical context is not possible. The research further highlights the fact that recognition of such objects during excavation is crucial for a more conclusive diagnosis. Historic medical records indicate that they were common and might therefore be expected to frequently occur in cemeteries. Consequently, an increasing the dataset of calcifications would also aid in extending the knowledge about diseases in past human populations. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Introduction to precise numerical methods

    Aberth, Oliver

    2007-01-01

    Precise numerical analysis may be defined as the study of computer methods for solving mathematical problems either exactly or to prescribed accuracy. This book explains how precise numerical analysis is constructed. The book also provides exercises which illustrate points from the text and references for the methods presented. All disc-based content for this title is now available on the Web. · Clearer, simpler descriptions and explanations ofthe various numerical methods· Two new types of numerical problems; accurately solving partial differential equations with the included software and computing line integrals in the complex plane.

  14. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience.

    Dunstone, Kimberley; Brennan, Emily; Slater, Michael D; Dixon, Helen G; Durkin, Sarah J; Pettigrew, Simone; Wakefield, Melanie A

    2017-04-11

    Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads). Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40%) and the United Kingdom (26%). The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38%) or to behave responsibly and/or not get drunk when drinking (33%). Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of different characteristics of alcohol harm reduction ads. Given

  15. Alcohol harm reduction advertisements: a content analysis of topic, objective, emotional tone, execution and target audience

    Kimberley Dunstone

    2017-04-01

    Full Text Available Abstract Background Public health mass media campaigns may contribute to reducing the health and social burden attributed to alcohol consumption, but little is known about which advertising characteristics have been used, or have been effective, in alcohol harm reduction campaigns to date. As a first step towards encouraging further research to identify the impact of various advertising characteristics, this study aimed to systematically identify and examine the content of alcohol harm reduction advertisements (ads. Method Ads were identified through an exhaustive internet search of Google, YouTube, Vimeo, and relevant government and health agency websites. Eligible ads were: English language, produced between 2006 and 2014, not primarily focused on drink-driving or alcohol in pregnancy, and not alcohol industry funded. Systematic content analysis of all ads was performed; each ad was double-coded. Results In total, 110 individual ads from 72 different alcohol harm reduction campaigns were identified, with the main source countries being Australia (40% and the United Kingdom (26%. The dominant topic for 52% of ads was short-term harms, while 10% addressed long-term harms, 18% addressed underage drinking, 17% communicated a how-to-change message, and 3% advocated for policy change. The behavioural objective of most ads was to motivate audiences to reduce their alcohol consumption (38% or to behave responsibly and/or not get drunk when drinking (33%. Only 10% of all ads mentioned low-risk drinking guidelines. Eighty-seven percent of ads used a dramatisation execution style and 74% had a negative emotional tone. Ninety percent of ads contained messages or content that appeared to target adults, and 36% specifically targeted young adults. Conclusions Some message attributes have been employed more frequently than others, suggesting several promising avenues for future audience or population-based research to compare the relative effectiveness of

  16. Optically continuous silcrete quartz cements of the St. Peter Sandstone: High precision oxygen isotope analysis by ion microprobe

    Kelly, Jacque L.; Fu, Bin; Kita, Noriko T.; Valley, John W.

    2007-08-01

    A detailed oxygen isotope study of detrital quartz and authigenic quartz overgrowths from shallowly buried (ratio by laser fluorination, resulting in an average δ 18O of 10.0 ± 0.2‰ (1SD, n = 109). Twelve thin sections were analyzed by CAMECA-1280 ion microprobe (6-10 μm spot size, analytical precision better than ±0.2‰, 1SD). Detrital quartz grains have an average δ 18O of 10.0 ± 1.4‰ (1SD, n = 91) identical to the data obtained by laser fluorination. The ion microprobe data reveal true variability that is otherwise lost by homogenization of powdered samples necessary for laser fluorination. Laser fluorination uses samples that are one million times larger than the ion microprobe. Whole rock (WR) samples from the 53 rocks were analyzed by laser fluorination, giving δ 18O between 9.8‰ and 16.7‰ ( n = 110). Quartz overgrowths in thin sections from 10 rocks were analyzed by ion microprobe and average δ 18O = 29.3 ± 1.0‰ (1SD, n = 161). Given the similarity, on average, of δ 18O for all detrital quartz grains and for all quartz overgrowths, samples with higher δ 18O(WR) values can be shown to have more cement. The quartz cement in the 53 rocks, calculated by mass balance, varies from outlier at 33 vol.% cement. Eolian samples have an average of 11% cement compared to marine samples, which average 4% cement. Two models for quartz cementation have been investigated: high temperature (50-110 °C) formation from ore-forming brines related to Mississippi Valley Type (MVT) mineralization and formation as silcretes at low temperature (10-30 °C). The homogeneity of δ 18O for quartz overgrowths determined by ion microprobe rules out a systematic regional variation of temperature as predicted for MVT brines and there are no other known heating events in these sediments that were never buried to depths >1 km. The data in this study suggest that quartz overgrowths formed as silcretes in the St. Peter Sandstone from meteoric water with δ 18O values of -10

  17. Precise determination of the f0(500) and f0(980) parameters in dispersive analysis of the ππ data

    Kamiński, Robert; Garcia-Martin, R.; Pelaez, J.R.; Ruiz de Elvira, J.

    2013-01-01

    Use of the new and precise dispersive equations with imposed crossing symmetry condition to solve the long-standing puzzle in the parameters of the f 0 (500), as well as the f 0 (980) is presented. This puzzle is finally being settled thanks to analyzes carried out during the last years [J. Beringer et al. (Particle Data Group), Phys. Rev. D86, (2012) 010001]. In this report we show how our very recent dispersive data analysis allowed for a precise and model independent determination of the amplitudes for the S, P, D and F waves [R. Garcia-Martin, R. Kaminski, J. R. Pelaez, J. Ruiz de Elvira and F.J. Yndurain, Phys. Rev. D83, (2011) 074004; R. Garcia-Martin, R. Kamiński, J.R. Pelaez and J. Ruiz de Elvira, Phys. Rev. Lett. 107, (2011) 072001; R. Kamiński, Phys. Rev. D83, (2011) 076008]. Especially we present that the analytic continuation of once subtracted dispersion relations for the S 0 wave to the complex energy plane leads to very precise results for the f 0 (500) pole: √(s pole )=457 −13 +14 −i279 −7 +11 MeV and for the f 0 (980) pole: √(s pole )=996±7−i25 −6 +10 MeV. We also mention on first (or one of the first) practical application of presented dispersion relations in refitting and in significant improving of the ππ S-wave amplitudes below 1000 MeV

  18. Land Cover/Land Use Classification and Change Detection Analysis with Astronaut Photography and Geographic Object-Based Image Analysis

    Hollier, Andi B.; Jagge, Amy M.; Stefanov, William L.; Vanderbloemen, Lisa A.

    2017-01-01

    For over fifty years, NASA astronauts have taken exceptional photographs of the Earth from the unique vantage point of low Earth orbit (as well as from lunar orbit and surface of the Moon). The Crew Earth Observations (CEO) Facility is the NASA ISS payload supporting astronaut photography of the Earth surface and atmosphere. From aurora to mountain ranges, deltas, and cities, there are over two million images of the Earth's surface dating back to the Mercury missions in the early 1960s. The Gateway to Astronaut Photography of Earth website (eol.jsc.nasa.gov) provides a publically accessible platform to query and download these images at a variety of spatial resolutions and perform scientific research at no cost to the end user. As a demonstration to the science, application, and education user communities we examine astronaut photography of the Washington D.C. metropolitan area for three time steps between 1998 and 2016 using Geographic Object-Based Image Analysis (GEOBIA) to classify and quantify land cover/land use and provide a template for future change detection studies with astronaut photography.

  19. Global analysis of general SU(2)xSU(2)xU(1) models with precision data

    Hsieh, Ken; Yu, Jiang-Hao; Yuan, C.-P.; Schmitz, Kai

    2010-01-01

    We present the results of a global analysis of a class of models with an extended electroweak gauge group of the form SU(2)xSU(2)xU(1), often denoted as G(221) models, which include as examples the left-right, the leptophobic, the hadrophobic, the fermiophobic, the un-unified, and the nonuniversal models. Using an effective Lagrangian approach, we compute the shifts to the coefficients in the electroweak Lagrangian due to the new heavy gauge bosons, and obtain the lower bounds on the masses of the Z ' and W ' bosons. The analysis of the electroweak parameter bounds reveals a consistent pattern of several key observables that are especially sensitive to the effects of new physics and thus dominate the overall shape of the respective parameter contours.

  20. Precision analysis of 15N-labelled samples with the emission spectrometer NOI-5 for nitrogen balance in field trials

    Lippold, H.

    1984-01-01

    A technique was adapted for the preparation of samples with 15 N to be analyzed with the emission spectrometer NOI-5. This technique is based on methods of analyzing 15 N labelled gas samples in denitrification experiments. Nitrogen released from ammonium compounds by using hypobromite is injected into a repeatedly usable gaseous discharge tube where it is freed from water traces by means of the molecular sieve 5A. The described procedure of activating the molecular sieve allows to record spectra of reproducible quality thus promising an accuracy of analysis of +- 0.003 at% in the range of natural isotope frequency and the possibility of soil nitrogen analysis in field trials with fertilizers of low nitrogen content (3 to 6.5 at%; corresponding with 0.055 to 0.14% N/sub t/ of soils) without being dependent on mass spectrometers. (author)

  1. Accuracy and precision of oscillometric blood pressure in standing conscious horses

    Olsen, Emil; Pedersen, Tilde Louise Skovgaard; Robinson, Rebecca

    2016-01-01

    from a teaching and research herd. HYPOTHESIS/OBJECTIVE: To evaluate the accuracy and precision of systolic arterial pressure (SAP), diastolic arterial pressure (DAP), and mean arterial pressure (MAP) in conscious horses obtained with an oscillometric NIBP device when compared to invasively measured...... administration. Agreement analysis with replicate measures was utilized to calculate bias (accuracy) and standard deviation (SD) of bias (precision). RESULTS: A total of 252 pairs of invasive arterial BP and NIBP measurements were analyzed. Compared to the direct BP measures, the NIBP MAP had an accuracy of -4...... mm Hg and precision of 10 mm Hg. SAP had an accuracy of -8 mm Hg and a precision of 17 mm Hg and DAP had an accuracy of -7 mm Hg and a precision of 14 mm Hg. CONCLUSIONS AND CLINICAL RELEVANCE: MAP from the evaluated NIBP monitor is accurate and precise in the adult horse across a range of BP...

  2. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration: In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA).

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-01-01

    We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SD(SE)): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SD(SE): 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice.

  3. Analysis of Greedy Decision Making for Geographic Routing for Networks of Randomly Moving Objects

    Amber Israr

    2016-04-01

    Full Text Available Autonomous and self-organizing wireless ad-hoc communication networks for moving objects consist of nodes, which use no centralized network infrastructure. Examples of moving object networks are networks of flying objects, networks of vehicles, networks of moving people or robots. Moving object networks have to face many critical challenges in terms of routing because of dynamic topological changes and asymmetric networks links. A suitable and effective routing mechanism helps to extend the deployment of moving nodes. In this paper an attempt has been made to analyze the performance of the Greedy Decision method (position aware distance based algorithm for geographic routing for network nodes moving according to the random waypoint mobility model. The widely used GPSR (Greedy Packet Stateless Routing protocol utilizes geographic distance and position based data of nodes to transmit packets towards destination nodes. In this paper different scenarios have been tested to develop a concrete set of recommendations for optimum deployment of distance based Greedy Decision of Geographic Routing in randomly moving objects network

  4. Measuring systems of hard to get objects: problems with analysis of measurement results

    Gilewska, Grazyna

    2005-02-01

    The problem accessibility of metrological parameters features of objects appeared in many measurements. Especially if it is biological object which parameters very often determined on the basis of indirect research. Accidental component predominate in forming of measurement results with very limited access to measurement objects. Every measuring process has a lot of conditions limiting its abilities to any way processing (e.g. increase number of measurement repetition to decrease random limiting error). It may be temporal, financial limitations, or in case of biological object, small volume of sample, influence measuring tool and observers on object, or whether fatigue effects e.g. at patient. It's taken listing difficulties into consideration author worked out and checked practical application of methods outlying observation reduction and next innovative methods of elimination measured data with excess variance to decrease of mean standard deviation of measured data, with limited aomunt of data and accepted level of confidence. Elaborated methods wee verified on the basis of measurement results of knee-joint width space got from radiographs. Measurements were carried out by indirectly method on the digital images of radiographs. Results of examination confirmed legitimacy to using of elaborated methodology and measurement procedures. Such methodology has special importance when standard scientific ways didn't bring expectations effects.

  5. Designing personal grief rituals: An analysis of symbolic objects and actions.

    Sas, Corina; Coman, Alina

    2016-10-01

    Personal grief rituals are beneficial in dealing with complicated grief, but challenging to design, as they require symbolic objects and actions meeting clients' emotional needs. The authors reported interviews with 10 therapists with expertise in both grief therapy and grief rituals. Findings indicate three types of rituals supporting honoring, letting go, and self transformation, with the latter being particularly complex. Outcomes also point to a taxonomy of ritual objects for framing and remembering ritual experience, and for capturing and processing grief. Besides symbolic possessions, the authors identified other types of ritual objects including transformational and future-oriented ones. Symbolic actions include creative craft of ritual objects, respectful handling, disposal, and symbolic play. They conclude with theoretical implications of these findings, and a reflection on their value for tailored, creative co-design of grief rituals. In particular, several implications for designing grief rituals were identified that include accounting for the client's need, selecting (or creating) the most appropriate objects and actions from the identified types, integrating principles of both grief and art/drama therapy, exploring clients' affinity for the ancient elements as medium of disposal in letting go rituals, and the value of technology for recording and reflecting on ritual experience.

  6. Multi-objective game-theory models for conflict analysis in reservoir watershed management.

    Lee, Chih-Sheng

    2012-05-01

    This study focuses on the development of a multi-objective game-theory model (MOGM) for balancing economic and environmental concerns in reservoir watershed management and for assistance in decision. Game theory is used as an alternative tool for analyzing strategic interaction between economic development (land use and development) and environmental protection (water-quality protection and eutrophication control). Geographic information system is used to concisely illustrate and calculate the areas of various land use types. The MOGM methodology is illustrated in a case study of multi-objective watershed management in the Tseng-Wen reservoir, Taiwan. The innovation and advantages of MOGM can be seen in the results, which balance economic and environmental concerns in watershed management and which can be interpreted easily by decision makers. For comparison, the decision-making process using conventional multi-objective method to produce many alternatives was found to be more difficult. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A novel no-reference objective stereoscopic video quality assessment method based on visual saliency analysis

    Yang, Xinyan; Zhao, Wei; Ye, Long; Zhang, Qin

    2017-07-01

    This paper proposes a no-reference objective stereoscopic video quality assessment method with the motivation that making the effect of objective experiments close to that of subjective way. We believe that the image regions with different visual salient degree should not have the same weights when designing an assessment metric. Therefore, we firstly use GBVS algorithm to each frame pairs and separate both the left and right viewing images into the regions with strong, general and week saliency. Besides, local feature information like blockiness, zero-crossing and depth are extracted and combined with a mathematical model to calculate a quality assessment score. Regions with different salient degree are assigned with different weights in the mathematical model. Experiment results demonstrate the superiority of our method compared with the existed state-of-the-art no-reference objective Stereoscopic video quality assessment methods.

  8. Indicators analysis and objectives for the development sustainable and sustainability environmental

    Pedro Noboa-Romero

    2016-09-01

    Full Text Available The present article is product of a research qualitative, descriptive and analytical of the indicators and objectives aimed to the development sustainable. The main objective of this essay is to analyze sustainability indicators: index of human development (IDH, sustainable development goals (SDGS, objectives of the Millennium Goals (MDGS and the index of Multidimensional poverty (IPM; through a review of research and work on these issues, in order to establish progress and results that have been generated during the use of these indicators in the field of health education, technology, and environment. Demonstrate that there is inequality between Nations, the approach is oriented to a development in the short term, benefit exclusively to current generations, exhausting natural resources, regardless of a vision in the long term for the future generations.

  9. Error Analysis: How Precise is Fused Deposition Modeling in Fabrication of Bone Models in Comparison to the Parent Bones?

    Reddy, M V; Eachempati, Krishnakiran; Gurava Reddy, A V; Mugalur, Aakash

    2018-01-01

    Rapid prototyping (RP) is used widely in dental and faciomaxillary surgery with anecdotal uses in orthopedics. The purview of RP in orthopedics is vast. However, there is no error analysis reported in the literature on bone models generated using office-based RP. This study evaluates the accuracy of fused deposition modeling (FDM) using standard tessellation language (STL) files and errors generated during the fabrication of bone models. Nine dry bones were selected and were computed tomography (CT) scanned. STL files were procured from the CT scans and three-dimensional (3D) models of the bones were printed using our in-house FDM based 3D printer using Acrylonitrile Butadiene Styrene (ABS) filament. Measurements were made on the bone and 3D models according to data collection procedures for forensic skeletal material. Statistical analysis was performed to establish interobserver co-relation for measurements on dry bones and the 3D bone models. Statistical analysis was performed using SPSS version 13.0 software to analyze the collected data. The inter-observer reliability was established using intra-class coefficient for both the dry bones and the 3D models. The mean of absolute difference is 0.4 that is very minimal. The 3D models are comparable to the dry bones. STL file dependent FDM using ABS material produces near-anatomical 3D models. The high 3D accuracy hold a promise in the clinical scenario for preoperative planning, mock surgery, and choice of implants and prostheses, especially in complicated acetabular trauma and complex hip surgeries.

  10. Error analysis: How precise is fused deposition modeling in fabrication of bone models in comparison to the parent bones?

    M V Reddy

    2018-01-01

    Full Text Available Background: Rapid prototyping (RP is used widely in dental and faciomaxillary surgery with anecdotal uses in orthopedics. The purview of RP in orthopedics is vast. However, there is no error analysis reported in the literature on bone models generated using office-based RP. This study evaluates the accuracy of fused deposition modeling (FDM using standard tessellation language (STL files and errors generated during the fabrication of bone models. Materials and Methods: Nine dry bones were selected and were computed tomography (CT scanned. STL files were procured from the CT scans and three-dimensional (3D models of the bones were printed using our in-house FDM based 3D printer using Acrylonitrile Butadiene Styrene (ABS filament. Measurements were made on the bone and 3D models according to data collection procedures for forensic skeletal material. Statistical analysis was performed to establish interobserver co-relation for measurements on dry bones and the 3D bone models. Statistical analysis was performed using SPSS version 13.0 software to analyze the collected data. Results: The inter-observer reliability was established using intra-class coefficient for both the dry bones and the 3D models. The mean of absolute difference is 0.4 that is very minimal. The 3D models are comparable to the dry bones. Conclusions: STL file dependent FDM using ABS material produces near-anatomical 3D models. The high 3D accuracy hold a promise in the clinical scenario for preoperative planning, mock surgery, and choice of implants and prostheses, especially in complicated acetabular trauma and complex hip surgeries.

  11. Analysis and design of the SI-simulator software system for the VHTR-SI process by using the object-oriented analysis and object-oriented design methodology

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2008-01-01

    The SI-simulator is an application software system that simulates the dynamic behavior of the VHTR-SI process by the use of mathematical models. Object-oriented analysis (OOA) and object-oriented design (OOD) methodologies were employed for the SI simulator system development. OOA is concerned with developing software engineering requirements and specifications that are expressed as a system's object model (which is composed of a population of interacting objects), as opposed to the traditional data or functional views of systems. OOD techniques are useful for the development of large complex systems. Also, OOA/OOD methodology is usually employed to maximize the reusability and extensibility of a software system. In this paper, we present a design feature for the SI simulator software system by the using methodologies of OOA and OOD

  12. Ontology-based coupled optimisation design method using state-space analysis for the spindle box system of large ultra-precision optical grinding machine

    Wang, Qianren; Chen, Xing; Yin, Yuehong; Lu, Jian

    2017-08-01

    With the increasing complexity of mechatronic products, traditional empirical or step-by-step design methods are facing great challenges with various factors and different stages having become inevitably coupled during the design process. Management of massive information or big data, as well as the efficient operation of information flow, is deeply involved in the process of coupled design. Designers have to address increased sophisticated situations when coupled optimisation is also engaged. Aiming at overcoming these difficulties involved in conducting the design of the spindle box system of ultra-precision optical grinding machine, this paper proposed a coupled optimisation design method based on state-space analysis, with the design knowledge represented by ontologies and their semantic networks. An electromechanical coupled model integrating mechanical structure, control system and driving system of the motor is established, mainly concerning the stiffness matrix of hydrostatic bearings, ball screw nut and rolling guide sliders. The effectiveness and precision of the method are validated by the simulation results of the natural frequency and deformation of the spindle box when applying an impact force to the grinding wheel.

  13. Java programming fundamentals problem solving through object oriented analysis and design

    Nair, Premchand S

    2008-01-01

    While Java texts are plentiful, it's difficult to find one that takes a real-world approach, and encourages novice programmers to build on their Java skills through practical exercise. Written by an expert with 19 experience teaching computer programming, Java Programming Fundamentals presents object-oriented programming by employing examples taken from everyday life. Provides a foundation in object-oriented design principles and UML notation Describes common pitfalls and good programming practicesFurnishes supplemental links, documents, and programs on its companion website, www.premnair.netU

  14. Metrology and statistical analysis for the precise standardisation of cobalt-60 by 4πβ-γ coincidence counting

    Buckman, S.M.

    1995-03-01

    The major part of the thesis is devoted to the theoretical development of a comprehensive PC-based statistical package for the analysis of data from coincidence-counting experiments. This analysis is applied to primary standardizations of Co performed in Australia and Japan. The Australian standardisation, the accuracy of which is confirmed through international comparison, is used to re-calibrate the ionisation chamber. Both Australian and Japanese coincidence-counting systems are interfaced to personal computers to enable replicated sets of measurements to be made under computer control. Further research to confirm the validity of the statistical model includes an experimental investigation into the non-Poisson behaviour of radiation detectors due to the effect of deadtime. Experimental investigation is conducted to determine which areas are most likely to limit the ultimate accuracy achievable with coincidence counting. The thesis concludes by discussing the possibilities of digital coincidence counting and outlines the design of a prototype system presently under development. The accuracy of the Australian standardisation is confirmed by international comparison. From this result a more accurate Co calibration is obtained for the Australian working standard. Based on the work of this thesis, uncertainties in coincidence counting experiments can be better handled with resulting improvements in measurement reliability. The concept and benefits of digital coincidence counting are discussed and a proposed design is given for such a system. All of the data and software associated with this thesis is provided on computer discs. 237 refs., figs., tabs

  15. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis

    Addink, Elisabeth A.; Van Coillie, Frieke M. B.; De Jong, Steven M.

    2012-04-01

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received considerable attention over the past 15 years for analyzing and interpreting remote sensing imagery. In contrast to traditional image analysis, GEOBIA works more like the human eye-brain combination does. The latter uses the object's color (spectral information), size, texture, shape and occurrence to other image objects to interpret and analyze what we see. GEOBIA starts by segmenting the image grouping together pixels into objects and next uses a wide range of object properties to classify the objects or to extract object's properties from the image. Significant advances and improvements in image analysis and interpretation are made thanks to GEOBIA. In June 2010 the third conference on GEOBIA took place at the Ghent University after successful previous meetings in Calgary (2008) and Salzburg (2006). This special issue presents a selection of the 2010 conference papers that are worked out as full research papers for JAG. The papers cover GEOBIA applications as well as innovative methods and techniques. The topics range from vegetation mapping, forest parameter estimation, tree crown identification, urban mapping, land cover change, feature selection methods and the effects of image compression on segmentation. From the original 94 conference papers, 26 full research manuscripts were submitted; nine papers were selected and are presented in this special issue. Selection was done on the basis of quality and topic of the studies. The next GEOBIA conference will take place in Rio de Janeiro from 7 to 9 May 2012 where we hope to welcome even more scientists working in the field of GEOBIA.

  16. Precision Guidance with Impact Angle Requirements

    Ford, Jason

    2001-01-01

    This paper examines a weapon system precision guidance problem in which the objective is to guide a weapon onto a non-manoeuvring target so that a particular desired angle of impact is achieved using...

  17. A Component Analysis of the Impact of Evaluative and Objective Feedback on Performance

    Johnson, Douglas A.

    2013-01-01

    Despite the frequency with which performance feedback interventions are used in organizational behavior management, component analyses of such feedback are rare. It has been suggested that evaluation of performance and objective details about performance are two necessary components for performance feedback. The present study was designed to help…

  18. A Case Study on Coloured Petri Nets in Object-oriented Analysis and Design

    Barros, Joao Paulo; Jørgensen, Jens Bæk

    2005-01-01

    In this paper, we first demonstrate how a coloured Petri nets (CPN) model can be used to capture requirements for a considered example system, an elevator controller. Then, we show how this requirements-level CPN model is transformed into a design-level object-oriented CPN model, which...

  19. An integrated approach for visual analysis of a multisource moving objects knowledge base

    Willems, N.; van Hage, W.R.; de Vries, G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  20. An Integrated Approach for Visual Analysis of a Multi-Source Moving Objects Knowledge Base

    Willems, C.M.E.; van Hage, W.R.; de Vries, G.K.D.; Janssens, J.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  1. An integrated approach for visual analysis of a multi-source moving objects knowledge base

    Willems, N.; Hage, van W.R.; Vries, de G.; Janssens, J.H.M.; Malaisé, V.

    2010-01-01

    We present an integrated and multidisciplinary approach for analyzing the behavior of moving objects. The results originate from an ongoing research of four different partners from the Dutch Poseidon project (Embedded Systems Institute (2007)), which aims to develop new methods for Maritime Safety

  2. Optimum analysis of pavement maintenance using multi-objective genetic algorithms

    Amr A. Elhadidy

    2015-04-01

    Full Text Available Road network expansion in Egypt is considered as a vital issue for the development of the country. This is done while upgrading current road networks to increase the safety and efficiency. A pavement management system (PMS is a set of tools or methods that assist decision makers in finding optimum strategies for providing and maintaining pavements in a serviceable condition over a given period of time. A multi-objective optimization problem for pavement maintenance and rehabilitation strategies on network level is discussed in this paper. A two-objective optimization model considers minimum action costs and maximum condition for used road network. In the proposed approach, Markov-chain models are used for predicting the performance of road pavement and to calculate the expected decline at different periods of time. A genetic-algorithm-based procedure is developed for solving the multi-objective optimization problem. The model searched for the optimum maintenance actions at adequate time to be implemented on an appropriate pavement. Based on the computing results, the Pareto optimal solutions of the two-objective optimization functions are obtained. From the optimal solutions represented by cost and condition, a decision maker can easily obtain the information of the maintenance and rehabilitation planning with minimum action costs and maximum condition. The developed model has been implemented on a network of roads and showed its ability to derive the optimal solution.

  3. An Analysis of Learning Objectives and Content Coverage in Introductory Psychology Syllabi

    Homa, Natalie; Hackathorn, Jana; Brown, Carrie M.; Garczynski, Amy; Solomon, Erin D.; Tennial, Rachel; Sanborn, Ursula A.; Gurung, Regan A. R.

    2013-01-01

    Introductory psychology is one of the most popular undergraduate courses and often serves as the gateway to choosing psychology as an academic major. However, little research has examined the typical structure of introductory psychology courses. The current study examined student learning objectives (SLOs) and course content in introductory…

  4. Object Selection Costs in Visual Working Memory: A Diffusion Model Analysis of the Focus of Attention

    Sewell, David K.; Lilburn, Simon D.; Smith, Philip L.

    2016-01-01

    A central question in working memory research concerns the degree to which information in working memory is accessible to other cognitive processes (e.g., decision-making). Theories assuming that the focus of attention can only store a single object at a time require the focus to orient to a target representation before further processing can…

  5. Simple proteomics data analysis in the object-oriented PowerShell.

    Mohammed, Yassene; Palmblad, Magnus

    2013-01-01

    Scripting languages such as Perl and Python are appreciated for solving simple, everyday tasks in bioinformatics. A more recent, object-oriented command shell and scripting language, Windows PowerShell, has many attractive features: an object-oriented interactive command line, fluent navigation and manipulation of XML files, ability to consume Web services from the command line, consistent syntax and grammar, rich regular expressions, and advanced output formatting. The key difference between classical command shells and scripting languages, such as bash, and object-oriented ones, such as PowerShell, is that in the latter the result of a command is a structured object with inherited properties and methods rather than a simple stream of characters. Conveniently, PowerShell is included in all new releases of Microsoft Windows and therefore already installed on most computers in classrooms and teaching labs. In this chapter we demonstrate how PowerShell in particular allows easy interaction with mass spectrometry data in XML formats, connection to Web services for tools such as BLAST, and presentation of results as formatted text or graphics. These features make PowerShell much more than "yet another scripting language."

  6. Analysis of Buried Dielectric Objects Using Higher-Order MoM for Volume Integral Equations

    Kim, Oleksiy S.; Meincke, Peter; Breinbjerg, Olav

    2004-01-01

    A higher-order method of moments (MoM) is applied to solve a volume integral equation for dielectric objects in layered media. In comparison to low-order methods, the higher-order MoM, which is based on higher-order hierarchical Legendre vector basis functions and curvilinear hexahedral elements,...

  7. Analysis of porous media and objects of cultural heritage by mobile NMR

    Haber, Agnes

    2012-01-01

    Low-field NMR techniques are used to study porous system, from simple to complex structures, and objects of cultural heritage. It is shown that NMR relaxometry can be used to study the fluid dynamics inside a porous system. A simple theoretical model for multi-site relaxation exchange NMR is used to extract exchange kinetic parameters when applied on a model porous systems. It provides a first step towards the study of more complex systems, where continuous relaxation distributions are present, such as soil systems or building materials. Moisture migration is observed in the soil systems with the help of 1D and 2D NMR relaxometry methods. In case of the concrete samples, the difference in composition makes a significant difference in the ability of water uptake. The single-sided NMR sensor proves to be a useful tool for on-site measurements. This is very important also in the case of the cultural heritage objects, as most of the objects can not be moved out of their environment. Mobile NMR turns out to be a simple but reliable and powerful tool to investigate moisture distributions and pore structures in porous media as well as the conservation state and history of objects of cultural heritage.

  8. Analysis of Optical Variations of BL Lac Object AO 0235+164 Wang ...

    obtain statistically meaningful values for the cross-correlation time lags ... deviation, the fifth represents the largest variations, the sixth represents the fractional ..... 6. Conclusions. The multi-band optical data are collected on the object of AO 0235 + 164. The time lags among the B, V, R and I bands have been analysed.

  9. Development and Factor Analysis of an Instrument to Measure Preservice Teachers' Perceptions of Learning Objects

    Sahin, Sami

    2010-01-01

    The purpose of this study was to develop a questionnaire to measure student teachers' perception of digital learning objects. The participants included 308 voluntary senior students attending courses in a college of education of a public university in Turkey. The items were extracted to their related factors by the principal axis factoring method.…

  10. An analysis of nature and mechanisms of the Lira objects territories' radionuclide contamination

    Kadyrzhanov, K.K; Tuleushev, A.Zh.; Lukashenko, S.N.; Solodukhin, V.P.; Kazachevskij, I.V.; Reznikov, S.V.

    2001-01-01

    In the paper the results of study of radioactive contamination of 'Lira' objects territories are presented. Obtained data are evidencing, that existing radiation situation does not presents a threat for operating personnel of both the occupied on the deposit and its objects furthermore for inhabitants of the closest localities. Therewith a radionuclides concentration in the soils on the examined areas is slightly exceeds the background values characteristic for this region. Two hypothesises for reveled radionuclide contamination have been considered: yield on the surface and distribution by territory immediately after explosion 137 Xe and 90 Kr inert gases - they are genetical predecessors of 137 Cs and 90 Sr, relatively; existence of a constant effluence of these radionuclides on a surface from a 'ditch cavities' of the 'Lira' objects by the zones of dis-consolidation and crack propagations in the earth crust. With purpose for these hypothesis correctness clarification the distribution of radionuclides by soil layer depth in the vicinities of militant wells (TK-2 and TK-5), as well as in the case and riverbed of the Berezovka river. There are not data confirm the hypothesis about possible constant radionuclides influent from a 'ditch cavities'. So, the hypothesis of the 'Lira' objects territories radionuclide contamination due to inert gases yield on the surface is a more rightful

  11. A multi-level object store and its application to HEP data analysis

    May, E.; Lifka, D.; Malon, D.; Grossman, R.L.; Qin, X.; Valsamis, D.; Xu, W.

    1994-01-01

    We present a design and demonstration of a scientific data manager consisting of a low overhead, high performance object store interfaced to a hierarchical storage system. This was done with the framework of the Mark1 testbeds of the PASS project

  12. An Achievement Degree Analysis Approach to Identifying Learning Problems in Object-Oriented Programming

    Allinjawi, Arwa A.; Al-Nuaim, Hana A.; Krause, Paul

    2014-01-01

    Students often face difficulties while learning object-oriented programming (OOP) concepts. Many papers have presented various assessment methods for diagnosing learning problems to improve the teaching of programming in computer science (CS) higher education. The research presented in this article illustrates that although max-min composition is…

  13. Object and Objective Lost?

    Lopdrup-Hjorth, Thomas

    2015-01-01

    This paper explores the erosion and problematization of ‘the organization’ as a demarcated entity. Utilizing Foucault's reflections on ‘state-phobia’ as a source of inspiration, I show how an organization-phobia has gained a hold within Organization Theory (OT). By attending to the history...... of this organization-phobia, the paper argues that OT has become increasingly incapable of speaking about its core object. I show how organizations went from being conceptualized as entities of major importance to becoming theoretically deconstructed and associated with all kinds of ills. Through this history......, organizations as distinct entities have been rendered so problematic that they have gradually come to be removed from the center of OT. The costs of this have been rather significant. Besides undermining the grounds that gave OT intellectual credibility and legitimacy to begin with, the organization-phobia...

  14. The newest precision measurement

    Lee, Jing Gu; Lee, Jong Dae

    1974-05-01

    This book introduces basic of precision measurement, measurement of length, limit gauge, measurement of angles, measurement of surface roughness, measurement of shapes and locations, measurement of outline, measurement of external and internal thread, gear testing, accuracy inspection of machine tools, three dimension coordinate measuring machine, digitalisation of precision measurement, automation of precision measurement, measurement of cutting tools, measurement using laser, and point of choosing length measuring instrument.

  15. Practical precision measurement

    Kwak, Ho Chan; Lee, Hui Jun

    1999-01-01

    This book introduces basic knowledge of precision measurement, measurement of length, precision measurement of minor diameter, measurement of angles, measurement of surface roughness, three dimensional measurement, measurement of locations and shapes, measurement of screw, gear testing, cutting tools testing, rolling bearing testing, and measurement of digitalisation. It covers height gauge, how to test surface roughness, measurement of plan and straightness, external and internal thread testing, gear tooth measurement, milling cutter, tab, rotation precision measurement, and optical transducer.

  16. Precision surveying system for PEP

    Gunn, J.; Lauritzen, T.; Sah, R.; Pellisier, P.F.

    1977-01-01

    A semi-automatic precision surveying system is being developed for PEP. Reference elevations for vertical alignment will be provided by a liquid level. The short range surveying will be accomplished using a Laser Surveying System featuring automatic data acquisition and analysis

  17. Proton gyromagnetic precision measurement system

    Zhu Deming; Deming Zhu

    1991-01-01

    A computerized control and measurement system used in the proton gyromagnetic precision meausrement is descirbed. It adopts the CAMAC data acquisition equipment, using on-line control and analysis with the HP85 and PDP-11/60 computer systems. It also adopts the RSX11M computer operation system, and the control software is written in FORTRAN language

  18. Specifications for trueness and precision of a reference measurement system for serum/plasma 25-hydroxyvitamin D analysis.

    Stöckl, Dietmar; Sluss, Patrick M; Thienpont, Linda M

    2009-10-01

    The divergence in analytical quality of serum/plasma 25-hydroxy-vitamin D analysis calls for defining specifications for a reference measurement system. Fundamentally, in a reference measurement system, there should be a relationship between the analytical specifications for higher- (reference) and lower-order (routine) measurements. Therefore, when setting specifications, we started with limits for routine imprecision (CV(rou)) and bias (B(rou)) using 4 models: (1) the misclassifications in diagnosis, (2) biological variation data (reference interval (RI) and monitoring), (3) expert recommendations, and (4) state-of-the-art performance. Then, we used the derived goals to tailor those for reference measurements and certified reference materials (CRMs) for calibration by setting the limits for CV(ref) at 0.5 CV(rou), B(ref) at 0.33 B(rou)(,) max. uncertainty (U(max)) at 0.33 B(ref). The established specifications ranged between CV(rou)model 3) and CV(rou)model 2, monitoring). Model 2 (monitoring) gave the most stringent goals, model 3, the most liberal ones. Accounting for state-of-the-art performance and certification capabilities, we used model 2 (RI) to recommend achievable goals: for routine testing, CV(rou)measurements, CV(ref)

  19. [Precision and personalized medicine].

    Sipka, Sándor

    2016-10-01

    The author describes the concept of "personalized medicine" and the newly introduced "precision medicine". "Precision medicine" applies the terms of "phenotype", "endotype" and "biomarker" in order to characterize more precisely the various diseases. Using "biomarkers" the homogeneous type of a disease (a "phenotype") can be divided into subgroups called "endotypes" requiring different forms of treatment and financing. The good results of "precision medicine" have become especially apparent in relation with allergic and autoimmune diseases. The application of this new way of thinking is going to be necessary in Hungary, too, in the near future for participants, controllers and financing boards of healthcare. Orv. Hetil., 2016, 157(44), 1739-1741.

  20. Precision Clock Evaluation Facility

    Federal Laboratory Consortium — FUNCTION: Tests and evaluates high-precision atomic clocks for spacecraft, ground, and mobile applications. Supports performance evaluation, environmental testing,...