WorldWideScience

Sample records for objective analysis precision

  1. Precise Object Tracking under Deformation

    International Nuclear Information System (INIS)

    Saad, M.H.

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results. xiiiThe precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This framework focuses on the precise object tracking under deformation such as scaling, rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high

  2. Precise object tracking under deformation

    International Nuclear Information System (INIS)

    Saad, M.H

    2010-01-01

    The precise object tracking is an essential issue in several serious applications such as; robot vision, automated surveillance (civil and military), inspection, biomedical image analysis, video coding, motion segmentation, human-machine interface, visualization, medical imaging, traffic systems, satellite imaging etc. This frame-work focuses on the precise object tracking under deformation such as scaling , rotation, noise, blurring and change of illumination. This research is a trail to solve these serious problems in visual object tracking by which the quality of the overall system will be improved. Developing a three dimensional (3D) geometrical model to determine the current pose of an object and predict its future location based on FIR model learned by the OLS. This framework presents a robust ranging technique to track a visual target instead of the traditional expensive ranging sensors. The presented research work is applied to real video stream and achieved high precession results.

  3. The neural basis of precise visual short-term memory for complex recognisable objects.

    Science.gov (United States)

    Veldsman, Michele; Mitchell, Daniel J; Cusack, Rhodri

    2017-10-01

    Recent evidence suggests that visual short-term memory (VSTM) capacity estimated using simple objects, such as colours and oriented bars, may not generalise well to more naturalistic stimuli. More visual detail can be stored in VSTM when complex, recognisable objects are maintained compared to simple objects. It is not yet known if it is recognisability that enhances memory precision, nor whether maintenance of recognisable objects is achieved with the same network of brain regions supporting maintenance of simple objects. We used a novel stimulus generation method to parametrically warp photographic images along a continuum, allowing separate estimation of the precision of memory representations and the number of items retained. The stimulus generation method was also designed to create unrecognisable, though perceptually matched, stimuli, to investigate the impact of recognisability on VSTM. We adapted the widely-used change detection and continuous report paradigms for use with complex, photographic images. Across three functional magnetic resonance imaging (fMRI) experiments, we demonstrated greater precision for recognisable objects in VSTM compared to unrecognisable objects. This clear behavioural advantage was not the result of recruitment of additional brain regions, or of stronger mean activity within the core network. Representational similarity analysis revealed greater variability across item repetitions in the representations of recognisable, compared to unrecognisable complex objects. We therefore propose that a richer range of neural representations support VSTM for complex recognisable objects. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Multiple-objective optimization in precision laser cutting of different thermoplastics

    Science.gov (United States)

    Tamrin, K. F.; Nukman, Y.; Choudhury, I. A.; Shirley, S.

    2015-04-01

    Thermoplastics are increasingly being used in biomedical, automotive and electronics industries due to their excellent physical and chemical properties. Due to the localized and non-contact process, use of lasers for cutting could result in precise cut with small heat-affected zone (HAZ). Precision laser cutting involving various materials is important in high-volume manufacturing processes to minimize operational cost, error reduction and improve product quality. This study uses grey relational analysis to determine a single optimized set of cutting parameters for three different thermoplastics. The set of the optimized processing parameters is determined based on the highest relational grade and was found at low laser power (200 W), high cutting speed (0.4 m/min) and low compressed air pressure (2.5 bar). The result matches with the objective set in the present study. Analysis of variance (ANOVA) is then carried out to ascertain the relative influence of process parameters on the cutting characteristics. It was found that the laser power has dominant effect on HAZ for all thermoplastics.

  5. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  6. A linear actuator for precision positioning of dual objects

    International Nuclear Information System (INIS)

    Peng, Yuxin; Cao, Jie; Guo, Zhao; Yu, Haoyong

    2015-01-01

    In this paper, a linear actuator for precision positioning of dual objects is proposed based on a double friction drive principle using a single piezoelectric element (PZT). The linear actuator consists of an electromagnet and a permanent magnet, which are connected by the PZT. The electromagnet serves as an object 1, and another object (object 2) is attached on the permanent magnet by the magnetic force. For positioning the dual objects independently, two different friction drive modes can be alternated by an on–off control of the electromagnet. When the electromagnet releases from the guide way, it can be driven by impact friction force generated by the PZT. Otherwise, when the electromagnet clamps on the guide way and remains stationary, the object 2 can be driven based on the principle of smooth impact friction drive. A prototype was designed and constructed and experiments were carried out to test the basic performance of the actuator. It has been verified that with a compact size of 31 mm (L) × 12 mm (W) × 8 mm (H), the two objects can achieve long strokes on the order of several millimeters and high resolutions of several tens of nanometers. Since the proposed actuator allows independent movement of two objects by a single PZT, the actuator has the potential to be constructed compactly. (paper)

  7. Numerical Simulation Analysis of High-precision Dispensing Needles for Solid-liquid Two-phase Grinding

    Science.gov (United States)

    Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming

    2018-03-01

    In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.

  8. Quantization and training of object detection networks with low-precision weights and activations

    Science.gov (United States)

    Yang, Bo; Liu, Jian; Zhou, Li; Wang, Yun; Chen, Jie

    2018-01-01

    As convolutional neural networks have demonstrated state-of-the-art performance in object recognition and detection, there is a growing need for deploying these systems on resource-constrained mobile platforms. However, the computational burden and energy consumption of inference for these networks are significantly higher than what most low-power devices can afford. To address these limitations, this paper proposes a method to train object detection networks with low-precision weights and activations. The probability density functions of weights and activations of each layer are first directly estimated using piecewise Gaussian models. Then, the optimal quantization intervals and step sizes for each convolution layer are adaptively determined according to the distribution of weights and activations. As the most computationally expensive convolutions can be replaced by effective fixed point operations, the proposed method can drastically reduce computation complexity and memory footprint. Performing on the tiny you only look once (YOLO) and YOLO architectures, the proposed method achieves comparable accuracy to their 32-bit counterparts. As an illustration, the proposed 4-bit and 8-bit quantized versions of the YOLO model achieve a mean average precision of 62.6% and 63.9%, respectively, on the Pascal visual object classes 2012 test dataset. The mAP of the 32-bit full-precision baseline model is 64.0%.

  9. A Mission Planning Approach for Precision Farming Systems Based on Multi-Objective Optimization

    Directory of Open Access Journals (Sweden)

    Zhaoyu Zhai

    2018-06-01

    Full Text Available As the demand for food grows continuously, intelligent agriculture has drawn much attention due to its capability of producing great quantities of food efficiently. The main purpose of intelligent agriculture is to plan agricultural missions properly and use limited resources reasonably with minor human intervention. This paper proposes a Precision Farming System (PFS as a Multi-Agent System (MAS. Components of PFS are treated as agents with different functionalities. These agents could form several coalitions to complete the complex agricultural missions cooperatively. In PFS, mission planning should consider several criteria, like expected benefit, energy consumption or equipment loss. Hence, mission planning could be treated as a Multi-objective Optimization Problem (MOP. In order to solve MOP, an improved algorithm, MP-PSOGA, is proposed, taking advantages of the Genetic Algorithms and Particle Swarm Optimization. A simulation, called precise pesticide spraying mission, is performed to verify the feasibility of the proposed approach. Simulation results illustrate that the proposed approach works properly. This approach enables the PFS to plan missions and allocate scarce resources efficiently. The theoretical analysis and simulation is a good foundation for the future study. Once the proposed approach is applied to a real scenario, it is expected to bring significant economic improvement.

  10. Best, Useful and Objective Precisions for Information Retrieval of Three Search Methods in PubMed and iPubMed

    Directory of Open Access Journals (Sweden)

    Somayyeh Nadi Ravandi

    2016-10-01

    Full Text Available MEDLINE is one of the valuable sources of medical information on the Internet. Among the different open access sites of MEDLINE, PubMed is the best-known site. In 2010, iPubMed was established with an interaction-fuzzy search method for MEDLINE access. In the present work, we aimed to compare the precision of the retrieved sources (Best, Useful and Objective precision in the PubMed and iPubMed using two search methods (simple and MeSH search in PubMed and interaction-fuzzy method in iPubmed. During our semi-empirical study period, we held training workshops for 61 students of higher education to teach them Simple Search, MeSH Search, and Fuzzy-Interaction Search methods. Then, the precision of 305 searches for each method prepared by the students was calculated on the basis of Best precision, Useful precision, and Objective precision formulas. Analyses were done in SPSS version 11.5 using the Friedman and Wilcoxon Test, and three precisions obtained with the three precision formulas were studied for the three search methods. The mean precision of the interaction-fuzzy Search method was higher than that of the simple search and MeSH search for all three types of precision, i.e., Best precision, Useful precision, and Objective precision, and the Simple search method was in the next rank, and their mean precisions were significantly different (P < 0.001. The precision of the interaction-fuzzy search method in iPubmed was investigated for the first time. Also for the first time, three types of precision were evaluated in PubMed and iPubmed. The results showed that the Interaction-Fuzzy search method is more precise than using the natural language search (simple search and MeSH search, and users of this method found papers that were more related to their queries; even though search in Pubmed is useful, it is important that users apply new search methods to obtain the best results.

  11. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  12. Determining characteristics of artificial near-Earth objects using observability analysis

    Science.gov (United States)

    Friedman, Alex M.; Frueh, Carolin

    2018-03-01

    Observability analysis is a method for determining whether a chosen state of a system can be determined from the output or measurements. Knowledge of state information availability resulting from observability analysis leads to improved sensor tasking for observation of orbital debris and better control of active spacecraft. This research performs numerical observability analysis of artificial near-Earth objects. Analysis of linearization methods and state transition matrices is performed to determine the viability of applying linear observability methods to the nonlinear orbit problem. Furthermore, pre-whitening is implemented to reformulate classical observability analysis. In addition, the state in observability analysis is typically composed of position and velocity; however, including object characteristics beyond position and velocity can be crucial for precise orbit propagation. For example, solar radiation pressure has a significant impact on the orbit of high area-to-mass ratio objects in geosynchronous orbit. Therefore, determining the time required for solar radiation pressure parameters to become observable is important for understanding debris objects. In order to compare observability analysis results with and without measurement noise and an extended state, quantitative measures of observability are investigated and implemented.

  13. Evidence of gradual loss of precision for simple features and complex objects in visual working memory.

    Science.gov (United States)

    Rademaker, Rosanne L; Park, Young Eun; Sack, Alexander T; Tong, Frank

    2018-03-01

    Previous studies have suggested that people can maintain prioritized items in visual working memory for many seconds, with negligible loss of information over time. Such findings imply that working memory representations are robust to the potential contaminating effects of internal noise. However, once visual information is encoded into working memory, one might expect it to inevitably begin degrading over time, as this actively maintained information is no longer tethered to the original perceptual input. Here, we examined this issue by evaluating working memory for single central presentations of an oriented grating, color patch, or face stimulus, across a range of delay periods (1, 3, 6, or 12 s). We applied a mixture-model analysis to distinguish changes in memory precision over time from changes in the frequency of outlier responses that resemble random guesses. For all 3 types of stimuli, participants exhibited a clear and consistent decline in the precision of working memory as a function of temporal delay, as well as a modest increase in guessing-related responses for colored patches and face stimuli. We observed a similar loss of precision over time while controlling for temporal distinctiveness. Our results demonstrate that visual working memory is far from lossless: while basic visual features and complex objects can be maintained in a quite stable manner over time, these representations are still subject to noise accumulation and complete termination. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Optimization to improve precision in neutron activation analysis

    International Nuclear Information System (INIS)

    Yustina Tri Handayani

    2010-01-01

    The level of precision or accuracy required in analysis should be satisfied the general requirements and customer needs. In presenting the results of the analysis, the level of precision is expressed as uncertainty. Requirement general is Horwitz prediction. Factors affecting the uncertainty in the Neutron Activation Analysis (NAA) include the mass of sample, mass standards, concentration in standard, count of sample, count of standard and counting geometry. Therefore, to achieve the expected level of precision, these parameters need to be optimized. A standard concentration of similar materials is applied as a basis of calculation. In the calculation NIST SRM 2704 is applied for sediment samples. Mass of sample, irradiation time and cooling time can be modified to obtain the expected uncertainty. The prediction results show the level of precision for Al, V, Mg, Mn, K, Na, As, Cr, Co, Fe, and Zn eligible the Horwitz. The predictive the count and standard deviation for Mg-27 and Zn-65 were higher than the actual value occurred due to overlapping of Mg-27 and Mn-54 peaks and Zn-65 and Fe-59 peaks. Precision level of Ca is greater than the Horwitz, since the value of microscopic cross section, the probability of radiation emission of Ca-49 and gamma spectrometer efficiency at 3084 keV is relatively small. Increased precision can only be done by extending the counting time and multiply the number of samples, because of the fixed value. The prediction results are in accordance with experimental results. (author)

  15. Moving Object Detection Using Scanning Camera on a High-Precision Intelligent Holder

    Science.gov (United States)

    Chen, Shuoyang; Xu, Tingfa; Li, Daqun; Zhang, Jizhou; Jiang, Shenwang

    2016-01-01

    During the process of moving object detection in an intelligent visual surveillance system, a scenario with complex background is sure to appear. The traditional methods, such as “frame difference” and “optical flow”, may not able to deal with the problem very well. In such scenarios, we use a modified algorithm to do the background modeling work. In this paper, we use edge detection to get an edge difference image just to enhance the ability of resistance illumination variation. Then we use a “multi-block temporal-analyzing LBP (Local Binary Pattern)” algorithm to do the segmentation. In the end, a connected component is used to locate the object. We also produce a hardware platform, the core of which consists of the DSP (Digital Signal Processor) and FPGA (Field Programmable Gate Array) platforms and the high-precision intelligent holder. PMID:27775671

  16. Moving Object Detection Using Scanning Camera on a High-Precision Intelligent Holder

    Directory of Open Access Journals (Sweden)

    Shuoyang Chen

    2016-10-01

    Full Text Available During the process of moving object detection in an intelligent visual surveillance system, a scenario with complex background is sure to appear. The traditional methods, such as “frame difference” and “optical flow”, may not able to deal with the problem very well. In such scenarios, we use a modified algorithm to do the background modeling work. In this paper, we use edge detection to get an edge difference image just to enhance the ability of resistance illumination variation. Then we use a “multi-block temporal-analyzing LBP (Local Binary Pattern” algorithm to do the segmentation. In the end, a connected component is used to locate the object. We also produce a hardware platform, the core of which consists of the DSP (Digital Signal Processor and FPGA (Field Programmable Gate Array platforms and the high-precision intelligent holder.

  17. PRECISE - pregabalin in addition to usual care: Statistical analysis plan

    NARCIS (Netherlands)

    S. Mathieson (Stephanie); L. Billot (Laurent); C. Maher (Chris); A.J. McLachlan (Andrew J.); J. Latimer (Jane); B.W. Koes (Bart); M.J. Hancock (Mark J.); I. Harris (Ian); R.O. Day (Richard O.); J. Pik (Justin); S. Jan (Stephen); C.-W.C. Lin (Chung-Wei Christine)

    2016-01-01

    textabstractBackground: Sciatica is a severe, disabling condition that lacks high quality evidence for effective treatment strategies. This a priori statistical analysis plan describes the methodology of analysis for the PRECISE study. Methods/design: PRECISE is a prospectively registered, double

  18. Precise Analysis of String Expressions

    DEFF Research Database (Denmark)

    Christensen, Aske Simon; Møller, Anders; Schwartzbach, Michael Ignatieff

    2003-01-01

    We perform static analysis of Java programs to answer a simple question: which values may occur as results of string expressions? The answers are summarized for each expression by a regular language that is guaranteed to contain all possible values. We present several applications of this analysis...... are automatically produced. We present extensive benchmarks demonstrating that the analysis is efficient and produces results of useful precision......., including statically checking the syntax of dynamically generated expressions, such as SQL queries. Our analysis constructs flow graphs from class files and generates a context-free grammar with a nonterminal for each string expression. The language of this grammar is then widened into a regular language...

  19. Finger pressure adjustments to various object configurations during precision grip in humans and monkeys.

    Science.gov (United States)

    Viaro, Riccardo; Tia, Banty; Coudé, Gino; Canto, Rosario; Oliynyk, Andriy; Salmas, Paola; Masia, Lorenzo; Sandini, Giulio; Fadiga, Luciano

    2017-06-01

    In this study, we recorded the pressure exerted onto an object by the index finger and the thumb of the preferred hand of 18 human subjects and either hand of two macaque monkeys during a precision grasping task. The to-be-grasped object was a custom-made device composed by two plates which could be variably oriented by a motorized system while keeping constant the size and thus grip dimension. The to-be-grasped plates were covered by an array of capacitive sensors to measure specific features of finger adaptation, namely pressure intensity and centroid location and displacement. Kinematic measurements demonstrated that for human subjects and for monkeys, different plate configurations did not affect wrist velocity and grip aperture during the reaching phase. Consistently, at the instant of fingers-plates contact, pressure centroids were clustered around the same point for all handle configurations. However, small pressure centroid displacements were specifically adopted for each configuration, indicating that both humans and monkeys can display finger adaptation during precision grip. Moreover, humans applied stronger thumb pressure intensity, performed less centroid displacement and required reduced adjustment time, as compared to monkeys. These pressure patterns remain similar when different load forces were required to pull the handle, as ascertained by additional measurements in humans. The present findings indicate that, although humans and monkeys share common features in motor control of grasping, they differ in the adjustment of fingertip pressure, probably because of skill and/or morphology divergences. Such a precision grip device may form the groundwork for future studies on prehension mechanisms. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  20. Functional Object Analysis

    DEFF Research Database (Denmark)

    Raket, Lars Lau

    We propose a direction it the field of statistics which we will call functional object analysis. This subfields considers the analysis of functional objects defined on continuous domains. In this setting we will focus on model-based statistics, with a particularly emphasis on mixed......-effect formulations, where the observed functional signal is assumed to consist of both fixed and random functional effects. This thesis takes the initial steps toward the development of likelihood-based methodology for functional objects. We first consider analysis of functional data defined on high...

  1. In vivo glenohumeral analysis using 3D MRI models and a flexible software tool: feasibility and precision.

    Science.gov (United States)

    Busse, Harald; Thomas, Michael; Seiwerts, Matthias; Moche, Michael; Busse, Martin W; von Salis-Soglio, Georg; Kahn, Thomas

    2008-01-01

    To implement a PC-based morphometric analysis platform and to evaluate the feasibility and precision of MRI measurements of glenohumeral translation. Using a vertically open 0.5T MRI scanner, the shoulders of 10 healthy subjects were scanned in apprehension (AP) and in neutral position (NP), respectively. Surface models of the humeral head (HH) and the glenoid cavity (GC) were created from segmented MR images by three readers. Glenohumeral translation was determined by the projection point of the manually fitted HH center on the GC plane defined by the two main principal axes of the GC model. Positional precision, given as mean (extreme value at 95% confidence level), was 0.9 (1.8) mm for the HH center and 0.7 (1.6) mm for the GC centroid; angular GC precision was 1.3 degrees (2.3 degrees ) for the normal and about 4 degrees (7 degrees ) for the anterior and superior coordinate axes. The two-dimensional (2D) precision of the HH projection point was 1.1 (2.2) mm. A significant HH translation between AP and NP was found. Despite a limited quality of the underlying model data, our PC-based analysis platform allows a precise morphometric analysis of the glenohumeral joint. The software is easily extendable and may potentially be used for an objective evaluation of therapeutical measures.

  2. The economic case for precision medicine.

    Science.gov (United States)

    Gavan, Sean P; Thompson, Alexander J; Payne, Katherine

    2018-01-01

    Introduction : The advancement of precision medicine into routine clinical practice has been highlighted as an agenda for national and international health care policy. A principle barrier to this advancement is in meeting requirements of the payer or reimbursement agency for health care. This special report aims to explain the economic case for precision medicine, by accounting for the explicit objectives defined by decision-makers responsible for the allocation of limited health care resources. Areas covered : The framework of cost-effectiveness analysis, a method of economic evaluation, is used to describe how precision medicine can, in theory, exploit identifiable patient-level heterogeneity to improve population health outcomes and the relative cost-effectiveness of health care. Four case studies are used to illustrate potential challenges when demonstrating the economic case for a precision medicine in practice. Expert commentary : The economic case for a precision medicine should be considered at an early stage during its research and development phase. Clinical and economic evidence can be generated iteratively and should be in alignment with the objectives and requirements of decision-makers. Programmes of further research, to demonstrate the economic case of a precision medicine, can be prioritized by the extent that they reduce the uncertainty expressed by decision-makers.

  3. Ion beam analysis in art and archaeology: attacking the power precisions paradigm

    International Nuclear Information System (INIS)

    Abraham, Meg

    2004-01-01

    It is a post-modern axiom that the closer one looks at something the more blinkered is the view, thus the result is often a failure to see the whole picture. With this in mind, the value of a tool for art and archaeology applications is greatly enhanced if the information is scientifically precise and yet is easily integrated into the broader study regarding the objects at hand. Art and archaeological objects offer some unique challenges for researchers. First, they are almost always extraordinarily inhomogeneous across individual pieces and across types. Second they are often valuable and delicate so sampling is discouraged. Finally, in most cases, each piece is unique, thus the data is also unique and is of greatest value when incorporated into the overall understanding of the object or of the culture of the artisan. Ion beam analysis solves many of these problems. With IBA, it is possible to avoid sampling by using an external beam setup or by manipulating small objects in a vacuum. The technique is largely non-destructive, allowing for multiple data points to be taken across an object. The X-ray yields are from deeper in the sample than those of other techniques and using RBS one can attain bulk concentrations from microns into the sample. And finally, the resulting X-ray spectra is easily interpreted and understood by many conservators and curators, while PIXE maps are a wonderful visual record of the results of the analysis. Some examples of the special role that ion beam analysis plays in the examination of cultural objects will be covered in this talk

  4. Precision-Recall-Gain Curves:PR Analysis Done Right

    OpenAIRE

    Flach, Peter; Kull, Meelis

    2015-01-01

    Precision-Recall analysis abounds in applications of binary classification where true negatives do not add value and hence should not affect assessment of the classifier's performance. Perhaps inspired by the many advantages of receiver operating characteristic (ROC) curves and the area under such curves for accuracy-based performance assessment, many researchers have taken to report Precision-Recall (PR) curves and associated areas as performance metric. We demonstrate in this paper that thi...

  5. Object properties and cognitive load in the formation of associative memory during precision lifting.

    Science.gov (United States)

    Li, Yong; Randerath, Jennifer; Bauer, Hans; Marquardt, Christian; Goldenberg, Georg; Hermsdörfer, Joachim

    2009-01-03

    When we manipulate familiar objects in our daily life, our grip force anticipates the physical demands right from the moment of contact with the object, indicating the existence of a memory for relevant object properties. This study explores the formation and consolidation of the memory processes that associate either familiar (size) or arbitrary object features (color) with object weight. In the general task, participants repetitively lifted two differently weighted objects (580 and 280 g) in a pseudo-random order. Forty young healthy adults participated in this study and were randomly distributed into four groups: Color Cue Single task (CCS, blue and red, 9.8(3)cm(3)), Color Cue Dual task (CCD), No Cue (NC) and Size Cue (SC, 9.8(3) and 6(3)cm(3)) group. All groups performed a repetitive precision grasp-lift task and were retested with the same protocol after a 5-min pause. The CCD group was also required to simultaneously perform a memory task during each lift of differently weighted objects coded by color. The results show that groups lifting objects with arbitrary or familiar features successfully formed the association between object weight and manipulated object features and incorporated this into grip force programming, as observed in the different scaling of grip force and grip force rate for different object weights. An arbitrary feature, i.e., color, can be sufficiently associated with object weight, however with less strength than the familiar feature of size. The simultaneous memory task impaired anticipatory force scaling during repetitive object lifting but did not jeopardize the learning process and the consolidation of the associative memory.

  6. Constraint Solver Techniques for Implementing Precise and Scalable Static Program Analysis

    DEFF Research Database (Denmark)

    Zhang, Ye

    solver using unification we could make a program analysis easier to design and implement, much more scalable, and still as precise as expected. We present an inclusion constraint language with the explicit equality constructs for specifying program analysis problems, and a parameterized framework...... developers to build reliable software systems more quickly and with fewer bugs or security defects. While designing and implementing a program analysis remains a hard work, making it both scalable and precise is even more challenging. In this dissertation, we show that with a general inclusion constraint...... data flow analyses for C language, we demonstrate a large amount of equivalences could be detected by off-line analyses, and they could then be used by a constraint solver to significantly improve the scalability of an analysis without sacrificing any precision....

  7. Precise Plan in the analysis of volume precision in SynergyTM conebeam CT image

    International Nuclear Information System (INIS)

    Bai Sen; Xu Qingfeng; Zhong Renming; Jiang Xiaoqin; Jiang Qingfeng; Xu Feng

    2007-01-01

    Objective: A method of checking the volume precision in Synergy TM conebeam CT image. Methods: To scan known phantoms (big, middle, small spheres, cubes and cuniform cavum) at different positions (CBCT centre and departure centre from 5, 8, 10 cm along the accelerator G-T way)with conebeam CT, the phantom volume of reconstructed images were measure. Then to compared measured volume of Synergy TM conebeam CT with fanbeam CT results and nominal values. Results: The middle spheres had 1.5% discrepancy in nominal values and metrical average values at CBCT centre and departure from centre 5, 8 cm along accelerator G-T way. The small spheres showed 8.1%, with 0.8 % of the big cube and 2.9% of small cube, in nominal values and metrical average values at CBCT centre and departure from centre 5, 8, 10 cm along the accelerator G-T way. Conclusion: In valid scan range of Synergy TM conebeam CT, reconstructed precision is independent of the distance deviation from the center. (authors)

  8. Sensitivity Analysis of Deviation Source for Fast Assembly Precision Optimization

    Directory of Open Access Journals (Sweden)

    Jianjun Tang

    2014-01-01

    Full Text Available Assembly precision optimization of complex product has a huge benefit in improving the quality of our products. Due to the impact of a variety of deviation source coupling phenomena, the goal of assembly precision optimization is difficult to be confirmed accurately. In order to achieve optimization of assembly precision accurately and rapidly, sensitivity analysis of deviation source is proposed. First, deviation source sensitivity is defined as the ratio of assembly dimension variation and deviation source dimension variation. Second, according to assembly constraint relations, assembly sequences and locating, deviation transmission paths are established by locating the joints between the adjacent parts, and establishing each part’s datum reference frame. Third, assembly multidimensional vector loops are created using deviation transmission paths, and the corresponding scalar equations of each dimension are established. Then, assembly deviation source sensitivity is calculated by using a first-order Taylor expansion and matrix transformation method. Finally, taking assembly precision optimization of wing flap rocker as an example, the effectiveness and efficiency of the deviation source sensitivity analysis method are verified.

  9. Visual Field Preferences of Object Analysis for Grasping with One Hand

    Directory of Open Access Journals (Sweden)

    Ada eLe

    2014-10-01

    Full Text Available When we grasp an object using one hand, the opposite hemisphere predominantly guides the motor control of grasp movements (Davare et al. 2007; Rice et al. 2007. However, it is unclear whether visual object analysis for grasp control relies more on inputs (a from the contralateral than the ipsilateral visual field, (b from one dominant visual field regardless of the grasping hand, or (c from both visual fields equally. For bimanual grasping of a single object we have recently demonstrated a visual field preference for the left visual field (Le and Niemeier 2013a, 2013b, consistent with a general right-hemisphere dominance for sensorimotor control of bimanual grasps (Le et al., 2013. But visual field differences have never been tested for unimanual grasping. Therefore, here we asked right-handed participants to fixate to the left or right of an object and then grasp the object either with their right or left hand using a precision grip. We found that participants grasping with their right hand performed better with objects in the right visual field: maximum grip apertures (MGAs were more closely matched to the object width and were smaller than for objects in the left visual field. In contrast, when people grasped with their left hand, preferences switched to the left visual field. What is more, MGA scaling showed greater visual field differences compared to right-hand grasping. Our data suggest that, visual object analysis for unimanual grasping shows a preference for visual information from the ipsilateral visual field, and that the left hemisphere is better equipped to control grasps in both visual fields.

  10. Droplet-counting Microtitration System for Precise On-site Analysis.

    Science.gov (United States)

    Kawakubo, Susumu; Omori, Taichi; Suzuki, Yasutada; Ueta, Ikuo

    2018-01-01

    A new microtitration system based on the counting of titrant droplets has been developed for precise on-site analysis. The dropping rate was controlled by inserting a capillary tube as a flow resistance in a laboratory-made micropipette. The error of titration was 3% in a simulated titration with 20 droplets. The pre-addition of a titrant was proposed for precise titration within an error of 0.5%. The analytical performances were evaluated for chelate titration, redox titration and acid-base titration.

  11. Fully automatic and precise data analysis developed for time-of-flight mass spectrometry.

    Science.gov (United States)

    Meyer, Stefan; Riedo, Andreas; Neuland, Maike B; Tulej, Marek; Wurz, Peter

    2017-09-01

    Scientific objectives of current and future space missions are focused on the investigation of the origin and evolution of the solar system with the particular emphasis on habitability and signatures of past and present life. For in situ measurements of the chemical composition of solid samples on planetary surfaces, the neutral atmospheric gas and the thermal plasma of planetary atmospheres, the application of mass spectrometers making use of time-of-flight mass analysers is a technique widely used. However, such investigations imply measurements with good statistics and, thus, a large amount of data to be analysed. Therefore, faster and especially robust automated data analysis with enhanced accuracy is required. In this contribution, an automatic data analysis software, which allows fast and precise quantitative data analysis of time-of-flight mass spectrometric data, is presented and discussed in detail. A crucial part of this software is a robust and fast peak finding algorithm with a consecutive numerical integration method allowing precise data analysis. We tested our analysis software with data from different time-of-flight mass spectrometers and different measurement campaigns thereof. The quantitative analysis of isotopes, using automatic data analysis, yields results with an accuracy of isotope ratios up to 100 ppm for a signal-to-noise ratio (SNR) of 10 4 . We show that the accuracy of isotope ratios is in fact proportional to SNR -1 . Furthermore, we observe that the accuracy of isotope ratios is inversely proportional to the mass resolution. Additionally, we show that the accuracy of isotope ratios is depending on the sample width T s by T s 0.5 . Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Defining precision: The precision medicine initiative trials NCI-MPACT and NCI-MATCH.

    Science.gov (United States)

    Coyne, Geraldine O'Sullivan; Takebe, Naoko; Chen, Alice P

    "Precision" trials, using rationally incorporated biomarker targets and molecularly selective anticancer agents, have become of great interest to both patients and their physicians. In the endeavor to test the cornerstone premise of precision oncotherapy, that is, determining if modulating a specific molecular aberration in a patient's tumor with a correspondingly specific therapeutic agent improves clinical outcomes, the design of clinical trials with embedded genomic characterization platforms which guide therapy are an increasing challenge. The National Cancer Institute Precision Medicine Initiative is an unprecedented large interdisciplinary collaborative effort to conceptualize and test the feasibility of trials incorporating sequencing platforms and large-scale bioinformatics processing that are not currently uniformly available to patients. National Cancer Institute-Molecular Profiling-based Assignment of Cancer Therapy and National Cancer Institute-Molecular Analysis for Therapy Choice are 2 genomic to phenotypic trials under this National Cancer Institute initiative, where treatment is selected according to predetermined genetic alterations detected using next-generation sequencing technology across a broad range of tumor types. In this article, we discuss the objectives and trial designs that have enabled the public-private partnerships required to complete the scale of both trials, as well as interim trial updates and strategic considerations that have driven data analysis and targeted therapy assignment, with the intent of elucidating further the benefits of this treatment approach for patients. Copyright © 2017. Published by Elsevier Inc.

  13. High precision analysis of trace lithium isotope by thermal ionization mass spectrometry

    International Nuclear Information System (INIS)

    Tang Lei; Liu Xuemei; Long Kaiming; Liu Zhao; Yang Tianli

    2010-01-01

    High precision analysis method of ng lithium by thermal ionization mass spectrometry is developed. By double-filament measurement,phosphine acid ion enhancer and sample pre-baking technique,the precision of trace lithium analysis is improved. For 100 ng lithium isotope standard sample, relative standard deviation is better than 0.086%; for 10 ng lithium isotope standard sample, relative standard deviation is better than 0.90%. (authors)

  14. GEOPOSITIONING PRECISION ANALYSIS OF MULTIPLE IMAGE TRIANGULATION USING LRO NAC LUNAR IMAGES

    Directory of Open Access Journals (Sweden)

    K. Di

    2016-06-01

    Full Text Available This paper presents an empirical analysis of the geopositioning precision of multiple image triangulation using Lunar Reconnaissance Orbiter Camera (LROC Narrow Angle Camera (NAC images at the Chang’e-3(CE-3 landing site. Nine LROC NAC images are selected for comparative analysis of geopositioning precision. Rigorous sensor models of the images are established based on collinearity equations with interior and exterior orientation elements retrieved from the corresponding SPICE kernels. Rational polynomial coefficients (RPCs of each image are derived by least squares fitting using vast number of virtual control points generated according to rigorous sensor models. Experiments of different combinations of images are performed for comparisons. The results demonstrate that the plane coordinates can achieve a precision of 0.54 m to 2.54 m, with a height precision of 0.71 m to 8.16 m when only two images are used for three-dimensional triangulation. There is a general trend that the geopositioning precision, especially the height precision, is improved with the convergent angle of the two images increasing from several degrees to about 50°. However, the image matching precision should also be taken into consideration when choosing image pairs for triangulation. The precisions of using all the 9 images are 0.60 m, 0.50 m, 1.23 m in along-track, cross-track, and height directions, which are better than most combinations of two or more images. However, triangulation with selected fewer images could produce better precision than that using all the images.

  15. System and method for high precision isotope ratio destructive analysis

    Science.gov (United States)

    Bushaw, Bruce A; Anheier, Norman C; Phillips, Jon R

    2013-07-02

    A system and process are disclosed that provide high accuracy and high precision destructive analysis measurements for isotope ratio determination of relative isotope abundance distributions in liquids, solids, and particulate samples. The invention utilizes a collinear probe beam to interrogate a laser ablated plume. This invention provides enhanced single-shot detection sensitivity approaching the femtogram range, and isotope ratios that can be determined at approximately 1% or better precision and accuracy (relative standard deviation).

  16. Aspects of precision and accuracy in neutron activation analysis

    International Nuclear Information System (INIS)

    Heydorn, K.

    1980-03-01

    Analytical results without systematic errors and with accurately known random errors are normally distributed around their true values. Such results may be produced by means of neutron activation analysis both with and without radiochemical separation. When all sources of random variation are known a priori, their effect may be combined with the Poisson statistics characteristic of the counting process, and the standard deviation of a single analytical result may be estimated. The various steps of a complete neutron activation analytical procedure are therefore studied in detail with respect to determining their contribution to the overall variability of the final result. Verification of the estimated standard deviation is carried out by demonstrating the absence of significant unknown random errors through analysing, in replicate, samples covering the range of concentrations and matrices anticipated in actual use. Agreement between the estimated and the observed variability of replicate results is then tested by a simple statistic T based on the chi-square distribution. It is found that results from neutron activation analysis on biological samples can be brought into statistical control. In routine application of methods in statistical control the same statistical test may be used for quality control when some of the actual samples are analysed in duplicate. This analysis of precision serves to detect unknown or unexpected sources of variation of the analytical results, and both random and systematic errors have been discovered in practical trace element investigations in different areas of research. Particularly, at the ultratrace level of concentration where there are few or no standard reference materials for ascertaining the accuracy of results, the proposed quality control based on the analysis of precision combined with neutron activation analysis with radiochemical separation, with an a priori precision independent of the level of concentration, becomes a

  17. Error analysis of marker-based object localization using a single-plane XRII

    International Nuclear Information System (INIS)

    Habets, Damiaan F.; Pollmann, Steven I.; Yuan, Xunhua; Peters, Terry M.; Holdsworth, David W.

    2009-01-01

    The role of imaging and image guidance is increasing in surgery and therapy, including treatment planning and follow-up. Fluoroscopy is used for two-dimensional (2D) guidance or localization; however, many procedures would benefit from three-dimensional (3D) guidance or localization. Three-dimensional computed tomography (CT) using a C-arm mounted x-ray image intensifier (XRII) can provide high-quality 3D images; however, patient dose and the required acquisition time restrict the number of 3D images that can be obtained. C-arm based 3D CT is therefore limited in applications for x-ray based image guidance or dynamic evaluations. 2D-3D model-based registration, using a single-plane 2D digital radiographic system, does allow for rapid 3D localization. It is our goal to investigate - over a clinically practical range - the impact of x-ray exposure on the resulting range of 3D localization precision. In this paper it is assumed that the tracked instrument incorporates a rigidly attached 3D object with a known configuration of markers. A 2D image is obtained by a digital fluoroscopic x-ray system and corrected for XRII distortions (±0.035 mm) and mechanical C-arm shift (±0.080 mm). A least-square projection-Procrustes analysis is then used to calculate the 3D position using the measured 2D marker locations. The effect of x-ray exposure on the precision of 2D marker localization and on 3D object localization was investigated using numerical simulations and x-ray experiments. The results show a nearly linear relationship between 2D marker localization precision and the 3D localization precision. However, a significant amplification of error, nonuniformly distributed among the three major axes, occurs, and that is demonstrated. To obtain a 3D localization error of less than ±1.0 mm for an object with 20 mm marker spacing, the 2D localization precision must be better than ±0.07 mm. This requirement was met for all investigated nominal x-ray exposures at 28 cm FOV, and

  18. Systems Biology of Metabolism: A Driver for Developing Personalized and Precision Medicine

    DEFF Research Database (Denmark)

    Nielsen, Jens

    2017-01-01

    for advancing the development of personalized and precision medicine to treat metabolic diseases like insulin resistance, obesity, NAFLD, NASH, and cancer. It will be illustrated how the concept of genome-scale metabolic models can be used for integrative analysis of big data with the objective of identifying...... novel biomarkers that are foundational for personalized and precision medicine....

  19. Precision medicine in myasthenia graves: begin from the data precision

    Science.gov (United States)

    Hong, Yu; Xie, Yanchen; Hao, Hong-Jun; Sun, Ren-Cheng

    2016-01-01

    Myasthenia gravis (MG) is a prototypic autoimmune disease with overt clinical and immunological heterogeneity. The data of MG is far from individually precise now, partially due to the rarity and heterogeneity of this disease. In this review, we provide the basic insights of MG data precision, including onset age, presenting symptoms, generalization, thymus status, pathogenic autoantibodies, muscle involvement, severity and response to treatment based on references and our previous studies. Subgroups and quantitative traits of MG are discussed in the sense of data precision. The role of disease registries and scientific bases of precise analysis are also discussed to ensure better collection and analysis of MG data. PMID:27127759

  20. Thorium spectrophotometric analysis with high precision

    International Nuclear Information System (INIS)

    Palmieri, H.E.L.

    1983-06-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using disodium ethylenediaminetetraacetate (EDTA) solution and alizarin S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer program. (author)

  1. Accuracy and precision in activation analysis: counting

    International Nuclear Information System (INIS)

    Becker, D.A.

    1974-01-01

    Accuracy and precision in activation analysis was investigated with regard to counting of induced radioactivity. The various parameters discussed include configuration, positioning, density, homogeneity, intensity, radioisotopic purity, peak integration, and nuclear constants. Experimental results are presented for many of these parameters. The results obtained indicate that counting errors often contribute significantly to the inaccuracy and imprecision of analyses. The magnitude of these errors range from less than 1 percent to 10 percent or more in many cases

  2. The emerging potential for network analysis to inform precision cancer medicine.

    Science.gov (United States)

    Ozturk, Kivilcim; Dow, Michelle; Carlin, Daniel E; Bejar, Rafael; Carter, Hannah

    2018-06-14

    Precision cancer medicine promises to tailor clinical decisions to patients using genomic information. Indeed, successes of drugs targeting genetic alterations in tumors, such as imatinib that targets BCR-ABL in chronic myelogenous leukemia, have demonstrated the power of this approach. However biological systems are complex, and patients may differ not only by the specific genetic alterations in their tumor, but by more subtle interactions among such alterations. Systems biology and more specifically, network analysis, provides a framework for advancing precision medicine beyond clinical actionability of individual mutations. Here we discuss applications of network analysis to study tumor biology, early methods for N-of-1 tumor genome analysis and the path for such tools to the clinic. Copyright © 2018. Published by Elsevier Ltd.

  3. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms.

    Science.gov (United States)

    Wu, Qianqian; Yue, Honghao; Liu, Rongqiang; Zhang, Xiaoyou; Ding, Liang; Liang, Tian; Deng, Zongquan

    2015-08-14

    High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms.

  4. Measurement Model and Precision Analysis of Accelerometers for Maglev Vibration Isolation Platforms

    Directory of Open Access Journals (Sweden)

    Qianqian Wu

    2015-08-01

    Full Text Available High precision measurement of acceleration levels is required to allow active control for vibration isolation platforms. It is necessary to propose an accelerometer configuration measurement model that yields such a high measuring precision. In this paper, an accelerometer configuration to improve measurement accuracy is proposed. The corresponding calculation formulas of the angular acceleration were derived through theoretical analysis. A method is presented to minimize angular acceleration noise based on analysis of the root mean square noise of the angular acceleration. Moreover, the influence of installation position errors and accelerometer orientation errors on the calculation precision of the angular acceleration is studied. Comparisons of the output differences between the proposed configuration and the previous planar triangle configuration under the same installation errors are conducted by simulation. The simulation results show that installation errors have a relatively small impact on the calculation accuracy of the proposed configuration. To further verify the high calculation precision of the proposed configuration, experiments are carried out for both the proposed configuration and the planar triangle configuration. On the basis of the results of simulations and experiments, it can be concluded that the proposed configuration has higher angular acceleration calculation precision and can be applied to different platforms.

  5. [Refractive precision and objective quality of vision after toric lens implantation in cataract surgery].

    Science.gov (United States)

    Debois, A; Nochez, Y; Bezo, C; Bellicaud, D; Pisella, P-J

    2012-10-01

    To study efficacy and predictability of toric IOL implantation for correction of preoperative corneal astigmatism by analysing spherocylindrical refractive precision and objective quality of vision. Prospective study of 13 eyes undergoing micro-incisional cataract surgery through a 1.8mm corneal incision with toric IOL implantation (Lentis L313T(®), Oculentis) to treat over one D of preoperative corneal astigmatism. Preoperative evaluation included keratometry, subjective refraction, and total and corneal aberrometry (KR-1(®), Topcon). Six months postoperatively, measurements included slit lamp photography, documenting IOL rotation, tilt or decentration, uncorrected visual acuity, best-corrected visual acuity and objective quality of vision measurement (OQAS(®) Visiometrics, Spain). Postoperatively, mean uncorrected distance visual acuity was 8.33/10 ± 1.91 (0.09 ± 0.11 LogMar). Mean postoperative refractive sphere was 0.13 ± 0.73 diopters. Mean refractive astigmatism was -0.66 ± 0.56 diopters with corneal astigmatism of 2.17 ± 0.68 diopters. Mean IOL rotation was 4.4° ± 3.6° (range 0° to 10°). Mean rotation of this IOL at 6 months was less than 5°, demonstrating stability of the optic within the capsular bag. Objective quality of vision measurements were consistent with subjective uncorrected visual acuity. Implantation of the L313T(®) IOL is safe and effective for correction of corneal astigmatism in 1.8mm micro-incisional cataract surgery. Copyright © 2012 Elsevier Masson SAS. All rights reserved.

  6. Variable precision rough set for multiple decision attribute analysis

    Institute of Scientific and Technical Information of China (English)

    Lai; Kin; Keung

    2008-01-01

    A variable precision rough set (VPRS) model is used to solve the multi-attribute decision analysis (MADA) problem with multiple conflicting decision attributes and multiple condition attributes. By introducing confidence measures and a β-reduct, the VPRS model can rationally solve the conflicting decision analysis problem with multiple decision attributes and multiple condition attributes. For illustration, a medical diagnosis example is utilized to show the feasibility of the VPRS model in solving the MADA...

  7. Selective Attention to Auditory Memory Neurally Enhances Perceptual Precision.

    Science.gov (United States)

    Lim, Sung-Joo; Wöstmann, Malte; Obleser, Jonas

    2015-12-09

    Selective attention to a task-relevant stimulus facilitates encoding of that stimulus into a working memory representation. It is less clear whether selective attention also improves the precision of a stimulus already represented in memory. Here, we investigate the behavioral and neural dynamics of selective attention to representations in auditory working memory (i.e., auditory objects) using psychophysical modeling and model-based analysis of electroencephalographic signals. Human listeners performed a syllable pitch discrimination task where two syllables served as to-be-encoded auditory objects. Valid (vs neutral) retroactive cues were presented during retention to allow listeners to selectively attend to the to-be-probed auditory object in memory. Behaviorally, listeners represented auditory objects in memory more precisely (expressed by steeper slopes of a psychometric curve) and made faster perceptual decisions when valid compared to neutral retrocues were presented. Neurally, valid compared to neutral retrocues elicited a larger frontocentral sustained negativity in the evoked potential as well as enhanced parietal alpha/low-beta oscillatory power (9-18 Hz) during memory retention. Critically, individual magnitudes of alpha oscillatory power (7-11 Hz) modulation predicted the degree to which valid retrocues benefitted individuals' behavior. Our results indicate that selective attention to a specific object in auditory memory does benefit human performance not by simply reducing memory load, but by actively engaging complementary neural resources to sharpen the precision of the task-relevant object in memory. Can selective attention improve the representational precision with which objects are held in memory? And if so, what are the neural mechanisms that support such improvement? These issues have been rarely examined within the auditory modality, in which acoustic signals change and vanish on a milliseconds time scale. Introducing a new auditory memory

  8. An Empirical Study of Precise Interprocedural Array Analysis

    Directory of Open Access Journals (Sweden)

    Michael Hind

    1994-01-01

    Full Text Available In this article we examine the role played by the interprocedural analysis of array accesses in the automatic parallelization of Fortran programs. We use the PTRAN system to provide measurements of several benchmarks to compare different methods of representing interprocedurally accessed arrays. We examine issues concerning the effectiveness of automatic parallelization using these methods and the efficiency of a precise summarization method.

  9. The Use of Object-Oriented Analysis Methods in Surety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Craft, Richard L.; Funkhouser, Donald R.; Wyss, Gregory D.

    1999-05-01

    Object-oriented analysis methods have been used in the computer science arena for a number of years to model the behavior of computer-based systems. This report documents how such methods can be applied to surety analysis. By embodying the causality and behavior of a system in a common object-oriented analysis model, surety analysts can make the assumptions that underlie their models explicit and thus better communicate with system designers. Furthermore, given minor extensions to traditional object-oriented analysis methods, it is possible to automatically derive a wide variety of traditional risk and reliability analysis methods from a single common object model. Automatic model extraction helps ensure consistency among analyses and enables the surety analyst to examine a system from a wider variety of viewpoints in a shorter period of time. Thus it provides a deeper understanding of a system's behaviors and surety requirements. This report documents the underlying philosophy behind the common object model representation, the methods by which such common object models can be constructed, and the rules required to interrogate the common object model for derivation of traditional risk and reliability analysis models. The methodology is demonstrated in an extensive example problem.

  10. High precision spectrophotometric analysis of thorium

    International Nuclear Information System (INIS)

    Palmieri, H.E.L.

    1984-01-01

    An accurate and precise determination of thorium is proposed. Precision of about 0,1% is required for the determination of macroquantities of thorium when processed. After an extensive literature search concerning this subject, spectrophotometric titration has been chosen, using dissodium ethylenediaminetetraacetate (EDTA) solution and alizarin-S as indicator. In order to obtain such a precision, an amount of 0,025 M EDTA solution precisely measured has been added and the titration was completed with less than 5 ml of 0,0025 M EDTA solution. It is usual to locate the end-point graphically, by plotting added titrant versus absorbance. The non-linear minimum square fit, using the Fletcher e Powell's minimization process and a computer programme. Besides the equivalence point, other parameters of titration were determined: the indicator concentration, the absorbance of the metal-indicator complex, and the stability constants of the metal-indicator and the metal-EDTA complexes. (Author) [pt

  11. Solution Method and Precision Analysis of Double-difference Dynamic Precise Orbit Determination of BeiDou Navigation Satellite System

    Directory of Open Access Journals (Sweden)

    LIU Weiping

    2016-02-01

    Full Text Available To resolve the high relativity between the transverse element of GEO orbit and double-difference ambiguity, the classical double-difference dynamic method is improved and the method, which is to determine precise BeiDou satellite orbit using carrier phase and pseudo-range smoothed by phase, is proposed. The feasibility of the method is discussed and the influence of the method about ambiguity fixing is analyzed. Considering the characteristic of BeiDou, the method, which is to fix double-difference ambiguity of BeiDou satellites by QIF, is derived. The real data analysis shows that the new method, which can reduce the relativity and assure the precision, is better than the classical double-difference dynamic method. The result of ambiguity fixing is well by QIF, but the ambiguity fixing success rate is not high on the whole. So the precision of BeiDou orbit can't be improved clearly after ambiguity fixing.

  12. An Integrative Object-Based Image Analysis Workflow for Uav Images

    Science.gov (United States)

    Yu, Huai; Yan, Tianheng; Yang, Wen; Zheng, Hong

    2016-06-01

    In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA). More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT) representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC). Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya'an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  13. AN INTEGRATIVE OBJECT-BASED IMAGE ANALYSIS WORKFLOW FOR UAV IMAGES

    Directory of Open Access Journals (Sweden)

    H. Yu

    2016-06-01

    Full Text Available In this work, we propose an integrative framework to process UAV images. The overall process can be viewed as a pipeline consisting of the geometric and radiometric corrections, subsequent panoramic mosaicking and hierarchical image segmentation for later Object Based Image Analysis (OBIA. More precisely, we first introduce an efficient image stitching algorithm after the geometric calibration and radiometric correction, which employs a fast feature extraction and matching by combining the local difference binary descriptor and the local sensitive hashing. We then use a Binary Partition Tree (BPT representation for the large mosaicked panoramic image, which starts by the definition of an initial partition obtained by an over-segmentation algorithm, i.e., the simple linear iterative clustering (SLIC. Finally, we build an object-based hierarchical structure by fully considering the spectral and spatial information of the super-pixels and their topological relationships. Moreover, an optimal segmentation is obtained by filtering the complex hierarchies into simpler ones according to some criterions, such as the uniform homogeneity and semantic consistency. Experimental results on processing the post-seismic UAV images of the 2013 Ya’an earthquake demonstrate the effectiveness and efficiency of our proposed method.

  14. An Image Analysis Method for the Precise Selection and Quantitation of Fluorescently Labeled Cellular Constituents

    Science.gov (United States)

    Agley, Chibeza C.; Velloso, Cristiana P.; Lazarus, Norman R.

    2012-01-01

    The accurate measurement of the morphological characteristics of cells with nonuniform conformations presents difficulties. We report here a straightforward method using immunofluorescent staining and the commercially available imaging program Adobe Photoshop, which allows objective and precise information to be gathered on irregularly shaped cells. We have applied this measurement technique to the analysis of human muscle cells and their immunologically marked intracellular constituents, as these cells are prone to adopting a highly branched phenotype in culture. Use of this method can be used to overcome many of the long-standing limitations of conventional approaches for quantifying muscle cell size in vitro. In addition, wider applications of Photoshop as a quantitative and semiquantitative tool in immunocytochemistry are explored. PMID:22511600

  15. Object-sensitive Type Analysis of PHP

    NARCIS (Netherlands)

    Van der Hoek, Henk Erik; Hage, J

    2015-01-01

    In this paper we develop an object-sensitive type analysis for PHP, based on an extension of the notion of monotone frameworks to deal with the dynamic aspects of PHP, and following the framework of Smaragdakis et al. for object-sensitive analysis. We consider a number of instantiations of the

  16. Precise Truss Assembly Using Commodity Parts and Low Precision Welding

    Science.gov (United States)

    Komendera, Erik; Reishus, Dustin; Dorsey, John T.; Doggett, W. R.; Correll, Nikolaus

    2014-01-01

    Hardware and software design and system integration for an intelligent precision jigging robot (IPJR), which allows high precision assembly using commodity parts and low-precision bonding, is described. Preliminary 2D experiments that are motivated by the problem of assembling space telescope optical benches and very large manipulators on orbit using inexpensive, stock hardware and low-precision welding are also described. An IPJR is a robot that acts as the precise "jigging", holding parts of a local structure assembly site in place, while an external low precision assembly agent cuts and welds members. The prototype presented in this paper allows an assembly agent (for this prototype, a human using only low precision tools), to assemble a 2D truss made of wooden dowels to a precision on the order of millimeters over a span on the order of meters. The analysis of the assembly error and the results of building a square structure and a ring structure are discussed. Options for future work, to extend the IPJR paradigm to building in 3D structures at micron precision are also summarized.

  17. Accuracy and precision of oscillometric blood pressure in standing conscious horses

    DEFF Research Database (Denmark)

    Olsen, Emil; Pedersen, Tilde Louise Skovgaard; Robinson, Rebecca

    2016-01-01

    from a teaching and research herd. HYPOTHESIS/OBJECTIVE: To evaluate the accuracy and precision of systolic arterial pressure (SAP), diastolic arterial pressure (DAP), and mean arterial pressure (MAP) in conscious horses obtained with an oscillometric NIBP device when compared to invasively measured...... administration. Agreement analysis with replicate measures was utilized to calculate bias (accuracy) and standard deviation (SD) of bias (precision). RESULTS: A total of 252 pairs of invasive arterial BP and NIBP measurements were analyzed. Compared to the direct BP measures, the NIBP MAP had an accuracy of -4...... mm Hg and precision of 10 mm Hg. SAP had an accuracy of -8 mm Hg and a precision of 17 mm Hg and DAP had an accuracy of -7 mm Hg and a precision of 14 mm Hg. CONCLUSIONS AND CLINICAL RELEVANCE: MAP from the evaluated NIBP monitor is accurate and precise in the adult horse across a range of BP...

  18. Neutron activation analysis of limestone objects

    International Nuclear Information System (INIS)

    Meyers, P.; Van Zelst, L.

    1977-01-01

    The elemental composition of samples from limestone objects were determined by neutron activation analysis to investigate whether this technique can be used to distinguish between objects made of limestone from different sources. Samples weighing between 0.2-2 grams were obtained by drilling from a series of ancient Egyptian and medieval Spanish objects. Analysis was performed on aliquots varying in weight from 40-100 milligrams. The following elements were determined quantitatively: Na, K, Rb, Cs, Ba, Sc, La, Ce, Sm, Eu, Hf, Th, Ta, Cr, Mn, Fe, Co and Zn. The data on Egyptian limestones indicate that, because of the inhomogeneous nature of the stone, 0.2-2 gram samples may not be representative of an entire object. Nevertheless, multivariate statistical methods produced a clear distinction between objects originating from the Luxor area (ancient Thebes) and objects found north of Luxor. The Spanish limestone studied appeared to be more homogeneous. Samples from stylistically related objects have similar elemental compositions while relative large differences were observed between objects having no relationship other than the common provenance of medieval Spain. (orig.) [de

  19. Automated analysis of art object surfaces using time-averaged digital speckle pattern interferometry

    Science.gov (United States)

    Lukomski, Michal; Krzemien, Leszek

    2013-05-01

    Technical development and practical evaluation of a laboratory built, out-of-plane digital speckle pattern interferometer (DSPI) are reported. The instrument was used for non-invasive, non-contact detection and characterization of early-stage damage, like fracturing and layer separation, of painted objects of art. A fully automated algorithm was developed for recording and analysis of vibrating objects utilizing continuous-wave laser light. The algorithm uses direct, numerical fitting or Hilbert transformation for an independent, quantitative evaluation of the Bessel function at every point of the investigated surface. The procedure does not require phase modulation and thus can be implemented within any, even the simplest, DSPI apparatus. The proposed deformation analysis is fast and computationally inexpensive. Diagnosis of physical state of the surface of a panel painting attributed to Nicolaus Haberschrack (a late-mediaeval painter active in Krakow) from the collection of the National Museum in Krakow is presented as an example of an in situ application of the developed methodology. It has allowed the effectiveness of the deformation analysis to be evaluated for the surface of a real painting (heterogeneous colour and texture) in a conservation studio where vibration level was considerably higher than in the laboratory. It has been established that the methodology, which offers automatic analysis of the interferometric fringe patterns, has a considerable potential to facilitate and render more precise the condition surveys of works of art.

  20. Bias, precision and statistical power of analysis of covariance in the analysis of randomized trials with baseline imbalance: a simulation study

    Science.gov (United States)

    2014-01-01

    Background Analysis of variance (ANOVA), change-score analysis (CSA) and analysis of covariance (ANCOVA) respond differently to baseline imbalance in randomized controlled trials. However, no empirical studies appear to have quantified the differential bias and precision of estimates derived from these methods of analysis, and their relative statistical power, in relation to combinations of levels of key trial characteristics. This simulation study therefore examined the relative bias, precision and statistical power of these three analyses using simulated trial data. Methods 126 hypothetical trial scenarios were evaluated (126 000 datasets), each with continuous data simulated by using a combination of levels of: treatment effect; pretest-posttest correlation; direction and magnitude of baseline imbalance. The bias, precision and power of each method of analysis were calculated for each scenario. Results Compared to the unbiased estimates produced by ANCOVA, both ANOVA and CSA are subject to bias, in relation to pretest-posttest correlation and the direction of baseline imbalance. Additionally, ANOVA and CSA are less precise than ANCOVA, especially when pretest-posttest correlation ≥ 0.3. When groups are balanced at baseline, ANCOVA is at least as powerful as the other analyses. Apparently greater power of ANOVA and CSA at certain imbalances is achieved in respect of a biased treatment effect. Conclusions Across a range of correlations between pre- and post-treatment scores and at varying levels and direction of baseline imbalance, ANCOVA remains the optimum statistical method for the analysis of continuous outcomes in RCTs, in terms of bias, precision and statistical power. PMID:24712304

  1. High Dynamics and Precision Optical Measurement Using a Position Sensitive Detector (PSD in Reflection-Mode: Application to 2D Object Tracking over a Smart Surface

    Directory of Open Access Journals (Sweden)

    Ioan Alexandru Ivan

    2012-12-01

    Full Text Available When related to a single and good contrast object or a laser spot, position sensing, or sensitive, detectors (PSDs have a series of advantages over the classical camera sensors, including a good positioning accuracy for a fast response time and very simple signal conditioning circuits. To test the performance of this kind of sensor for microrobotics, we have made a comparative analysis between a precise but slow video camera and a custom-made fast PSD system applied to the tracking of a diffuse-reflectivity object transported by a pneumatic microconveyor called Smart-Surface. Until now, the fast system dynamics prevented the full control of the smart surface by visual servoing, unless using a very expensive high frame rate camera. We have built and tested a custom and low cost PSD-based embedded circuit, optically connected with a camera to a single objective by means of a beam splitter. A stroboscopic light source enhanced the resolution. The obtained results showed a good linearity and a fast (over 500 frames per second response time which will enable future closed-loop control by using PSD.

  2. Numerical Analysis Objects

    Science.gov (United States)

    Henderson, Michael

    1997-08-01

    The Numerical Analysis Objects project (NAO) is a project in the Mathematics Department of IBM's TJ Watson Research Center. While there are plenty of numerical tools available today, it is not an easy task to combine them into a custom application. NAO is directed at the dual problems of building applications from a set of tools, and creating those tools. There are several "reuse" projects, which focus on the problems of identifying and cataloging tools. NAO is directed at the specific context of scientific computing. Because the type of tools is restricted, problems such as tools with incompatible data structures for input and output, and dissimilar interfaces to tools which solve similar problems can be addressed. The approach we've taken is to define interfaces to those objects used in numerical analysis, such as geometries, functions and operators, and to start collecting (and building) a set of tools which use these interfaces. We have written a class library (a set of abstract classes and implementations) in C++ which demonstrates the approach. Besides the classes, the class library includes "stub" routines which allow the library to be used from C or Fortran, and an interface to a Visual Programming Language. The library has been used to build a simulator for petroleum reservoirs, using a set of tools for discretizing nonlinear differential equations that we have written, and includes "wrapped" versions of packages from the Netlib repository. Documentation can be found on the Web at "http://www.research.ibm.com/nao". I will describe the objects and their interfaces, and give examples ranging from mesh generation to solving differential equations.

  3. An Information-Based Approach to Precision Analysis of Indoor WLAN Localization Using Location Fingerprint

    Directory of Open Access Journals (Sweden)

    Mu Zhou

    2015-12-01

    Full Text Available In this paper, we proposed a novel information-based approach to precision analysis of indoor wireless local area network (WLAN localization using location fingerprint. First of all, by using the Fisher information matrix (FIM, we derive the fundamental limit of WLAN fingerprint-based localization precision considering different signal distributions in characterizing the variation of received signal strengths (RSSs in the target environment. After that, we explore the relationship between the localization precision and access point (AP placement, which can provide valuable suggestions for the design of the highly-precise localization system. Second, we adopt the heuristic simulated annealing (SA algorithm to optimize the AP locations for the sake of approaching the fundamental limit of localization precision. Finally, the extensive simulations and experiments are conducted in both regular line-of-sight (LOS and irregular non-line-of-sight (NLOS environments to demonstrate that the proposed approach can not only effectively improve the WLAN fingerprint-based localization precision, but also reduce the time overhead.

  4. Precision analysis of a multi-slice ultrasound sensor for non-invasive 3D kinematic analysis of knee joints.

    Science.gov (United States)

    Masum, Md Abdullah; Lambert, Andrew J; Pickering, Mark R; Scarvell, J M; Smith, P N

    2012-01-01

    Currently the standard clinical practice for measuring the motion of bones in a knee joint with sufficient precision involves implanting tantalum beads into the bones to act as fiducial markers prior to imaging using X-ray equipment. This procedure is invasive in nature and exposure to ionizing radiation imposes a cancer risk and the patient's movements are confined to a narrow field of view. In this paper, an ultrasound based system for non-invasive kinematic evaluation of knee joints is proposed. The results of an initial analysis show that this system can provide the precision required for non-invasive motion analysis while the patient performs normal physical activities.

  5. Maintaining high precision of isotope ratio analysis over extended periods of time.

    Science.gov (United States)

    Brand, Willi A

    2009-06-01

    Stable isotope ratios are reliable and long lasting process tracers. In order to compare data from different locations or different sampling times at a high level of precision, a measurement strategy must include reliable traceability to an international stable isotope scale via a reference material (RM). Since these international RMs are available in low quantities only, we have developed our own analysis schemes involving laboratory working RM. In addition, quality assurance RMs are used to control the long-term performance of the delta-value assignments. The analysis schemes allow the construction of quality assurance performance charts over years of operation. In this contribution, the performance of three typical techniques established in IsoLab at the MPI-BGC in Jena is discussed. The techniques are (1) isotope ratio mass spectrometry with an elemental analyser for delta(15)N and delta(13)C analysis of bulk (organic) material, (2) high precision delta(13)C and delta(18)O analysis of CO(2) in clean-air samples, and (3) stable isotope analysis of water samples using a high-temperature reaction with carbon. In addition, reference strategies on a laser ablation system for high spatial resolution delta(13)C analysis in tree rings is exemplified briefly.

  6. Drone-based Object Counting by Spatially Regularized Regional Proposal Network

    OpenAIRE

    Hsieh, Meng-Ru; Lin, Yen-Liang; Hsu, Winston H.

    2017-01-01

    Existing counting methods often adopt regression-based approaches and cannot precisely localize the target objects, which hinders the further analysis (e.g., high-level understanding and fine-grained classification). In addition, most of prior work mainly focus on counting objects in static environments with fixed cameras. Motivated by the advent of unmanned flying vehicles (i.e., drones), we are interested in detecting and counting objects in such dynamic environments. We propose Layout Prop...

  7. Object-oriented analysis and design

    CERN Document Server

    Deacon, John

    2005-01-01

    John Deacon’s in-depth, highly pragmatic approach to object-oriented analysis and design, demonstrates how to lay the foundations for developing the best possible software. Students will learn how to ensure that analysis and design remain focused and productive. By working through the book, they will gain a solid working knowledge of best practices in software development.

  8. Iso-precision scaling of digitized mammograms to facilitate image analysis

    International Nuclear Information System (INIS)

    Karssmeijer, N.; van Erning, L.

    1991-01-01

    This paper reports on a 12 bit CCD camera equipped with a linear sensor of 4096 photodiodes which is used to digitize conventional mammographic films. An iso-precision conversion of the pixel values is preformed to transform the image data to a scale on which the image noise is equal at each level. For this purpose film noise and digitization noise have been determined as a function of optical density and pixel size. It appears that only at high optical densities digitization noise is comparable to or larger than film noise. The quantization error caused by compression of images recorded with 12 bits per pixel to 8 bit images by an iso-precision conversion has been calculated as a function of the number of quantization levels. For mammograms digitized in a 4096 2 matrix the additional error caused by such a scale transform is only about 1.5 percent. An iso-precision scale transform can be advantageous when automated procedures for quantitative image analysis are developed. Especially when detection of signals in noise is aimed at, a constant noise level over the whole pixel value range is very convenient. This is demonstrated by applying local thresholding to detect small microcalcifications. Results are compared to those obtained by using logarithmic or linearized scales

  9. Setting Organizational Key Performance Indicators in the Precision Machine Industry

    Directory of Open Access Journals (Sweden)

    Mei-Hsiu Hong

    2015-11-01

    Full Text Available The aim of this research is to define (or set organizational key performance indicators (KPIs in the precision machine industry using the concept of core competence and the supply chain operations reference (SCOR model. The research is conducted in three steps. In the first step, a benchmarking study is conducted to collect major items of core competence and to group them into main categories in order to form a foundation for the research. In the second step, a case company questionnaire and interviews are conducted to identify the key factors of core competence in the precision machine industry. The analysis is conducted based on four dimensions and hence several analysis rounds are completed. Questionnaire data is analyzed with grey relational analysis (GRA and resulted in 5–6 key factors in each dimension or sub-dimension. Based on the conducted interviews, 13 of these identified key factors are separated into one organization objective, five key factors of core competence and seven key factors of core ability. In the final step, organizational KPIs are defined (or set for the five identified key factors of core competence. The most competitive core abilities for each of the five key factors are established. After that, organizational KPIs are set based on the core abilities within 3 main categories of KPIs (departmental, office grade and hierarchal for each key factor. The developed KPI system based on organizational objectives, core competences, and core abilities allow enterprises to handle dynamic market demand and business environments, as well as changes in overall corporate objectives.

  10. BEAMGAA. A chance for high precision analysis of big samples

    International Nuclear Information System (INIS)

    Goerner, W.; Berger, A.; Haase, O.; Segebade, Chr.; Alber, D.; Monse, G.

    2005-01-01

    In activation analysis of traces in small samples, the non-equivalence of the activating radiation doses of sample and calibration material gives rise to sometimes tolerable systematic errors. Conversely, analysis of major components usually demands high trueness and precision. To meet this, beam geometry activation analysis (BEAMGAA) procedures have been developed for instrumental photon (IPAA) and neutron activation analysis (INAA) in which the activating neutron/photon beam exhibits broad, flat-topped characteristics. This results in a very low lateral activating flux gradient compared to known radiation facilities, however, at significantly lower flux density. The axial flux gradient can be accounted for by a monitor-sample-monitor assembly. As a first approach, major components were determined in high purity substances as well as selenium in a cattle fodder additive. (author)

  11. Binocular optical axis parallelism detection precision analysis based on Monte Carlo method

    Science.gov (United States)

    Ying, Jiaju; Liu, Bingqi

    2018-02-01

    According to the working principle of the binocular photoelectric instrument optical axis parallelism digital calibration instrument, and in view of all components of the instrument, the various factors affect the system precision is analyzed, and then precision analysis model is established. Based on the error distribution, Monte Carlo method is used to analyze the relationship between the comprehensive error and the change of the center coordinate of the circle target image. The method can further guide the error distribution, optimize control the factors which have greater influence on the comprehensive error, and improve the measurement accuracy of the optical axis parallelism digital calibration instrument.

  12. Analytical techniques applied to study cultural heritage objects

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N., E-mail: rizzutto@if.usp.br [Universidade de Sao Paulo (USP), SP (Brazil). Instituto de Fisica

    2015-07-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  13. Analytical techniques applied to study cultural heritage objects

    International Nuclear Information System (INIS)

    Rizzutto, M.A.; Curado, J.F.; Bernardes, S.; Campos, P.H.O.V.; Kajiya, E.A.M.; Silva, T.F.; Rodrigues, C.L.; Moro, M.; Tabacniks, M.; Added, N.

    2015-01-01

    The scientific study of artistic and cultural heritage objects have been routinely performed in Europe and the United States for decades. In Brazil this research area is growing, mainly through the use of physical and chemical characterization methods. Since 2003 the Group of Applied Physics with Particle Accelerators of the Physics Institute of the University of Sao Paulo (GFAA-IF) has been working with various methodologies for material characterization and analysis of cultural objects. Initially using ion beam analysis performed with Particle Induced X-Ray Emission (PIXE), Rutherford Backscattering (RBS) and recently Ion Beam Induced Luminescence (IBIL), for the determination of the elements and chemical compounds in the surface layers. These techniques are widely used in the Laboratory of Materials Analysis with Ion Beams (LAMFI-USP). Recently, the GFAA expanded the studies to other possibilities of analysis enabled by imaging techniques that coupled with elemental and compositional characterization provide a better understanding on the materials and techniques used in the creative process in the manufacture of objects. The imaging analysis, mainly used to examine and document artistic and cultural heritage objects, are performed through images with visible light, infrared reflectography (IR), fluorescence with ultraviolet radiation (UV), tangential light and digital radiography. Expanding more the possibilities of analysis, new capabilities were added using portable equipment such as Energy Dispersive X-Ray Fluorescence (ED-XRF) and Raman Spectroscopy that can be used for analysis 'in situ' at the museums. The results of these analyzes are providing valuable information on the manufacturing process and have provided new information on objects of different University of Sao Paulo museums. Improving the arsenal of cultural heritage analysis it was recently constructed an 3D robotic stage for the precise positioning of samples in the external beam setup

  14. August Dvorak (1894-1975): Early expressions of applied behavior analysis and precision teaching

    Science.gov (United States)

    Joyce, Bonnie; Moxley, Roy A.

    1988-01-01

    August Dvorak is best known for his development of the Dvorak keyboard. However, Dvorak also adapted and applied many behavioral and scientific management techniques to the field of education. Taken collectively, these techniques are representative of many of the procedures currently used in applied behavior analysis, in general, and especially in precision teaching. The failure to consider Dvorak's instructional methods may explain some of the discrepant findings in studies which compare the efficiency of the Dvorak to the standard keyboard. This article presents a brief background on the development of the standard (QWERTY) and Dvorak keyboards, describes parallels between Dvorak's teaching procedures and those used in precision teaching, reviews some of the comparative research on the Dvorak keyboard, and suggests some implications for further research in applying the principles of behavior analysis. PMID:22477993

  15. High Precision Measurements of W Boson Transverse Momentum at 13 TeV

    CERN Document Server

    Eppler, Drew Galen

    2018-01-01

    The precision of the transverse momentum distribution of the W-boson is crucial for reducing the uncertainty of current and future measurements of the W-boson mass. For this kind of analysis several new recoil algorithms based on particle flow objects have been recently developed. This note studies the reconstruction performances of the different hadronic recoil algorithms.

  16. Analysis and Optimization of Dynamic Measurement Precision of Fiber Optic Gyroscope

    Directory of Open Access Journals (Sweden)

    Hui Li

    2013-01-01

    Full Text Available In order to improve the dynamic performance of high precision interferometer fiber optic gyroscope (IFOG, the influencing factors of the fast response characteristics are analyzed based on a proposed assistant design setup, and a high dynamic detection method is proposed to suppress the adverse effects of the key influencing factors. The assistant design platform is built by using the virtual instrument technology for IFOG, which can monitor the closed-loop state variables in real time for analyzing the influence of both the optical components and detection circuit on the dynamic performance of IFOG. The analysis results indicate that nonlinearity of optical Sagnac effect, optical parameter uncertainty, dynamic characteristics of internal modules and time delay of signal detection circuit are the major causes of dynamic performance deterioration, which can induce potential system instability in practical control systems. By taking all these factors into consideration, we design a robust control algorithm to realize the high dynamic closed-loop detection of IFOG. Finally, experiments show that the improved 0.01 deg/h high precision IFOG with the proposed control algorithm can achieve fast tracking and good dynamic measurement precision.

  17. Precision and within- and between-day variation of bioimpedance parameters in children aged 2-14 years

    DEFF Research Database (Denmark)

    Andersen, Trine B; Jødal, Lars; Arveschoug, Anne

    2011-01-01

    BACKGROUND & AIMS: Bioimpedance spectroscopy (BIS) offers the possibility to perform rapid estimates of fluid distribution and body composition. Few studies, however, have addressed the precision and biological variation in a pediatric population. Our objectives were to evaluate precision.......4-14.9 years) had one series measured on day one (precision population). Forty-four children had a second series on day one (within-day sub-population). Thirty-two children had a series measured on the next day (between-day sub-population). Each measurement series consisted of three repeated measurements....... A linear mixed model was used for statistical analysis. RESULTS: The precision was 0.3-0.8% in children ≥6 years and 0.5-2.4% in children...

  18. Precision surveying the principles and geomatics practice

    CERN Document Server

    Ogundare, John Olusegun

    2016-01-01

    A comprehensive overview of high precision surveying, including recent developments in geomatics and their applications This book covers advanced precision surveying techniques, their proper use in engineering and geoscience projects, and their importance in the detailed analysis and evaluation of surveying projects. The early chapters review the fundamentals of precision surveying: the types of surveys; survey observations; standards and specifications; and accuracy assessments for angle, distance and position difference measurement systems. The book also covers network design and 3-D coordinating systems before discussing specialized topics such as structural and ground deformation monitoring techniques and analysis, mining surveys, tunneling surveys, and alignment surveys. Precision Surveying: The Principles and Geomatics Practice: * Covers structural and ground deformation monitoring analysis, advanced techniques in mining and tunneling surveys, and high precision alignment of engineering structures *...

  19. Object-Oriented Analysis, Structured Analysis, and Jackson System Development

    NARCIS (Netherlands)

    Van Assche, F.; Wieringa, Roelf J.; Moulin, B.; Rolland, C

    1991-01-01

    Conceptual modeling is the activity of producing a conceptual model of an actual or desired version of a universe of discourse (UoD). In this paper, two methods of conceptual modeling are compared, structured analysis (SA) and object-oriented analysis (OOA). This is done by transforming a model

  20. Selection of Objective Function For Imbalanced Classification: An Industrial Case Study

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat

    2017-01-01

    In this article we discuss the issue of selecting suitable objective function for Genetic Algorithm to solve an imbalanced classification problem. More precisely, first we discuss the need of specialized objective function to solve a real classification problem from our industrial partner and the...... and then we compare the results of our proposed objective function with commonly used candidates to serve this purpose. Our comparison is based on the analysis of real data collected during the quality control stages of the manufacturing process....

  1. Precision analysis for standard deviation measurements of immobile single fluorescent molecule images.

    Science.gov (United States)

    DeSantis, Michael C; DeCenzo, Shawn H; Li, Je-Luen; Wang, Y M

    2010-03-29

    Standard deviation measurements of intensity profiles of stationary single fluorescent molecules are useful for studying axial localization, molecular orientation, and a fluorescence imaging system's spatial resolution. Here we report on the analysis of the precision of standard deviation measurements of intensity profiles of single fluorescent molecules imaged using an EMCCD camera.We have developed an analytical expression for the standard deviation measurement error of a single image which is a function of the total number of detected photons, the background photon noise, and the camera pixel size. The theoretical results agree well with the experimental, simulation, and numerical integration results. Using this expression, we show that single-molecule standard deviation measurements offer nanometer precision for a large range of experimental parameters.

  2. A high precision mass spectrometer for hydrogen isotopic analysis of water samples

    International Nuclear Information System (INIS)

    Murthy, M.S.; Prahallada Rao, B.S.; Handu, V.K.; Satam, J.V.

    1979-01-01

    A high precision mass spectrometer with two ion collector assemblies and direct on line reduction facility (with uranium at 700 0 C) for water samples for hydrogen isotopic analysis has been designed and developed. The ion source particularly gives high sensitivity and at the same tike limits the H 3 + ions to a minimum. A digital ratiometer with a H 2 + compensator has also been developed. The overall precision obtained on the spectrometer is 0.07% 2sub(sigmasub(10)) value. Typical results on the performance of the spectrometer, which is working since a year and a half are given. Possible methods of extending the ranges of concentration the spectrometer can handle, both on lower and higher sides are discussed. Problems of memory between samples are briefly listed. A multiple inlet system to overcome these problems is suggested. This will also enable faster analysis when samples of highly varying concentrations are to be analyzed. A few probable areas in which the spectrometer will be shortly put to use are given. (auth.)

  3. Efficient Tracking of Moving Objects with Precision Guarantees

    DEFF Research Database (Denmark)

    Civilis, Alminas; Jensen, Christian Søndergaard; Nenortaite, Jovita

    2004-01-01

    Sustained advances in wireless communications, geo-positioning, and consumer electronics pave the way to a kind of location-based service that relies on the tracking of the continuously changing positions of an entire population of service users. This type of service is characterized by large...... an object is moving. Empirical performance studies based on a real road network and GPS logs from cars are reported....

  4. Elevation data fitting and precision analysis of Google Earth in road survey

    Science.gov (United States)

    Wei, Haibin; Luan, Xiaohan; Li, Hanchao; Jia, Jiangkun; Chen, Zhao; Han, Leilei

    2018-05-01

    Objective: In order to improve efficiency of road survey and save manpower and material resources, this paper intends to apply Google Earth to the feasibility study stage of road survey and design. Limited by the problem that Google Earth elevation data lacks precision, this paper is focused on finding several different fitting or difference methods to improve the data precision, in order to make every effort to meet the accuracy requirements of road survey and design specifications. Method: On the basis of elevation difference of limited public points, any elevation difference of the other points can be fitted or interpolated. Thus, the precise elevation can be obtained by subtracting elevation difference from the Google Earth data. Quadratic polynomial surface fitting method, cubic polynomial surface fitting method, V4 interpolation method in MATLAB and neural network method are used in this paper to process elevation data of Google Earth. And internal conformity, external conformity and cross correlation coefficient are used as evaluation indexes to evaluate the data processing effect. Results: There is no fitting difference at the fitting point while using V4 interpolation method. Its external conformity is the largest and the effect of accuracy improvement is the worst, so V4 interpolation method is ruled out. The internal and external conformity of the cubic polynomial surface fitting method both are better than those of the quadratic polynomial surface fitting method. The neural network method has a similar fitting effect with the cubic polynomial surface fitting method, but its fitting effect is better in the case of a higher elevation difference. Because the neural network method is an unmanageable fitting model, the cubic polynomial surface fitting method should be mainly used and the neural network method can be used as the auxiliary method in the case of higher elevation difference. Conclusions: Cubic polynomial surface fitting method can obviously

  5. Super-resolution imaging applied to moving object tracking

    Science.gov (United States)

    Swalaganata, Galandaru; Ratna Sulistyaningrum, Dwi; Setiyono, Budi

    2017-10-01

    Moving object tracking in a video is a method used to detect and analyze changes that occur in an object that being observed. Visual quality and the precision of the tracked target are highly wished in modern tracking system. The fact that the tracked object does not always seem clear causes the tracking result less precise. The reasons are low quality video, system noise, small object, and other factors. In order to improve the precision of the tracked object especially for small object, we propose a two step solution that integrates a super-resolution technique into tracking approach. First step is super-resolution imaging applied into frame sequences. This step was done by cropping the frame in several frame or all of frame. Second step is tracking the result of super-resolution images. Super-resolution image is a technique to obtain high-resolution images from low-resolution images. In this research single frame super-resolution technique is proposed for tracking approach. Single frame super-resolution was a kind of super-resolution that it has the advantage of fast computation time. The method used for tracking is Camshift. The advantages of Camshift was simple calculation based on HSV color that use its histogram for some condition and color of the object varies. The computational complexity and large memory requirements required for the implementation of super-resolution and tracking were reduced and the precision of the tracked target was good. Experiment showed that integrate a super-resolution imaging into tracking technique can track the object precisely with various background, shape changes of the object, and in a good light conditions.

  6. A novel algorithm for a precise analysis of subchondral bone alterations

    Science.gov (United States)

    Gao, Liang; Orth, Patrick; Goebel, Lars K. H.; Cucchiarini, Magali; Madry, Henning

    2016-01-01

    Subchondral bone alterations are emerging as considerable clinical problems associated with articular cartilage repair. Their analysis exposes a pattern of variable changes, including intra-lesional osteophytes, residual microfracture holes, peri-hole bone resorption, and subchondral bone cysts. A precise distinction between them is becoming increasingly important. Here, we present a tailored algorithm based on continuous data to analyse subchondral bone changes using micro-CT images, allowing for a clear definition of each entity. We evaluated this algorithm using data sets originating from two large animal models of osteochondral repair. Intra-lesional osteophytes were detected in 3 of 10 defects in the minipig and in 4 of 5 defects in the sheep model. Peri-hole bone resorption was found in 22 of 30 microfracture holes in the minipig and in 17 of 30 microfracture holes in the sheep model. Subchondral bone cysts appeared in 1 microfracture hole in the minipig and in 5 microfracture holes in the sheep model (n = 30 holes each). Calculation of inter-rater agreement (90% agreement) and Cohen’s kappa (kappa = 0.874) revealed that the novel algorithm is highly reliable, reproducible, and valid. Comparison analysis with the best existing semi-quantitative evaluation method was also performed, supporting the enhanced precision of this algorithm. PMID:27596562

  7. Multi-element analysis of unidentified fallen objects from Tatale in ...

    African Journals Online (AJOL)

    A multi-element analysis has been carried out on two fallen objects, # 01 and # 02, using instrumental neutron activation analysis technique. A total of 17 elements were identified in object # 01 while 21 elements were found in object # 02. The two major elements in object # 01 were Fe and Mg, which together constitute ...

  8. Precision Guidance with Impact Angle Requirements

    National Research Council Canada - National Science Library

    Ford, Jason

    2001-01-01

    This paper examines a weapon system precision guidance problem in which the objective is to guide a weapon onto a non-manoeuvring target so that a particular desired angle of impact is achieved using...

  9. Detailed precision and accuracy analysis of swarm parameters from a pulsed Townsend experiment

    Science.gov (United States)

    Haefliger, P.; Franck, C. M.

    2018-02-01

    A newly built pulsed Townsend experimental setup which allows one to measure both electron and ion currents is presented. The principle of pulsed Townsend measurements itself is well established to obtain swarm parameters such as the effective ionization rate coefficient, the density-reduced mobility, and the density-normalized longitudinal diffusion coefficient. The main novelty of the present contribution is a detailed and comprehensive analysis of the entire measurement and evaluation chain with respect to accuracy, precision, and reproducibility. The influence of the input parameters (gap distance, applied voltage, measured pressure, and temperature) is analyzed in detail. An overall accuracy of ±0.5% in the density reduced electric field (E/N) is achieved, which is close to the theoretically possible limit using the chosen components. The precision of the experimental results is higher than the accuracy. Through an extensive measurement campaign, the repeatability of our measurements proved to be high and similar to the precision. The reproducibility of results at identical (E/N) is similar to the precision for different distances but decreases for varying pressures. For benchmark purposes, measurements for Ar, CO2, and N2 are presented and compared with our previous experimental setup, simulations, and other experimental references.

  10. A strategic analysis of Business Objects' portal application

    OpenAIRE

    Kristinsson, Olafur Oskar

    2007-01-01

    Business Objects is the leading software firm producing business intelligence software. Business intelligence is a growing market. Small to medium businesses are increasingly looking at business intelligence. Business Objects' flagship product in the enterprise market is Business Objects XI and for medium-size companies it has Crystal Decisions. Portals are the front end for the two products. InfoView, Business Objects portal application, lacks a long-term strategy. This analysis evaluates...

  11. Chemical Shift Imaging (CSI) by precise object displacement

    OpenAIRE

    Leclerc, Sebastien; Trausch, Gregory; Cordier, Benoit; Grandclaude, Denis; Retournard, Alain; Fraissard, Jacques; Canet, Daniel

    2006-01-01

    International audience; A mechanical device (NMR lift) has been built for displacing vertically an object (typically a NMR sample tube) inside the NMR probe with an accuracy of 1 Μm. A series of single pulse experiments are performed for incremented vertical positions of the sample. With a sufficiently spatially selective rf field, one obtains chemical shift information along the displacement direction (one dimensional Chemical Shift Imaging – CSI). Knowing the vertical radio-frequency (rf) f...

  12. Objective - oriented financial analysis introduction

    Directory of Open Access Journals (Sweden)

    Dessislava Kostova – Pickett

    2018-02-01

    Full Text Available The practice of financial analysis has been immeasurably strengthened in recent years thanks to the ongoing evolution of computerized approaches in the form of spreadsheets and computer-based financial models of different types. These devices not only relieved the analyst's computing task, but also opened up a wide range of analyzes and research into alternative sensitivity, which so far has not been possible. The main potential for object-oriented financial analysis consists in enormously expanding the analyst's capabilities through an online knowledge and information interface that has not yet been achieved through existing methods and software packages.

  13. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  14. Frame sequences analysis technique of linear objects movement

    Science.gov (United States)

    Oshchepkova, V. Y.; Berg, I. A.; Shchepkin, D. V.; Kopylova, G. V.

    2017-12-01

    Obtaining data by noninvasive methods are often needed in many fields of science and engineering. This is achieved through video recording in various frame rate and light spectra. In doing so quantitative analysis of movement of the objects being studied becomes an important component of the research. This work discusses analysis of motion of linear objects on the two-dimensional plane. The complexity of this problem increases when the frame contains numerous objects whose images may overlap. This study uses a sequence containing 30 frames at the resolution of 62 × 62 pixels and frame rate of 2 Hz. It was required to determine the average velocity of objects motion. This velocity was found as an average velocity for 8-12 objects with the error of 15%. After processing dependencies of the average velocity vs. control parameters were found. The processing was performed in the software environment GMimPro with the subsequent approximation of the data obtained using the Hill equation.

  15. Precision Nutrition 4.0: A Big Data and Ethics Foresight Analysis--Convergence of Agrigenomics, Nutrigenomics, Nutriproteomics, and Nutrimetabolomics.

    Science.gov (United States)

    Özdemir, Vural; Kolker, Eugene

    2016-02-01

    Nutrition is central to sustenance of good health, not to mention its role as a cultural object that brings together or draws lines among societies. Undoubtedly, understanding the future paths of nutrition science in the current era of Big Data remains firmly on science, technology, and innovation strategy agendas around the world. Nutrigenomics, the confluence of nutrition science with genomics, brought about a new focus on and legitimacy for "variability science" (i.e., the study of mechanisms of person-to-person and population differences in response to food, and the ways in which food variably impacts the host, for example, nutrient-related disease outcomes). Societal expectations, both public and private, and claims over genomics-guided and individually-tailored precision diets continue to proliferate. While the prospects of nutrition science, and nutrigenomics in particular, are established, there is a need to integrate the efforts in four Big Data domains that are naturally allied--agrigenomics, nutrigenomics, nutriproteomics, and nutrimetabolomics--that address complementary variability questions pertaining to individual differences in response to food-related environmental exposures. The joint use of these four omics knowledge domains, coined as Precision Nutrition 4.0 here, has sadly not been realized to date, but the potentials for such integrated knowledge innovation are enormous. Future personalized nutrition practices would benefit from a seamless planning of life sciences funding, research, and practice agendas from "farm to clinic to supermarket to society," and from "genome to proteome to metabolome." Hence, this innovation foresight analysis explains the already existing potentials waiting to be realized, and suggests ways forward for innovation in both technology and ethics foresight frames on precision nutrition. We propose the creation of a new Precision Nutrition Evidence Barometer for periodic, independent, and ongoing retrieval, screening

  16. Precision of neutron activation analysis for environmental biological materials

    International Nuclear Information System (INIS)

    Hamaguchi, Hiroshi; Iwata, Shiro; Koyama, Mutsuo; Sasajima, Kazuhisa; Numata, Yuichi.

    1977-01-01

    Between 1973 and 1974 a special committee ''Research on the application of neutron activation analysis to the environmental samples'' had been organized at the Research Reactor Institute, Kyoto University. Eleven research groups composed mainly of the committee members cooperated in the intercomparison programme of the reactor neutron activation analysis of NBS standard reference material, 1571 Orchard Leaves and 1577 Bovine Liver. Five different type of reactors were used for the neutron irradiation; i.e. KUR reactor of the Research Reactor Institute, Kyoto University, TRIGA MARK II reactor of the Institute for Atomic Energy, Rikkyo University, and JRR-2, JRR-3, JRR-4 reactor of Japan Atomic Energy Research Institute. Analyses were performed mainly by instrumental method. Precision of the analysis of 23 elements in Orchard Leaves and 13 elements in Bovine Liver presented by the different research groups was shown in table 4 and 5, respectively. The coefficient of variation for these elements was from several to -- 30 percent. Averages given to these elements agreed well with the NBS certified or reference values. Thus, from the practical point of view for the routine multielement analysis of environmental samples, the validity of the instrumental neutron activation technique for this purpose has been proved. (auth.)

  17. Object width modulates object-based attentional selection.

    Science.gov (United States)

    Nah, Joseph C; Neppi-Modona, Marco; Strother, Lars; Behrmann, Marlene; Shomstein, Sarah

    2018-04-24

    Visual input typically includes a myriad of objects, some of which are selected for further processing. While these objects vary in shape and size, most evidence supporting object-based guidance of attention is drawn from paradigms employing two identical objects. Importantly, object size is a readily perceived stimulus dimension, and whether it modulates the distribution of attention remains an open question. Across four experiments, the size of the objects in the display was manipulated in a modified version of the two-rectangle paradigm. In Experiment 1, two identical parallel rectangles of two sizes (thin or thick) were presented. Experiments 2-4 employed identical trapezoids (each having a thin and thick end), inverted in orientation. In the experiments, one end of an object was cued and participants performed either a T/L discrimination or a simple target-detection task. Combined results show that, in addition to the standard object-based attentional advantage, there was a further attentional benefit for processing information contained in the thick versus thin end of objects. Additionally, eye-tracking measures demonstrated increased saccade precision towards thick object ends, suggesting that Fitts's Law may play a role in object-based attentional shifts. Taken together, these results suggest that object-based attentional selection is modulated by object width.

  18. First-Class Object Sets

    DEFF Research Database (Denmark)

    Ernst, Erik

    Typically, objects are monolithic entities with a fixed interface. To increase the flexibility in this area, this paper presents first-class object sets as a language construct. An object set offers an interface which is a disjoint union of the interfaces of its member objects. It may also be used...... for a special kind of method invocation involving multiple objects in a dynamic lookup process. With support for feature access and late-bound method calls object sets are similar to ordinary objects, only more flexible. The approach is made precise by means of a small calculus, and the soundness of its type...

  19. Precise subtyping for synchronous multiparty sessions

    Directory of Open Access Journals (Sweden)

    Mariangiola Dezani-Ciancaglini

    2016-02-01

    Full Text Available The notion of subtyping has gained an important role both in theoretical and applicative domains: in lambda and concurrent calculi as well as in programming languages. The soundness and the completeness, together referred to as the preciseness of subtyping, can be considered from two different points of view: operational and denotational. The former preciseness has been recently developed with respect to type safety, i.e. the safe replacement of a term of a smaller type when a term of a bigger type is expected. The latter preciseness is based on the denotation of a type which is a mathematical object that describes the meaning of the type in accordance with the denotations of other expressions from the language. The result of this paper is the operational and denotational preciseness of the subtyping for a synchronous multiparty session calculus. The novelty of this paper is the introduction of characteristic global types to prove the operational completeness.

  20. Precision of Carbon-14 analysis in a single laboratory

    International Nuclear Information System (INIS)

    Nashriyah Mat; Misman Sumin; Holland, P.T.

    2009-01-01

    In a single laboratory, one operator has used a Biological Material Oxidizer (BMO) unit to prepare (combust) solid samples before analyzing (counting) the radioactivity by using various Liquid Scintillation Counters (LSCs). The different batches of commercially available solid Certified Reference Material (CRM, Amersham, UK) standards were analyzed depending on the time of analysis over a period of seven years. The certified radioactivity and accuracy of the C-14 standards as cellulose tabs, designated as the Certified Reference Material (CRM), was 5000 + 3% DPM. Each analysis was carried out using triplicate tabs. The medium of counting was commercially available cocktail containing the sorbent solution for the oxidizer gases, although of different batches were used depending on the date of analysis. The mean DPM of the solutions was measured after correction for quenching by the LSC internal standard procedure and subtracting the mean DPM of control. The precisions of the standard and control counts and of the recovery percentage for the CRM were measured as the coefficients of variation (CV), for the C-14 determination over the seven year period. The results from a recently acquired Sample Oxidizer unit were also included for comparison. (Author)

  1. High-precision positioning of radar scatterers

    NARCIS (Netherlands)

    Dheenathayalan, P.; Small, D.; Schubert, A.; Hanssen, R.F.

    2016-01-01

    Remote sensing radar satellites cover wide areas and provide spatially dense measurements, with millions of scatterers. Knowledge of the precise position of each radar scatterer is essential to identify the corresponding object and interpret the estimated deformation. The absolute position accuracy

  2. Precision Higgs Boson Physics and Implications for Beyond the Standard Model Physics Theories

    International Nuclear Information System (INIS)

    Wells, James

    2015-01-01

    The discovery of the Higgs boson is one of science's most impressive recent achievements. We have taken a leap forward in understanding what is at the heart of elementary particle mass generation. We now have a significant opportunity to develop even deeper understanding of how the fundamental laws of nature are constructed. As such, we need intense focus from the scientific community to put this discovery in its proper context, to realign and narrow our understanding of viable theory based on this positive discovery, and to detail the implications the discovery has for theories that attempt to answer questions beyond what the Standard Model can explain. This project's first main object is to develop a state-of-the-art analysis of precision Higgs boson physics. This is to be done in the tradition of the electroweak precision measurements of the LEP/SLC era. Indeed, the electroweak precision studies of the past are necessary inputs to the full precision Higgs program. Calculations will be presented to the community of Higgs boson observables that detail just how well various couplings of the Higgs boson can be measured, and more. These will be carried out using state-of-the-art theory computations coupled with the new experimental results coming in from the LHC. The project's second main objective is to utilize the results obtained from LHC Higgs boson experiments and the precision analysis, along with the direct search studies at LHC, and discern viable theories of physics beyond the Standard Model that unify physics to a deeper level. Studies will be performed on supersymmetric theories, theories of extra spatial dimensions (and related theories, such as compositeness), and theories that contain hidden sector states uniquely accessible to the Higgs boson. In addition, if data becomes incompatible with the Standard Model's low-energy effective lagrangian, new physics theories will be developed that explain the anomaly and put it into a more

  3. Introduction to precise numerical methods

    CERN Document Server

    Aberth, Oliver

    2007-01-01

    Precise numerical analysis may be defined as the study of computer methods for solving mathematical problems either exactly or to prescribed accuracy. This book explains how precise numerical analysis is constructed. The book also provides exercises which illustrate points from the text and references for the methods presented. All disc-based content for this title is now available on the Web. · Clearer, simpler descriptions and explanations ofthe various numerical methods· Two new types of numerical problems; accurately solving partial differential equations with the included software and computing line integrals in the complex plane.

  4. Multi-objective optimization in quantum parameter estimation

    Science.gov (United States)

    Gong, BeiLi; Cui, Wei

    2018-04-01

    We investigate quantum parameter estimation based on linear and Kerr-type nonlinear controls in an open quantum system, and consider the dissipation rate as an unknown parameter. We show that while the precision of parameter estimation is improved, it usually introduces a significant deformation to the system state. Moreover, we propose a multi-objective model to optimize the two conflicting objectives: (1) maximizing the Fisher information, improving the parameter estimation precision, and (2) minimizing the deformation of the system state, which maintains its fidelity. Finally, simulations of a simplified ɛ-constrained model demonstrate the feasibility of the Hamiltonian control in improving the precision of the quantum parameter estimation.

  5. Quantitative analysis of factors affecting intraoperative precision and stability of optoelectronic and electromagnetic tracking systems

    International Nuclear Information System (INIS)

    Wagner, A.; Schicho, K.; Birkfellner, W.; Figl, M.; Seemann, R.; Koenig, F.; Kainberger, Franz; Ewers, R.

    2002-01-01

    This study aims to provide a quantitative analysis of the factors affecting the actual precision and stability of optoelectronic and electromagnetic tracking systems in computer-aided surgery under real clinical/intraoperative conditions. A 'phantom-skull' with five precisely determined reference distances between marker spheres is used for all measurements. Three optoelectronic and one electromagnetic tracking systems are included in this study. The experimental design is divided into three parts: (1) evaluation of serial- and multislice-CT (computed tomography) images of the phantom-skull for the precision of distance measurements by means of navigation software without a digitizer, (2) digitizer measurements under realistic intraoperative conditions with the factors OR-lamp (radiating into the field of view of the digitizer) or/and 'handling with ferromagnetic surgical instruments' (in the field of view of the digitizer) and (3) 'point-measurements' to analyze the influence of changes in the angle of inclination of the stylus axis. Deviations between reference distances and measured values are statistically investigated by means of analysis of variance. Computerized measurements of distances based on serial-CT data were more precise than based on multislice-CT data. All tracking systems included in this study proved to be considerably less precise under realistic OR conditions when compared to the technical specifications in the manuals of the systems. Changes in the angle of inclination of the stylus axis resulted in deviations of up to 3.40 mm (mean deviations for all systems ranging from 0.49 to 1.42 mm, variances ranging from 0.09 to 1.44 mm), indicating a strong need for improvements of stylus design. The electromagnetic tracking system investigated in this study was not significantly affected by small ferromagnetic surgical instruments

  6. Development of technology on natural flaw fabrication and precise diagnosis for the major components in NPPs

    International Nuclear Information System (INIS)

    Han, Jung Ho; Choi, Myung Sik; Lee, Doek Hyun; Hur, Do Haeng

    2002-01-01

    The objective of this research is to develop a fabrication technology of natural flaw specimen of major components in NPPs and a technology of precise diagnosis for failure and degradation of components using natural flaw specimen. 1) Successful development of the natural flaw fabrication technology of SG tube 2) Evaluation of ECT signal and development of precise diagnosis using natural flaws. - Determination of length, depth, width, and multiplicity of fabricated natural flaws. - Informations about detectability and accuracy of ECT evaluation on various kinds of defects are collected when the combination of probe and frequency is changed. - An advanced technology for precise ECT evaluation is established. 3) Application of precise ECT diagnosis to failure analysis of SG tube in operation. - Fretting wear of KSNP SG. - ODSCC at tube expanded region of KSNP SG. - Determination of through/non-through wall of axial crack

  7. Oxygen isotope analysis of phosphate: improved precision using TC/EA CF-IRMS.

    Science.gov (United States)

    LaPorte, D F; Holmden, C; Patterson, W P; Prokopiuk, T; Eglington, B M

    2009-06-01

    Oxygen isotope values of biogenic apatite have long demonstrated considerable promise for paleothermometry potential because of the abundance of material in the fossil record and greater resistance of apatite to diagenesis compared to carbonate. Unfortunately, this promise has not been fully realized because of relatively poor precision of isotopic measurements, and exceedingly small size of some substrates for analysis. Building on previous work, we demonstrate that it is possible to improve precision of delta18O(PO4) measurements using a 'reverse-plumbed' thermal conversion elemental analyzer (TC/EA) coupled to a continuous flow isotope ratio mass spectrometer (CF-IRMS) via a helium stream [Correction made here after initial online publication]. This modification to the flow of helium through the TC/EA, and careful location of the packing of glassy carbon fragments relative to the hot spot in the reactor, leads to narrower, more symmetrically distributed CO elution peaks with diminished tailing. In addition, we describe our apatite purification chemistry that uses nitric acid and cation exchange resin. Purification chemistry is optimized for processing small samples, minimizing isotopic fractionation of PO4(-3) and permitting Ca, Sr and Nd to be eluted and purified further for the measurement of delta44Ca and 87Sr/86Sr in modern biogenic apatite and 143Nd/144Nd in fossil apatite. Our methodology yields an external precision of +/- 0.15 per thousand (1sigma) for delta18O(PO4). The uncertainty is related to the preparation of the Ag3PO4 salt, conversion to CO gas in a reversed-plumbed TC/EA, analysis of oxygen isotopes using a CF-IRMS, and uncertainty in constructing calibration lines that convert raw delta18O data to the VSMOW scale. Matrix matching of samples and standards for the purpose of calibration to the VSMOW scale was determined to be unnecessary. Our method requires only slightly modified equipment that is widely available. This fact, and the

  8. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  9. Accurate and Precise Titriraetric Analysis of Uranium in Nuclear Fuels. RCN Report

    International Nuclear Information System (INIS)

    Tolk, A.; Lingerak, W.A.; Verheul-Klompmaker, T.A.

    1970-09-01

    For the accurate and precise titrimetric analysis of uranium in nuclear fuels, the material is dissolved in orthophosphoric and nitric acids. The nitric acid is fumed off, and the U (VI) present is analysed reductometrically in a CO 2 -atmosphere with iron (II) ammonium sulfate. For U 3 O 8 -test-sample aliquots of resp. 800 and 80 mg coefficients of variation of 0.012 resp. 0.11% are measured. (author)

  10. Precision and Accuracy of k0-NAA Method for Analysis of Multi Elements in Reference Samples

    International Nuclear Information System (INIS)

    Sri-Wardani

    2004-01-01

    Accuracy and precision of k 0 -NAA method could determine in the analysis of multi elements contained in reference samples. The analyzed results of multi elements in SRM 1633b sample were obtained with optimum results in bias of 20% but it is in a good accuracy and precision. The analyzed results of As, Cd and Zn in CCQM-P29 rice flour sample were obtained with very good result in bias of 0.5 - 5.6%. (author)

  11. Precision medicine at the crossroads.

    Science.gov (United States)

    Olson, Maynard V

    2017-10-11

    There are bioethical, institutional, economic, legal, and cultural obstacles to creating the robust-precompetitive-data resource that will be required to advance the vision of "precision medicine," the ability to use molecular data to target therapies to patients for whom they offer the most benefit at the least risk. Creation of such an "information commons" was the central recommendation of the 2011 report Toward Precision Medicine issued by a committee of the National Research Council of the USA (Committee on a Framework for Development of a New Taxonomy of Disease; National Research Council. Toward precision medicine: building a knowledge network for biomedical research and a new taxonomy of disease. 2011). In this commentary, I review the rationale for creating an information commons and the obstacles to doing so; then, I endorse a path forward based on the dynamic consent of research subjects interacting with researchers through trusted mediators. I assert that the advantages of the proposed system overwhelm alternative ways of handling data on the phenotypes, genotypes, and environmental exposures of individual humans; hence, I argue that its creation should be the central policy objective of early efforts to make precision medicine a reality.

  12. Precision Health Economics and Outcomes Research to Support Precision Medicine: Big Data Meets Patient Heterogeneity on the Road to Value

    Directory of Open Access Journals (Sweden)

    Yixi Chen

    2016-11-01

    Full Text Available The “big data” era represents an exciting opportunity to utilize powerful new sources of information to reduce clinical and health economic uncertainty on an individual patient level. In turn, health economic outcomes research (HEOR practices will need to evolve to accommodate individual patient–level HEOR analyses. We propose the concept of “precision HEOR”, which utilizes a combination of costs and outcomes derived from big data to inform healthcare decision-making that is tailored to highly specific patient clusters or individuals. To explore this concept, we discuss the current and future roles of HEOR in health sector decision-making, big data and predictive analytics, and several key HEOR contexts in which big data and predictive analytics might transform traditional HEOR into precision HEOR. The guidance document addresses issues related to the transition from traditional to precision HEOR practices, the evaluation of patient similarity analysis and its appropriateness for precision HEOR analysis, and future challenges to precision HEOR adoption. Precision HEOR should make precision medicine more realizable by aiding and adapting healthcare resource allocation. The combined hopes for precision medicine and precision HEOR are that individual patients receive the best possible medical care while overall healthcare costs remain manageable or become more cost-efficient.

  13. Sternal instability measured with radiostereometric analysis. A study of method feasibility, accuracy and precision.

    Science.gov (United States)

    Vestergaard, Rikke Falsig; Søballe, Kjeld; Hasenkam, John Michael; Stilling, Maiken

    2018-05-18

    A small, but unstable, saw-gap may hinder bone-bridging and induce development of painful sternal dehiscence. We propose the use of Radiostereometric Analysis (RSA) for evaluation of sternal instability and present a method validation. Four bone analogs (phantoms) were sternotomized and tantalum beads were inserted in each half. The models were reunited with wire cerclage and placed in a radiolucent separation device. Stereoradiographs (n = 48) of the phantoms in 3 positions were recorded at 4 imposed separation points. The accuracy and precision was compared statistically and presented as translations along the 3 orthogonal axes. 7 sternotomized patients were evaluated for clinical RSA precision by double-examination stereoradiographs (n = 28). In the phantom study, we found no systematic error (p > 0.3) between the three phantom positions, and precision for evaluation of sternal separation was 0.02 mm. Phantom accuracy was mean 0.13 mm (SD 0.25). In the clinical study, we found a detection limit of 0.42 mm for sternal separation and of 2 mm for anterior-posterior dislocation of the sternal halves for the individual patient. RSA is a precise and low-dose image modality feasible for clinical evaluation of sternal stability in research. ClinicalTrials.gov Identifier: NCT02738437 , retrospectively registered.

  14. The PPP Precision Analysis Based on BDS Regional Navigation System

    Directory of Open Access Journals (Sweden)

    ZHU Yongxing

    2015-04-01

    Full Text Available BeiDou navigation satellite system(BDS has opened service in most of the Asia-Pacific region, it offers the possibility to break the technological monopoly of GPS in the field of high-precision applications, so its performance of precise point positioning (PPP has been a great concern. Firstly, the constellation of BeiDou regional navigation system and BDS/GPS tracking network is introduced. Secondly, the precise ephemeris and clock offset accuracy of BeiDou satellite based on domestic tracking network is analyzed. Finally, the static and kinematic PPP accuracy is studied, and compared with the GPS. The actual measured numerical example shows that the static and kinematic PPP based on BDS can achieve centimeter-level and decimeter-level respectively, reaching the current level of GPS precise point positioning.

  15. Analysis of Hall Probe Precise Positioning with Cylindrical Permanent Magnet

    International Nuclear Information System (INIS)

    Belicev, P.; Vorozhtsov, A.S.; Vorozhtsov, S.B.

    2007-01-01

    Precise positioning of a Hall probe for cyclotron magnetic field mapping, using cylindrical permanent magnets, was analyzed. The necessary permanent magnet parameters in order to achieve ±20 μm position precision, were determined. (author)

  16. Precision digital control systems

    Science.gov (United States)

    Vyskub, V. G.; Rozov, B. S.; Savelev, V. I.

    This book is concerned with the characteristics of digital control systems of great accuracy. A classification of such systems is considered along with aspects of stabilization, programmable control applications, digital tracking systems and servomechanisms, and precision systems for the control of a scanning laser beam. Other topics explored are related to systems of proportional control, linear devices and methods for increasing precision, approaches for further decreasing the response time in the case of high-speed operation, possibilities for the implementation of a logical control law, and methods for the study of precision digital control systems. A description is presented of precision automatic control systems which make use of electronic computers, taking into account the existing possibilities for an employment of computers in automatic control systems, approaches and studies required for including a computer in such control systems, and an analysis of the structure of automatic control systems with computers. Attention is also given to functional blocks in the considered systems.

  17. Precision manufacturing

    CERN Document Server

    Dornfeld, David

    2008-01-01

    Today there is a high demand for high-precision products. The manufacturing processes are now highly sophisticated and derive from a specialized genre called precision engineering. Precision Manufacturing provides an introduction to precision engineering and manufacturing with an emphasis on the design and performance of precision machines and machine tools, metrology, tooling elements, machine structures, sources of error, precision machining processes and precision process planning. As well as discussing the critical role precision machine design for manufacturing has had in technological developments over the last few hundred years. In addition, the influence of sustainable manufacturing requirements in precision processes is introduced. Drawing upon years of practical experience and using numerous examples and illustrative applications, David Dornfeld and Dae-Eun Lee cover precision manufacturing as it applies to: The importance of measurement and metrology in the context of Precision Manufacturing. Th...

  18. Ionospheric Modeling for Precise GNSS Applications

    NARCIS (Netherlands)

    Memarzadeh, Y.

    2009-01-01

    The main objective of this thesis is to develop a procedure for modeling and predicting ionospheric Total Electron Content (TEC) for high precision differential GNSS applications. As the ionosphere is a highly dynamic medium, we believe that to have a reliable procedure it is necessary to transfer

  19. Change Analysis and Decision Tree Based Detection Model for Residential Objects across Multiple Scales

    Directory of Open Access Journals (Sweden)

    CHEN Liyan

    2018-03-01

    Full Text Available Change analysis and detection plays important role in the updating of multi-scale databases.When overlap an updated larger-scale dataset and a to-be-updated smaller-scale dataset,people usually focus on temporal changes caused by the evolution of spatial entities.Little attention is paid to the representation changes influenced by map generalization.Using polygonal building data as an example,this study examines the changes from different perspectives,such as the reasons for their occurrence,their performance format.Based on this knowledge,we employ decision tree in field of machine learning to establish a change detection model.The aim of the proposed model is to distinguish temporal changes that need to be applied as updates to the smaller-scale dataset from representation changes.The proposed method is validated through tests using real-world building data from Guangzhou city.The experimental results show the overall precision of change detection is more than 90%,which indicates our method is effective to identify changed objects.

  20. Advanced bioanalytics for precision medicine.

    Science.gov (United States)

    Roda, Aldo; Michelini, Elisa; Caliceti, Cristiana; Guardigli, Massimo; Mirasoli, Mara; Simoni, Patrizia

    2018-01-01

    Precision medicine is a new paradigm that combines diagnostic, imaging, and analytical tools to produce accurate diagnoses and therapeutic interventions tailored to the individual patient. This approach stands in contrast to the traditional "one size fits all" concept, according to which researchers develop disease treatments and preventions for an "average" patient without considering individual differences. The "one size fits all" concept has led to many ineffective or inappropriate treatments, especially for pathologies such as Alzheimer's disease and cancer. Now, precision medicine is receiving massive funding in many countries, thanks to its social and economic potential in terms of improved disease prevention, diagnosis, and therapy. Bioanalytical chemistry is critical to precision medicine. This is because identifying an appropriate tailored therapy requires researchers to collect and analyze information on each patient's specific molecular biomarkers (e.g., proteins, nucleic acids, and metabolites). In other words, precision diagnostics is not possible without precise bioanalytical chemistry. This Trend article highlights some of the most recent advances, including massive analysis of multilayer omics, and new imaging technique applications suitable for implementing precision medicine. Graphical abstract Precision medicine combines bioanalytical chemistry, molecular diagnostics, and imaging tools for performing accurate diagnoses and selecting optimal therapies for each patient.

  1. Department of Defense Precise Time and Time Interval program improvement plan

    Science.gov (United States)

    Bowser, J. R.

    1981-01-01

    The United States Naval Observatory is responsible for ensuring uniformity in precise time and time interval operations including measurements, the establishment of overall DOD requirements for time and time interval, and the accomplishment of objectives requiring precise time and time interval with minimum cost. An overview of the objectives, the approach to the problem, the schedule, and a status report, including significant findings relative to organizational relationships, current directives, principal PTTI users, and future requirements as currently identified by the users are presented.

  2. Foreign object detection and removal to improve automated analysis of chest radiographs

    International Nuclear Information System (INIS)

    Hogeweg, Laurens; Sánchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van; Story, Alistair; Hayward, Andrew

    2013-01-01

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A z value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis

  3. A light and faster regional convolutional neural network for object detection in optical remote sensing images

    Science.gov (United States)

    Ding, Peng; Zhang, Ye; Deng, Wei-Jian; Jia, Ping; Kuijper, Arjan

    2018-07-01

    Detection of objects from satellite optical remote sensing images is very important for many commercial and governmental applications. With the development of deep convolutional neural networks (deep CNNs), the field of object detection has seen tremendous advances. Currently, objects in satellite remote sensing images can be detected using deep CNNs. In general, optical remote sensing images contain many dense and small objects, and the use of the original Faster Regional CNN framework does not yield a suitably high precision. Therefore, after careful analysis we adopt dense convoluted networks, a multi-scale representation and various combinations of improvement schemes to enhance the structure of the base VGG16-Net for improving the precision. We propose an approach to reduce the test-time (detection time) and memory requirements. To validate the effectiveness of our approach, we perform experiments using satellite remote sensing image datasets of aircraft and automobiles. The results show that the improved network structure can detect objects in satellite optical remote sensing images more accurately and efficiently.

  4. A method of precise profile analysis of diffuse scattering for the KENS pulsed neutrons

    International Nuclear Information System (INIS)

    Todate, Y.; Fukumura, T.; Fukazawa, H.

    2001-01-01

    An outline of our profile analysis method, which is now of practical use for the asymmetric KENS pulsed thermal neutrons, are presented. The analysis of the diffuse scattering from a single crystal of D 2 O is shown as an example. The pulse shape function is based on the Ikeda-Carpenter function adjusted for the KENS neutron pulses. The convoluted intensity is calculated by a Monte-Carlo method and the precision of the calculation is controlled. Fitting parameters in the model cross section can be determined by the built-in nonlinear least square fitting procedure. Because this method is the natural extension of the procedure conventionally used for the triple-axis data, it is easy to apply with generality and versatility. Most importantly, furthermore, this method has capability of precise correction of the time shift of the observed peak position which is inevitably caused in the case of highly asymmetric pulses and broad scattering function. It will be pointed out that the accurate determination of true time-of-flight is important especially in the single crystal inelastic experiments. (author)

  5. Categorical data processing for real estate objects valuation using statistical analysis

    Science.gov (United States)

    Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.

    2018-05-01

    Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.

  6. Aerodynamic multi-objective integrated optimization based on principal component analysis

    Directory of Open Access Journals (Sweden)

    Jiangtao HUANG

    2017-08-01

    Full Text Available Based on improved multi-objective particle swarm optimization (MOPSO algorithm with principal component analysis (PCA methodology, an efficient high-dimension multi-objective optimization method is proposed, which, as the purpose of this paper, aims to improve the convergence of Pareto front in multi-objective optimization design. The mathematical efficiency, the physical reasonableness and the reliability in dealing with redundant objectives of PCA are verified by typical DTLZ5 test function and multi-objective correlation analysis of supercritical airfoil, and the proposed method is integrated into aircraft multi-disciplinary design (AMDEsign platform, which contains aerodynamics, stealth and structure weight analysis and optimization module. Then the proposed method is used for the multi-point integrated aerodynamic optimization of a wide-body passenger aircraft, in which the redundant objectives identified by PCA are transformed to optimization constraints, and several design methods are compared. The design results illustrate that the strategy used in this paper is sufficient and multi-point design requirements of the passenger aircraft are reached. The visualization level of non-dominant Pareto set is improved by effectively reducing the dimension without losing the primary feature of the problem.

  7. [Precision nutrition in the era of precision medicine].

    Science.gov (United States)

    Chen, P Z; Wang, H

    2016-12-06

    Precision medicine has been increasingly incorporated into clinical practice and is enabling a new era for disease prevention and treatment. As an important constituent of precision medicine, precision nutrition has also been drawing more attention during physical examinations. The main aim of precision nutrition is to provide safe and efficient intervention methods for disease treatment and management, through fully considering the genetics, lifestyle (dietary, exercise and lifestyle choices), metabolic status, gut microbiota and physiological status (nutrient level and disease status) of individuals. Three major components should be considered in precision nutrition, including individual criteria for sufficient nutritional status, biomarker monitoring or techniques for nutrient detection and the applicable therapeutic or intervention methods. It was suggested that, in clinical practice, many inherited and chronic metabolic diseases might be prevented or managed through precision nutritional intervention. For generally healthy populations, because lifestyles, dietary factors, genetic factors and environmental exposures vary among individuals, precision nutrition is warranted to improve their physical activity and reduce disease risks. In summary, research and practice is leading toward precision nutrition becoming an integral constituent of clinical nutrition and disease prevention in the era of precision medicine.

  8. Ten years of Object-Oriented analysis on H1

    International Nuclear Information System (INIS)

    Laycock, Paul

    2012-01-01

    Over a decade ago, the H1 Collaboration decided to embrace the object-oriented paradigm and completely redesign its data analysis model and data storage format. The event data model, based on the ROOT framework, consists of three layers - tracks and calorimeter clusters, identified particles and finally event summary data - with a singleton class providing unified access. This original solution was then augmented with a fourth layer containing user-defined objects. This contribution will summarise the history of the solutions used, from modifications to the original design, to the evolution of the high-level end-user analysis object framework which is used by H1 today. Several important issues are addressed - the portability of expert knowledge to increase the efficiency of data analysis, the flexibility of the framework to incorporate new analyses, the performance and ease of use, and lessons learned for future projects.

  9. Classification of LIDAR Data for Generating a High-Precision Roadway Map

    Science.gov (United States)

    Jeong, J.; Lee, I.

    2016-06-01

    Generating of a highly precise map grows up with development of autonomous driving vehicles. The highly precise map includes a precision of centimetres level unlike an existing commercial map with the precision of meters level. It is important to understand road environments and make a decision for autonomous driving since a robust localization is one of the critical challenges for the autonomous driving car. The one of source data is from a Lidar because it provides highly dense point cloud data with three dimensional position, intensities and ranges from the sensor to target. In this paper, we focus on how to segment point cloud data from a Lidar on a vehicle and classify objects on the road for the highly precise map. In particular, we propose the combination with a feature descriptor and a classification algorithm in machine learning. Objects can be distinguish by geometrical features based on a surface normal of each point. To achieve correct classification using limited point cloud data sets, a Support Vector Machine algorithm in machine learning are used. Final step is to evaluate accuracies of obtained results by comparing them to reference data The results show sufficient accuracy and it will be utilized to generate a highly precise road map.

  10. CLASSIFICATION OF LIDAR DATA FOR GENERATING A HIGH-PRECISION ROADWAY MAP

    Directory of Open Access Journals (Sweden)

    J. Jeong

    2016-06-01

    Full Text Available Generating of a highly precise map grows up with development of autonomous driving vehicles. The highly precise map includes a precision of centimetres level unlike an existing commercial map with the precision of meters level. It is important to understand road environments and make a decision for autonomous driving since a robust localization is one of the critical challenges for the autonomous driving car. The one of source data is from a Lidar because it provides highly dense point cloud data with three dimensional position, intensities and ranges from the sensor to target. In this paper, we focus on how to segment point cloud data from a Lidar on a vehicle and classify objects on the road for the highly precise map. In particular, we propose the combination with a feature descriptor and a classification algorithm in machine learning. Objects can be distinguish by geometrical features based on a surface normal of each point. To achieve correct classification using limited point cloud data sets, a Support Vector Machine algorithm in machine learning are used. Final step is to evaluate accuracies of obtained results by comparing them to reference data The results show sufficient accuracy and it will be utilized to generate a highly precise road map.

  11. Enhanced online convolutional neural networks for object tracking

    Science.gov (United States)

    Zhang, Dengzhuo; Gao, Yun; Zhou, Hao; Li, Tianwen

    2018-04-01

    In recent several years, object tracking based on convolution neural network has gained more and more attention. The initialization and update of convolution filters can directly affect the precision of object tracking effective. In this paper, a novel object tracking via an enhanced online convolution neural network without offline training is proposed, which initializes the convolution filters by a k-means++ algorithm and updates the filters by an error back-propagation. The comparative experiments of 7 trackers on 15 challenging sequences showed that our tracker can perform better than other trackers in terms of AUC and precision.

  12. Objectives for Stakeholder Engagement in Global Environmental Assessments

    Directory of Open Access Journals (Sweden)

    Jennifer Garard

    2017-09-01

    Full Text Available Global environmental assessments (GEAs are among the most large-scale, formalized processes for synthesizing knowledge at the science–policy–society interface. The successful engagement of diverse stakeholders in GEAs is often described as a crucial mechanism for increasing their legitimacy, salience and credibility. However, the diversity of perspectives on the more precise objectives for stakeholder engagement remains largely unclear. The aims of this study are to categorize and characterize the diversity of perspectives on objectives for stakeholder engagement in GEAs; to explore differences in perspectives within and between different stakeholder groups and categories; and to test whether the more practical prioritization and selection of objectives in GEAs can be linked to deliberative policy learning as a higher-level rationale for stakeholder engagement. For these purposes, we conduct a grounded theory analysis and a keyword analysis of interview material and official GEA documents relating to two GEAs: UN Environment’s Fifth Global Environment Outlook and the Working Group III contribution to the Intergovernmental Panel on Climate Change Fifth Assessment Report. Based on the analysis, we identify six categories of objectives and present as hypotheses promising ways forward for prioritizing and characterizing objectives for stakeholder engagement in GEAs, as well as potential reasons for the differences between perspectives on objectives. This study draws attention to the need for future GEA processes to have more explicit discussions on the objectives for stakeholder engagement, as well as the importance of moving towards increasingly deliberative and inclusive assessment processes more broadly.

  13. Comparative analysis of imaging configurations and objectives for Fourier microscopy.

    Science.gov (United States)

    Kurvits, Jonathan A; Jiang, Mingming; Zia, Rashid

    2015-11-01

    Fourier microscopy is becoming an increasingly important tool for the analysis of optical nanostructures and quantum emitters. However, achieving quantitative Fourier space measurements requires a thorough understanding of the impact of aberrations introduced by optical microscopes that have been optimized for conventional real-space imaging. Here we present a detailed framework for analyzing the performance of microscope objectives for several common Fourier imaging configurations. To this end, we model objectives from Nikon, Olympus, and Zeiss using parameters that were inferred from patent literature and confirmed, where possible, by physical disassembly. We then examine the aberrations most relevant to Fourier microscopy, including the alignment tolerances of apodization factors for different objective classes, the effect of magnification on the modulation transfer function, and vignetting-induced reductions of the effective numerical aperture for wide-field measurements. Based on this analysis, we identify an optimal objective class and imaging configuration for Fourier microscopy. In addition, the Zemax files for the objectives and setups used in this analysis have been made publicly available as a resource for future studies.

  14. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  15. Data analysis in an Object Request Broker environment

    International Nuclear Information System (INIS)

    Malon, D.M.; May, E.N.; Grossman, R.L.; Day, C.T.; Quarrie, D.R.

    1995-01-01

    Computing for the Next Millenium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Object Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanisms for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function in such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study

  16. Object-Based Image Analysis Beyond Remote Sensing - the Human Perspective

    Science.gov (United States)

    Blaschke, T.; Lang, S.; Tiede, D.; Papadakis, M.; Györi, A.

    2016-06-01

    We introduce a prototypical methodological framework for a place-based GIS-RS system for the spatial delineation of place while incorporating spatial analysis and mapping techniques using methods from different fields such as environmental psychology, geography, and computer science. The methodological lynchpin for this to happen - when aiming to delineate place in terms of objects - is object-based image analysis (OBIA).

  17. Worst-case execution time analysis-driven object cache design

    DEFF Research Database (Denmark)

    Huber, Benedikt; Puffitsch, Wolfgang; Schoeberl, Martin

    2012-01-01

    result in a WCET analysis‐friendly design. Aiming for a time‐predictable design, we therefore propose to employ WCET analysis techniques for the design space exploration of processor architectures. We evaluated different object cache configurations using static analysis techniques. The number of field......Hard real‐time systems need a time‐predictable computing platform to enable static worst‐case execution time (WCET) analysis. All performance‐enhancing features need to be WCET analyzable. However, standard data caches containing heap‐allocated data are very hard to analyze statically....... In this paper we explore a new object cache design, which is driven by the capabilities of static WCET analysis. Simulations of standard benchmarks estimating the expected average case performance usually drive computer architecture design. The design decisions derived from this methodology do not necessarily...

  18. Present situation and trend of precision guidance technology and its intelligence

    Science.gov (United States)

    Shang, Zhengguo; Liu, Tiandong

    2017-11-01

    This paper first introduces the basic concepts of precision guidance technology and artificial intelligence technology. Then gives a brief introduction of intelligent precision guidance technology, and with the help of development of intelligent weapon based on deep learning project in foreign: LRASM missile project, TRACE project, and BLADE project, this paper gives an overview of the current foreign precision guidance technology. Finally, the future development trend of intelligent precision guidance technology is summarized, mainly concentrated in the multi objectives, intelligent classification, weak target detection and recognition, intelligent between complex environment intelligent jamming and multi-source, multi missile cooperative fighting and other aspects.

  19. Analysis and experiments of a novel and compact 3-DOF precision positioning platform

    International Nuclear Information System (INIS)

    Huang, Hu; Zhao, Hongwei; Fan, Zunqiang; Zhang, Hui; Ma, Zhichao; Yang, Zhaojun

    2013-01-01

    A novel 3-DOF precision positioning platform with dimensions of 48 mm X 50 mm X 35 mm was designed by integrating piezo actuators and flexure hinges. The platform has a compact structure but it can do high precision positioning in three axes. The dynamic model of the platform in a single direction was established. Stiffness of the flexure hinges and modal characteristics of the flexure hinge mechanism were analyzed by the finite element method. Output displacements of the platform along three axes were forecasted via stiffness analysis. Output performance of the platform in x and y axes with open-loop control as well as the z-axis with closed-loop control was tested and discussed. The preliminary application of the platform in the field of nanoindentation indicates that the designed platform works well during nanoindentation tests, and the closed-loop control ensures the linear displacement output. With suitable control, the platform has the potential to realize different positioning functions under various working conditions.

  20. First-Class Object Sets

    DEFF Research Database (Denmark)

    Ernst, Erik

    2009-01-01

    Typically, an object is a monolithic entity with a fixed interface.  To increase flexibility in this area, this paper presents first-class object sets as a language construct.  An object set offers an interface which is a disjoint union of the interfaces of its member objects.  It may also be use...... to a mainstream virtual machine in order to improve on the support for family polymorphism.  The approach is made precise by means of a small calculus, and the soundness of its type system has been shown by a mechanically checked proof in Coq....

  1. A task specific uncertainty analysis method for least-squares-based form characterization of ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    Ren, M J; Cheung, C F; Kong, L B

    2012-01-01

    In the measurement of ultra-precision freeform surfaces, least-squares-based form characterization methods are widely used to evaluate the form error of the measured surfaces. Although many methodologies have been proposed in recent years to improve the efficiency of the characterization process, relatively little research has been conducted on the analysis of associated uncertainty in the characterization results which may result from those characterization methods being used. As a result, this paper presents a task specific uncertainty analysis method with application in the least-squares-based form characterization of ultra-precision freeform surfaces. That is, the associated uncertainty in the form characterization results is estimated when the measured data are extracted from a specific surface with specific sampling strategy. Three factors are considered in this study which include measurement error, surface form error and sample size. The task specific uncertainty analysis method has been evaluated through a series of experiments. The results show that the task specific uncertainty analysis method can effectively estimate the uncertainty of the form characterization results for a specific freeform surface measurement

  2. Trace element analysis by EPMA in geosciences: detection limit, precision and accuracy

    Science.gov (United States)

    Batanova, V. G.; Sobolev, A. V.; Magnin, V.

    2018-01-01

    Use of the electron probe microanalyser (EPMA) for trace element analysis has increased over the last decade, mainly because of improved stability of spectrometers and the electron column when operated at high probe current; development of new large-area crystal monochromators and ultra-high count rate spectrometers; full integration of energy-dispersive / wavelength-dispersive X-ray spectrometry (EDS/WDS) signals; and the development of powerful software packages. For phases that are stable under a dense electron beam, the detection limit and precision can be decreased to the ppm level by using high acceleration voltage and beam current combined with long counting time. Data on 10 elements (Na, Al, P, Ca, Ti, Cr, Mn, Co, Ni, Zn) in olivine obtained on a JEOL JXA-8230 microprobe with tungsten filament show that the detection limit decreases proportionally to the square root of counting time and probe current. For all elements equal or heavier than phosphorus (Z = 15), the detection limit decreases with increasing accelerating voltage. The analytical precision for minor and trace elements analysed in olivine at 25 kV accelerating voltage and 900 nA beam current is 4 - 18 ppm (2 standard deviations of repeated measurements of the olivine reference sample) and is similar to the detection limit of corresponding elements. To analyse trace elements accurately requires careful estimation of background, and consideration of sample damage under the beam and secondary fluorescence from phase boundaries. The development and use of matrix reference samples with well-characterised trace elements of interest is important for monitoring and improving of the accuracy. An evaluation of the accuracy of trace element analyses in olivine has been made by comparing EPMA data for new reference samples with data obtained by different in-situ and bulk analytical methods in six different laboratories worldwide. For all elements, the measured concentrations in the olivine reference sample

  3. The precision of textural analysis in {sup 18}F-FDG-PET scans of oesophageal cancer

    Energy Technology Data Exchange (ETDEWEB)

    Doumou, Georgia; Siddique, Musib [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Tsoumpas, Charalampos [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); University of Leeds, The Division of Medical Physics, Leeds (United Kingdom); Goh, Vicky [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Guy' s and St Thomas' Hospitals NHS Foundation Trust, Radiology Department, London (United Kingdom); Cook, Gary J. [King' s College London, Division of Imaging Sciences and Biomedical Engineering, London (United Kingdom); Guy' s and St Thomas' Hospitals NHS Foundation Trust, The PET Centre, London (United Kingdom); University of Leeds, The Division of Medical Physics, Leeds (United Kingdom); St Thomas' Hospital, Clinical PET Centre, Division of Imaging Sciences and Biomedical Engineering, Kings College London, London (United Kingdom)

    2015-09-15

    Measuring tumour heterogeneity by textural analysis in {sup 18}F-fluorodeoxyglucose positron emission tomography ({sup 18}F-FDG PET) provides predictive and prognostic information but technical aspects of image processing can influence parameter measurements. We therefore tested effects of image smoothing, segmentation and quantisation on the precision of heterogeneity measurements. Sixty-four {sup 18}F-FDG PET/CT images of oesophageal cancer were processed using different Gaussian smoothing levels (2.0, 2.5, 3.0, 3.5, 4.0 mm), maximum standardised uptake value (SUV{sub max}) segmentation thresholds (45 %, 50 %, 55 %, 60 %) and quantisation (8, 16, 32, 64, 128 bin widths). Heterogeneity parameters included grey-level co-occurrence matrix (GLCM), grey-level run length matrix (GLRL), neighbourhood grey-tone difference matrix (NGTDM), grey-level size zone matrix (GLSZM) and fractal analysis methods. The concordance correlation coefficient (CCC) for the three processing variables was calculated for each heterogeneity parameter. Most parameters showed poor agreement between different bin widths (CCC median 0.08, range 0.004-0.99). Segmentation and smoothing showed smaller effects on precision (segmentation: CCC median 0.82, range 0.33-0.97; smoothing: CCC median 0.99, range 0.58-0.99). Smoothing and segmentation have only a small effect on the precision of heterogeneity measurements in {sup 18}F-FDG PET data. However, quantisation often has larger effects, highlighting a need for further evaluation and standardisation of parameters for multicentre studies. (orig.)

  4. Critical Steps in Data Analysis for Precision Casimir Force Measurements with Semiconducting Films

    Science.gov (United States)

    Banishev, A. A.; Chang, Chia-Cheng; Mohideen, U.

    2011-06-01

    Some experimental procedures and corresponding results of the precision measurement of the Casimir force between low doped Indium Tin Oxide (ITO) film and gold sphere are described. Measurements were performed using an Atomic Force Microscope in high vacuum. It is shown that the magnitude of the Casimir force decreases after prolonged UV treatment of the ITO film. Some critical data analysis steps such as the correction for the mechanical drift of the sphere-plate system and photodiodes are discussed.

  5. Improvement of precision method of spectrophotometry with inner standardization and its use in plutonium solutions analysis

    International Nuclear Information System (INIS)

    Stepanov, A.V.; Stepanov, D.A.; Nikitina, S.A.; Gogoleva, T.D.; Grigor'eva, M.G.; Bulyanitsa, L.S.; Panteleev, Yu.A.; Pevtsova, E.V.; Domkin, V.D.; Pen'kin, M.V.

    2006-01-01

    Precision method of spectrophotometry with inner standardization is used for analysis of pure Pu solutions. Improvement of the spectrophotometer and spectrophotometric method of analysis is done to decrease accidental constituent of relative error of the method. Influence of U, Np impurities and corrosion products on systematic constituent of error of the method, and effect of fluoride-ion on completeness of Pu oxidation in sample preparation are studied [ru

  6. Omics AnalySIs System for PRecision Oncology (OASISPRO): A Web-based Omics Analysis Tool for Clinical Phenotype Prediction.

    Science.gov (United States)

    Yu, Kun-Hsing; Fitzpatrick, Michael R; Pappas, Luke; Chan, Warren; Kung, Jessica; Snyder, Michael

    2017-09-12

    Precision oncology is an approach that accounts for individual differences to guide cancer management. Omics signatures have been shown to predict clinical traits for cancer patients. However, the vast amount of omics information poses an informatics challenge in systematically identifying patterns associated with health outcomes, and no general-purpose data-mining tool exists for physicians, medical researchers, and citizen scientists without significant training in programming and bioinformatics. To bridge this gap, we built the Omics AnalySIs System for PRecision Oncology (OASISPRO), a web-based system to mine the quantitative omics information from The Cancer Genome Atlas (TCGA). This system effectively visualizes patients' clinical profiles, executes machine-learning algorithms of choice on the omics data, and evaluates the prediction performance using held-out test sets. With this tool, we successfully identified genes strongly associated with tumor stage, and accurately predicted patients' survival outcomes in many cancer types, including mesothelioma and adrenocortical carcinoma. By identifying the links between omics and clinical phenotypes, this system will facilitate omics studies on precision cancer medicine and contribute to establishing personalized cancer treatment plans. This web-based tool is available at http://tinyurl.com/oasispro ;source codes are available at http://tinyurl.com/oasisproSourceCode . © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.|info:eu-repo/dai/nl/224281216; Queiroz Feitosa, R.; van der Meer, F.D.|info:eu-repo/dai/nl/138940908; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  8. 40 CFR 75.41 - Precision criteria.

    Science.gov (United States)

    2010-07-01

    ... correlation analysis according to the following procedures. (i) Plot each of the paired emissions readings as... and analysis. To demonstrate precision equal to or better than the continuous emission monitoring system, the owner or operator shall conduct an F-test, a correlation analysis, and a t-test for bias as...

  9. Evaluation of the prediction precision capability of partial least squares regression approach for analysis of high alloy steel by laser induced breakdown spectroscopy

    Science.gov (United States)

    Sarkar, Arnab; Karki, Vijay; Aggarwal, Suresh K.; Maurya, Gulab S.; Kumar, Rohit; Rai, Awadhesh K.; Mao, Xianglei; Russo, Richard E.

    2015-06-01

    Laser induced breakdown spectroscopy (LIBS) was applied for elemental characterization of high alloy steel using partial least squares regression (PLSR) with an objective to evaluate the analytical performance of this multivariate approach. The optimization of the number of principle components for minimizing error in PLSR algorithm was investigated. The effect of different pre-treatment procedures on the raw spectral data before PLSR analysis was evaluated based on several statistical (standard error of prediction, percentage relative error of prediction etc.) parameters. The pre-treatment with "NORM" parameter gave the optimum statistical results. The analytical performance of PLSR model improved by increasing the number of laser pulses accumulated per spectrum as well as by truncating the spectrum to appropriate wavelength region. It was found that the statistical benefit of truncating the spectrum can also be accomplished by increasing the number of laser pulses per accumulation without spectral truncation. The constituents (Co and Mo) present in hundreds of ppm were determined with relative precision of 4-9% (2σ), whereas the major constituents Cr and Ni (present at a few percent levels) were determined with a relative precision of ~ 2%(2σ).

  10. Robotic Observatory System Design-Specification Considerations for Achieving Long-Term Sustainable Precision Performance

    Science.gov (United States)

    Wray, J. D.

    2003-05-01

    The robotic observatory telescope must point precisely on the target object, and then track autonomously to a fraction of the FWHM of the system PSF for durations of ten to twenty minutes or more. It must retain this precision while continuing to function at rates approaching thousands of observations per night for all its years of useful life. These stringent requirements raise new challenges unique to robotic telescope systems design. Critical design considerations are driven by the applicability of the above requirements to all systems of the robotic observatory, including telescope and instrument systems, telescope-dome enclosure systems, combined electrical and electronics systems, environmental (e.g. seeing) control systems and integrated computer control software systems. Traditional telescope design considerations include the effects of differential thermal strain, elastic flexure, plastic flexure and slack or backlash with respect to focal stability, optical alignment and angular pointing and tracking precision. Robotic observatory design must holistically encapsulate these traditional considerations within the overall objective of maximized long-term sustainable precision performance. This overall objective is accomplished through combining appropriate mechanical and dynamical system characteristics with a full-time real-time telescope mount model feedback computer control system. Important design considerations include: identifying and reducing quasi-zero-backlash; increasing size to increase precision; directly encoding axis shaft rotation; pointing and tracking operation via real-time feedback between precision mount model and axis mounted encoders; use of monolithic construction whenever appropriate for sustainable mechanical integrity; accelerating dome motion to eliminate repetitive shock; ducting internal telescope air to outside dome; and the principal design criteria: maximizing elastic repeatability while minimizing slack, plastic deformation

  11. Data quality objectives lessons learned for tank waste characterization

    International Nuclear Information System (INIS)

    Eberlein, S.J.

    1996-01-01

    The tank waste characterization process is an integral part of the overall effort to control the hazards associated with radioactive wastes stored in underground tanks at the Hanford Reservation. The programs involved in the characterization of the wastes are employing Data Quality Objective (DQO) process in all information and data collection activities. The DQO process is used by the programs to address an issue or problem rather than a specific sampling event. Practical limits do not always allow for precise characterization of a tank or the implementation of the DQO process. Because of the flexibility of the DQO process, it can be used as a tool for sampling and analysis of the underground waste storage tanks. The iterative nature of the DQO process allows it to be used as additional information is claimed or lessons are learned concerning an issue or problem requiring sampling and analysis of tank waste. In addition, the application of DQO process forces alternative actions to be considered when precise characterization of a tank or the full implementation of the DQO process is not practical

  12. Longitudinal interfacility precision in single-energy quantitative CT

    International Nuclear Information System (INIS)

    Morin, R.L.; Gray, J.E.; Wahner, H.W.; Weekes, R.G.

    1987-01-01

    The authors investigated the precision of single-energy quantitative CT measurements between two facilities over 3 months. An anthropomorphic phantom with calcium hydroxyapatite inserts (60,100, and 160 mg/cc) was used with the Cann-Gennant method to measure bone mineral density. The same model CT scanner, anthropomorphic phantom, quantitative CT standard and analysis package were utilized at each facility. Acquisition and analysis techniques were identical to those used in patient studies. At one facility, 28 measurements yielded an average precision of 6.1% (5.0%-8.5%). The average precision for 39 measurements at the other facility was 4.3% (3.2%-8.1%). Successive scans with phantom repositioning between scanning yielded an average precision of about 3% (1%-4% without repositioning). Despite differences in personnel, scanners, standards, and phantoms, the variation between facilities was about 2%, which was within the intrafacility variation of about 5% at each location

  13. Diachronic and Synchronic Analysis - the Case of the Indirect Object in Spanish

    DEFF Research Database (Denmark)

    Dam, Lotte; Dam-Jensen, Helle

    2007-01-01

    The article deals with a monograph on the indirect object in Spanish. The book offers a many-faceted analysis of the indrect object, as it, on the one hand, gives a detailed diachronic analysis of what is known as clitic-doubled constructions and, on the other, a synchronic analysis of both...

  14. Comparison of ATLAS tilecal module No. 8 high-precision metrology measurement results obtained by laser (JINR) and photogrammetric (CERN) methods

    International Nuclear Information System (INIS)

    Batusov, V.; Budagov, Yu.; Gayde, J.C.

    2002-01-01

    The high-precision assembly of large experimental set-ups is of a principal necessity for the successful execution of the forthcoming LHC research programme in the TeV-beams. The creation of an adequate survey and control metrology method is an essential part of the detector construction scenario. This work contains the dimension measurement data for ATLAS hadron calorimeter MODULE No. 8 (6 m, 22 tons) which were obtained by laser and by photogrammetry methods. The comparative data analysis demonstrates the measurements agreement within ± 70 μm. It means, these two clearly independent methods can be combined and lead to the rise of a new-generation engineering culture: high-precision metrology when precision assembling of large scale massive objects

  15. Comparison of ATLAS Tilecal MODULE No 8 high-precision metrology measurement results obtained by laser (JINR) and photogrammetric (CERN) methods

    CERN Document Server

    Batusov, V; Gayde, J C; Khubua, J I; Lasseur, C; Lyablin, M V; Miralles-Verge, L; Nessi, Marzio; Rusakovitch, N A; Sissakian, A N; Topilin, N D

    2002-01-01

    The high-precision assembly of large experimental set-ups is of a principal necessity for the successful execution of the forthcoming LHC research programme in the TeV-beams. The creation of an adequate survey and control metrology method is an essential part of the detector construction scenario. This work contains the dimension measurement data for ATLAS hadron calorimeter MODULE No. 8 (6 m, 22 tons) which were obtained by laser and by photogrammetry methods. The comparative data analysis demonstrates the measurements agreement within +or-70 mu m. It means, these two clearly independent methods can be combined and lead to the rise of a new-generation engineering culture: high-precision metrology when precision assembling of large scale massive objects. (3 refs).

  16. Collision free pick up and movement of large objects

    International Nuclear Information System (INIS)

    Drotning, W.D.; McKee, G.R.

    1998-01-01

    An automated system is described for the sensor-based precision docking and manipulation of large objects. Past work in the remote handling of large nuclear waste containers is extensible to the problems associated with the handling of large objects such as coils of flat steel in industry. Computer vision and ultrasonic proximity sensing as described here are used to control the precision docking of large objects, and swing damped motion control of overhead cranes is used to control the position of the pick up device and suspended payload during movement. Real-time sensor processing and model-based control are used to accurately position payloads

  17. Plasmonic micropillars for precision cell force measurement across a large field-of-view

    Science.gov (United States)

    Xiao, Fan; Wen, Ximiao; Tan, Xing Haw Marvin; Chiou, Pei-Yu

    2018-01-01

    A plasmonic micropillar platform with self-organized gold nanospheres is reported for the precision cell traction force measurement across a large field-of-view (FOV). Gold nanospheres were implanted into the tips of polymer micropillars by annealing gold microdisks with nanosecond laser pulses. Each gold nanosphere is physically anchored in the center of a pillar tip and serves as a strong, point-source-like light scattering center for each micropillar. This allows a micropillar to be clearly observed and precisely tracked even under a low magnification objective lens for the concurrent and precision measurement across a large FOV. A spatial resolution of 30 nm for the pillar deflection measurement has been accomplished on this platform with a 20× objective lens.

  18. Precision siting of a particle accelerator

    International Nuclear Information System (INIS)

    Cintra, Jorge Pimentel

    1996-01-01

    Precise location is a specific survey job that involves a high skilled work to avoid unrecoverable results at the project installation. As a function of the different process stages, different specifications can be applied, invoking different instruments: theodolite, measurement tape, distanciometer, invar wire. This paper, based on experience obtained at the installation of particle accelerator equipment, deals with general principles of precise location: tolerance definitions, increasing accuracy techniques, schedule of locations, sensitivity analysis, quality control methods. (author)

  19. Head First Object-Oriented Analysis and Design

    CERN Document Server

    McLaughlin, Brett D; West, David

    2006-01-01

    "Head First Object Oriented Analysis and Design is a refreshing look at subject of OOAD. What sets this book apart is its focus on learning. The authors have made the content of OOAD accessible, usable for the practitioner." Ivar Jacobson, Ivar Jacobson Consulting "I just finished reading HF OOA&D and I loved it! The thing I liked most about this book was its focus on why we do OOA&D-to write great software!" Kyle Brown, Distinguished Engineer, IBM "Hidden behind the funny pictures and crazy fonts is a serious, intelligent, extremely well-crafted presentation of OO Analysis and Design

  20. Precision of INR measured with a patient operated whole blood coagulometer

    DEFF Research Database (Denmark)

    Attermann, Jørn; Andersen, Niels Trolle; Korsgaard, Helle

    2003-01-01

    INTRODUCTION: The objective of the present study was to evaluate the precision of a portable whole blood coagulometer (CoaguChek S) in the hands of self-managing patients on oral anticoagulant therapy (OAT). MATERIALS AND METHODS: Fifteen patients on self-managed OAT performed measurements of INR...... and between patients was 15.0% and 14.7%, respectively. CONCLUSION: The precision of CoaguChek S is satisfactory....

  1. Perspectives of precision agriculture in a broader policy context

    DEFF Research Database (Denmark)

    Lind, Kim Martin Hjorth; Pedersen, Søren Marcus

    2017-01-01

    that in a precise and targeted approach reduce resource use and increase yield. Furthermore, the growing demand for higher value food products in terms of health and quality require traceability and information about production processes and resource use, which also correspond with the possibilities offered...... by precision agriculture technology. The general movement towards higher integration in food supply chains is a natural extension of the requirements for traceability and product information, which are integral parts of precision agriculture.......Agriculture is faced with contrasting requirements from the broader society. On the one hand, agriculture needs to expand production to be able to feed a growing global population. Furthermore, the developing bio-economy requires agriculture to produce for a range of non-food objectives such as bio...

  2. BIG DATA ANALYTICS AND PRECISION ANIMAL AGRICULTURE SYMPOSIUM: Machine learning and data mining advance predictive big data analysis in precision animal agriculture.

    Science.gov (United States)

    Morota, Gota; Ventura, Ricardo V; Silva, Fabyano F; Koyama, Masanori; Fernando, Samodha C

    2018-04-14

    Precision animal agriculture is poised to rise to prominence in the livestock enterprise in the domains of management, production, welfare, sustainability, health surveillance, and environmental footprint. Considerable progress has been made in the use of tools to routinely monitor and collect information from animals and farms in a less laborious manner than before. These efforts have enabled the animal sciences to embark on information technology-driven discoveries to improve animal agriculture. However, the growing amount and complexity of data generated by fully automated, high-throughput data recording or phenotyping platforms, including digital images, sensor and sound data, unmanned systems, and information obtained from real-time noninvasive computer vision, pose challenges to the successful implementation of precision animal agriculture. The emerging fields of machine learning and data mining are expected to be instrumental in helping meet the daunting challenges facing global agriculture. Yet, their impact and potential in "big data" analysis have not been adequately appreciated in the animal science community, where this recognition has remained only fragmentary. To address such knowledge gaps, this article outlines a framework for machine learning and data mining and offers a glimpse into how they can be applied to solve pressing problems in animal sciences.

  3. Using beta-binomial regression for high-precision differential methylation analysis in multifactor whole-genome bisulfite sequencing experiments

    Science.gov (United States)

    2014-01-01

    Background Whole-genome bisulfite sequencing currently provides the highest-precision view of the epigenome, with quantitative information about populations of cells down to single nucleotide resolution. Several studies have demonstrated the value of this precision: meaningful features that correlate strongly with biological functions can be found associated with only a few CpG sites. Understanding the role of DNA methylation, and more broadly the role of DNA accessibility, requires that methylation differences between populations of cells are identified with extreme precision and in complex experimental designs. Results In this work we investigated the use of beta-binomial regression as a general approach for modeling whole-genome bisulfite data to identify differentially methylated sites and genomic intervals. Conclusions The regression-based analysis can handle medium- and large-scale experiments where it becomes critical to accurately model variation in methylation levels between replicates and account for influence of various experimental factors like cell types or batch effects. PMID:24962134

  4. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  5. Precision surveying system for PEP

    International Nuclear Information System (INIS)

    Gunn, J.; Lauritzen, T.; Sah, R.; Pellisier, P.F.

    1977-01-01

    A semi-automatic precision surveying system is being developed for PEP. Reference elevations for vertical alignment will be provided by a liquid level. The short range surveying will be accomplished using a Laser Surveying System featuring automatic data acquisition and analysis

  6. Capacity and precision in an animal model of visual short-term memory.

    Science.gov (United States)

    Lara, Antonio H; Wallis, Jonathan D

    2012-03-14

    Temporary storage of information in visual short-term memory (VSTM) is a key component of many complex cognitive abilities. However, it is highly limited in capacity. Understanding the neurophysiological nature of this capacity limit will require a valid animal model of VSTM. We used a multiple-item color change detection task to measure macaque monkeys' VSTM capacity. Subjects' performance deteriorated and reaction times increased as a function of the number of items in memory. Additionally, we measured the precision of the memory representations by varying the distance between sample and test colors. In trials with similar sample and test colors, subjects made more errors compared to trials with highly discriminable colors. We modeled the error distribution as a Gaussian function and used this to estimate the precision of VSTM representations. We found that as the number of items in memory increases the precision of the representations decreases dramatically. Additionally, we found that focusing attention on one of the objects increases the precision with which that object is stored and degrades the precision of the remaining. These results are in line with recent findings in human psychophysics and provide a solid foundation for understanding the neurophysiological nature of the capacity limit of VSTM.

  7. The forthcoming era of precision medicine.

    Science.gov (United States)

    Gamulin, Stjepan

    2016-11-01

    The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients' groups). Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism ("big data"), development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach. Copyright © 2016 by Academy of Sciences and Arts of Bosnia and Herzegovina.

  8. The forthcoming era of precision medicine

    Directory of Open Access Journals (Sweden)

    Stjepan Gamulin

    2016-11-01

    Full Text Available Abstract. The aim of this essay is to present the definition and principles of personalized or precision medicine, the perspective and barriers to its development and clinical application. The implementation of precision medicine in health care requires the coordinated efforts of all health care stakeholders (the biomedical community, government, regulatory bodies, patients’ groups. Particularly, translational research with the integration of genomic and comprehensive data from all levels of the organism (“big data”, development of bioinformatics platforms enabling network analysis of disease etiopathogenesis, development of a legislative framework for handling personal data, and new paradigms of medical education are necessary for successful application of the concept of precision medicine in health care. Conclusion. In the present and future era of precision medicine, the collaboration of all participants in health care is necessary for its realization, resulting in improvement of diagnosis, prevention and therapy, based on a holistic, individually tailored approach.

  9. Creating Objects and Object Categories for Studying Perception and Perceptual Learning

    Science.gov (United States)

    Hauffen, Karin; Bart, Eugene; Brady, Mark; Kersten, Daniel; Hegdé, Jay

    2012-01-01

    In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created

  10. Objective Bayesian analysis of neutrino masses and hierarchy

    Science.gov (United States)

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  11. Geometrical accuracy of metallic objects produced with additive or subtractive manufacturing: A comparative in vitro study.

    Science.gov (United States)

    Braian, Michael; Jönsson, David; Kevci, Mir; Wennerberg, Ann

    2018-04-06

    To evaluate the accuracy and precision of objects produced by additive manufacturing systems (AM) for use in dentistry and to compare with subtractive manufacturing systems (SM). Ten specimens of two geometrical objects were produced by five different AM machines and one SM machine. Object A mimics an inlay-shaped object, while object B imitates a four-unit bridge model. All the objects were sorted into different measurement dimensions (x, y, z), linear distances, angles and corner radius. None of the additive manufacturing or subtractive manufacturing groups presented a perfect match to the CAD file with regard to all parameters included in the present study. Considering linear measurements, the precision for subtractive manufacturing group was consistent in all axes for object A, presenting results of additive manufacturing groups had consistent precision in the x-axis and y-axis but not in the z-axis. With regard to corner radius measurements, the SM group had the best overall accuracy and precision for both objects A and B when compared to the AM groups. Within the limitations of this in vitro study, the conclusion can be made that subtractive manufacturing presented overall precision on all measurements below 0.050mm. The AM machines also presented fairly good precision, additive techniques are now being implemented. Thus all these production techniques need to be tested, compared and validated. Copyright © 2018 The Academy of Dental Materials. Published by Elsevier Inc. All rights reserved.

  12. Precision Oncology: Between Vaguely Right and Precisely Wrong.

    Science.gov (United States)

    Brock, Amy; Huang, Sui

    2017-12-01

    Precision Oncology seeks to identify and target the mutation that drives a tumor. Despite its straightforward rationale, concerns about its effectiveness are mounting. What is the biological explanation for the "imprecision?" First, Precision Oncology relies on indiscriminate sequencing of genomes in biopsies that barely represent the heterogeneous mix of tumor cells. Second, findings that defy the orthodoxy of oncogenic "driver mutations" are now accumulating: the ubiquitous presence of oncogenic mutations in silent premalignancies or the dynamic switching without mutations between various cell phenotypes that promote progression. Most troublesome is the observation that cancer cells that survive treatment still will have suffered cytotoxic stress and thereby enter a stem cell-like state, the seeds for recurrence. The benefit of "precision targeting" of mutations is inherently limited by this counterproductive effect. These findings confirm that there is no precise linear causal relationship between tumor genotype and phenotype, a reminder of logician Carveth Read's caution that being vaguely right may be preferable to being precisely wrong. An open-minded embrace of the latest inconvenient findings indicating nongenetic and "imprecise" phenotype dynamics of tumors as summarized in this review will be paramount if Precision Oncology is ultimately to lead to clinical benefits. Cancer Res; 77(23); 6473-9. ©2017 AACR . ©2017 American Association for Cancer Research.

  13. Featureous: infrastructure for feature-centric analysis of object-oriented software

    DEFF Research Database (Denmark)

    Olszak, Andrzej; Jørgensen, Bo Nørregaard

    2010-01-01

    The decentralized nature of collaborations between objects in object-oriented software makes it difficult to understand how user-observable program features are implemented and how their implementations relate to each other. It is worthwhile to improve this situation, since feature-centric program...... understanding and modification are essential during software evolution and maintenance. In this paper, we present an infrastructure built on top of the NetBeans IDE called Featureous that allows for rapid construction of tools for feature-centric analysis of object-oriented software. Our infrastructure...... encompasses a lightweight feature location mechanism, a number of analytical views and an API allowing for addition of third-party extensions. To form a common conceptual framework for future feature-centric extensions, we propose to structure feature centric analysis along three dimensions: perspective...

  14. Data analysis in an object request broker environment

    International Nuclear Information System (INIS)

    Malon, David M.; May, Edward N.; Grossman, Robert L.; Day, Christopher T.; Quarrie, David R.

    1996-01-01

    Computing for the Next Millennium will require software interoperability in heterogeneous, increasingly object-oriented environments. The Common Request Broker Architecture (CORBA) is a software industry effort, under the aegis of the Object Management Group (OMG), to standardize mechanism for software interaction among disparate applications written in a variety of languages and running on a variety of distributed platforms. In this paper, we describe some of the design and performance implications for software that must function is such a brokered environment in a standards-compliant way. We illustrate these implications with a physics data analysis example as a case study. (author)

  15. Effects of grasp compatibility on long-term memory for objects.

    Science.gov (United States)

    Canits, Ivonne; Pecher, Diane; Zeelenberg, René

    2018-01-01

    Previous studies have shown action potentiation during conceptual processing of manipulable objects. In four experiments, we investigated whether these motor actions also play a role in long-term memory. Participants categorized objects that afforded either a power grasp or a precision grasp as natural or artifact by grasping cylinders with either a power grasp or a precision grasp. In all experiments, responses were faster when the affordance of the object was compatible with the type of grasp response. However, subsequent free recall and recognition memory tasks revealed no better memory for object pictures and object names for which the grasp affordance was compatible with the grasp response. The present results therefore do not support the hypothesis that motor actions play a role in long-term memory. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Precision of MRI-based body composition measurements of postmenopausal women

    Science.gov (United States)

    Romu, Thobias; Thorell, Sofia; Lindblom, Hanna; Berin, Emilia; Holm, Anna-Clara Spetz; Åstrand, Lotta Lindh; Karlsson, Anette; Borga, Magnus; Hammar, Mats; Leinhard, Olof Dahlqvist

    2018-01-01

    Objectives To determine precision of magnetic resonance imaging (MRI) based fat and muscle quantification in a group of postmenopausal women. Furthermore, to extend the method to individual muscles relevant to upper-body exercise. Materials and methods This was a sub-study to a randomized control trial investigating effects of resistance training to decrease hot flushes in postmenopausal women. Thirty-six women were included, mean age 56 ± 6 years. Each subject was scanned twice with a 3.0T MR-scanner using a whole-body Dixon protocol. Water and fat images were calculated using a 6-peak lipid model including R2*-correction. Body composition analyses were performed to measure visceral and subcutaneous fat volumes, lean volumes and muscle fat infiltration (MFI) of the muscle groups’ thigh muscles, lower leg muscles, and abdominal muscles, as well as the three individual muscles pectoralis, latissimus, and rhomboideus. Analysis was performed using a multi-atlas, calibrated water-fat separated quantification method. Liver-fat was measured as average proton density fat-fraction (PDFF) of three regions-of-interest. Precision was determined with Bland-Altman analysis, repeatability, and coefficient of variation. Results All of the 36 included women were successfully scanned and analysed. The coefficient of variation was 1.1% to 1.5% for abdominal fat compartments (visceral and subcutaneous), 0.8% to 1.9% for volumes of muscle groups (thigh, lower leg, and abdomen), and 2.3% to 7.0% for individual muscle volumes (pectoralis, latissimus, and rhomboideus). Limits of agreement for MFI was within ± 2.06% for muscle groups and within ± 5.13% for individual muscles. The limits of agreement for liver PDFF was within ± 1.9%. Conclusion Whole-body Dixon MRI could characterize a range of different fat and muscle compartments with high precision, including individual muscles, in the study-group of postmenopausal women. The inclusion of individual muscles, calculated from the

  17. Ion chromatography for the precise analysis of chloride and sodium in sweat for the diagnosis of cystic fibrosis.

    Science.gov (United States)

    Doorn, J; Storteboom, T T R; Mulder, A M; de Jong, W H A; Rottier, B L; Kema, I P

    2015-07-01

    Measurement of chloride in sweat is an essential part of the diagnostic algorithm for cystic fibrosis. The lack in sensitivity and reproducibility of current methods led us to develop an ion chromatography/high-performance liquid chromatography (IC/HPLC) method, suitable for the analysis of both chloride and sodium in small volumes of sweat. Precision, linearity and limit of detection of an in-house developed IC/HPLC method were established. Method comparison between the newly developed IC/HPLC method and the traditional Chlorocounter was performed, and trueness was determined using Passing Bablok method comparison with external quality assurance material (Royal College of Pathologists of Australasia). Precision and linearity fulfill criteria as established by UK guidelines are comparable with inductively coupled plasma-mass spectrometry methods. Passing Bablok analysis demonstrated excellent correlation between IC/HPLC measurements and external quality assessment target values, for both chloride and sodium. With a limit of quantitation of 0.95 mmol/L, our method is suitable for the analysis of small amounts of sweat and can thus be used in combination with the Macroduct collection system. Although a chromatographic application results in a somewhat more expensive test compared to a Chlorocounter test, more accurate measurements are achieved. In addition, simultaneous measurements of sodium concentrations will result in better detection of false positives, less test repeating and thus faster and more accurate and effective diagnosis. The described IC/HPLC method, therefore, provides a precise, relatively cheap and easy-to-handle application for the analysis of both chloride and sodium in sweat. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  18. Non-precision approach in manual mode

    Directory of Open Access Journals (Sweden)

    М. В. Коршунов

    2013-07-01

    Full Text Available Considered is the method of non-precision approach of an aircraft in the manual mode with a constant angle of path. Advantage of this method consists in the fact that the construction of approach with a constant angle of path provides the stable path of flight. It is also considered a detailed analysis of the possibility of the approach by the above-mentioned method. Conclusions contain recommendations regarding the use of the described method of non-precision approach during training flights.

  19. Developing web-based data analysis tools for precision farming using R and Shiny

    Science.gov (United States)

    Jahanshiri, Ebrahim; Mohd Shariff, Abdul Rashid

    2014-06-01

    Technologies that are set to increase the productivity of agricultural practices require more and more data. Nevertheless, farming data is also being increasingly cheap to collect and maintain. Bulk of data that are collected by the sensors and samples need to be analysed in an efficient and transparent manner. Web technologies have long being used to develop applications that can assist the farmers and managers. However until recently, analysing the data in an online environment has not been an easy task especially in the eyes of data analysts. This barrier is now overcome by the availability of new application programming interfaces that can provide real-time web based data analysis. In this paper developing a prototype web based application for data analysis using new facilities in R statistical package and its web development facility, Shiny is explored. The pros and cons of this type of data analysis environment for precision farming are enumerated and future directions in web application development for agricultural data are discussed.

  20. Developing web-based data analysis tools for precision farming using R and Shiny

    International Nuclear Information System (INIS)

    Jahanshiri, Ebrahim; Shariff, Abdul Rashid Mohd

    2014-01-01

    Technologies that are set to increase the productivity of agricultural practices require more and more data. Nevertheless, farming data is also being increasingly cheap to collect and maintain. Bulk of data that are collected by the sensors and samples need to be analysed in an efficient and transparent manner. Web technologies have long being used to develop applications that can assist the farmers and managers. However until recently, analysing the data in an online environment has not been an easy task especially in the eyes of data analysts. This barrier is now overcome by the availability of new application programming interfaces that can provide real-time web based data analysis. In this paper developing a prototype web based application for data analysis using new facilities in R statistical package and its web development facility, Shiny is explored. The pros and cons of this type of data analysis environment for precision farming are enumerated and future directions in web application development for agricultural data are discussed

  1. Sensor fusion for active vibration isolation in precision equipment

    NARCIS (Netherlands)

    Tjepkema, D.; van Dijk, Johannes; Soemers, Herman

    2012-01-01

    Sensor fusion is a promising control strategy to improve the performance of active vibration isolation systems that are used in precision equipment. Normally, those vibration isolation systems are only capable of realizing a low transmissibility. Additional objectives are to increase the damping

  2. Data quality objectives lessons learned for tank waste characterization

    International Nuclear Information System (INIS)

    Eberlein, S.J.; Banning, D.L.

    1996-01-01

    The tank waste characterization process is an integral part of the overall effort to control the hazards associated with radioactive wastes stored in underground tanks at the Hanford Reservation. The programs involved in the characterization of the waste are employing the Data Quality Objective (DQO) process in all information and data collection activities. The DQO process is used by the programs to address an issue or problem rather than a specific sampling event. Practical limits (e.g., limited number and location of sampling points) do not always allow for precise characterization of a tank or the full implementation of the DQO process. Because of the flexibility of the DQO process, it can be used as a planning tool for sampling and analysis of the underground waste storage tanks. The iterative nature of the DQO process allows it to be used as additional information is obtained or open-quotes lessons are learnedclose quotes concerning an issue or problem requiring sampling and analysis of tank waste. In addition, the application of the DQO process forces alternative actions to be considered when precise characterization of a tank or the fall implementation of the DQO process is not practical

  3. From Pixels to Geographic Objects in Remote Sensing Image Analysis

    NARCIS (Netherlands)

    Addink, E.A.; Van Coillie, Frieke M.B.; Jong, Steven M. de

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received

  4. High-Precision Computation and Mathematical Physics

    International Nuclear Information System (INIS)

    Bailey, David H.; Borwein, Jonathan M.

    2008-01-01

    At the present time, IEEE 64-bit floating-point arithmetic is sufficiently accurate for most scientific applications. However, for a rapidly growing body of important scientific computing applications, a higher level of numeric precision is required. Such calculations are facilitated by high-precision software packages that include high-level language translation modules to minimize the conversion effort. This paper presents a survey of recent applications of these techniques and provides some analysis of their numerical requirements. These applications include supernova simulations, climate modeling, planetary orbit calculations, Coulomb n-body atomic systems, scattering amplitudes of quarks, gluons and bosons, nonlinear oscillator theory, Ising theory, quantum field theory and experimental mathematics. We conclude that high-precision arithmetic facilities are now an indispensable component of a modern large-scale scientific computing environment.

  5. Use of objective analysis to estimate winter temperature and ...

    Indian Academy of Sciences (India)

    In the complex terrain of Himalaya, nonavailability of snow and meteorological data of the remote locations ... Precipitation intensity; spatial interpolation; objective analysis. J. Earth Syst. ... This technique needs historical database and unable ...

  6. Accuracy and Precision in Elemental Analysis of Environmental Samples using Inductively Coupled Plasma-Atomic Emission Spectrometry

    International Nuclear Information System (INIS)

    Quraishi, Shamsad Begum; Chung, Yong-Sam; Choi, Kwang Soon

    2005-01-01

    Inductively Coupled Plasma-Atomic Emission Spectrometry followed by micro-wave digestion have been performed on different environmental Certified Reference Materials (CRMs). Analytical results show that accuracy and precision in ICP-AES analysis were acceptable and satisfactory in case of soil and hair CRM samples. The relative error of most of the elements in these two CRMs is within 10% with few exceptions and coefficient of variation is also less than 10%. Z-score as an analytical performance was also within the acceptable range (±2). ICP-AES was found as an inadequate method for Air Filter CRM due to incomplete dissolution, low concentration of elements and very low mass of the sample. However, real air filter sample could have been analyzed with high accuracy and precision by increasing sample mass during collection. (author)

  7. Fast grasping of unknown objects using principal component analysis

    Science.gov (United States)

    Lei, Qujiang; Chen, Guangming; Wisse, Martijn

    2017-09-01

    Fast grasping of unknown objects has crucial impact on the efficiency of robot manipulation especially subjected to unfamiliar environments. In order to accelerate grasping speed of unknown objects, principal component analysis is utilized to direct the grasping process. In particular, a single-view partial point cloud is constructed and grasp candidates are allocated along the principal axis. Force balance optimization is employed to analyze possible graspable areas. The obtained graspable area with the minimal resultant force is the best zone for the final grasping execution. It is shown that an unknown object can be more quickly grasped provided that the component analysis principle axis is determined using single-view partial point cloud. To cope with the grasp uncertainty, robot motion is assisted to obtain a new viewpoint. Virtual exploration and experimental tests are carried out to verify this fast gasping algorithm. Both simulation and experimental tests demonstrated excellent performances based on the results of grasping a series of unknown objects. To minimize the grasping uncertainty, the merits of the robot hardware with two 3D cameras can be utilized to suffice the partial point cloud. As a result of utilizing the robot hardware, the grasping reliance is highly enhanced. Therefore, this research demonstrates practical significance for increasing grasping speed and thus increasing robot efficiency under unpredictable environments.

  8. Analysis of web-based online services for GPS relative and precise point positioning techniques

    Directory of Open Access Journals (Sweden)

    Taylan Ocalan

    Full Text Available Nowadays, Global Positioning System (GPS has been used effectively in several engineering applications for the survey purposes by multiple disciplines. Web-based online services developed by several organizations; which are user friendly, unlimited and most of them are free; have become a significant alternative against the high-cost scientific and commercial software on achievement of post processing and analyzing the GPS data. When centimeter (cm or decimeter (dm level accuracies are desired, that can be obtained easily regarding different quality engineering applications through these services. In this paper, a test study was conducted at ISKI-CORS network; Istanbul-Turkey in order to figure out the accuracy analysis of the most used web based online services around the world (namely OPUS, AUSPOS, SCOUT, CSRS-PPP, GAPS, APPS, magicGNSS. These services use relative and precise point positioning (PPP solution approaches. In this test study, the coordinates of eight stations were estimated by using of both online services and Bernese 5.0 scientific GPS processing software from 24-hour GPS data set and then the coordinate differences between the online services and Bernese processing software were computed. From the evaluations, it was seen that the results for each individual differences were less than 10 mm regarding relative online service, and less than 20 mm regarding precise point positioning service. The accuracy analysis was gathered from these coordinate differences and standard deviations of the obtained coordinates from different techniques and then online services were compared to each other. The results show that the position accuracies obtained by associated online services provide high accurate solutions that may be used in many engineering applications and geodetic analysis.

  9. Precision medicine: In need of guidance and surveillance.

    Science.gov (United States)

    Lin, Jian-Zhen; Long, Jun-Yu; Wang, An-Qiang; Zheng, Ying; Zhao, Hai-Tao

    2017-07-28

    Precision medicine, currently a hotspot in mainstream medicine, has been strongly promoted in recent years. With rapid technological development, such as next-generation sequencing, and fierce competition in molecular targeted drug exploitation, precision medicine represents an advance in science and technology; it also fulfills needs in public health care. The clinical translation and application of precision medicine - especially in the prevention and treatment of tumors - is far from satisfactory; however, the aims of precision medicine deserve approval. Thus, this medical approach is currently in its infancy; it has promising prospects, but it needs to overcome numbers of problems and deficiencies. It is expected that in addition to conventional symptoms and signs, precision medicine will define disease in terms of the underlying molecular characteristics and other environmental susceptibility factors. Those expectations should be realized by constructing a novel data network, integrating clinical data from individual patients and personal genomic background with existing research on the molecular makeup of diseases. In addition, multi-omics analysis and multi-discipline collaboration will become crucial elements in precision medicine. Precision medicine deserves strong support, and its development demands directed momentum. We propose three kinds of impetus (research, application and collaboration impetus) for such directed momentum toward promoting precision medicine and accelerating its clinical translation and application.

  10. Precision Joining Center

    Science.gov (United States)

    Powell, J. W.; Westphal, D. A.

    1991-08-01

    A workshop to obtain input from industry on the establishment of the Precision Joining Center (PJC) was held on July 10-12, 1991. The PJC is a center for training Joining Technologists in advanced joining techniques and concepts in order to promote the competitiveness of U.S. industry. The center will be established as part of the DOE Defense Programs Technology Commercialization Initiative, and operated by EG&G Rocky Flats in cooperation with the American Welding Society and the Colorado School of Mines Center for Welding and Joining Research. The overall objectives of the workshop were to validate the need for a Joining Technologists to fill the gap between the welding operator and the welding engineer, and to assure that the PJC will train individuals to satisfy that need. The consensus of the workshop participants was that the Joining Technologist is a necessary position in industry, and is currently used, with some variation, by many companies. It was agreed that the PJC core curriculum, as presented, would produce a Joining Technologist of value to industries that use precision joining techniques. The advantage of the PJC would be to train the Joining Technologist much more quickly and more completely. The proposed emphasis of the PJC curriculum on equipment intensive and hands-on training was judged to be essential.

  11. The Future of Precision Medicine in Oncology.

    Science.gov (United States)

    Millner, Lori M; Strotman, Lindsay N

    2016-09-01

    Precision medicine in oncology focuses on identifying which therapies are most effective for each patient based on genetic characterization of the cancer. Traditional chemotherapy is cytotoxic and destroys all cells that are rapidly dividing. The foundation of precision medicine is targeted therapies and selecting patients who will benefit most from these therapies. One of the newest aspects of precision medicine is liquid biopsy. A liquid biopsy includes analysis of circulating tumor cells, cell-free nucleic acid, or exosomes obtained from a peripheral blood draw. These can be studied individually or in combination and collected serially, providing real-time information as a patient's cancer changes. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. The High Road to Astronomical Photometric Precision : Differential Photometry

    NARCIS (Netherlands)

    Milone, E. F.; Pel, Jan Willem

    2011-01-01

    Differential photometry offers the most precise method for measuring the brightness of astronomical objects. We attempt to demonstrate why this should be the case, and then describe how well it has been done through a review of the application of differential techniques from the earliest visual

  13. How Objective a Neutral Word Is? A Neutrosophic Approach for the Objectivity Degrees of Neutral Words

    Directory of Open Access Journals (Sweden)

    Mihaela Colhon

    2017-11-01

    Full Text Available In the latest studies concerning the sentiment polarity of words, the authors mostly consider the positive and negative constructions, without paying too much attention to the neutral words, which can have, in fact, significant sentiment degrees. More precisely, not all the neutral words have zero positivity or negativity scores, some of them having quite important nonzero scores for these polarities. At this moment, in the literature, a word is considered neutral if its positive and negative scores are equal, which implies two possibilities: (1 zero positive and negative scores; (2 nonzero, but equal positive and negative scores. It is obvious that these cases represent two different categories of neutral words that must be treated separately by a sentiment analysis task. In this paper, we present a comprehensive study about the neutral words applied to English as is developed with the aid of SentiWordNet 3.0: the publicly available lexical resource for opinion mining. We designed our study in order to provide an accurate classification of the so-called “neutral words” described in terms of sentiment scores and using measures from neutrosophy theory. The intended scope is to fill the gap concerning the neutrality aspect by giving precise measurements for the words’ objectivity.

  14. Development of High Precision Tsunami Runup Calculation Method Coupled with Structure Analysis

    Science.gov (United States)

    Arikawa, Taro; Seki, Katsumi; Chida, Yu; Takagawa, Tomohiro; Shimosako, Kenichiro

    2017-04-01

    The 2011 Great East Japan Earthquake (GEJE) has shown that tsunami disasters are not limited to inundation damage in a specified region, but may destroy a wide area, causing a major disaster. Evaluating standing land structures and damage to them requires highly precise evaluation of three-dimensional fluid motion - an expensive process. Our research goals were thus to develop a coupling STOC-CADMAS (Arikawa and Tomita, 2016) coupling with the structure analysis (Arikawa et. al., 2009) to efficiently calculate all stages from tsunami source to runup including the deformation of structures and to verify their applicability. We also investigated the stability of breakwaters at Kamaishi Bay. Fig. 1 shows the whole of this calculation system. The STOC-ML simulator approximates pressure by hydrostatic pressure and calculates the wave profiles based on an equation of continuity, thereby lowering calculation cost, primarily calculating from a e epi center to the shallow region. As a simulator, STOC-IC solves pressure based on a Poisson equation to account for a shallower, more complex topography, but reduces computation cost slightly to calculate the area near a port by setting the water surface based on an equation of continuity. CS3D also solves a Navier-Stokes equation and sets the water surface by VOF to deal with the runup area, with its complex surfaces of overflows and bores. STR solves the structure analysis including the geo analysis based on the Biot's formula. By coupling these, it efficiently calculates the tsunami profile from the propagation to the inundation. The numerical results compared with the physical experiments done by Arikawa et. al.,2012. It was good agreement with the experimental ones. Finally, the system applied to the local situation at Kamaishi bay. The almost breakwaters were washed away, whose situation was similar to the damage at Kamaishi bay. REFERENCES T. Arikawa and T. Tomita (2016): "Development of High Precision Tsunami Runup

  15. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The Object-Oriented Turbomachinery Analysis Code (OTAC) is a new meanline/streamline turbomachinery modeling tool being developed at NASA GRC. During the development process, a limitation of the code was discovered in relation to the analysis of choked flow in axial turbines. This paper describes the relevant physics for choked flow as well as the changes made to OTAC to enable analysis in this flow regime.

  16. Per Object statistical analysis

    DEFF Research Database (Denmark)

    2008-01-01

    of a specific class in turn, and uses as pair of PPO stages to derive the statistics and then assign them to the objects' Object Variables. It may be that this could all be done in some other, simply way, but several other ways that were tried did not succeed. The procedure ouptut has been tested against...

  17. Forest Rent as an Object of Economic Analysis

    Directory of Open Access Journals (Sweden)

    Lisichko Andriyana M.

    2018-01-01

    Full Text Available The article is aimed at researching the concept of forest rent as an object of economic analysis. The essence of the concept of «forest rent» has been researched. It has been defined that the forest rent is the object of management of the forest complex of Ukraine as a whole and forest enterprises in particular. Rent for special use of forest resources is the object of interest om the part of both the State and the corporate sector, because its value depends on the cost of timber for industry and households. Works of scholars on classification of rents were studied. It has been determined that the rent for specialized use of forest resources is a special kind of natural rent. The structure of constituents in the system of rent relations in the forest sector has been defined in accordance with provisions of the tax code of Ukraine.

  18. Precision force sensing with optically-levitated nanospheres

    Science.gov (United States)

    Geraci, Andrew

    2017-04-01

    In high vacuum, optically-trapped dielectric nanospheres achieve excellent decoupling from their environment and experience minimal friction, making them ideal for precision force sensing. We have shown that 300 nm silica spheres can be used for calibrated zeptonewton force measurements in a standing-wave optical trap. In this optical potential, the known spacing of the standing wave anti-nodes can serve as an independent calibration tool for the displacement spectrum of the trapped particle. I will describe our progress towards using these sensors for tests of the Newtonian gravitational inverse square law at micron length scales. Optically levitated dielectric objects also show promise for a variety of other precision sensing applications, including searches for gravitational waves and other experiments in quantum optomechanics. National Science Foundation PHY-1205994, PHY-1506431, PHY-1509176.

  19. Significant improvement of accuracy and precision in the determination of trace rare earths by fluorescence analysis

    International Nuclear Information System (INIS)

    Ozawa, L.; Hersh, H.N.

    1976-01-01

    Most of the rare earths in yttrium, gadolinium and lanthanum oxides emit characteristic fluorescent line spectra under irradiation with photons, electrons and x rays. The sensitivity and selectivity of the rare earth fluorescences are high enough to determine the trace amounts (0.01 to 100 ppM) of rare earths. The absolute fluorescent intensities of solids, however, are markedly affected by the synthesis procedure, level of contamination and crystal perfection, resulting in poor accuracy and low precision for the method (larger than 50 percent error). Special care in preparation of the samples is required to obtain good accuracy and precision. It is found that the accuracy and precision for the determination of trace (less than 10 ppM) rare earths by fluorescence analysis improved significantly, while still maintaining the sensitivity, when the determination is made by comparing the ratio of the fluorescent intensities of the trace rare earths to that of a deliberately added rare earth as reference. The variation in the absolute fluorescent intensity remains, but is compensated for by measuring the fluorescent line intensity ratio. Consequently, the determination of trace rare earths (with less than 3 percent error) is easily made by a photoluminescence technique in which the rare earths are excited directly by photons. Accuracy is still maintained when the absolute fluorescent intensity is reduced by 50 percent through contamination by Ni, Fe, Mn or Pb (about 100 ppM). Determination accuracy is also improved for fluorescence analysis by electron excitation and x-ray excitation. For some rare earths, however, accuracy by these techniques is reduced because indirect excitation mechanisms are involved. The excitation mechanisms and the interferences between rare earths are also reported

  20. A sensitive, reproducible and objective immunofluorescence analysis method of dystrophin in individual fibers in samples from patients with duchenne muscular dystrophy.

    Directory of Open Access Journals (Sweden)

    Chantal Beekman

    Full Text Available Duchenne muscular dystrophy (DMD is characterized by the absence or reduced levels of dystrophin expression on the inner surface of the sarcolemmal membrane of muscle fibers. Clinical development of therapeutic approaches aiming to increase dystrophin levels requires sensitive and reproducible measurement of differences in dystrophin expression in muscle biopsies of treated patients with DMD. This, however, poses a technical challenge due to intra- and inter-donor variance in the occurrence of revertant fibers and low trace dystrophin expression throughout the biopsies. We have developed an immunofluorescence and semi-automated image analysis method that measures the sarcolemmal dystrophin intensity per individual fiber for the entire fiber population in a muscle biopsy. Cross-sections of muscle co-stained for dystrophin and spectrin have been imaged by confocal microscopy, and image analysis was performed using Definiens software. Dystrophin intensity has been measured in the sarcolemmal mask of spectrin for each individual muscle fiber and multiple membrane intensity parameters (mean, maximum, quantiles per fiber were calculated. A histogram can depict the distribution of dystrophin intensities for the fiber population in the biopsy. This method was tested by measuring dystrophin in DMD, Becker muscular dystrophy, and healthy muscle samples. Analysis of duplicate or quadruplicate sections of DMD biopsies on the same or multiple days, by different operators, or using different antibodies, was shown to be objective and reproducible (inter-assay precision, CV 2-17% and intra-assay precision, CV 2-10%. Moreover, the method was sufficiently sensitive to detect consistently small differences in dystrophin between two biopsies from a patient with DMD before and after treatment with an investigational compound.

  1. Moving the Weber fraction: the perceptual precision for moment of inertia increases with exploration force

    NARCIS (Netherlands)

    Debats, N.B.; Kingma, I.; Beek, P.J.; Smeets, J.B.J.

    2012-01-01

    How does the magnitude of the exploration force influence the precision of haptic perceptual estimates? To address this question, we examined the perceptual precision for moment of inertia (i.e., an object's "angular mass") under different force conditions, using the Weber fraction to quantify

  2. Objective analysis of toolmarks in forensics

    Energy Technology Data Exchange (ETDEWEB)

    Grieve, Taylor N. [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    Since the 1993 court case of Daubert v. Merrell Dow Pharmaceuticals, Inc. the subjective nature of toolmark comparison has been questioned by attorneys and law enforcement agencies alike. This has led to an increased drive to establish objective comparison techniques with known error rates, much like those that DNA analysis is able to provide. This push has created research in which the 3-D surface profile of two different marks are characterized and the marks’ cross-sections are run through a comparative statistical algorithm to acquire a value that is intended to indicate the likelihood of a match between the marks. The aforementioned algorithm has been developed and extensively tested through comparison of evenly striated marks made by screwdrivers. However, this algorithm has yet to be applied to quasi-striated marks such as those made by the shear edge of slip-joint pliers. The results of this algorithm’s application to the surface of copper wire will be presented. Objective mark comparison also extends to comparison of toolmarks made by firearms. In an effort to create objective comparisons, microstamping of firing pins and breech faces has been introduced. This process involves placing unique alphanumeric identifiers surrounded by a radial code on the surface of firing pins, which transfer to the cartridge’s primer upon firing. Three different guns equipped with microstamped firing pins were used to fire 3000 cartridges. These cartridges are evaluated based on the clarity of their alphanumeric transfers and the clarity of the radial code surrounding the alphanumerics.

  3. Intermediary object for participative design processes based on the ergonomic work analysis

    DEFF Research Database (Denmark)

    Souza da Conceição, Carolina; Duarte, F.; Broberg, Ole

    2012-01-01

    The objective of this paper is to present and discuss the use of an intermediary object, built from the ergonomic work analysis, in a participative design process. The object was a zoning pattern, developed as a visual representation ‘mapping’ of the interrelations among the functional units of t...

  4. Magnitude, precision, and realism of depth perception in stereoscopic vision.

    Science.gov (United States)

    Hibbard, Paul B; Haines, Alice E; Hornsey, Rebecca L

    2017-01-01

    Our perception of depth is substantially enhanced by the fact that we have binocular vision. This provides us with more precise and accurate estimates of depth and an improved qualitative appreciation of the three-dimensional (3D) shapes and positions of objects. We assessed the link between these quantitative and qualitative aspects of 3D vision. Specifically, we wished to determine whether the realism of apparent depth from binocular cues is associated with the magnitude or precision of perceived depth and the degree of binocular fusion. We presented participants with stereograms containing randomly positioned circles and measured how the magnitude, realism, and precision of depth perception varied with the size of the disparities presented. We found that as the size of the disparity increased, the magnitude of perceived depth increased, while the precision with which observers could make depth discrimination judgments decreased. Beyond an initial increase, depth realism decreased with increasing disparity magnitude. This decrease occurred well below the disparity limit required to ensure comfortable viewing.

  5. A Method for a Retrospective Analysis of Course Objectives: Have Pursued Objectives in Fact Been Attained? Twente Educational Report Number 7.

    Science.gov (United States)

    Plomp, Tjeerd; van der Meer, Adri

    A method pertaining to the identification and analysis of course objectives is discussed. A framework is developed by which post facto objectives can be determined and students' attainment of the objectives can be assessed. The method can also be used for examining the quality of instruction. Using this method, it is possible to determine…

  6. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  7. voom: Precision weights unlock linear model analysis tools for RNA-seq read counts.

    Science.gov (United States)

    Law, Charity W; Chen, Yunshun; Shi, Wei; Smyth, Gordon K

    2014-02-03

    New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods.

  8. Analysis of micro computed tomography images; a look inside historic enamelled metal objects

    Science.gov (United States)

    van der Linden, Veerle; van de Casteele, Elke; Thomas, Mienke Simon; de Vos, Annemie; Janssen, Elsje; Janssens, Koen

    2010-02-01

    In this study the usefulness of micro-Computed Tomography (µ-CT) for the in-depth analysis of enamelled metal objects was tested. Usually investigations of enamelled metal artefacts are restricted to non-destructive surface analysis or analysis of cross sections after destructive sampling. Radiography, a commonly used technique in the field of cultural heritage studies, is limited to providing two-dimensional information about a three-dimensional object (Lang and Middleton, Radiography of Cultural Material, pp. 60-61, Elsevier-Butterworth-Heinemann, Amsterdam-Stoneham-London, 2005). Obtaining virtual slices and information about the internal structure of these objects was made possible by CT analysis. With this technique the underlying metal work was studied without removing the decorative enamel layer. Moreover visible defects such as cracks were measured in both width and depth and as of yet invisible defects and weaker areas are visualised. All these features are of great interest to restorers and conservators as they allow a view inside these objects without so much as touching them.

  9. [Cotton identification and extraction using near infrared sensor and object-oriented spectral segmentation technique].

    Science.gov (United States)

    Deng, Jin-Song; Shi, Yuan-Yuan; Chen, Li-Su; Wang, Ke; Zhu, Jin-Xia

    2009-07-01

    The real-time, effective and reliable method of identifying crop is the foundation of scientific management for crop in the precision agriculture. It is also one of the key techniques for the precision agriculture. However, this expectation cannot be fulfilled by the traditional pixel-based information extraction method with respect to complicated image processing and accurate objective identification. In the present study, visible-near infrared image of cotton was acquired using high-resolution sensor. Object-oriented segmentation technique was performed on the image to produce image objects and spatial/spectral features of cotton. Afterwards, nearest neighbor classifier integrated the spectral, shape and topologic information of image objects to precisely identify cotton according to various features. Finally, 300 random samples and an error matrix were applied to undertake the accuracy assessment of identification. Although errors and confusion exist, this method shows satisfying results with an overall accuracy of 96.33% and a KAPPA coefficient of 0.926 7, which can meet the demand of automatic management and decision-making in precision agriculture.

  10. Automatic Discovery and Geotagging of Objects from Street View Imagery

    Directory of Open Access Journals (Sweden)

    Vladimir A. Krylov

    2018-04-01

    Full Text Available Many applications, such as autonomous navigation, urban planning, and asset monitoring, rely on the availability of accurate information about objects and their geolocations. In this paper, we propose the automatic detection and computation of the coordinates of recurring stationary objects of interest using street view imagery. Our processing pipeline relies on two fully convolutional neural networks: the first segments objects in the images, while the second estimates their distance from the camera. To geolocate all the detected objects coherently we propose a novel custom Markov random field model to estimate the objects’ geolocation. The novelty of the resulting pipeline is the combined use of monocular depth estimation and triangulation to enable automatic mapping of complex scenes with the simultaneous presence of multiple, visually similar objects of interest. We validate experimentally the effectiveness of our approach on two object classes: traffic lights and telegraph poles. The experiments report high object recall rates and position precision of approximately 2 m, which is approaching the precision of single-frequency GPS receivers.

  11. Negative emotion enhances mnemonic precision and subjective feelings of remembering in visual long-term memory.

    Science.gov (United States)

    Xie, Weizhen; Zhang, Weiwei

    2017-09-01

    Negative emotion sometimes enhances memory (higher accuracy and/or vividness, e.g., flashbulb memories). The present study investigates whether it is the qualitative (precision) or quantitative (the probability of successful retrieval) aspect of memory that drives these effects. In a visual long-term memory task, observers memorized colors (Experiment 1a) or orientations (Experiment 1b) of sequentially presented everyday objects under negative, neutral, or positive emotions induced with International Affective Picture System images. In a subsequent test phase, observers reconstructed objects' colors or orientations using the method of adjustment. We found that mnemonic precision was enhanced under the negative condition relative to the neutral and positive conditions. In contrast, the probability of successful retrieval was comparable across the emotion conditions. Furthermore, the boost in memory precision was associated with elevated subjective feelings of remembering (vividness and confidence) and metacognitive sensitivity in Experiment 2. Altogether, these findings suggest a novel precision-based account for emotional memories. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Mitigation strategies for pandemic influenza A: balancing conflicting policy objectives.

    Directory of Open Access Journals (Sweden)

    T Déirdre Hollingsworth

    Full Text Available Mitigation of a severe influenza pandemic can be achieved using a range of interventions to reduce transmission. Interventions can reduce the impact of an outbreak and buy time until vaccines are developed, but they may have high social and economic costs. The non-linear effect on the epidemic dynamics means that suitable strategies crucially depend on the precise aim of the intervention. National pandemic influenza plans rarely contain clear statements of policy objectives or prioritization of potentially conflicting aims, such as minimizing mortality (depending on the severity of a pandemic or peak prevalence or limiting the socio-economic burden of contact-reducing interventions. We use epidemiological models of influenza A to investigate how contact-reducing interventions and availability of antiviral drugs or pre-pandemic vaccines contribute to achieving particular policy objectives. Our analyses show that the ideal strategy depends on the aim of an intervention and that the achievement of one policy objective may preclude success with others, e.g., constraining peak demand for public health resources may lengthen the duration of the epidemic and hence its economic and social impact. Constraining total case numbers can be achieved by a range of strategies, whereas strategies which additionally constrain peak demand for services require a more sophisticated intervention. If, for example, there are multiple objectives which must be achieved prior to the availability of a pandemic vaccine (i.e., a time-limited intervention, our analysis shows that interventions should be implemented several weeks into the epidemic, not at the very start. This observation is shown to be robust across a range of constraints and for uncertainty in estimates of both R(0 and the timing of vaccine availability. These analyses highlight the need for more precise statements of policy objectives and their assumed consequences when planning and implementing strategies

  13. Expertise for upright faces improves the precision but not the capacity of visual working memory.

    Science.gov (United States)

    Lorenc, Elizabeth S; Pratte, Michael S; Angeloni, Christopher F; Tong, Frank

    2014-10-01

    Considerable research has focused on how basic visual features are maintained in working memory, but little is currently known about the precision or capacity of visual working memory for complex objects. How precisely can an object be remembered, and to what extent might familiarity or perceptual expertise contribute to working memory performance? To address these questions, we developed a set of computer-generated face stimuli that varied continuously along the dimensions of age and gender, and we probed participants' memories using a method-of-adjustment reporting procedure. This paradigm allowed us to separately estimate the precision and capacity of working memory for individual faces, on the basis of the assumptions of a discrete capacity model, and to assess the impact of face inversion on memory performance. We found that observers could maintain up to four to five items on average, with equally good memory capacity for upright and upside-down faces. In contrast, memory precision was significantly impaired by face inversion at every set size tested. Our results demonstrate that the precision of visual working memory for a complex stimulus is not strictly fixed but, instead, can be modified by learning and experience. We find that perceptual expertise for upright faces leads to significant improvements in visual precision, without modifying the capacity of working memory.

  14. Geographic Object-Based Image Analysis – Towards a new paradigm

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J.; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ‘per-pixel paradigm’ and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm. PMID:24623958

  15. Is There Space for the Objective Force?

    National Research Council Canada - National Science Library

    Coffin, Timothy

    2003-01-01

    The Army has launched itself on a daring trajectory toward the Objective Force. It will transform the Army forces into a more lethal and devastating force through the combination of precision weapons and knowledge-based warfare...

  16. Stochastic precision analysis of 2D cardiac strain estimation in vivo

    International Nuclear Information System (INIS)

    Bunting, E A; Provost, J; Konofagou, E E

    2014-01-01

    Ultrasonic strain imaging has been applied to echocardiography and carries great potential to be used as a tool in the clinical setting. Two-dimensional (2D) strain estimation may be useful when studying the heart due to the complex, 3D deformation of the cardiac tissue. Increasing the framerate used for motion estimation, i.e. motion estimation rate (MER), has been shown to improve the precision of the strain estimation, although maintaining the spatial resolution necessary to view the entire heart structure in a single heartbeat remains challenging at high MERs. Two previously developed methods, the temporally unequispaced acquisition sequence (TUAS) and the diverging beam sequence (DBS), have been used in the past to successfully estimate in vivo axial strain at high MERs without compromising spatial resolution. In this study, a stochastic assessment of 2D strain estimation precision is performed in vivo for both sequences at varying MERs (65, 272, 544, 815 Hz for TUAS; 250, 500, 1000, 2000 Hz for DBS). 2D incremental strains were estimated during left ventricular contraction in five healthy volunteers using a normalized cross-correlation function and a least-squares strain estimator. Both sequences were shown capable of estimating 2D incremental strains in vivo. The conditional expected value of the elastographic signal-to-noise ratio (E(SNRe|ε)) was used to compare strain estimation precision of both sequences at multiple MERs over a wide range of clinical strain values. The results here indicate that axial strain estimation precision is much more dependent on MER than lateral strain estimation, while lateral estimation is more affected by strain magnitude. MER should be increased at least above 544 Hz to avoid suboptimal axial strain estimation. Radial and circumferential strain estimations were influenced by the axial and lateral strain in different ways. Furthermore, the TUAS and DBS were found to be of comparable precision at similar MERs. (paper)

  17. Design and algorithm research of high precision airborne infrared touch screen

    Science.gov (United States)

    Zhang, Xiao-Bing; Wang, Shuang-Jie; Fu, Yan; Chen, Zhao-Quan

    2016-10-01

    There are shortcomings of low precision, touch shaking, and sharp decrease of touch precision when emitting and receiving tubes are failure in the infrared touch screen. A high precision positioning algorithm based on extended axis is proposed to solve these problems. First, the unimpeded state of the beam between emitting and receiving tubes is recorded as 0, while the impeded state is recorded as 1. Then, the method of oblique scan is used, in which the light of one emitting tube is used for five receiving tubes. The impeded information of all emitting and receiving tubes is collected as matrix. Finally, according to the method of arithmetic average, the position of the touch object is calculated. The extended axis positioning algorithm is characteristic of high precision in case of failure of individual infrared tube and affects slightly the precision. The experimental result shows that the 90% display area of the touch error is less than 0.25D, where D is the distance between adjacent emitting tubes. The conclusion is gained that the algorithm based on extended axis has advantages of high precision, little impact when individual infrared tube is failure, and using easily.

  18. A decision analysis approach for risk management of near-earth objects

    Science.gov (United States)

    Lee, Robert C.; Jones, Thomas D.; Chapman, Clark R.

    2014-10-01

    Risk management of near-Earth objects (NEOs; e.g., asteroids and comets) that can potentially impact Earth is an important issue that took on added urgency with the Chelyabinsk event of February 2013. Thousands of NEOs large enough to cause substantial damage are known to exist, although only a small fraction of these have the potential to impact Earth in the next few centuries. The probability and location of a NEO impact are subject to complex physics and great uncertainty, and consequences can range from minimal to devastating, depending upon the size of the NEO and location of impact. Deflecting a potential NEO impactor would be complex and expensive, and inter-agency and international cooperation would be necessary. Such deflection campaigns may be risky in themselves, and mission failure may result in unintended consequences. The benefits, risks, and costs of different potential NEO risk management strategies have not been compared in a systematic fashion. We present a decision analysis framework addressing this hazard. Decision analysis is the science of informing difficult decisions. It is inherently multi-disciplinary, especially with regard to managing catastrophic risks. Note that risk analysis clarifies the nature and magnitude of risks, whereas decision analysis guides rational risk management. Decision analysis can be used to inform strategic, policy, or resource allocation decisions. First, a problem is defined, including the decision situation and context. Second, objectives are defined, based upon what the different decision-makers and stakeholders (i.e., participants in the decision) value as important. Third, quantitative measures or scales for the objectives are determined. Fourth, alternative choices or strategies are defined. Fifth, the problem is then quantitatively modeled, including probabilistic risk analysis, and the alternatives are ranked in terms of how well they satisfy the objectives. Sixth, sensitivity analyses are performed in

  19. The Analysis of Object-Based Change Detection in Mining Area: a Case Study with Pingshuo Coal Mine

    Science.gov (United States)

    Zhang, M.; Zhou, W.; Li, Y.

    2017-09-01

    Accurate information on mining land use and land cover change are crucial for monitoring and environmental change studies. In this paper, RapidEye Remote Sensing Image (Map 2012) and SPOT7 Remote Sensing Image (Map 2015) in Pingshuo Mining Area are selected to monitor changes combined with object-based classification and change vector analysis method, we also used R in highresolution remote sensing image for mining land classification, and found the feasibility and the flexibility of open source software. The results show that (1) the classification of reclaimed mining land has higher precision, the overall accuracy and kappa coefficient of the classification of the change region map were 86.67 % and 89.44 %. It's obvious that object-based classification and change vector analysis which has a great significance to improve the monitoring accuracy can be used to monitor mining land, especially reclaiming mining land; (2) the vegetation area changed from 46 % to 40 % accounted for the proportion of the total area from 2012 to 2015, and most of them were transformed into the arable land. The sum of arable land and vegetation area increased from 51 % to 70 %; meanwhile, build-up land has a certain degree of increase, part of the water area was transformed into arable land, but the extent of the two changes is not obvious. The result illustrated the transformation of reclaimed mining area, at the same time, there is still some land convert to mining land, and it shows the mine is still operating, mining land use and land cover are the dynamic procedure.

  20. [Application of target restoration space quantity and quantitative relation in precise esthetic prosthodontics].

    Science.gov (United States)

    Haiyang, Yu; Tian, Luo

    2016-06-01

    Target restoration space (TRS) is the most precise space required for designing optimal prosthesis. TRS consists of an internal or external tooth space to confirm the esthetics and function of the final restoration. Therefore, assisted with quantitive analysis transfer, TRS quantitative analysis is a significant improvement for minimum tooth preparation. This article presents TRS quantity-related measurement, analysis, transfer, and internal relevance of three TR. classifications. Results reveal the close bond between precision and minimally invasive treatment. This study can be used to improve the comprehension and execution of precise esthetic prosthodontics.

  1. Using Epistemic Network Analysis to understand core topics as planned learning objectives

    DEFF Research Database (Denmark)

    Allsopp, Benjamin Brink; Dreyøe, Jonas; Misfeldt, Morten

    Epistemic Network Analysis is a tool developed by the epistemic games group at the University of Wisconsin Madison for tracking the relations between concepts in students discourse (Shaffer 2017). In our current work we are applying this tool to learning objectives in teachers digital preparation....... The danish mathematics curriculum is organised in six competencies and three topics. In the recently implemented learning platforms teacher choose which of the mathematical competencies that serves as objective for a specific lesson or teaching sequence. Hence learning objectives for lessons and teaching...... sequences are defining a network of competencies, where two competencies are closely related of they often are part of the same learning objective or teaching sequence. We are currently using Epistemic Network Analysis to study these networks. In the poster we will include examples of different networks...

  2. Tendency for interlaboratory precision in the GMO analysis method based on real-time PCR.

    Science.gov (United States)

    Kodama, Takashi; Kurosawa, Yasunori; Kitta, Kazumi; Naito, Shigehiro

    2010-01-01

    The Horwitz curve estimates interlaboratory precision as a function only of concentration, and is frequently used as a method performance criterion in food analysis with chemical methods. The quantitative biochemical methods based on real-time PCR require an analogous criterion to progressively promote method validation. We analyzed the tendency of precision using a simplex real-time PCR technique in 53 collaborative studies of seven genetically modified (GM) crops. Reproducibility standard deviation (SR) and repeatability standard deviation (Sr) of the genetically modified organism (GMO) amount (%) was more or less independent of GM crops (i.e., maize, soybean, cotton, oilseed rape, potato, sugar beet, and rice) and evaluation procedure steps. Some studies evaluated whole steps consisting of DNA extraction and PCR quantitation, whereas others focused only on the PCR quantitation step by using DNA extraction solutions. Therefore, SR and Sr for GMO amount (%) are functions only of concentration similar to the Horwitz curve. We proposed S(R) = 0.1971C 0.8685 and S(r) = 0.1478C 0.8424, where C is the GMO amount (%). We also proposed a method performance index in GMO quantitative methods that is analogous to the Horwitz Ratio.

  3. Precision evaluation of dual X-ray absorptiometry (iDXA) measurements

    International Nuclear Information System (INIS)

    Yu Wei; Lin Qiang; Yu Xiaobo; Yao Jinpeng

    2009-01-01

    Objective: To evaluate the precision of the iDXA measurements for lumbar spine, proximal femur and whole body bone density as well as body composition (lean and fat). Methods: The study recruited randomly 30 volunteers. Each subject was scanned by iDXA twice in the same day. Measurement sites included lumbar spine, proximal femur and whole body. Precision errors were expressed as root mean square of CV (RMS-CV). Results: Mean precision errors of bone density measurements at lumbar spine, femoral neck, Ward's triangle, great trochanter and total femur ranged from 0.8% to 2.0%, with the lowest of 0.8% at both lumbar spine and total femur, as well as with the highest of 2.0% at ward' s triangle; Mean precision errors of bone density measurements at whole body and its individual site ranged from 0.7% to 2.0%, with the lowest of 0.7% for the whole body measurement, mean precision errors of lean measurements at whole body and its individual site ranged from 0.6% to 2.1%, with the lowest of 0.6% for the whole body lean measurement; Mean precision errors of fat measurements at whole body and its individual site ranged from 1.0% to 3.2%, with the lowest of 1.0% for the whole body fat measurement. Conclusion: Measurement precision of iDXA at lumbar spine, proximal femur and whole body bone density could meet clinical needs; Precision values of the measurements of whole body and its individual composition may be helpful for future clinical use. (authors)

  4. High-resolution tree canopy mapping for New York City using LIDAR and object-based image analysis

    Science.gov (United States)

    MacFaden, Sean W.; O'Neil-Dunne, Jarlath P. M.; Royar, Anna R.; Lu, Jacqueline W. T.; Rundle, Andrew G.

    2012-01-01

    Urban tree canopy is widely believed to have myriad environmental, social, and human-health benefits, but a lack of precise canopy estimates has hindered quantification of these benefits in many municipalities. This problem was addressed for New York City using object-based image analysis (OBIA) to develop a comprehensive land-cover map, including tree canopy to the scale of individual trees. Mapping was performed using a rule-based expert system that relied primarily on high-resolution LIDAR, specifically its capacity for evaluating the height and texture of aboveground features. Multispectral imagery was also used, but shadowing and varying temporal conditions limited its utility. Contextual analysis was a key part of classification, distinguishing trees according to their physical and spectral properties as well as their relationships to adjacent, nonvegetated features. The automated product was extensively reviewed and edited via manual interpretation, and overall per-pixel accuracy of the final map was 96%. Although manual editing had only a marginal effect on accuracy despite requiring a majority of project effort, it maximized aesthetic quality and ensured the capture of small, isolated trees. Converting high-resolution LIDAR and imagery into usable information is a nontrivial exercise, requiring significant processing time and labor, but an expert system-based combination of OBIA and manual review was an effective method for fine-scale canopy mapping in a complex urban environment.

  5. Analysis of process parameters in surface grinding using single objective Taguchi and multi-objective grey relational grade

    Directory of Open Access Journals (Sweden)

    Prashant J. Patil

    2016-09-01

    Full Text Available Close tolerance and good surface finish are achieved by means of grinding process. This study was carried out for multi-objective optimization of MQL grinding process parameters. Water based Al2O3 and CuO nanofluids of various concentrations are used as lubricant for MQL system. Grinding experiments were carried out on instrumented surface grinding machine. For experimentation purpose Taguchi's method was used. Important process parameters that affect the G ratio and surface finish in MQL grinding are depth of cut, type of lubricant, feed rate, grinding wheel speed, coolant flow rate, and nanoparticle size. Grinding performance was calculated by the measurement G ratio and surface finish. For improvement of grinding process a multi-objective process parameter optimization is performed by use of Taguchi based grey relational analysis. To identify most significant factor of process analysis of variance (ANOVA has been used.

  6. High precision isotopic ratio analysis of volatile metal chelates

    International Nuclear Information System (INIS)

    Hachey, D.L.; Blais, J.C.; Klein, P.D.

    1980-01-01

    High precision isotope ratio measurements have been made for a series of volatile alkaline earth and transition metal chelates using conventional GC/MS instrumentation. Electron ionization was used for alkaline earth chelates, whereas isobutane chemical ionization was used for transition metal studies. Natural isotopic abundances were determined for a series of Mg, Ca, Cr, Fe, Ni, Cu, Cd, and Zn chelates. Absolute accuracy ranged between 0.01 and 1.19 at. %. Absolute precision ranged between +-0.01-0.27 at. % (RSD +- 0.07-10.26%) for elements that contained as many as eight natural isotopes. Calibration curves were prepared using natural abundance metals and their enriched 50 Cr, 60 Ni, and 65 Cu isotopes covering the range 0.1-1010.7 at. % excess. A separate multiple isotope calibration curve was similarly prepared using enriched 60 Ni (0.02-2.15 at. % excess) and 62 Ni (0.23-18.5 at. % excess). The samples were analyzed by GC/CI/MS. Human plasma, containing enriched 26 Mg and 44 Ca, was analyzed by EI/MS. 1 figure, 5 tables

  7. DDASAC, Double-Precision Differential or Algebraic Sensitivity Analysis

    International Nuclear Information System (INIS)

    Caracotsios, M.; Stewart, W.E.; Petzold, L.

    1997-01-01

    1 - Description of program or function: DDASAC solves nonlinear initial-value problems involving stiff implicit systems of ordinary differential and algebraic equations. Purely algebraic nonlinear systems can also be solved, given an initial guess within the region of attraction of a solution. Options include automatic reconciliation of inconsistent initial states and derivatives, automatic initial step selection, direct concurrent parametric sensitivity analysis, and stopping at a prescribed value of any user-defined functional of the current solution vector. Local error control (in the max-norm or the 2-norm) is provided for the state vector and can include the sensitivities on request. 2 - Method of solution: Reconciliation of initial conditions is done with a damped Newton algorithm adapted from Bain and Stewart (1991). Initial step selection is done by the first-order algorithm of Shampine (1987), extended here to differential-algebraic equation systems. The solution is continued with the DASSL predictor- corrector algorithm (Petzold 1983, Brenan et al. 1989) with the initial acceleration phase detected and with row scaling of the Jacobian added. The backward-difference formulas for the predictor and corrector are expressed in divide-difference form, and the fixed-leading-coefficient form of the corrector (Jackson and Sacks-Davis 1980, Brenan et al. 1989) is used. Weights for error tests are updated in each step with the user's tolerances at the predicted state. Sensitivity analysis is performed directly on the corrector equations as given by Catacotsios and Stewart (1985) and is extended here to the initialization when needed. 3 - Restrictions on the complexity of the problem: This algorithm, like DASSL, performs well on differential-algebraic systems of index 0 and 1 but not on higher-index systems; see Brenan et al. (1989). The user assigns the work array lengths and the output unit. The machine number range and precision are determined at run time by a

  8. COSMOS: Carnegie Observatories System for MultiObject Spectroscopy

    Science.gov (United States)

    Oemler, A.; Clardy, K.; Kelson, D.; Walth, G.; Villanueva, E.

    2017-05-01

    COSMOS (Carnegie Observatories System for MultiObject Spectroscopy) reduces multislit spectra obtained with the IMACS and LDSS3 spectrographs on the Magellan Telescopes. It can be used for the quick-look analysis of data at the telescope as well as for pipeline reduction of large data sets. COSMOS is based on a precise optical model of the spectrographs, which allows (after alignment and calibration) an accurate prediction of the location of spectra features. This eliminates the line search procedure which is fundamental to many spectral reduction programs, and allows a robust data pipeline to be run in an almost fully automatic mode, allowing large amounts of data to be reduced with minimal intervention.

  9. Ultra-precision bearings

    CERN Document Server

    Wardle, F

    2015-01-01

    Ultra-precision bearings can achieve extreme accuracy of rotation, making them ideal for use in numerous applications across a variety of fields, including hard disk drives, roundness measuring machines and optical scanners. Ultraprecision Bearings provides a detailed review of the different types of bearing and their properties, as well as an analysis of the factors that influence motion error, stiffness and damping. Following an introduction to basic principles of motion error, each chapter of the book is then devoted to the basic principles and properties of a specific type of bearin

  10. Proton gyromagnetic precision measurement system

    International Nuclear Information System (INIS)

    Zhu Deming; Deming Zhu

    1991-01-01

    A computerized control and measurement system used in the proton gyromagnetic precision meausrement is descirbed. It adopts the CAMAC data acquisition equipment, using on-line control and analysis with the HP85 and PDP-11/60 computer systems. It also adopts the RSX11M computer operation system, and the control software is written in FORTRAN language

  11. Precision machining commercialization

    International Nuclear Information System (INIS)

    1978-01-01

    To accelerate precision machining development so as to realize more of the potential savings within the next few years of known Department of Defense (DOD) part procurement, the Air Force Materials Laboratory (AFML) is sponsoring the Precision Machining Commercialization Project (PMC). PMC is part of the Tri-Service Precision Machine Tool Program of the DOD Manufacturing Technology Five-Year Plan. The technical resources supporting PMC are provided under sponsorship of the Department of Energy (DOE). The goal of PMC is to minimize precision machining development time and cost risk for interested vendors. PMC will do this by making available the high precision machining technology as developed in two DOE contractor facilities, the Lawrence Livermore Laboratory of the University of California and the Union Carbide Corporation, Nuclear Division, Y-12 Plant, at Oak Ridge, Tennessee

  12. Microhartree precision in density functional theory calculations

    Science.gov (United States)

    Gulans, Andris; Kozhevnikov, Anton; Draxl, Claudia

    2018-04-01

    To address ultimate precision in density functional theory calculations we employ the full-potential linearized augmented plane-wave + local-orbital (LAPW + lo) method and justify its usage as a benchmark method. LAPW + lo and two completely unrelated numerical approaches, the multiresolution analysis (MRA) and the linear combination of atomic orbitals, yield total energies of atoms with mean deviations of 0.9 and 0.2 μ Ha , respectively. Spectacular agreement with the MRA is reached also for total and atomization energies of the G2-1 set consisting of 55 molecules. With the example of α iron we demonstrate the capability of LAPW + lo to reach μ Ha /atom precision also for periodic systems, which allows also for the distinction between the numerical precision and the accuracy of a given functional.

  13. Application of INAA to the examination of art objects. Research in Poland

    International Nuclear Information System (INIS)

    Panczyk, E.; Walis, L.; Ligeza, M.

    2000-01-01

    Systematic studies on art objects using instrumental neutron activation analysis and neutron autoradiography have been carried out in the Institute of Nuclear Chemistry and Technology in collaboration with the Faculty of Art Conservation and Restoration of the Academy of Fine Arts in Cracow, as well as with other Academies of Fine Arts and museums in Poland. A number of essential data on the concentration of trace elements particularly in chalk grounds and pigments (such as lead white, lead-tin yellow, smalt), Chinese porcelain, Thai ceramics, as well as in the clay fillings of sarcophagi of Egyptian mummies was accumulated. The above mentioned examination of art objects prior to their conservation helps to determine precisely the materials used in the process of creating art objects, as well as to identify the approximate place of origin of particular materials. (author)

  14. Wireless Sensor Networks for Heritage Object Deformation Detection and Tracking Algorithm

    Directory of Open Access Journals (Sweden)

    Zhijun Xie

    2014-10-01

    Full Text Available Deformation is the direct cause of heritage object collapse. It is significant to monitor and signal the early warnings of the deformation of heritage objects. However, traditional heritage object monitoring methods only roughly monitor a simple-shaped heritage object as a whole, but cannot monitor complicated heritage objects, which may have a large number of surfaces inside and outside. Wireless sensor networks, comprising many small-sized, low-cost, low-power intelligent sensor nodes, are more useful to detect the deformation of every small part of the heritage objects. Wireless sensor networks need an effective mechanism to reduce both the communication costs and energy consumption in order to monitor the heritage objects in real time. In this paper, we provide an effective heritage object deformation detection and tracking method using wireless sensor networks (EffeHDDT. In EffeHDDT, we discover a connected core set of sensor nodes to reduce the communication cost for transmitting and collecting the data of the sensor networks. Particularly, we propose a heritage object boundary detecting and tracking mechanism. Both theoretical analysis and experimental results demonstrate that our EffeHDDT method outperforms the existing methods in terms of network traffic and the precision of the deformation detection.

  15. Sub-cell turning to accomplish micron-level alignment of precision assemblies

    Science.gov (United States)

    Kumler, James J.; Buss, Christian

    2017-08-01

    Higher performance expectations for complex optical systems demand tighter alignment requirements for lens assembly alignment. In order to meet diffraction limited imaging performance over wide spectral bands across the UV and visible wavebands, new manufacturing approaches and tools must be developed if the optical systems will be produced consistently in volume production. This is especially applicable in the field of precision microscope objectives for life science, semiconductor inspection and laser material processing systems. We observe a rising need for the improvement in the optical imaging performance of objective lenses. The key challenge lies in the micron-level decentration and tilt of each lens element. One solution for the production of high quality lens systems is sub-cell assembly with alignment turning. This process relies on an automatic alignment chuck to align the optical axis of a mounted lens to the spindle axis of the machine. Subsequently, the mount is cut with diamond tools on a lathe with respect to the optical axis of the mount. Software controlled integrated measurement technology ensures highest precision. In addition to traditional production processes, further dimensions can be controlled in a very precise manner, e.g. the air gaps between the lenses. Using alignment turning simplifies further alignment steps and reduces the risk of errors. This paper describes new challenges in microscope objective design and manufacturing, and addresses difficulties with standard production processes. A new measurement and alignment technique is described, and strengths and limitations are outlined.

  16. X-ray fluorescence analysis of archaeological finds and art objects: Recognizing gold and gilding

    International Nuclear Information System (INIS)

    Trojek, Tomáš; Hložek, Martin

    2012-01-01

    Many cultural heritage objects were gilded in the past, and nowadays they can be found in archeological excavations or in historical buildings dating back to the Middle Ages, or from the modern period. Old gilded artifacts have been studied using X-ray fluorescence analysis and 2D microanalysis. Several techniques that enable the user to distinguish gold and gilded objects are described and then applied to investigate artifacts. These techniques differ in instrumentation, data analysis and numbers of measurements. The application of Monte Carlo calculation to a quantitative analysis of gilded objects is also introduced. - Highlights: ► Three techniques of gilding identification with XRF analysis are proposed. ► These techniques are applied to gold and gilded art and archeological objects. ► Composition of a substrate material is determined by a Monte Carlo simulation.

  17. Cost and quality effectiveness of objective-based and statistically-based quality control for volatile organic compounds analyses of gases

    International Nuclear Information System (INIS)

    Bennett, J.T.; Crowder, C.A.; Connolly, M.J.

    1994-01-01

    Gas samples from drums of radioactive waste at the Department of Energy (DOE) Idaho National Engineering Laboratory are being characterized for 29 volatile organic compounds to determine the feasibility of storing the waste in DOE's Waste Isolation Pilot Plant (WIPP) in Carlsbad, New Mexico. Quality requirements for the gas chromatography (GC) and GC/mass spectrometry chemical methods used to analyze the waste are specified in the Quality Assurance Program Plan for the WIPP Experimental Waste Characterization Program. Quality requirements consist of both objective criteria (data quality objectives, DQOs) and statistical criteria (process control). The DQOs apply to routine sample analyses, while the statistical criteria serve to determine and monitor precision and accuracy (P ampersand A) of the analysis methods and are also used to assign upper confidence limits to measurement results close to action levels. After over two years and more than 1000 sample analyses there are two general conclusions concerning the two approaches to quality control: (1) Objective criteria (e.g., ± 25% precision, ± 30% accuracy) based on customer needs and the usually prescribed criteria for similar EPA- approved methods are consistently attained during routine analyses. (2) Statistical criteria based on short term method performance are almost an order of magnitude more stringent than objective criteria and are difficult to satisfy following the same routine laboratory procedures which satisfy the objective criteria. A more cost effective and representative approach to establishing statistical method performances criteria would be either to utilize a moving average of P ampersand A from control samples over a several month time period or to determine within a sample variation by one-way analysis of variance of several months replicate sample analysis results or both. Confidence intervals for results near action levels could also be determined by replicate analysis of the sample in

  18. Error analysis of motion correction method for laser scanning of moving objects

    Science.gov (United States)

    Goel, S.; Lohani, B.

    2014-05-01

    The limitation of conventional laser scanning methods is that the objects being scanned should be static. The need of scanning moving objects has resulted in the development of new methods capable of generating correct 3D geometry of moving objects. Limited literature is available showing development of very few methods capable of catering to the problem of object motion during scanning. All the existing methods utilize their own models or sensors. Any studies on error modelling or analysis of any of the motion correction methods are found to be lacking in literature. In this paper, we develop the error budget and present the analysis of one such `motion correction' method. This method assumes availability of position and orientation information of the moving object which in general can be obtained by installing a POS system on board or by use of some tracking devices. It then uses this information along with laser scanner data to apply correction to laser data, thus resulting in correct geometry despite the object being mobile during scanning. The major application of this method lie in the shipping industry to scan ships either moving or parked in the sea and to scan other objects like hot air balloons or aerostats. It is to be noted that the other methods of "motion correction" explained in literature can not be applied to scan the objects mentioned here making the chosen method quite unique. This paper presents some interesting insights in to the functioning of "motion correction" method as well as a detailed account of the behavior and variation of the error due to different sensor components alone and in combination with each other. The analysis can be used to obtain insights in to optimal utilization of available components for achieving the best results.

  19. Evaluation and analysis of real-time precise orbits and clocks products from different IGS analysis centers

    Science.gov (United States)

    Zhang, Liang; Yang, Hongzhou; Gao, Yang; Yao, Yibin; Xu, Chaoqian

    2018-06-01

    To meet the increasing demands from the real-time Precise Point Positioning (PPP) users, the real-time satellite orbit and clock products are generated by different International GNSS Service (IGS) real-time analysis centers and can be publicly received through the Internet. Based on different data sources and processing strategies, the real-time products from different analysis centers therefore differ in availability and accuracy. The main objective of this paper is to evaluate availability and accuracy of different real-time products and their effects on real-time PPP. A total of nine commonly used Real-Time Service (RTS) products, namely IGS01, IGS03, CLK01, CLK15, CLK22, CLK52, CLK70, CLK81 and CLK90, will be evaluated in this paper. Because not all RTS products support multi-GNSS, only GPS products are analyzed in this paper. Firstly, the availability of all RTS products is analyzed in two levels. The first level is the epoch availability, indicating whether there is outage for that epoch. The second level is the satellite availability, which defines the available satellite number for each epoch. Then the accuracy of different RTS products is investigated on nominal accuracy and the accuracy degradation over time. Results show that Root-Mean-Square Error (RMSE) of satellite orbit ranges from 3.8 cm to 7.5 cm for different RTS products. While the mean Standard Deviations of Errors (STDE) of satellite clocks range from 1.9 cm to 5.6 cm. The modified Signal In Space Range Error (SISRE) for all products are from 1.3 cm to 5.5 cm for different RTS products. The accuracy degradation of the orbit has the linear trend for all RTS products and the satellite clock degradation depends on the satellite clock types. The Rb clocks on board of GPS IIF satellites have the smallest degradation rate of less than 3 cm over 10 min while the Cs clocks on board of GPS IIF have the largest degradation rate of more than 10 cm over 10 min. Finally, the real-time kinematic PPP is

  20. Analysis and design of the SI-simulator software system for the VHTR-SI process by using the object-oriented analysis and object-oriented design methodology

    International Nuclear Information System (INIS)

    Chang, Jiwoon; Shin, Youngjoon; Kim, Jihwan; Lee, Kiyoung; Lee, Wonjae; Chang, Jonghwa; Youn, Cheung

    2008-01-01

    The SI-simulator is an application software system that simulates the dynamic behavior of the VHTR-SI process by the use of mathematical models. Object-oriented analysis (OOA) and object-oriented design (OOD) methodologies were employed for the SI simulator system development. OOA is concerned with developing software engineering requirements and specifications that are expressed as a system's object model (which is composed of a population of interacting objects), as opposed to the traditional data or functional views of systems. OOD techniques are useful for the development of large complex systems. Also, OOA/OOD methodology is usually employed to maximize the reusability and extensibility of a software system. In this paper, we present a design feature for the SI simulator software system by the using methodologies of OOA and OOD

  1. Laser technology for high precision satellite tracking

    Science.gov (United States)

    Plotkin, H. H.

    1974-01-01

    Fixed and mobile laser ranging stations have been developed to track satellites equipped with retro-reflector arrays. These have operated consistently at data rates of once per second with range precision better than 50 cm, using Q-switched ruby lasers with pulse durations of 20 to 40 nanoseconds. Improvements are being incorporated to improve the precision to 10 cm, and to permit ranging to more distant satellites. These include improved reflector array designs, processing and analysis of the received reflection pulses, and use of sub-nanosecond pulse duration lasers.

  2. Joint Estimation of Multiple Precision Matrices with Common Structures.

    Science.gov (United States)

    Lee, Wonyul; Liu, Yufeng

    Estimation of inverse covariance matrices, known as precision matrices, is important in various areas of statistical analysis. In this article, we consider estimation of multiple precision matrices sharing some common structures. In this setting, estimating each precision matrix separately can be suboptimal as it ignores potential common structures. This article proposes a new approach to parameterize each precision matrix as a sum of common and unique components and estimate multiple precision matrices in a constrained l 1 minimization framework. We establish both estimation and selection consistency of the proposed estimator in the high dimensional setting. The proposed estimator achieves a faster convergence rate for the common structure in certain cases. Our numerical examples demonstrate that our new estimator can perform better than several existing methods in terms of the entropy loss and Frobenius loss. An application to a glioblastoma cancer data set reveals some interesting gene networks across multiple cancer subtypes.

  3. Some new mathematical methods for variational objective analysis

    Science.gov (United States)

    Wahba, Grace; Johnson, Donald R.

    1994-01-01

    Numerous results were obtained relevant to remote sensing, variational objective analysis, and data assimilation. A list of publications relevant in whole or in part is attached. The principal investigator gave many invited lectures, disseminating the results to the meteorological community as well as the statistical community. A list of invited lectures at meetings is attached, as well as a list of departmental colloquia at various universities and institutes.

  4. Theoretical Analysis of Heat Stress Prefabricating the Crack in Precision Cropping

    Directory of Open Access Journals (Sweden)

    Lijun Zhang

    2013-07-01

    Full Text Available The mathematical model of the metal bar in course of heat treatment is built by regarding the convective heat transfer process of the metal bar as the heat conduction boundary condition. By the theory analysis and numerical simulation methods, the theoretical expression of unsteady multidimensional temperature field for the axisymmetric model of metal bar is obtained. Temperature field distribution of bar V-shaped notch equivalent tip is given by ANSYS software. The quantitative relationship between temperature of bar inner key points and the time is determined. Through the polynomial curve fitting, the relation between the ultimate strength and the temperature is also given. Based on it, the influences of the width of the adiabatic boundary and water velocity on the critical temperature gradient of germinating heat crack in the tip of V-shaped notch are analyzed. The experimental results in precision cropping show that the expression of unsteady multidimensional temperature field is feasible in the rapid calculation of crack generation.

  5. Statistical precision of delayed-neutron nondestructive assay techniques

    International Nuclear Information System (INIS)

    Bayne, C.K.; McNeany, S.R.

    1979-02-01

    A theoretical analysis of the statistical precision of delayed-neutron nondestructive assay instruments is presented. Such instruments measure the fissile content of nuclear fuel samples by neutron irradiation and delayed-neutron detection. The precision of these techniques is limited by the statistical nature of the nuclear decay process, but the precision can be optimized by proper selection of system operating parameters. Our method is a three-part analysis. We first present differential--difference equations describing the fundamental physics of the measurements. We then derive and present complete analytical solutions to these equations. Final equations governing the expected number and variance of delayed-neutron counts were computer programmed to calculate the relative statistical precision of specific system operating parameters. Our results show that Poisson statistics do not govern the number of counts accumulated in multiple irradiation-count cycles and that, in general, maximum count precision does not correspond with maximum count as first expected. Covariance between the counts of individual cycles must be considered in determining the optimum number of irradiation-count cycles and the optimum irradiation-to-count time ratio. For the assay system in use at ORNL, covariance effects are small, but for systems with short irradiation-to-count transition times, covariance effects force the optimum number of irradiation-count cycles to be half those giving maximum count. We conclude that the equations governing the expected value and variance of delayed-neutron counts have been derived in closed form. These have been computerized and can be used to select optimum operating parameters for delayed-neutron assay devices

  6. Mediman: Object oriented programming approach for medical image analysis

    International Nuclear Information System (INIS)

    Coppens, A.; Sibomana, M.; Bol, A.; Michel, C.

    1993-01-01

    Mediman is a new image analysis package which has been developed to analyze quantitatively Positron Emission Tomography (PET) data. It is object-oriented, written in C++ and its user interface is based on InterViews on top of which new classes have been added. Mediman accesses data using external data representation or import/export mechanism which avoids data duplication. Multimodality studies are organized in a simple database which includes images, headers, color tables, lists and objects of interest (OOI's) and history files. Stored color table parameters allow to focus directly on the interesting portion of the dynamic range. Lists allow to organize the study according to modality, acquisition protocol, time and spatial properties. OOI's (points, lines and regions) are stored in absolute 3-D coordinates allowing correlation with other co-registered imaging modalities such as MRI or SPECT. OOI's have visualization properties and are organized into groups. Quantitative ROI analysis of anatomic images consists of position, distance, volume calculation on selected OOI's. An image calculator is connected to mediman. Quantitation of metabolic images is performed via profiles, sectorization, time activity curves and kinetic modeling. Mediman is menu and mouse driven, macro-commands can be registered and replayed. Its interface is customizable through a configuration file. The benefit of the object-oriented approach are discussed from a development point of view

  7. Why precision?

    Energy Technology Data Exchange (ETDEWEB)

    Bluemlein, Johannes

    2012-05-15

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  8. Why precision?

    International Nuclear Information System (INIS)

    Bluemlein, Johannes

    2012-05-01

    Precision measurements together with exact theoretical calculations have led to steady progress in fundamental physics. A brief survey is given on recent developments and current achievements in the field of perturbative precision calculations in the Standard Model of the Elementary Particles and their application in current high energy collider data analyses.

  9. Computer-determined assay time based on preset precision

    International Nuclear Information System (INIS)

    Foster, L.A.; Hagan, R.; Martin, E.R.; Wachter, J.R.; Bonner, C.A.; Malcom, J.E.

    1994-01-01

    Most current assay systems for special nuclear materials (SNM) operate on the principle of a fixed assay time which provides acceptable measurement precision without sacrificing the required throughput of the instrument. Waste items to be assayed for SNM content can contain a wide range of nuclear material. Counting all items for the same preset assay time results in a wide range of measurement precision and wastes time at the upper end of the calibration range. A short time sample taken at the beginning of the assay could optimize the analysis time on the basis of the required measurement precision. To illustrate the technique of automatically determining the assay time, measurements were made with a segmented gamma scanner at the Plutonium Facility of Los Alamos National Laboratory with the assay time for each segment determined by counting statistics in that segment. Segments with very little SNM were quickly determined to be below the lower limit of the measurement range and the measurement was stopped. Segments with significant SNM were optimally assays to the preset precision. With this method the total assay time for each item is determined by the desired preset precision. This report describes the precision-based algorithm and presents the results of measurements made to test its validity

  10. Precision mechatronics based on high-precision measuring and positioning systems and machines

    Science.gov (United States)

    Jäger, Gerd; Manske, Eberhard; Hausotte, Tino; Mastylo, Rostyslav; Dorozhovets, Natalja; Hofmann, Norbert

    2007-06-01

    Precision mechatronics is defined in the paper as the science and engineering of a new generation of high precision systems and machines. Nanomeasuring and nanopositioning engineering represents important fields of precision mechatronics. The nanometrology is described as the today's limit of the precision engineering. The problem, how to design nanopositioning machines with uncertainties as small as possible will be discussed. The integration of several optical and tactile nanoprobes makes the 3D-nanopositioning machine suitable for various tasks, such as long range scanning probe microscopy, mask and wafer inspection, nanotribology, nanoindentation, free form surface measurement as well as measurement of microoptics, precision molds, microgears, ring gauges and small holes.

  11. A precision nutrient variability study of an experimental plot in ...

    African Journals Online (AJOL)

    DR F O ADEKAYODE

    reported (Sadeghi et al., 2006; Shah et al., 2013). The objective of the research was to use the GIS kriging technique to produce precision soil nutrient concentration and fertility maps of a 2.5-ha experimental land in Mukono Agricultural Research and Development. Institute Mukono, Uganda. MATERIALS AND METHODS.

  12. An Assessment of Imaging Informatics for Precision Medicine in Cancer.

    Science.gov (United States)

    Chennubhotla, C; Clarke, L P; Fedorov, A; Foran, D; Harris, G; Helton, E; Nordstrom, R; Prior, F; Rubin, D; Saltz, J H; Shalley, E; Sharma, A

    2017-08-01

    Objectives: Precision medicine requires the measurement, quantification, and cataloging of medical characteristics to identify the most effective medical intervention. However, the amount of available data exceeds our current capacity to extract meaningful information. We examine the informatics needs to achieve precision medicine from the perspective of quantitative imaging and oncology. Methods: The National Cancer Institute (NCI) organized several workshops on the topic of medical imaging and precision medicine. The observations and recommendations are summarized herein. Results: Recommendations include: use of standards in data collection and clinical correlates to promote interoperability; data sharing and validation of imaging tools; clinician's feedback in all phases of research and development; use of open-source architecture to encourage reproducibility and reusability; use of challenges which simulate real-world situations to incentivize innovation; partnership with industry to facilitate commercialization; and education in academic communities regarding the challenges involved with translation of technology from the research domain to clinical utility and the benefits of doing so. Conclusions: This article provides a survey of the role and priorities for imaging informatics to help advance quantitative imaging in the era of precision medicine. While these recommendations were drawn from oncology, they are relevant and applicable to other clinical domains where imaging aids precision medicine. Georg Thieme Verlag KG Stuttgart.

  13. Video-rate or high-precision: a flexible range imaging camera

    Science.gov (United States)

    Dorrington, Adrian A.; Cree, Michael J.; Carnegie, Dale A.; Payne, Andrew D.; Conroy, Richard M.; Godbaz, John P.; Jongenelen, Adrian P. P.

    2008-02-01

    A range imaging camera produces an output similar to a digital photograph, but every pixel in the image contains distance information as well as intensity. This is useful for measuring the shape, size and location of objects in a scene, hence is well suited to certain machine vision applications. Previously we demonstrated a heterodyne range imaging system operating in a relatively high resolution (512-by-512) pixels and high precision (0.4 mm best case) configuration, but with a slow measurement rate (one every 10 s). Although this high precision range imaging is useful for some applications, the low acquisition speed is limiting in many situations. The system's frame rate and length of acquisition is fully configurable in software, which means the measurement rate can be increased by compromising precision and image resolution. In this paper we demonstrate the flexibility of our range imaging system by showing examples of high precision ranging at slow acquisition speeds and video-rate ranging with reduced ranging precision and image resolution. We also show that the heterodyne approach and the use of more than four samples per beat cycle provides better linearity than the traditional homodyne quadrature detection approach. Finally, we comment on practical issues of frame rate and beat signal frequency selection.

  14. Improvement in precision and trueness of quantitative XRF analysis with glass-bead method. 1

    International Nuclear Information System (INIS)

    Yamamoto, Yasuyuki; Ogasawara, Noriko; Yuhara, Yoshitaroh; Yokoyama, Yuichi

    1995-01-01

    The factors which lower the precisions of simultaneous X-ray Fluorescence (XRF) spectrometer were investigated. Especially in quantitative analyses of oxide powders with glass-bead method, X-ray optical characteristics of the equipment affects the precision of the X-ray intensities. In focused (curved) crystal spectrometers, the precision depends on the deviation of the actual size and position of the crystals from those of theoretical designs, thus the precision differs for each crystal for each element. When the deviation is large, a dispersion of the measured X-ray intensities is larger than the statistical dispersion, even though the intensity itself keeps unchanged. Moreover, a waviness of the surface of glass-beads makes the difference of the height of an analyzed surface from that of the designed one. This difference makes the change of the amount of the X-ray incident on the analyzing crystal and makes the dispersion of the X-ray intensity larger. Considering these factors, a level of the waviness must be regulated to improve the precision under exsisting XRF equipments. In this study, measurement precisions of 4 simultaneous XRF spectrometers were evaluated, and the element lead (Pb-Lβ1) was found to have the lowest precision. Relative standard deviation (RSD) of the measurements of 10 glass-beads for the same powder sample was 0.3% without the regulation of the waviness of analytical surface. With mechanical flattening of the glass-bead surface, the level of waviness, which is the maximum difference of the heights in a glass-bead, was regulated as under 30 μm, RSD was 0.038%, which is almost comparable to the statistical RSD 0.033%. (author)

  15. Weak gravitational lensing towards high-precision cosmology

    International Nuclear Information System (INIS)

    Berge, Joel

    2007-01-01

    This thesis aims at studying weak gravitational lensing as a tool for high-precision cosmology. We first present the development and validation of a precise and accurate tool for measuring gravitational shear, based on the shapelets formalism. We then use shapelets on real images for the first time, we analyze CFHTLS images, and combine them with XMM-LSS data. We measure the normalisation of the density fluctuations power spectrum σ 8 , and the one of the mass-temperature relation for galaxy clusters. The analysis of the Hubble space telescope COSMOS field confirms our σ 8 measurement and introduces tomography. Finally, aiming at optimizing future surveys, we compare the individual and combined merits of cluster counts and power spectrum tomography. Our results demonstrate that next generation surveys will allow weak lensing to yield its full potential in the high-precision cosmology era. (author) [fr

  16. Scout: orbit analysis and hazard assessment for NEOCP objects

    Science.gov (United States)

    Farnocchia, Davide; Chesley, Steven R.; Chamberlin, Alan B.

    2016-10-01

    It typically takes a few days for a newly discovered asteroid to be officially recognized as a real object. During this time, the tentative discovery is published on the Minor Planet Center's Near-Earth Object Confirmation Page (NEOCP) until additional observations confirm that the object is a real asteroid rather than an observational artifact or an artificial object. Also, NEOCP objects could have a limited observability window and yet be scientifically interesting, e.g., radar and lightcurve targets, mini-moons (temporary Earth captures), mission accessible targets, close approachers or even impactors. For instance, the only two asteroids discovered before an impact, 2008 TC3 and 2014 AA, both reached the Earth less than a day after discovery. For these reasons we developed Scout, an automated system that provides an orbital and hazard assessment for NEOCP objects within minutes after the observations are available. Scout's rapid analysis increases the chances of securing the trajectory of interesting NEOCP objects before the ephemeris uncertainty grows too large or the observing geometry becomes unfavorable. The generally short observation arcs, perhaps only a few hours or even less, lead severe degeneracies in the orbit estimation process. To overcome these degeneracies Scout relies on systematic ranging, a technique that derives possible orbits by scanning a grid in the poorly constrained space of topocentric range and range rate, while the plane-of-sky position and motion are directly tied to the recorded observations. This scan allows us to derive a distribution of the possible orbits and in turn identify the NEOCP objects of most interest to prioritize followup efforts. In particular, Scout ranks objects according to the likelihood of an impact, estimates the close approach distance, the Earth-relative minimum orbit intersection distance and v-infinity, and computes scores to identify objects more likely to be an NEO, a km-sized NEO, a Potentially

  17. Precision Rescue Behavior in North American Ants

    Directory of Open Access Journals (Sweden)

    Katherine Taylor

    2013-07-01

    Full Text Available Altruistic behavior, in which one individual provides aid to another at some cost to itself, is well documented. However, some species engage in a form of altruism, called rescue, that places the altruist in immediate danger. Here we investigate one such example, namely rescuing victims captured by predators. In a field experiment with two North American ant species, Tetramorium sp. E and Prenolepis imparis, individuals were held in artificial snares simulating capture. T. sp. E, but not P. imparis, exhibited digging, pulling, and snare biting, the latter precisely targeted to the object binding the victim. These results are the first to document precision rescue in a North American ant species; moreover, unlike rescue in other ants, T. sp. E rescues conspecifics from different colonies, mirroring their atypical social behavior, namely the lack of aggression between non-nestmate (heterocolonial conspecifics. In a second, observational study designed to demonstrate rescue from an actual predator, T. sp. E victims were dropped into an antlion's pit and the behavior of a single rescuer was observed. Results showed that T. sp. E not only attempted to release the victim, but also risked attacking the predator, suggesting that precision rescue may play an important role in this species' antipredator behavior.

  18. Conflicting Multi-Objective Compatible Optimization Control

    OpenAIRE

    Xu, Lihong; Hu, Qingsong; Hu, Haigen; Goodman, Erik

    2010-01-01

    Based on ideas developed in addressing practical greenhouse environmental control, we propose a new multi-objective compatible control method. Several detailed algorithms are proposed to meet the requirements of different kinds of problem: 1) A two-layer MOCC framework is presented for problems with a precise model; 2) To deal with situations

  19. Fast and precise method of contingency ranking in modern power system

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2011-01-01

    Contingency Analysis is one of the most important aspect of Power System Security Analysis. This paper presents a fast and precise method of contingency ranking for effective power system security analysis. The method proposed in this research work takes due consideration of both apparent power o...... is based on realistic approach taking practical situations into account. Besides taking real situations into consideration the proposed method is fast enough to be considered for on-line security analysis.......Contingency Analysis is one of the most important aspect of Power System Security Analysis. This paper presents a fast and precise method of contingency ranking for effective power system security analysis. The method proposed in this research work takes due consideration of both apparent power...

  20. Precision-analog fiber-optic transmission system

    International Nuclear Information System (INIS)

    Stover, G.

    1981-06-01

    This article describes the design, experimental development, and construction of a DC-coupled precision analog fiber optic link. Topics to be covered include overall electrical and mechanical system parameters, basic circuit organization, modulation format, optical system design, optical receiver circuit analysis, and the experimental verification of the major design parameters

  1. Objective image analysis of the meibomian gland area.

    Science.gov (United States)

    Arita, Reiko; Suehiro, Jun; Haraguchi, Tsuyoshi; Shirakawa, Rika; Tokoro, Hideaki; Amano, Shiro

    2014-06-01

    To evaluate objectively the meibomian gland area using newly developed software for non-invasive meibography. Eighty eyelids of 42 patients without meibomian gland loss (meiboscore=0), 105 eyelids of 57 patients with loss of less than one-third total meibomian gland area (meiboscore=1), 13 eyelids of 11 patients with between one-third and two-thirds loss of meibomian gland area (meiboscore=2) and 20 eyelids of 14 patients with two-thirds loss of meibomian gland area (meiboscore=3) were studied. Lid borders were automatically determined. The software evaluated the distribution of the luminance and, by enhancing the contrast and reducing image noise, the meibomian gland area was automatically discriminated. The software calculated the ratio of the total meibomian gland area relative to the total analysis area in all subjects. Repeatability of the software was also evaluated. The mean ratio of the meibomian gland area to the total analysis area in the upper/lower eyelids was 51.9±5.7%/54.7±5.4% in subjects with a meiboscore of 0, 47.7±6.0%/51.5±5.4% in those with a meiboscore of 1, 32.0±4.4%/37.2±3.5% in those with a meiboscore of 2 and 16.7±6.4%/19.5±5.8% in subjects with a meiboscore of 3. The meibomian gland area was objectively evaluated using the developed software. This system could be useful for objectively evaluating the effect of treatment on meibomian gland dysfunction. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Virtual learning object and environment: a concept analysis.

    Science.gov (United States)

    Salvador, Pétala Tuani Candido de Oliveira; Bezerril, Manacés Dos Santos; Mariz, Camila Maria Santos; Fernandes, Maria Isabel Domingues; Martins, José Carlos Amado; Santos, Viviane Euzébia Pereira

    2017-01-01

    To analyze the concept of virtual learning object and environment according to Rodgers' evolutionary perspective. Descriptive study with a mixed approach, based on the stages proposed by Rodgers in his concept analysis method. Data collection occurred in August 2015 with the search of dissertations and theses in the Bank of Theses of the Coordination for the Improvement of Higher Education Personnel. Quantitative data were analyzed based on simple descriptive statistics and the concepts through lexicographic analysis with support of the IRAMUTEQ software. The sample was made up of 161 studies. The concept of "virtual learning environment" was presented in 99 (61.5%) studies, whereas the concept of "virtual learning object" was presented in only 15 (9.3%) studies. A virtual learning environment includes several and different types of virtual learning objects in a common pedagogical context. Analisar o conceito de objeto e de ambiente virtual de aprendizagem na perspectiva evolucionária de Rodgers. Estudo descritivo, de abordagem mista, realizado a partir das etapas propostas por Rodgers em seu modelo de análise conceitual. A coleta de dados ocorreu em agosto de 2015 com a busca de dissertações e teses no Banco de Teses e Dissertações da Coordenação de Aperfeiçoamento de Pessoal de Nível Superior. Os dados quantitativos foram analisados a partir de estatística descritiva simples e os conceitos pela análise lexicográfica com suporte do IRAMUTEQ. A amostra é constituída de 161 estudos. O conceito de "ambiente virtual de aprendizagem" foi apresentado em 99 (61,5%) estudos, enquanto o de "objeto virtual de aprendizagem" em apenas 15 (9,3%). Concluiu-se que um ambiente virtual de aprendizagem reúne vários e diferentes tipos de objetos virtuais de aprendizagem em um contexto pedagógico comum.

  3. Analysis on the precision of the dimensions of self-ligating brackets.

    Science.gov (United States)

    Erduran, Rackel Hatice Milhomens Gualberto; Maeda, Fernando Akio; Ortiz, Sandra Regina Mota; Triviño, Tarcila; Fuziy, Acácio; Carvalho, Paulo Eduardo Guedes

    2016-12-01

    The present study aimed to evaluate the precision of the torque applied by 0.022" self-ligating brackets of different brands, the precision of parallelism between the inner walls of their slots, and precision of their slot height. Eighty brackets for upper central incisors of eight trademarked models were selected: Abzil, GAC, American Orthodontics, Morelli, Orthometric, Ormco, Forestadent, and Ortho Organizers. Images of the brackets were obtained using a scanning electron microscope (SEM) and these were measured using the AutoCAD 2011 software. The tolerance parameters stated in the ISO 27020 standard were used as references. The results showed that only the Orthometric, Morelli, and Ormco groups showed results inconsistent with the ISO standard. Regarding the parallelism of the internal walls of the slots, most of the models studied had results in line with the ISO prescription, except the Morelli group. In assessing bracket slot height, only the Forestadent, GAC, American Orthodontics, and Ormco groups presented results in accordance with the ISO standard. The GAC, Forestadent, and American Orthodontics groups did not differ in relation to the three factors of the ISO 27020 standard. Great variability of results is observed in relation to all the variables. © 2016 Wiley Periodicals, Inc.

  4. Accuracy and precision of 3 intraoral scanners and accuracy of conventional impressions: A novel in vivo analysis method.

    Science.gov (United States)

    Nedelcu, R; Olsson, P; Nyström, I; Rydén, J; Thor, A

    2018-02-01

    To evaluate a novel methodology using industrial scanners as a reference, and assess in vivo accuracy of 3 intraoral scanners (IOS) and conventional impressions. Further, to evaluate IOS precision in vivo. Four reference-bodies were bonded to the buccal surfaces of upper premolars and incisors in five subjects. After three reference-scans, ATOS Core 80 (ATOS), subjects were scanned three times with three IOS systems: 3M True Definition (3M), CEREC Omnicam (OMNI) and Trios 3 (TRIOS). One conventional impression (IMPR) was taken, 3M Impregum Penta Soft, and poured models were digitized with laboratory scanner 3shape D1000 (D1000). Best-fit alignment of reference-bodies and 3D Compare Analysis was performed. Precision of ATOS and D1000 was assessed for quantitative evaluation and comparison. Accuracy of IOS and IMPR were analyzed using ATOS as reference. Precision of IOS was evaluated through intra-system comparison. Precision of ATOS reference scanner (mean 0.6 μm) and D1000 (mean 0.5 μm) was high. Pairwise multiple comparisons of reference-bodies located in different tooth positions displayed a statistically significant difference of accuracy between two scanner-groups: 3M and TRIOS, over OMNI (p value range 0.0001 to 0.0006). IMPR did not show any statistically significant difference to IOS. However, deviations of IOS and IMPR were within a similar magnitude. No statistical difference was found for IOS precision. The methodology can be used for assessing accuracy of IOS and IMPR in vivo in up to five units bilaterally from midline. 3M and TRIOS had a higher accuracy than OMNI. IMPR overlapped both groups. Intraoral scanners can be used as a replacement for conventional impressions when restoring up to ten units without extended edentulous spans. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Practical precision measurement

    International Nuclear Information System (INIS)

    Kwak, Ho Chan; Lee, Hui Jun

    1999-01-01

    This book introduces basic knowledge of precision measurement, measurement of length, precision measurement of minor diameter, measurement of angles, measurement of surface roughness, three dimensional measurement, measurement of locations and shapes, measurement of screw, gear testing, cutting tools testing, rolling bearing testing, and measurement of digitalisation. It covers height gauge, how to test surface roughness, measurement of plan and straightness, external and internal thread testing, gear tooth measurement, milling cutter, tab, rotation precision measurement, and optical transducer.

  6. A high precision method for quantitative measurements of reactive oxygen species in frozen biopsies.

    Directory of Open Access Journals (Sweden)

    Kirsti Berg

    Full Text Available OBJECTIVE: An electron paramagnetic resonance (EPR technique using the spin probe cyclic hydroxylamine 1-hydroxy-3-methoxycarbonyl-2,2,5,5-tetramethylpyrrolidine (CMH was introduced as a versatile method for high precision quantification of reactive oxygen species, including the superoxide radical in frozen biological samples such as cell suspensions, blood or biopsies. MATERIALS AND METHODS: Loss of measurement precision and accuracy due to variations in sample size and shape were minimized by assembling the sample in a well-defined volume. Measurement was carried out at low temperature (150 K using a nitrogen flow Dewar. The signal intensity was measured from the EPR 1st derivative amplitude, and related to a sample, 3-carboxy-proxyl (CP• with known spin concentration. RESULTS: The absolute spin concentration could be quantified with a precision and accuracy better than ±10 µM (k = 1. The spin concentration of samples stored at -80°C could be reproduced after 6 months of storage well within the same error estimate. CONCLUSION: The absolute spin concentration in wet biological samples such as biopsies, water solutions and cell cultures could be quantified with higher precision and accuracy than normally achievable using common techniques such as flat cells, tissue cells and various capillary tubes. In addition; biological samples could be collected and stored for future incubation with spin probe, and also further stored up to at least six months before EPR analysis, without loss of signal intensity. This opens for the possibility to store and transport incubated biological samples with known accuracy of the spin concentration over time.

  7. Shape Analysis of Planar Multiply-Connected Objects Using Conformal Welding.

    Science.gov (United States)

    Lok Ming Lui; Wei Zeng; Shing-Tung Yau; Xianfeng Gu

    2014-07-01

    Shape analysis is a central problem in the field of computer vision. In 2D shape analysis, classification and recognition of objects from their observed silhouettes are extremely crucial but difficult. It usually involves an efficient representation of 2D shape space with a metric, so that its mathematical structure can be used for further analysis. Although the study of 2D simply-connected shapes has been subject to a corpus of literatures, the analysis of multiply-connected shapes is comparatively less studied. In this work, we propose a representation for general 2D multiply-connected domains with arbitrary topologies using conformal welding. A metric can be defined on the proposed representation space, which gives a metric to measure dissimilarities between objects. The main idea is to map the exterior and interior of the domain conformally to unit disks and circle domains (unit disk with several inner disks removed), using holomorphic 1-forms. A set of diffeomorphisms of the unit circle S(1) can be obtained, which together with the conformal modules are used to define the shape signature. A shape distance between shape signatures can be defined to measure dissimilarities between shapes. We prove theoretically that the proposed shape signature uniquely determines the multiply-connected objects under suitable normalization. We also introduce a reconstruction algorithm to obtain shapes from their signatures. This completes our framework and allows us to move back and forth between shapes and signatures. With that, a morphing algorithm between shapes can be developed through the interpolation of the Beltrami coefficients associated with the signatures. Experiments have been carried out on shapes extracted from real images. Results demonstrate the efficacy of our proposed algorithm as a stable shape representation scheme.

  8. Performance Analysis of Several GPS/Galileo Precise Point Positioning Models.

    Science.gov (United States)

    Afifi, Akram; El-Rabbany, Ahmed

    2015-06-19

    This paper examines the performance of several precise point positioning (PPP) models, which combine dual-frequency GPS/Galileo observations in the un-differenced and between-satellite single-difference (BSSD) modes. These include the traditional un-differenced model, the decoupled clock model, the semi-decoupled clock model, and the between-satellite single-difference model. We take advantage of the IGS-MGEX network products to correct for the satellite differential code biases and the orbital and satellite clock errors. Natural Resources Canada's GPSPace PPP software is modified to handle the various GPS/Galileo PPP models. A total of six data sets of GPS and Galileo observations at six IGS stations are processed to examine the performance of the various PPP models. It is shown that the traditional un-differenced GPS/Galileo PPP model, the GPS decoupled clock model, and the semi-decoupled clock GPS/Galileo PPP model improve the convergence time by about 25% in comparison with the un-differenced GPS-only model. In addition, the semi-decoupled GPS/Galileo PPP model improves the solution precision by about 25% compared to the traditional un-differenced GPS/Galileo PPP model. Moreover, the BSSD GPS/Galileo PPP model improves the solution convergence time by about 50%, in comparison with the un-differenced GPS PPP model, regardless of the type of BSSD combination used. As well, the BSSD model improves the precision of the estimated parameters by about 50% and 25% when the loose and the tight combinations are used, respectively, in comparison with the un-differenced GPS-only model. Comparable results are obtained through the tight combination when either a GPS or a Galileo satellite is selected as a reference.

  9. Memory color of natural familiar objects: effects of surface texture and 3-D shape.

    Science.gov (United States)

    Vurro, Milena; Ling, Yazhu; Hurlbert, Anya C

    2013-06-28

    Natural objects typically possess characteristic contours, chromatic surface textures, and three-dimensional shapes. These diagnostic features aid object recognition, as does memory color, the color most associated in memory with a particular object. Here we aim to determine whether polychromatic surface texture, 3-D shape, and contour diagnosticity improve memory color for familiar objects, separately and in combination. We use solid three-dimensional familiar objects rendered with their natural texture, which participants adjust in real time to match their memory color for the object. We analyze mean, accuracy, and precision of the memory color settings relative to the natural color of the objects under the same conditions. We find that in all conditions, memory colors deviate slightly but significantly in the same direction from the natural color. Surface polychromaticity, shape diagnosticity, and three dimensionality each improve memory color accuracy, relative to uniformly colored, generic, or two-dimensional shapes, respectively. Shape diagnosticity improves the precision of memory color also, and there is a trend for polychromaticity to do so as well. Differently from other studies, we find that the object contour alone also improves memory color. Thus, enhancing the naturalness of the stimulus, in terms of either surface or shape properties, enhances the accuracy and precision of memory color. The results support the hypothesis that memory color representations are polychromatic and are synergistically linked with diagnostic shape representations.

  10. Real-time GPS seismology using a single receiver: method comparison, error analysis and precision validation

    Science.gov (United States)

    Li, Xingxing

    2014-05-01

    Earthquake monitoring and early warning system for hazard assessment and mitigation has traditional been based on seismic instruments. However, for large seismic events, it is difficult for traditional seismic instruments to produce accurate and reliable displacements because of the saturation of broadband seismometers and problematic integration of strong-motion data. Compared with the traditional seismic instruments, GPS can measure arbitrarily large dynamic displacements without saturation, making them particularly valuable in case of large earthquakes and tsunamis. GPS relative positioning approach is usually adopted to estimate seismic displacements since centimeter-level accuracy can be achieved in real-time by processing double-differenced carrier-phase observables. However, relative positioning method requires a local reference station, which might itself be displaced during a large seismic event, resulting in misleading GPS analysis results. Meanwhile, the relative/network approach is time-consuming, particularly difficult for the simultaneous and real-time analysis of GPS data from hundreds or thousands of ground stations. In recent years, several single-receiver approaches for real-time GPS seismology, which can overcome the reference station problem of the relative positioning approach, have been successfully developed and applied to GPS seismology. One available method is real-time precise point positioning (PPP) relied on precise satellite orbit and clock products. However, real-time PPP needs a long (re)convergence period, of about thirty minutes, to resolve integer phase ambiguities and achieve centimeter-level accuracy. In comparison with PPP, Colosimo et al. (2011) proposed a variometric approach to determine the change of position between two adjacent epochs, and then displacements are obtained by a single integration of the delta positions. This approach does not suffer from convergence process, but the single integration from delta positions to

  11. 3D object-oriented image analysis in 3D geophysical modelling

    DEFF Research Database (Denmark)

    Fadel, I.; van der Meijde, M.; Kerle, N.

    2015-01-01

    Non-uniqueness of satellite gravity interpretation has traditionally been reduced by using a priori information from seismic tomography models. This reduction in the non-uniqueness has been based on velocity-density conversion formulas or user interpretation of the 3D subsurface structures (objects......) based on the seismic tomography models and then forward modelling these objects. However, this form of object-based approach has been done without a standardized methodology on how to extract the subsurface structures from the 3D models. In this research, a 3D object-oriented image analysis (3D OOA......) approach was implemented to extract the 3D subsurface structures from geophysical data. The approach was applied on a 3D shear wave seismic tomography model of the central part of the East African Rift System. Subsequently, the extracted 3D objects from the tomography model were reconstructed in the 3D...

  12. Evaluation of precision and accuracy of neutron activation analysis method of environmental samples analysis

    International Nuclear Information System (INIS)

    Wardani, Sri; Rina M, Th.; L, Dyah

    2000-01-01

    Evaluation of precision and accuracy of Neutron Activation Analysis (NAA) method used by P2TRR performed by analyzed the standard reference samples from the National Institute of Environmental Study of Japan (NIES-CRM No.10 (rice flour) and the National Bureau of USA (NBS-SRM 1573a (tomato leave) by NAA method. In analyze the environmental SRM No.10 by NAA method in qualitatively could identified multi elements of contents, namely: Br, Ca, Co, CI, Cs, Gd, I, K< La, Mg, Mn, Na, Pa, Sb, Sm, Sr, Ta, Th, and Zn (19 elements) for SRM 1573a; As, Br, Cr, CI, Ce, Co, Cs, Fe, Ga, Hg, K, Mn, Mg, Mo, Na, Ni, Pb, Rb, Sr, Se, Sc, Sb, Ti, and Zn, (25 elements) for CRM No.10a; Ag, As, Br, Cr, CI, Ce, Cd, Co, Cs, Eu, Fe, Ga, Hg, K, Mg, Mn, Mo, Na, Nb, Pb, Rb, Sb, Sc, Th, TI, and Zn, (26 elements) for CRM No. 10b; As, Br, Co, CI, Ce, Cd, Ga, Hg, K, Mn, Mg, Mo, Na, Nb, Pb, Rb, Sb, Se, TI, and Zn (20 elementary) for CRM No.10c. In the quantitatively analysis could determined only some element of sample contents, namely: As, Co, Cd, Mo, Mn, and Zn. From the result compared with NIES or NBS values attained with deviation of 3% ∼ 15%. Overall, the result shown that the method and facilities have a good capability, but the irradiation facility and the software of spectrometry gamma ray necessary to developing or seriously research perform

  13. The Analysis of Dimensionality Reduction Techniques in Cryptographic Object Code Classification

    Energy Technology Data Exchange (ETDEWEB)

    Jason L. Wright; Milos Manic

    2010-05-01

    This paper compares the application of three different dimension reduction techniques to the problem of locating cryptography in compiled object code. A simple classi?er is used to compare dimension reduction via sorted covariance, principal component analysis, and correlation-based feature subset selection. The analysis concentrates on the classi?cation accuracy as the number of dimensions is increased.

  14. Design principles for six degrees-of-freedom MEMS-based precision manipulators

    NARCIS (Netherlands)

    Brouwer, Dannis Michel

    2007-01-01

    In the future, the precision manipulation of small objects will become more and more important for appliances such as data storage, micro assembly, sample manipulation in microscopes, cell manipulation, and manipulation of beam paths by micro mirrors. At the same time, there is a drive towards

  15. A Retrospective Analysis of Precision Medicine Outcomes in Patients With Advanced Cancer Reveals Improved Progression-Free Survival Without Increased Health Care Costs.

    Science.gov (United States)

    Haslem, Derrick S; Van Norman, S Burke; Fulde, Gail; Knighton, Andrew J; Belnap, Tom; Butler, Allison M; Rhagunath, Sharanya; Newman, David; Gilbert, Heather; Tudor, Brian P; Lin, Karen; Stone, Gary R; Loughmiller, David L; Mishra, Pravin J; Srivastava, Rajendu; Ford, James M; Nadauld, Lincoln D

    2017-02-01

    The advent of genomic diagnostic technologies such as next-generation sequencing has recently enabled the use of genomic information to guide targeted treatment in patients with cancer, an approach known as precision medicine. However, clinical outcomes, including survival and the cost of health care associated with precision cancer medicine, have been challenging to measure and remain largely unreported. We conducted a matched cohort study of 72 patients with metastatic cancer of diverse subtypes in the setting of a large, integrated health care delivery system. We analyzed the outcomes of 36 patients who received genomic testing and targeted therapy (precision cancer medicine) between July 1, 2013, and January 31, 2015, compared with 36 historical control patients who received standard chemotherapy (n = 29) or best supportive care (n = 7). The average progression-free survival was 22.9 weeks for the precision medicine group and 12.0 weeks for the control group ( P = .002) with a hazard ratio of 0.47 (95% CI, 0.29 to 0.75) when matching on age, sex, histologic diagnosis, and previous lines of treatment. In a subset analysis of patients who received all care within the Intermountain Healthcare system (n = 44), per patient charges per week were $4,665 in the precision treatment group and $5,000 in the control group ( P = .126). These findings suggest that precision cancer medicine may improve survival for patients with refractory cancer without increasing health care costs. Although the results of this study warrant further validation, this precision medicine approach may be a viable option for patients with advanced cancer.

  16. Panel 3: Genetics and Precision Medicine of Otitis Media.

    Science.gov (United States)

    Lin, Jizhen; Hafrén, Hena; Kerschner, Joseph; Li, Jian-Dong; Brown, Steve; Zheng, Qing Y; Preciado, Diego; Nakamura, Yoshihisa; Huang, Qiuhong; Zhang, Yan

    2017-04-01

    Objective The objective is to perform a comprehensive review of the literature up to 2015 on the genetics and precision medicine relevant to otitis media. Data Sources PubMed database of the National Library of Medicine. Review Methods Two subpanels were formed comprising experts in the genetics and precision medicine of otitis media. Each of the panels reviewed the literature in their respective fields and wrote draft reviews. The reviews were shared with all panel members, and a merged draft was created. The entire panel met at the 18th International Symposium on Recent Advances in Otitis Media in June 2015 and discussed the review and refined the content. A final draft was made, circulated, and approved by the panel members. Conclusion Many genes relevant to otitis media have been identified in the last 4 years in advancing our knowledge regarding the predisposition of the middle ear mucosa to commensals and pathogens. Advances include mutant animal models and clinical studies. Many signaling pathways are involved in the predisposition of otitis media. Implications for Practice New knowledge on the genetic background relevant to otitis media forms a basis of novel potential interventions, including potential new ways to treat otitis media.

  17. Human genomics projects and precision medicine.

    Science.gov (United States)

    Carrasco-Ramiro, F; Peiró-Pastor, R; Aguado, B

    2017-09-01

    The completion of the Human Genome Project (HGP) in 2001 opened the floodgates to a deeper understanding of medicine. There are dozens of HGP-like projects which involve from a few tens to several million genomes currently in progress, which vary from having specialized goals or a more general approach. However, data generation, storage, management and analysis in public and private cloud computing platforms have raised concerns about privacy and security. The knowledge gained from further research has changed the field of genomics and is now slowly permeating into clinical medicine. The new precision (personalized) medicine, where genome sequencing and data analysis are essential components, allows tailored diagnosis and treatment according to the information from the patient's own genome and specific environmental factors. P4 (predictive, preventive, personalized and participatory) medicine is introducing new concepts, challenges and opportunities. This review summarizes current sequencing technologies, concentrates on ongoing human genomics projects, and provides some examples in which precision medicine has already demonstrated clinical impact in diagnosis and/or treatment.

  18. Simulation analysis of photometric data for attitude estimation of unresolved space objects

    Science.gov (United States)

    Du, Xiaoping; Gou, Ruixin; Liu, Hao; Hu, Heng; Wang, Yang

    2017-10-01

    The attitude information acquisition of unresolved space objects, such as micro-nano satellites and GEO objects under the way of ground-based optical observations, is a challenge to space surveillance. In this paper, a useful method is proposed to estimate the SO attitude state according to the simulation analysis of photometric data in different attitude states. The object shape model was established and the parameters of the BRDF model were determined, then the space object photometric model was established. Furthermore, the photometric data of space objects in different states are analyzed by simulation and the regular characteristics of the photometric curves are summarized. The simulation results show that the photometric characteristics are useful for attitude inversion in a unique way. Thus, a new idea is provided for space object identification in this paper.

  19. Duality and calculus of convex objects (theory and applications)

    International Nuclear Information System (INIS)

    Brinkhuis, Ya; Tikhomirov, V M

    2007-01-01

    A new approach to convex calculus is presented, which allows one to treat from a single point of view duality and calculus for various convex objects. This approach is based on the possibility of associating with each convex object (a convex set or a convex function) a certain convex cone without loss of information about the object. From the duality theorem for cones duality theorems for other convex objects are deduced as consequences. The theme 'Duality formulae and the calculus of convex objects' is exhausted (from a certain precisely formulated point of view). Bibliography: 5 titles.

  20. Sleep deprivation accelerates delay-related loss of visual short-term memories without affecting precision.

    Science.gov (United States)

    Wee, Natalie; Asplund, Christopher L; Chee, Michael W L

    2013-06-01

    Visual short-term memory (VSTM) is an important measure of information processing capacity and supports many higher-order cognitive processes. We examined how sleep deprivation (SD) and maintenance duration interact to influence the number and precision of items in VSTM using an experimental design that limits the contribution of lapses at encoding. For each trial, participants attempted to maintain the location and color of three stimuli over a delay. After a retention interval of either 1 or 10 seconds, participants reported the color of the item at the cued location by selecting it on a color wheel. The probability of reporting the probed item, the precision of report, and the probability of reporting a nonprobed item were determined using a mixture-modeling analysis. Participants were studied twice in counterbalanced order, once after a night of normal sleep and once following a night of sleep deprivation. Sleep laboratory. Nineteen healthy college age volunteers (seven females) with regular sleep patterns. Approximately 24 hours of total SD. SD selectively reduced the number of integrated representations that can be retrieved after a delay, while leaving the precision of object information in the stored representations intact. Delay interacted with SD to lower the rate of successful recall. Visual short-term memory is compromised during sleep deprivation, an effect compounded by delay. However, when memories are retrieved, they tend to be intact.

  1. Analysis of residual stress in subsurface layers after precision hard machining of forging tools

    Directory of Open Access Journals (Sweden)

    Czan Andrej

    2018-01-01

    Full Text Available This paper is focused on analysis of residual stress of functional surfaces and subsurface layers created by precision technologies of hard machining for progressive constructional materials of forging tools. Methods of experiments are oriented on monitoring of residual stress in surface which is created by hard turning (roughing and finishing operations. Subsequently these surfaces were etched in thin layers by electro-chemical polishing. The residual stress was monitored in each etched layer. The measuring was executed by portable X-ray diffractometer for detection of residual stress and structural phases. The results significantly indicate rise and distribution of residual stress in surface and subsurface layers and their impact on functional properties of surface integrity.

  2. Topological situational analysis and synthesis of strategies of object management in the conditions of conflict, uncertainty of behaviour and varible amount of the observed objects

    Directory of Open Access Journals (Sweden)

    Віктор Володимирович Семко

    2016-09-01

    Full Text Available The conflict of cooperation of objects is considered in observation space as integral phenomenon with the certain variety of types of connections between its elements, objects, systems and environment that erected in a single theoretical conception and comprehensively and deeply determine the real features of object of researches. Methodology of system-structural analysis of conflict is used as research of the phenomenon in the whole and system-functional analysis as research with the aim of determination of all basic intercommunications with an environment

  3. Precision lens assembly with alignment turning system

    Science.gov (United States)

    Ho, Cheng-Fang; Huang, Chien-Yao; Lin, Yi-Hao; Kuo, Hui-Jean; Kuo, Ching-Hsiang; Hsu, Wei-Yao; Chen, Fong-Zhi

    2017-10-01

    The poker chip assembly with high precision lens barrels is widely applied to ultra-high performance optical system. ITRC applies the poker chip assembly technology to the high numerical aperture objective lenses and lithography projection lenses because of its high efficiency assembly process. In order to achieve high precision lens cell for poker chip assembly, an alignment turning system (ATS) is developed. The ATS includes measurement, alignment and turning modules. The measurement module is equipped with a non-contact displacement sensor (NCDS) and an autocollimator (ACM). The NCDS and ACM are used to measure centration errors of the top and the bottom surface of a lens respectively; then the amount of adjustment of displacement and tilt with respect to the rotational axis of the turning machine for the alignment module can be determined. After measurement, alignment and turning processes on the ATS, the centration error of a lens cell with 200 mm in diameter can be controlled within 10 arcsec. Furthermore, a poker chip assembly lens cell with three sub-cells is demonstrated, each sub-cells are measured and accomplished with alignment and turning processes. The lens assembly test for five times by each three technicians; the average transmission centration error of assembly lens is 12.45 arcsec. The results show that ATS can achieve high assembly efficiency for precision optical systems.

  4. Chromatographic speciation of Cr(III)-species, inter-species equilibrium isotope fractionation and improved chemical purification strategies for high-precision isotope analysis

    DEFF Research Database (Denmark)

    Larsen, Kirsten Kolbjørn; Wielandt, Daniel Kim Peel; Schiller, Martin

    2016-01-01

    Chromatographic purification of chromium (Cr), which is required for high-precision isotope analysis, is complicated by the presence of multiple Cr-species with different effective charges in the acid digested sample aliquots. The differing ion exchange selectivity and sluggish reaction rates of ...

  5. Feasibility study for objective oriented design of system thermal hydraulic analysis program

    International Nuclear Information System (INIS)

    Chung, Bub Dong; Jeong, Jae Jun; Hwang, Moon Kyu

    2008-01-01

    The system safety analysis code, such as RELAP5, TRAC, CATHARE etc. have been developed based on Fortran language during the past few decades. Refactoring of conventional codes has been also performed to improve code readability and maintenance. However the programming paradigm in software technology has been changed to use objects oriented programming (OOP), which is based on several techniques, including encapsulation, modularity, polymorphism, and inheritance. In this work, objective oriented program for system safety analysis code has been tried utilizing modernized C language. The analysis, design, implementation and verification steps for OOP system code development are described with some implementation examples. The system code SYSTF based on three-fluid thermal hydraulic solver has been developed by OOP design. The verifications of feasibility are performed with simple fundamental problems and plant models. (author)

  6. Laser-induced breakdown spectroscopy (LIBS) analysis of calcium ions dissolved in water using filter paper substrates: an ideal internal standard for precision improvement.

    Science.gov (United States)

    Choi, Daewoong; Gong, Yongdeuk; Nam, Sang-Ho; Han, Song-Hee; Yoo, Jonghyun; Lee, Yonghoon

    2014-01-01

    We report an approach for selecting an internal standard to improve the precision of laser-induced breakdown spectroscopy (LIBS) analysis for determining calcium (Ca) concentration in water. The dissolved Ca(2+) ions were pre-concentrated on filter paper by evaporating water. The filter paper was dried and analyzed using LIBS. By adding strontium chloride to sample solutions and using a Sr II line at 407.771 nm for the intensity normalization of Ca II lines at 393.366 or 396.847 nm, the analysis precision could be significantly improved. The Ca II and Sr II line intensities were mapped across the filter paper, and they showed a strong positive shot-to-shot correlation with the same spatial distribution on the filter paper surface. We applied this analysis approach for the measurement of Ca(2+) in tap, bottled, and ground water samples. The Ca(2+) concentrations determined using LIBS are in good agreement with those obtained from flame atomic absorption spectrometry. Finally, we suggest a homologous relation of the strongest emission lines of period 4 and 5 elements in groups IA and IIA based on their similar electronic structures. Our results indicate that the LIBS can be effectively applied for liquid analysis at the sub-parts per million level with high precision using a simple drying of liquid solutions on filter paper and the use of the correct internal standard elements with the similar valence electronic structure with respect to the analytes of interest.

  7. Stereological analysis of spatial structures

    DEFF Research Database (Denmark)

    Hansen, Linda Vadgård

    The thesis deals with stereological analysis of spatial structures. One area of focus has been to improve the precision of well-known stereological estimators by including information that is available via automatic image analysis. Furthermore, the thesis presents a stochastic model for star......-shaped three-dimensional objects using the radial function. It appears that the model is highly fleksiblel in the sense that it can be used to describe an object with arbitrary irregular surface. Results on the distribution of well-known local stereological volume estimators are provided....

  8. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    Science.gov (United States)

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  9. Moving the Weber Fraction: The Perceptual Precision for Moment of Inertia Increases with Exploration Force

    Science.gov (United States)

    Debats, Nienke B.; Kingma, Idsart; Beek, Peter J.; Smeets, Jeroen B. J.

    2012-01-01

    How does the magnitude of the exploration force influence the precision of haptic perceptual estimates? To address this question, we examined the perceptual precision for moment of inertia (i.e., an object's “angular mass”) under different force conditions, using the Weber fraction to quantify perceptual precision. Participants rotated a rod around a fixed axis and judged its moment of inertia in a two-alternative forced-choice task. We instructed different levels of exploration force, thereby manipulating the magnitude of both the exploration force and the angular acceleration. These are the two signals that are needed by the nervous system to estimate moment of inertia. Importantly, one can assume that the absolute noise on both signals increases with an increase in the signals' magnitudes, while the relative noise (i.e., noise/signal) decreases with an increase in signal magnitude. We examined how the perceptual precision for moment of inertia was affected by this neural noise. In a first experiment we found that a low exploration force caused a higher Weber fraction (22%) than a high exploration force (13%), which suggested that the perceptual precision was constrained by the relative noise. This hypothesis was supported by the result of a second experiment, in which we found that the relationship between exploration force and Weber fraction had a similar shape as the theoretical relationship between signal magnitude and relative noise. The present study thus demonstrated that the amount of force used to explore an object can profoundly influence the precision by which its properties are perceived. PMID:23028437

  10. Art, historical and cultural heritage objects studied with different non-destructive analysis

    International Nuclear Information System (INIS)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M.

    2012-01-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  11. Art, historical and cultural heritage objects studied with different non-destructive analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rizzutto, Marcia A.; Tabacniks, Manfredo H.; Added, Nemitala; Campos, Pedro H.O.V.; Curado, Jessica F.; Kajiya, Elizabeth A.M. [Universidade de Sao Paulo (IF/USP), SP (Brazil). Inst. de Fisica

    2012-07-01

    Full text: Since 2003, the analysis of art, historical and cultural heritage objects has being performed at the Laboratorio de Analise de Materiais of the Instituto de Fisica of the Universidade de Sao Paulo (LAMFI-USP). Initially the studies were restricted to non-destructive methods using ion beams to characterize the chemical elements present in the objects. Recently, new analytical techniques and procedures have been incorporated to the better characterization of the objects and the examinations were expanded to other non-destructive analytical techniques such as portable X-Ray fluorescence (XRF), digitalized radiography, high resolution photography with visible, UV (ultraviolet) light and reflectography in the infrared region. These non-destructive analytical techniques systematically applied to the objects are helping the better understanding of these objects and allow studying them by examining their main components; their conservation status and also the creative process of the artist, particularly in easel paintings allow making new discoveries. The setup of the external beam in the LAMFI laboratory is configured to allow different simultaneous analysis by PIXE / PIGE (Particle Induced X-ray emission / Particle Induced gamma rays emission), RBS (Rutherford Backscattering) and IBL (Ion Beam Luminescence) and to expand the archaeometric results using ion beams. PIXE and XRF analysis are important to characterize the elements presents in the objects, pigments and others materials. The digitized radiography has provided important information about the internal structure of the objects, the manufacturing process, the internal particles existing and in case of easel paintings it can reveal features of the artist's creative process showing hidden images and the first paintings done by the artist in the background. Some Brazilian paintings studied by IR imaging revealed underlying drawings, which allowed us to discover the process of creation and also some

  12. N-of-1-pathways MixEnrich: advancing precision medicine via single-subject analysis in discovering dynamic changes of transcriptomes.

    Science.gov (United States)

    Li, Qike; Schissler, A Grant; Gardeux, Vincent; Achour, Ikbel; Kenost, Colleen; Berghout, Joanne; Li, Haiquan; Zhang, Hao Helen; Lussier, Yves A

    2017-05-24

    Transcriptome analytic tools are commonly used across patient cohorts to develop drugs and predict clinical outcomes. However, as precision medicine pursues more accurate and individualized treatment decisions, these methods are not designed to address single-patient transcriptome analyses. We previously developed and validated the N-of-1-pathways framework using two methods, Wilcoxon and Mahalanobis Distance (MD), for personal transcriptome analysis derived from a pair of samples of a single patient. Although, both methods uncover concordantly dysregulated pathways, they are not designed to detect dysregulated pathways with up- and down-regulated genes (bidirectional dysregulation) that are ubiquitous in biological systems. We developed N-of-1-pathways MixEnrich, a mixture model followed by a gene set enrichment test, to uncover bidirectional and concordantly dysregulated pathways one patient at a time. We assess its accuracy in a comprehensive simulation study and in a RNA-Seq data analysis of head and neck squamous cell carcinomas (HNSCCs). In presence of bidirectionally dysregulated genes in the pathway or in presence of high background noise, MixEnrich substantially outperforms previous single-subject transcriptome analysis methods, both in the simulation study and the HNSCCs data analysis (ROC Curves; higher true positive rates; lower false positive rates). Bidirectional and concordant dysregulated pathways uncovered by MixEnrich in each patient largely overlapped with the quasi-gold standard compared to other single-subject and cohort-based transcriptome analyses. The greater performance of MixEnrich presents an advantage over previous methods to meet the promise of providing accurate personal transcriptome analysis to support precision medicine at point of care.

  13. Exergoeconomic multi objective optimization and sensitivity analysis of a regenerative Brayton cycle

    International Nuclear Information System (INIS)

    Naserian, Mohammad Mahdi; Farahat, Said; Sarhaddi, Faramarz

    2016-01-01

    Highlights: • Finite time exergoeconomic multi objective optimization of a Brayton cycle. • Comparing the exergoeconomic and the ecological function optimization results. • Inserting the cost of fluid streams concept into finite-time thermodynamics. • Exergoeconomic sensitivity analysis of a regenerative Brayton cycle. • Suggesting the cycle performance curve drawing and utilization. - Abstract: In this study, the optimal performance of a regenerative Brayton cycle is sought through power maximization and then exergoeconomic optimization using finite-time thermodynamic concept and finite-size components. Optimizations are performed using genetic algorithm. In order to take into account the finite-time and finite-size concepts in current problem, a dimensionless mass-flow parameter is used deploying time variations. The decision variables for the optimum state (of multi objective exergoeconomic optimization) are compared to the maximum power state. One can see that the multi objective exergoeconomic optimization results in a better performance than that obtained with the maximum power state. The results demonstrate that system performance at optimum point of multi objective optimization yields 71% of the maximum power, but only with exergy destruction as 24% of the amount that is produced at the maximum power state and 67% lower total cost rate than that of the maximum power state. In order to assess the impact of the variation of the decision variables on the objective functions, sensitivity analysis is conducted. Finally, the cycle performance curve drawing according to exergoeconomic multi objective optimization results and its utilization, are suggested.

  14. Extending Track Analysis from Animals in the Lab to Moving Objects Anywhere

    NARCIS (Netherlands)

    Dommelen, W. van; Laar, P.J.L.J. van de; Noldus, L.P.J.J.

    2013-01-01

    In this chapter we compare two application domains in which the tracking of objects and the analysis of their movements are core activities, viz. animal tracking and vessel tracking. More specifically, we investigate whether EthoVision XT, a research tool for video tracking and analysis of the

  15. Precision medicine, an approach for development of the future medicine technologies

    Directory of Open Access Journals (Sweden)

    Iraj Nabipour

    2016-04-01

    Full Text Available Precision medicine is an approach in medicine that takes into account individual differences in people's genes, environments, and lifestyle. This field of medicine redefines our understanding of disease onset and progression, treatment response, and health outcomes through the more precise measurement of molecular, environmental, and behavioral factors that contribute to health and disease. Undoubtedly, the advances in omics technologies including genomics, data collection and storage, computational analysis, and mobile health applications over the last decade produced significant progress for precision medicine. In fact, precision medicine is a platform for the growth of personalized medicine, wearable biosensors, mobile health, computational sciences, genomic singularity, and other omics technologies. In the pathway of precision medicine, mathematics and computational sciences will be revolutionized to overcome the challenges in Big Data. By the birth of precision medicine, novel therapeutic strategies for chronic complex diseases such as cardiovascular disease and cancers would be designed in Systems Medicine.

  16. Artificial intelligence, physiological genomics, and precision medicine.

    Science.gov (United States)

    Williams, Anna Marie; Liu, Yong; Regner, Kevin R; Jotterand, Fabrice; Liu, Pengyuan; Liang, Mingyu

    2018-04-01

    Big data are a major driver in the development of precision medicine. Efficient analysis methods are needed to transform big data into clinically-actionable knowledge. To accomplish this, many researchers are turning toward machine learning (ML), an approach of artificial intelligence (AI) that utilizes modern algorithms to give computers the ability to learn. Much of the effort to advance ML for precision medicine has been focused on the development and implementation of algorithms and the generation of ever larger quantities of genomic sequence data and electronic health records. However, relevance and accuracy of the data are as important as quantity of data in the advancement of ML for precision medicine. For common diseases, physiological genomic readouts in disease-applicable tissues may be an effective surrogate to measure the effect of genetic and environmental factors and their interactions that underlie disease development and progression. Disease-applicable tissue may be difficult to obtain, but there are important exceptions such as kidney needle biopsy specimens. As AI continues to advance, new analytical approaches, including those that go beyond data correlation, need to be developed and ethical issues of AI need to be addressed. Physiological genomic readouts in disease-relevant tissues, combined with advanced AI, can be a powerful approach for precision medicine for common diseases.

  17. -Omic and Electronic Health Records Big Data Analytics for Precision Medicine

    Science.gov (United States)

    Wu, Po-Yen; Cheng, Chih-Wen; Kaddi, Chanchala D.; Venugopalan, Janani; Hoffman, Ryan; Wang, May D.

    2017-01-01

    Objective Rapid advances of high-throughput technologies and wide adoption of electronic health records (EHRs) have led to fast accumulation of -omic and EHR data. These voluminous complex data contain abundant information for precision medicine, and big data analytics can extract such knowledge to improve the quality of health care. Methods In this article, we present -omic and EHR data characteristics, associated challenges, and data analytics including data pre-processing, mining, and modeling. Results To demonstrate how big data analytics enables precision medicine, we provide two case studies, including identifying disease biomarkers from multi-omic data and incorporating -omic information into EHR. Conclusion Big data analytics is able to address –omic and EHR data challenges for paradigm shift towards precision medicine. Significance Big data analytics makes sense of –omic and EHR data to improve healthcare outcome. It has long lasting societal impact. PMID:27740470

  18. Joint Tensor Feature Analysis For Visual Object Recognition.

    Science.gov (United States)

    Wong, Wai Keung; Lai, Zhihui; Xu, Yong; Wen, Jiajun; Ho, Chu Po

    2015-11-01

    Tensor-based object recognition has been widely studied in the past several years. This paper focuses on the issue of joint feature selection from the tensor data and proposes a novel method called joint tensor feature analysis (JTFA) for tensor feature extraction and recognition. In order to obtain a set of jointly sparse projections for tensor feature extraction, we define the modified within-class tensor scatter value and the modified between-class tensor scatter value for regression. The k-mode optimization technique and the L(2,1)-norm jointly sparse regression are combined together to compute the optimal solutions. The convergent analysis, computational complexity analysis and the essence of the proposed method/model are also presented. It is interesting to show that the proposed method is very similar to singular value decomposition on the scatter matrix but with sparsity constraint on the right singular value matrix or eigen-decomposition on the scatter matrix with sparse manner. Experimental results on some tensor datasets indicate that JTFA outperforms some well-known tensor feature extraction and selection algorithms.

  19. Examination of tumor diameter measurement precision by RECIST

    International Nuclear Information System (INIS)

    Goto, Masami; Ino, Kenji; Akahane, Masaaki

    2007-01-01

    Image evaluation with Response Evaluation Criteria in Solid Tumors (RECIST) evaluates the change in a measurable lesion as determined by ruler or micrometer caliper. However, there is no definition of the conditions thought to influence the precision of measurement. We therefore examined the effects on measurement precision by changing image amplification, window width (WW), window level (WL), and time phase. Moreover, to determine response rate, one-dimensional evaluation with RECIST was compared with the two-dimensional evaluation of World Health Organization (WHO) for a hepatocellular carcinoma. The results of measuring the object lesion for measured value variation were as follows. Under image conditions of 1 time expansion/WW 150/WL 100 was (4.92±1.94)%. Under image conditions of 1 time expansion/WW 350/WL 75 was (4.42±1.70)%. Under image conditions of 4 times expansion/WW 150/WL 100 was (2.52±0.82)%. Under image conditions of 4 times expansion/WW 350/WL 75 was (2.83±1.10)%. When an image was enlarged to 4 times, precision doubled. There was no a difference in comparing RECIST to WHO in terms of response rate. Thus the best method was considered to be RECIST because of its convenience. (author)

  20. Object-oriented analysis and design for information systems Modeling with UML, OCL, IFML

    CERN Document Server

    Wazlawick, Raul Sidnei

    2014-01-01

    Object-Oriented Analysis and Design for Information Systems clearly explains real object-oriented programming in practice. Expert author Raul Sidnei Wazlawick explains concepts such as object responsibility, visibility and the real need for delegation in detail. The object-oriented code generated by using these concepts in a systematic way is concise, organized and reusable. The patterns and solutions presented in this book are based in research and industrial applications. You will come away with clarity regarding processes and use cases and a clear understand of how to expand a use case.

  1. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1986-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are presented. (author)

  2. A functional analysis of photo-object matching skills of severely retarded adolescents.

    Science.gov (United States)

    Dixon, L S

    1981-01-01

    Matching-to-sample procedures were used to assess picture representation skills of severely retarded, nonverbal adolescents. Identity matching within the classes of objects and life-size, full-color photos of the objects was first used to assess visual discrimination, a necessary condition for picture representation. Picture representation was then assessed through photo-object matching tasks. Five students demonstrated visual discrimination (identity matching) within the two classes of photos and the objects. Only one student demonstrated photo-object matching. The results of the four students who failed to demonstrate photo-object matching suggested that physical properties of photos (flat, rectangular) and depth dimensions of objects may exert more control over matching than the similarities of the objects and images within the photos. An analysis of figure-ground variables was conducted to provide an empirical basis for program development in the use of pictures. In one series of tests, rectangular shape and background were removed by cutting out the figures in the photos. The edge shape of the photo and the edge shape of the image were then identical. The results suggest that photo-object matching may be facilitated by using cut-out figures rather than the complete rectangular photo.

  3. An Objective Comparison of Applied Behavior Analysis and Organizational Behavior Management Research

    Science.gov (United States)

    Culig, Kathryn M.; Dickinson, Alyce M.; McGee, Heather M.; Austin, John

    2005-01-01

    This paper presents an objective review, analysis, and comparison of empirical studies targeting the behavior of adults published in Journal of Applied Behavior Analysis (JABA) and Journal of Organizational Behavior Management (JOBM) between 1997 and 2001. The purpose of the comparisons was to identify similarities and differences with respect to…

  4. A study of reduced numerical precision to make superparameterization more competitive using a hardware emulator in the OpenIFS model

    Science.gov (United States)

    Düben, Peter D.; Subramanian, Aneesh; Dawson, Andrew; Palmer, T. N.

    2017-03-01

    The use of reduced numerical precision to reduce computing costs for the cloud resolving model of superparameterized simulations of the atmosphere is investigated. An approach to identify the optimal level of precision for many different model components is presented, and a detailed analysis of precision is performed. This is nontrivial for a complex model that shows chaotic behavior such as the cloud resolving model in this paper. It is shown not only that numerical precision can be reduced significantly but also that the results of the reduced precision analysis provide valuable information for the quantification of model uncertainty for individual model components. The precision analysis is also used to identify model parts that are of less importance thus enabling a reduction of model complexity. It is shown that the precision analysis can be used to improve model efficiency for both simulations in double precision and in reduced precision. Model simulations are performed with a superparameterized single-column model version of the OpenIFS model that is forced by observational data sets. A software emulator was used to mimic the use of reduced precision floating point arithmetic in simulations.

  5. Precision farming - Technology assessment of site-specific input application in cereals

    DEFF Research Database (Denmark)

    Pedersen, Søren Marcus

    economic and socio-economic analysis. The current status of precision farming in Denmark is as follows: • The technology is primarily applicable for large farm holdings • Economic viability depends on site-specific yield variation • So far, the business economic benefits from most PF-practices are modest...... but it seems possible to obtain a socio-economic benefits from lime, variable rate herbicide and possibly nitrogen application • The technology may improve farm logistics, planning and crop quality (e.g. protein content) - but • The costs of implementing PF-practices are high and • Technical functionality...... several years before the next generation of precision farming systems will be available in practice. Meanwhile, those farmers who already have invested in yield monitors and soil analysis for precision farming should be able to use the current technology in the best possible way....

  6. Examination of objective analysis precision using wind profiler and radiosonde network data

    Energy Technology Data Exchange (ETDEWEB)

    Mace, G.G.; Ackerman, T.P. [Penn State Univ., University Park, PA (United States)

    1996-04-01

    One of the principal research strategies that has emerged from the science team of the Atmospheric Radiation Measurement (ARM) Program is the use of a single column model (SCM). The basic assumption behind the SCM approach is that a cloud and radiation parameterization embedded in a general circulation model can be effectively tested and improved by extracting that column parameterization from the general circulation model and then driving this single column at the lateral boundaries of the column with diagnosed large-scale atmospheric forcing. A second and related assumption is that the large-scale atmospheric state, and hence the associated forcing, can be characterized directly from observations. One of the primary reasons that the Southern Great Plains (SGP) site is located in Lamont, Oklahoma, is because Lamont is at the approximate center of the NOM Wind Profiler Demonstration Array (WPDA). The assumption was that hourly average wind profiles provided by the 7 wind profilers (one Lamont and six surrounding it in a hexagon) coupled with radiosonde launches every three hours at 5 sites (Lamont plus four of the six profiler locations forming the hexagon) would be sufficient to characterize accurately the large-scale forcing at the site and thereby provide the required forcing for the SCM. The goal of this study was to examine these three assumptions.

  7. [Implementation of precision control to achieve the goal of schistosomiasis elimination in China].

    Science.gov (United States)

    Zhou, Xiao-nong

    2016-02-01

    The integrated strategy for schistosomiasis control with focus on infectious source control, which has been implemented since 2004, accelerated the progress towards schistosomiasis control in China, and achieved transmission control of the disease across the country by the end of 2015, which achieved the overall objective of the Mid- and Long-term National Plan for Prevention and Control of Schistosomiasis (2004-2015) on schedule. Then, the goal of schistosomiasis elimination by 2025 was proposed in China in 2014. To achieve this new goal on schedule, we have to address the key issues, and implement precision control measures with more precise identification of control targets, so that we are able to completely eradicate the potential factors leading to resurgence of schistosomiasis transmission and enable the achievement of schistosomiasis elimination on schedule. Precision schistosomiasis control, a theoretical innovation of precision medicine in schistosomiasis control, will provide new insights into schistosomiasis control based on the conception of precision medicine. This paper describes the definition, interventions and the role of precision schistosomiasis control in the elimination of schistosomiasis in China, and demonstrates that sustainable improvement of professionals and integrated control capability at grass-root level is a prerequisite to the implementation of schistosomiasis control, precision schistosomiasis control is a key to the further implementation of the integrated strategy for schistosomiasis control with focus on infectious source control, and precision schistosomiasis control is a guarantee of curing schistosomiasis patients and implementing schistosomiasis control program and interventions.

  8. The Density of Mid-sized Kuiper Belt Objects from ALMA Thermal Observations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Michael E. [California Institute of Technology, 1200 E California Blvd., Pasadena CA 91125 (United States); Butler, Bryan J. [National Radio Astronomy Observatory, 1003 Lopezville Rd., Socorro NM 87801 (United States)

    2017-07-01

    The densities of mid-sized Kuiper Belt objects (KBOs) are a key constraint in understanding the assembly of objects in the outer solar system. These objects are critical for understanding the currently unexplained transition from the smallest KBOs with densities lower than that of water, to the largest objects with significant rock content. Mapping this transition is made difficult by the uncertainties in the diameters of these objects, which maps into an even larger uncertainty in volume and thus density. The substantial collecting area of the Atacama Large Millimeter Array allows significantly more precise measurements of thermal emission from outer solar system objects and could potentially greatly improve the density measurements. Here we use new thermal observations of four objects with satellites to explore the improvements possible with millimeter data. We find that effects due to effective emissivity at millimeter wavelengths make it difficult to use the millimeter data directly to find diameters and thus volumes for these bodies. In addition, we find that when including the effects of model uncertainty, the true uncertainties on the sizes of outer solar system objects measured with radiometry are likely larger than those previously published. Substantial improvement in object sizes will likely require precise occultation measurements.

  9. The precision of circadian clocks : Assessment and analysis in Syrian hamsters

    NARCIS (Netherlands)

    Daan, S; Oklejewicz, M

    2003-01-01

    Locomotor activity recordings of Syrian hamsters were systematically analyzed to estimate the precision of the overt circadian activity rhythm in constant darkness. Phase variation, i.e., the standard deviation of phase markers around the regression line, varied with the definition of phase.

  10. Getting nowhere fast: trade-off between speed and precision in training to execute image-guided hand-tool movements

    Directory of Open Access Journals (Sweden)

    Anil Ufuk Batmaz

    2016-11-01

    Full Text Available Abstract Background The speed and precision with which objects are moved by hand or hand-tool interaction under image guidance depend on a specific type of visual and spatial sensorimotor learning. Novices have to learn to optimally control what their hands are doing in a real-world environment while looking at an image representation of the scene on a video monitor. Previous research has shown slower task execution times and lower performance scores under image-guidance compared with situations of direct action viewing. The cognitive processes for overcoming this drawback by training are not yet understood. Methods We investigated the effects of training on the time and precision of direct view versus image guided object positioning on targets of a Real-world Action Field (RAF. Two men and two women had to learn to perform the task as swiftly and as precisely as possible with their dominant hand, using a tool or not and wearing a glove or not. Individuals were trained in sessions of mixed trial blocks with no feed-back. Results As predicted, image-guidance produced significantly slower times and lesser precision in all trainees and sessions compared with direct viewing. With training, all trainees get faster in all conditions, but only one of them gets reliably more precise in the image-guided conditions. Speed-accuracy trade-offs in the individual performance data show that the highest precision scores and steepest learning curve, for time and precision, were produced by the slowest starter. Fast starters produced consistently poorer precision scores in all sessions. The fastest starter showed no sign of stable precision learning, even after extended training. Conclusions Performance evolution towards optimal precision is compromised when novices start by going as fast as they can. The findings have direct implications for individual skill monitoring in training programmes for image-guided technology applications with human operators.

  11. Efficacy comparison of precise and traditional liver resection in treatment of intrahepatic bile duct stones

    Directory of Open Access Journals (Sweden)

    ZHANG Shengjun

    2015-10-01

    Full Text Available ObjectiveTo compare the efficacy of precise and traditional liver resection in the treatment of intrahepatic bile duct stones. MethodsOne hundred and twenty-seven patients with intrahepatic bile duct stones who were treated with surgery in our hospital from December 2008 to December 2014 were selected and divided into precise liver resection group (n=72 and traditional liver resection group (n=55 based on the type of surgery. The operation time, intraoperative blood loss, amount of postoperative drainage, postoperative time to recovery, postoperative complications (incision infection, biliary fistula, lung infection, and pleural effusion, hospitalization cost, postoperative residual calculi, and postoperative calculus recurrence were compared between the two groups. Between-group comparison of continuous data was made by t test, and between-group comparison of categorical data was made by χ2 test. Survival data were analyzed using survival function. ResultsThere were significant differences in operation time, intraoperative blood loss, amount of postoperative drainage, postoperative time to recovery, and hospitalization cost between the precise liver resection group and the traditional liver resection group (t=3.720, 58.681, 19.169, 5.990, and 6.944; all P<0.05. There were no significant differences in postoperative complications including incision infection, biliary fistula, lung infection, and pleural effusion between the two groups (all P>0.05. There were also no significant differences in the incidence rates of postoperative residual calculi and calculus recurrence between the two groups (all P>0.05. The survival analysis of postoperative calculus recurrence time showed that there was no significant difference in calculus recurrence time between the two groups (P>0.05. ConclusionCompared with traditional liver resection, precise liver resection has the advantages of shorter operation time, less intraoperative bleeding, less

  12. [Precision and personalized medicine].

    Science.gov (United States)

    Sipka, Sándor

    2016-10-01

    The author describes the concept of "personalized medicine" and the newly introduced "precision medicine". "Precision medicine" applies the terms of "phenotype", "endotype" and "biomarker" in order to characterize more precisely the various diseases. Using "biomarkers" the homogeneous type of a disease (a "phenotype") can be divided into subgroups called "endotypes" requiring different forms of treatment and financing. The good results of "precision medicine" have become especially apparent in relation with allergic and autoimmune diseases. The application of this new way of thinking is going to be necessary in Hungary, too, in the near future for participants, controllers and financing boards of healthcare. Orv. Hetil., 2016, 157(44), 1739-1741.

  13. The newest precision measurement

    International Nuclear Information System (INIS)

    Lee, Jing Gu; Lee, Jong Dae

    1974-05-01

    This book introduces basic of precision measurement, measurement of length, limit gauge, measurement of angles, measurement of surface roughness, measurement of shapes and locations, measurement of outline, measurement of external and internal thread, gear testing, accuracy inspection of machine tools, three dimension coordinate measuring machine, digitalisation of precision measurement, automation of precision measurement, measurement of cutting tools, measurement using laser, and point of choosing length measuring instrument.

  14. Feasibility analysis of CNP 1000 computerized I and C system design objectives

    International Nuclear Information System (INIS)

    Zhang Mingguang; Xu Jijun; Zhang Qinshen

    2000-01-01

    The author states the design objectives of the computerized I and C (CIC) system and advanced main control room (AMCR), which could and should be achieved in CNP 1000, based on the national 1E computer production technology including software and hardware, and current instrumentation and control design technique of nuclear power plant. The feasibility analysis on the design objectives and the reasons or necessity to do the design research projects have been described. The objectives of design research on CIC and AMCR as well as the self-design proficiency after the design research have been given

  15. Precision medicine and traditional chinese medicine of dialogue

    Directory of Open Access Journals (Sweden)

    Lou Xin

    2017-01-01

    Full Text Available The precision medicine is more precise individualized medicine, based on the patient’s genes or physiological to formulate the specific treatment plan, for the realization of individualized treatment of various diseases to provide valuable information.But with the progress of modern science and technology, modern medicine dependence on medical instruments are too serious, traditional ways are gradually forgotten.If the machine depends on the instrument test results too serious which don’t combined with the actual diagnosis, the cause of misdiagnosis, so we should pay attention to the overall analysis of diseases and systematic diagnosis and examination, use of the overall treatment concept traced back to find the cause of Traditional Chinese Medicine, finally decide to select a best treatment plan.We should use the dialectical attitude to look at the precise medical. Not blindly requirements according to the road of precision medicine of Traditional Chinese Medicine to go, to shine in himself field, form of self characteristic of Traditional Chinese Medicine.Can learn some of the advantages of accurate concept, the good and rejecting the bad, hope the Traditional Chinese Medicine in the modern environment more walk more far.

  16. Reliability of Pressure Ulcer Rates: How Precisely Can We Differentiate Among Hospital Units, and Does the Standard Signal‐Noise Reliability Measure Reflect This Precision?

    Science.gov (United States)

    Cramer, Emily

    2016-01-01

    Abstract Hospital performance reports often include rankings of unit pressure ulcer rates. Differentiating among units on the basis of quality requires reliable measurement. Our objectives were to describe and apply methods for assessing reliability of hospital‐acquired pressure ulcer rates and evaluate a standard signal‐noise reliability measure as an indicator of precision of differentiation among units. Quarterly pressure ulcer data from 8,199 critical care, step‐down, medical, surgical, and medical‐surgical nursing units from 1,299 US hospitals were analyzed. Using beta‐binomial models, we estimated between‐unit variability (signal) and within‐unit variability (noise) in annual unit pressure ulcer rates. Signal‐noise reliability was computed as the ratio of between‐unit variability to the total of between‐ and within‐unit variability. To assess precision of differentiation among units based on ranked pressure ulcer rates, we simulated data to estimate the probabilities of a unit's observed pressure ulcer rate rank in a given sample falling within five and ten percentiles of its true rank, and the probabilities of units with ulcer rates in the highest quartile and highest decile being identified as such. We assessed the signal‐noise measure as an indicator of differentiation precision by computing its correlations with these probabilities. Pressure ulcer rates based on a single year of quarterly or weekly prevalence surveys were too susceptible to noise to allow for precise differentiation among units, and signal‐noise reliability was a poor indicator of precision of differentiation. To ensure precise differentiation on the basis of true differences, alternative methods of assessing reliability should be applied to measures purported to differentiate among providers or units based on quality. © 2016 The Authors. Research in Nursing & Health published by Wiley Periodicals, Inc. PMID:27223598

  17. Method of high precision interval measurement in pulse laser ranging system

    Science.gov (United States)

    Wang, Zhen; Lv, Xin-yuan; Mao, Jin-jin; Liu, Wei; Yang, Dong

    2013-09-01

    Laser ranging is suitable for laser system, for it has the advantage of high measuring precision, fast measuring speed,no cooperative targets and strong resistance to electromagnetic interference,the measuremen of laser ranging is the key paremeters affecting the performance of the whole system.The precision of the pulsed laser ranging system was decided by the precision of the time interval measurement, the principle structure of laser ranging system was introduced, and a method of high precision time interval measurement in pulse laser ranging system was established in this paper.Based on the analysis of the factors which affected the precision of range measure,the pulse rising edges discriminator was adopted to produce timing mark for the start-stop time discrimination,and the TDC-GP2 high precision interval measurement system based on TMS320F2812 DSP was designed to improve the measurement precision.Experimental results indicate that the time interval measurement method in this paper can obtain higher range accuracy. Compared with the traditional time interval measurement system,the method simplifies the system design and reduce the influence of bad weather conditions,furthermore,it satisfies the requirements of low costs and miniaturization.

  18. Design of an Object-Oriented Turbomachinery Analysis Code: Initial Results

    Science.gov (United States)

    Jones, Scott M.

    2015-01-01

    Performance prediction of turbomachines is a significant part of aircraft propulsion design. In the conceptual design stage, there is an important need to quantify compressor and turbine aerodynamic performance and develop initial geometry parameters at the 2-D level prior to more extensive Computational Fluid Dynamics (CFD) analyses. The Object-oriented Turbomachinery Analysis Code (OTAC) is being developed to perform 2-D meridional flowthrough analysis of turbomachines using an implicit formulation of the governing equations to solve for the conditions at the exit of each blade row. OTAC is designed to perform meanline or streamline calculations; for streamline analyses simple radial equilibrium is used as a governing equation to solve for spanwise property variations. While the goal for OTAC is to allow simulation of physical effects and architectural features unavailable in other existing codes, it must first prove capable of performing calculations for conventional turbomachines. OTAC is being developed using the interpreted language features available in the Numerical Propulsion System Simulation (NPSS) code described by Claus et al (1991). Using the NPSS framework came with several distinct advantages, including access to the pre-existing NPSS thermodynamic property packages and the NPSS Newton-Raphson solver. The remaining objects necessary for OTAC were written in the NPSS framework interpreted language. These new objects form the core of OTAC and are the BladeRow, BladeSegment, TransitionSection, Expander, Reducer, and OTACstart Elements. The BladeRow and BladeSegment consumed the initial bulk of the development effort and required determining the equations applicable to flow through turbomachinery blade rows given specific assumptions about the nature of that flow. Once these objects were completed, OTAC was tested and found to agree with existing solutions from other codes; these tests included various meanline and streamline comparisons of axial

  19. Fast and objective detection and analysis of structures in downhole images

    Science.gov (United States)

    Wedge, Daniel; Holden, Eun-Jung; Dentith, Mike; Spadaccini, Nick

    2017-09-01

    Downhole acoustic and optical televiewer images, and formation microimager (FMI) logs are important datasets for structural and geotechnical analyses for the mineral and petroleum industries. Within these data, dipping planar structures appear as sinusoids, often in incomplete form and in abundance. Their detection is a labour intensive and hence expensive task and as such is a significant bottleneck in data processing as companies may have hundreds of kilometres of logs to process each year. We present an image analysis system that harnesses the power of automated image analysis and provides an interactive user interface to support the analysis of televiewer images by users with different objectives. Our algorithm rapidly produces repeatable, objective results. We have embedded it in an interactive workflow to complement geologists' intuition and experience in interpreting data to improve efficiency and assist, rather than replace the geologist. The main contributions include a new image quality assessment technique for highlighting image areas most suited to automated structure detection and for detecting boundaries of geological zones, and a novel sinusoid detection algorithm for detecting and selecting sinusoids with given confidence levels. Further tools are provided to perform rapid analysis of and further detection of structures e.g. as limited to specific orientations.

  20. Precision measurements with LPCTrap at GANIL

    Energy Technology Data Exchange (ETDEWEB)

    Liénard, E., E-mail: lienard@lpccaen.in2p3.fr; Ban, G. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Couratin, C. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Delahaye, P. [GANIL, CEA/DSM-CNRS/IN2P3 (France); Durand, D.; Fabian, X. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Fabre, B. [CELIA, Université Bordeaux, CNRS, CEA (France); Fléchard, X. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Finlay, P. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Mauger, F. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Méry, A. [CIMAP, CEA/CNRS/ENSICAEN, Université de Caen (France); Naviliat-Cuncic, O. [NSCL and Department of Physics and Astronomy, MSU (United States); Pons, B. [CELIA, Université Bordeaux, CNRS, CEA (France); Porobic, T. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Quéméner, G. [LPC CAEN, ENSICAEN, Université de Caen, CNRS/IN2P3 (France); Severijns, N. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium); Thomas, J. C. [GANIL, CEA/DSM-CNRS/IN2P3 (France); Velten, Ph. [Instituut voor Kern- en Stralingsfysica, KU Leuven (Belgium)

    2015-11-15

    The experimental achievements and the results obtained so far with the LPCTrap device installed at GANIL are presented. The apparatus is dedicated to the study of the weak interaction at low energy by means of precise measurements of the β − ν angular correlation parameter in nuclear β decays. So far, the data collected with three isotopes have enabled to determine, for the first time, the charge state distributions of the recoiling ions, induced by shakeoff process. The analysis is presently refined to deduce the correlation parameters, with the potential of improving both the constraint deduced at low energy on exotic tensor currents ({sup 6}He{sup 1+}) and the precision on the V{sub ud} element of the quark-mixing matrix ({sup 35}Ar{sup 1+} and {sup 19}Ne{sup 1+}) deduced from the mirror transitions dataset.

  1. Development and validation of an automated and marker-free CT-based spatial analysis method (CTSA) for assessment of femoral hip implant migration: In vitro accuracy and precision comparable to that of radiostereometric analysis (RSA).

    Science.gov (United States)

    Scheerlinck, Thierry; Polfliet, Mathias; Deklerck, Rudi; Van Gompel, Gert; Buls, Nico; Vandemeulebroucke, Jef

    2016-01-01

    We developed a marker-free automated CT-based spatial analysis (CTSA) method to detect stem-bone migration in consecutive CT datasets and assessed the accuracy and precision in vitro. Our aim was to demonstrate that in vitro accuracy and precision of CTSA is comparable to that of radiostereometric analysis (RSA). Stem and bone were segmented in 2 CT datasets and both were registered pairwise. The resulting rigid transformations were compared and transferred to an anatomically sound coordinate system, taking the stem as reference. This resulted in 3 translation parameters and 3 rotation parameters describing the relative amount of stem-bone displacement, and it allowed calculation of the point of maximal stem migration. Accuracy was evaluated in 39 comparisons by imposing known stem migration on a stem-bone model. Precision was estimated in 20 comparisons based on a zero-migration model, and in 5 patients without stem loosening. Limits of the 95% tolerance intervals (TIs) for accuracy did not exceed 0.28 mm for translations and 0.20° for rotations (largest standard deviation of the signed error (SD(SE)): 0.081 mm and 0.057°). In vitro, limits of the 95% TI for precision in a clinically relevant setting (8 comparisons) were below 0.09 mm and 0.14° (largest SD(SE): 0.012 mm and 0.020°). In patients, the precision was lower, but acceptable, and dependent on CT scan resolution. CTSA allows detection of stem-bone migration with an accuracy and precision comparable to that of RSA. It could be valuable for evaluation of subtle stem loosening in clinical practice.

  2. Experimental assessment of precision and accuracy of radiostereometric analysis for the determination of polyethylene wear in a total hip replacement model.

    Science.gov (United States)

    Bragdon, Charles R; Malchau, Henrik; Yuan, Xunhua; Perinchief, Rebecca; Kärrholm, Johan; Börlin, Niclas; Estok, Daniel M; Harris, William H

    2002-07-01

    The purpose of this study was to develop and test a phantom model based on actual total hip replacement (THR) components to simulate the true penetration of the femoral head resulting from polyethylene wear. This model was used to study both the accuracy and the precision of radiostereometric analysis, RSA, in measuring wear. We also used this model to evaluate optimum tantalum bead configuration for this particular cup design when used in a clinical setting. A physical model of a total hip replacement (a phantom) was constructed which could simulate progressive, three-dimensional (3-D) penetration of the femoral head into the polyethylene component of a THR. Using a coordinate measuring machine (CMM) the positioning of the femoral head using the phantom was measured to be accurate to within 7 microm. The accuracy and precision of an RSA analysis system was determined from five repeat examinations of the phantom using various experimental set-ups of the phantom. The accuracy of the radiostereometric analysis, in this optimal experimental set-up studied was 33 microm for the medial direction, 22 microm for the superior direction, 86 microm for the posterior direction and 55 microm for the resultant 3-D vector length. The corresponding precision at the 95% confidence interval of the test results for repositioning the phantom five times, measured 8.4 microm for the medial direction, 5.5 microm for the superior direction, 16.0 microm for the posterior direction, and 13.5 microm for the resultant 3-D vector length. This in vitro model is proposed as a useful tool for developing a standard for the evaluation of radiostereometric and other radiographic methods used to measure in vivo wear.

  3. EXAMINATION ABOUT INFLUENCE FOR PRECISION OF 3D IMAGE MEASUREMENT FROM THE GROUND CONTROL POINT MEASUREMENT AND SURFACE MATCHING

    Directory of Open Access Journals (Sweden)

    T. Anai

    2015-05-01

    Full Text Available As the 3D image measurement software is now widely used with the recent development of computer-vision technology, the 3D measurement from the image is now has acquired the application field from desktop objects as wide as the topography survey in large geographical areas. Especially, the orientation, which used to be a complicated process in the heretofore image measurement, can be now performed automatically by simply taking many pictures around the object. And in the case of fully textured object, the 3D measurement of surface features is now done all automatically from the orientated images, and greatly facilitated the acquisition of the dense 3D point cloud from images with high precision. With all this development in the background, in the case of small and the middle size objects, we are now furnishing the all-around 3D measurement by a single digital camera sold on the market. And we have also developed the technology of the topographical measurement with the air-borne images taken by a small UAV [1~5]. In this present study, in the case of the small size objects, we examine the accuracy of surface measurement (Matching by the data of the experiments. And as to the topographic measurement, we examine the influence of GCP distribution on the accuracy by the data of the experiments. Besides, we examined the difference of the analytical results in each of the 3D image measurement software. This document reviews the processing flow of orientation and the 3D measurement of each software and explains the feature of the each software. And as to the verification of the precision of stereo-matching, we measured the test plane and the test sphere of the known form and assessed the result. As to the topography measurement, we used the air-borne image data photographed at the test field in Yadorigi of Matsuda City, Kanagawa Prefecture JAPAN. We have constructed Ground Control Point which measured by RTK-GPS and Total Station. And we show the results

  4. Examination about Influence for Precision of 3d Image Measurement from the Ground Control Point Measurement and Surface Matching

    Science.gov (United States)

    Anai, T.; Kochi, N.; Yamada, M.; Sasaki, T.; Otani, H.; Sasaki, D.; Nishimura, S.; Kimoto, K.; Yasui, N.

    2015-05-01

    As the 3D image measurement software is now widely used with the recent development of computer-vision technology, the 3D measurement from the image is now has acquired the application field from desktop objects as wide as the topography survey in large geographical areas. Especially, the orientation, which used to be a complicated process in the heretofore image measurement, can be now performed automatically by simply taking many pictures around the object. And in the case of fully textured object, the 3D measurement of surface features is now done all automatically from the orientated images, and greatly facilitated the acquisition of the dense 3D point cloud from images with high precision. With all this development in the background, in the case of small and the middle size objects, we are now furnishing the all-around 3D measurement by a single digital camera sold on the market. And we have also developed the technology of the topographical measurement with the air-borne images taken by a small UAV [1~5]. In this present study, in the case of the small size objects, we examine the accuracy of surface measurement (Matching) by the data of the experiments. And as to the topographic measurement, we examine the influence of GCP distribution on the accuracy by the data of the experiments. Besides, we examined the difference of the analytical results in each of the 3D image measurement software. This document reviews the processing flow of orientation and the 3D measurement of each software and explains the feature of the each software. And as to the verification of the precision of stereo-matching, we measured the test plane and the test sphere of the known form and assessed the result. As to the topography measurement, we used the air-borne image data photographed at the test field in Yadorigi of Matsuda City, Kanagawa Prefecture JAPAN. We have constructed Ground Control Point which measured by RTK-GPS and Total Station. And we show the results of analysis made

  5. Determination of the elemental composition of copper and bronze objects by neutron activation analysis

    International Nuclear Information System (INIS)

    Hoelttae, P.; Rosenberg, R.J.

    1987-01-01

    A method for the elemental analysis of copper and bronze objects is described. Na, Co, Ni, Cu, Zn, As, Ag, Sn, Sb, W, Ir and Au are determined through instrumental neutron activation analysis. Mg, Al, V, Ti and Mn are determined after chemical separation using anionic exchange. The detection limits for a number of other elements are also given. Results for NBS standard reference materials are presented and the results are compared with the recommended values. The agreement is good. The results of the analysis of five ancient bronze and two copper objects are also presented. (author) 3 refs.; 4 tabs

  6. Meanline Analysis of Turbines with Choked Flow in the Object-Oriented Turbomachinery Analysis Code

    Science.gov (United States)

    Hendricks, Eric S.

    2016-01-01

    The prediction of turbomachinery performance characteristics is an important part of the conceptual aircraft engine design process. During this phase, the designer must examine the effects of a large number of turbomachinery design parameters to determine their impact on overall engine performance and weight. The lack of detailed design information available in this phase necessitates the use of simpler meanline and streamline methods to determine the turbomachinery geometry characteristics and provide performance estimates prior to more detailed CFD (Computational Fluid Dynamics) analyses. While a number of analysis codes have been developed for this purpose, most are written in outdated software languages and may be difficult or impossible to apply to new, unconventional designs. The Object-Oriented Turbomachinery Analysis Code (OTAC) is currently being developed at NASA Glenn Research Center to provide a flexible meanline and streamline analysis capability in a modern object-oriented language. During the development and validation of OTAC, a limitation was identified in the code's ability to analyze and converge turbines as the flow approached choking. This paper describes a series of changes which can be made to typical OTAC turbine meanline models to enable the assessment of choked flow up to limit load conditions. Results produced with this revised model setup are provided in the form of turbine performance maps and are compared to published maps.

  7. Impacts of the precision agricultural technologies in Iran: An analysis experts' perception & their determinants

    Directory of Open Access Journals (Sweden)

    Somayeh Tohidyan Far

    2018-03-01

    Full Text Available Nowadays agricultural methods developments that are productively, economically, environmentally and socially sustainable are required immediately. The concept of precision agriculture is becoming an attractive idea for managing natural resources and realizing modern sustainable agricultural development. The purpose of this study was to investigate factors influencing impacts of precision agriculture from the viewpoints of Boushehr Province experts. The research method was a cross sectional survey and multi-stage random sampling was used to collect data from 115 experts in Boushehr province. According to the results, experts found underground and surface waters conservation, rural areas development, increase of productivity and increasing income as the most important impacts of precision agricultural technologies. Experts’ attitudes indicate their positive view toward these kinds of impacts. Also behavioral attitude has the most effect on impacts.

  8. Accuracy and precision of protein–ligand interaction kinetics determined from chemical shift titrations

    International Nuclear Information System (INIS)

    Markin, Craig J.; Spyracopoulos, Leo

    2012-01-01

    NMR-monitored chemical shift titrations for the study of weak protein–ligand interactions represent a rich source of information regarding thermodynamic parameters such as dissociation constants (K D ) in the micro- to millimolar range, populations for the free and ligand-bound states, and the kinetics of interconversion between states, which are typically within the fast exchange regime on the NMR timescale. We recently developed two chemical shift titration methods wherein co-variation of the total protein and ligand concentrations gives increased precision for the K D value of a 1:1 protein–ligand interaction (Markin and Spyracopoulos in J Biomol NMR 53: 125–138, 2012). In this study, we demonstrate that classical line shape analysis applied to a single set of 1 H– 15 N 2D HSQC NMR spectra acquired using precise protein–ligand chemical shift titration methods we developed, produces accurate and precise kinetic parameters such as the off-rate (k off ). For experimentally determined kinetics in the fast exchange regime on the NMR timescale, k off ∼ 3,000 s −1 in this work, the accuracy of classical line shape analysis was determined to be better than 5 % by conducting quantum mechanical NMR simulations of the chemical shift titration methods with the magnetic resonance toolkit GAMMA. Using Monte Carlo simulations, the experimental precision for k off from line shape analysis of NMR spectra was determined to be 13 %, in agreement with the theoretical precision of 12 % from line shape analysis of the GAMMA simulations in the presence of noise and protein concentration errors. In addition, GAMMA simulations were employed to demonstrate that line shape analysis has the potential to provide reasonably accurate and precise k off values over a wide range, from 100 to 15,000 s −1 . The validity of line shape analysis for k off values approaching intermediate exchange (∼100 s −1 ), may be facilitated by more accurate K D measurements from NMR

  9. Accuracy and precision of protein-ligand interaction kinetics determined from chemical shift titrations.

    Science.gov (United States)

    Markin, Craig J; Spyracopoulos, Leo

    2012-12-01

    NMR-monitored chemical shift titrations for the study of weak protein-ligand interactions represent a rich source of information regarding thermodynamic parameters such as dissociation constants (K ( D )) in the micro- to millimolar range, populations for the free and ligand-bound states, and the kinetics of interconversion between states, which are typically within the fast exchange regime on the NMR timescale. We recently developed two chemical shift titration methods wherein co-variation of the total protein and ligand concentrations gives increased precision for the K ( D ) value of a 1:1 protein-ligand interaction (Markin and Spyracopoulos in J Biomol NMR 53: 125-138, 2012). In this study, we demonstrate that classical line shape analysis applied to a single set of (1)H-(15)N 2D HSQC NMR spectra acquired using precise protein-ligand chemical shift titration methods we developed, produces accurate and precise kinetic parameters such as the off-rate (k ( off )). For experimentally determined kinetics in the fast exchange regime on the NMR timescale, k ( off ) ~ 3,000 s(-1) in this work, the accuracy of classical line shape analysis was determined to be better than 5 % by conducting quantum mechanical NMR simulations of the chemical shift titration methods with the magnetic resonance toolkit GAMMA. Using Monte Carlo simulations, the experimental precision for k ( off ) from line shape analysis of NMR spectra was determined to be 13 %, in agreement with the theoretical precision of 12 % from line shape analysis of the GAMMA simulations in the presence of noise and protein concentration errors. In addition, GAMMA simulations were employed to demonstrate that line shape analysis has the potential to provide reasonably accurate and precise k ( off ) values over a wide range, from 100 to 15,000 s(-1). The validity of line shape analysis for k ( off ) values approaching intermediate exchange (~100 s(-1)), may be facilitated by more accurate K ( D ) measurements

  10. High Precision Testbed to Evaluate Ethernet Performance for In-Car Networks

    DEFF Research Database (Denmark)

    Revsbech, Kasper; Madsen, Tatiana Kozlova; Schiøler, Henrik

    2012-01-01

    Validating safety-critical real-time systems such as in-car networks often involves a model-based performance analysis of the network. An important issue performing such analysis is to provide precise model parameters, matching the actual equipment. One way to obtain such parameters is to derive...... them by measurements of the equipment. In this work we describe the design of a testbed enabling active measurements on up to 1 [Gb=Sec] Copper based Ethernet Switches. By use of the testbed it self, we conduct a series of tests where the precision of the testbed is estimated. We find a maximum error...

  11. System Would Detect Foreign-Object Damage in Turbofan Engine

    Science.gov (United States)

    Torso, James A.; Litt, Jonathan S.

    2006-01-01

    A proposed data-fusion system, to be implemented mostly in software, would further process the digitized and preprocessed outputs of sensors in a turbofan engine to detect foreign-object damage (FOD) [more precisely, damage caused by impingement of such foreign objects as birds, pieces of ice, and runway debris]. The proposed system could help a flight crew to decide what, if any, response is necessary to complete a flight safely, and could aid mechanics in deciding what post-flight maintenance action might be needed. The sensory information to be utilized by the proposed system would consist of (1) the output of an accelerometer in an engine-vibration-monitoring subsystem and (2) features extracted from a gas path analysis. ["Gas path analysis" (GPA) is a term of art that denotes comprehensive analysis of engine performance derived from readings of fuel-flow meters, shaft-speed sensors, temperature sensors, and the like.] The acceleration signal would first be processed by a wavelet-transform-based algorithm, using a wavelet created for the specific purpose of finding abrupt FOD-induced changes in noisy accelerometer signals. Two additional features extracted would be the amplitude of vibration (determined via a single- frequency Fourier transform calculated at the rotational speed of the engine), and the rate of change in amplitude due to an FOD-induced rotor imbalance. This system would utilize two GPA features: the fan efficiency and the rate of change of fan efficiency with time. The selected GPA and vibrational features would be assessed by two fuzzy-logic inference engines, denoted the "Gas Path Expert" and the "Vibration Expert," respectively (see Figure 1). Each of these inference engines would generate a "possibility" distribution for occurrence of an FOD event: Each inference engine would assign, to its input information, degrees of membership, which would subsequently be transformed into basic probability assignments for the gas path and vibration

  12. Smart indoor climate control in precision livestock farming

    DEFF Research Database (Denmark)

    Zhang, Guoqiang; Bjerg, Bjarne Schmidt; Wang, Xiaoshuai

    2016-01-01

    One of the major objectives of precision livestock farming (PLF) is to provide an optimal thermal climate control in the animal occupant zones for promoting animal production and wellbeing. To achieve this goal, smart climate models that reflect the needs of different animal species and ages or f...... condition in AOZ. In addition, the paper presents a fundamental principle of development of an integrated indoor climate sensor to reflect animal thermal wellbeing and techniques that could be used for a smart system design and control are discussed....

  13. Precision Medicine in Cardiovascular Diseases

    Directory of Open Access Journals (Sweden)

    Yan Liu

    2017-02-01

    Full Text Available Since President Obama announced the Precision Medicine Initiative in the United States, more and more attention has been paid to precision medicine. However, clinicians have already used it to treat conditions such as cancer. Many cardiovascular diseases have a familial presentation, and genetic variants are associated with the prevention, diagnosis, and treatment of cardiovascular diseases, which are the basis for providing precise care to patients with cardiovascular diseases. Large-scale cohorts and multiomics are critical components of precision medicine. Here we summarize the application of precision medicine to cardiovascular diseases based on cohort and omic studies, and hope to elicit discussion about future health care.

  14. Precision medicine in breast cancer: reality or utopia?

    Science.gov (United States)

    Bettaieb, Ali; Paul, Catherine; Plenchette, Stéphanie; Shan, Jingxuan; Chouchane, Lotfi; Ghiringhelli, François

    2017-06-17

    Many cancers, including breast cancer, have demonstrated prognosis and support advantages thanks to the discovery of targeted therapies. The advent of these new approaches marked the rise of precision medicine, which leads to improve the diagnosis, prognosis and treatment of cancer. Precision medicine takes into account the molecular and biological specificities of the patient and their tumors that will influence the treatment determined by physicians. This new era of medicine is accessible through molecular genetics platforms, the development of high-speed sequencers and means of analysis of these data. Despite the spectacular results in the treatment of cancers including breast cancer, described in this review, not all patients however can benefit from this new strategy. This seems to be related to the many genetic mutations, which may be different from one patient to another or within the same patient. It comes to give new impetus to the research-both from a technological and biological point of view-to make the hope of precision medicine accessible to all.

  15. An Emerging Role for Polystores in Precision Medicine

    Energy Technology Data Exchange (ETDEWEB)

    Begoli, Edmon [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Christian, J. Blair [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gadepally, Vijay [Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States); Papadopoulos, Stavros [TileDB, Inc., Cambridge, MA (United States)

    2017-08-26

    Medical data is organically heterogeneous, and it usually varies significantly in both size and composition. Yet, this data is also a key for the recent and promising field of precision medicine, which focuses on identifying and tailoring appropriate medical treatments for the needs of the individual patients, based on their specific conditions, their medical history, lifestyle, genetic, and other individual factors. As we, and a database community at large, recognize that a “one size does not fit all” solution is required to work with such data, we present in this paper our observations based on our experiences, and the applications in the field of precision medicine. Finally, we make the case for the use of polystore architecture; how it applies for precision medicine; we discuss the reference architecture; describe some of its critical components (array database); and discuss the specific types of analysis that directly benefit from this database architecture, and the ways it serves the data.

  16. Analysis of precision in chemical oscillators: implications for circadian clocks

    International Nuclear Information System (INIS)

    D'Eysmond, Thomas; De Simone, Alessandro; Naef, Felix

    2013-01-01

    Biochemical reaction networks often exhibit spontaneous self-sustained oscillations. An example is the circadian oscillator that lies at the heart of daily rhythms in behavior and physiology in most organisms including humans. While the period of these oscillators evolved so that it resonates with the 24 h daily environmental cycles, the precision of the oscillator (quantified via the Q factor) is another relevant property of these cell-autonomous oscillators. Since this quantity can be measured in individual cells, it is of interest to better understand how this property behaves across mathematical models of these oscillators. Current theoretical schemes for computing the Q factors show limitations for both high-dimensional models and in the vicinity of Hopf bifurcations. Here, we derive low-noise approximations that lead to numerically stable schemes also in high-dimensional models. In addition, we generalize normal form reductions that are appropriate near Hopf bifurcations. Applying our approximations to two models of circadian clocks, we show that while the low-noise regime is faithfully recapitulated, increasing the level of noise leads to species-dependent precision. We emphasize that subcomponents of the oscillator gradually decouple from the core oscillator as noise increases, which allows us to identify the subnetworks responsible for robust rhythms. (paper)

  17. Yale High Energy Physics Research: Precision Studies of Reactor Antineutrinos

    International Nuclear Information System (INIS)

    Heeger, Karsten M.

    2014-01-01

    This report presents experimental research at the intensity frontier of particle physics with particular focus on the study of reactor antineutrinos and the precision measurement of neutrino oscillations. The experimental neutrino physics group of Professor Heeger and Senior Scientist Band at Yale University has had leading responsibilities in the construction and operation of the Daya Bay Reactor Antineutrino Experiment and made critical contributions to the discovery of non-zero$\\theta . Heeger and Band led the Daya Bay detector management team and are now overseeing the operations of the antineutrino detectors. Postdoctoral researchers and students in this group have made leading contributions to the Daya Bay analysis including the prediction of the reactor antineutrino flux and spectrum, the analysis of the oscillation signal, and the precision determination of the target mass yielding unprecedented precision in the relative detector uncertainty. Heeger's group is now leading an R\\&D effort towards a short-baseline oscillation experiment, called PROSPECT, at a US research reactor and the development of antineutrino detectors with advanced background discrimination.

  18. Yale High Energy Physics Research: Precision Studies of Reactor Antineutrinos

    Energy Technology Data Exchange (ETDEWEB)

    Heeger, Karsten M. [Yale Univ., New Haven, CT (United States)

    2014-09-13

    This report presents experimental research at the intensity frontier of particle physics with particular focus on the study of reactor antineutrinos and the precision measurement of neutrino oscillations. The experimental neutrino physics group of Professor Heeger and Senior Scientist Band at Yale University has had leading responsibilities in the construction and operation of the Daya Bay Reactor Antineutrino Experiment and made critical contributions to the discovery of non-zero$\\theta_{13}$. Heeger and Band led the Daya Bay detector management team and are now overseeing the operations of the antineutrino detectors. Postdoctoral researchers and students in this group have made leading contributions to the Daya Bay analysis including the prediction of the reactor antineutrino flux and spectrum, the analysis of the oscillation signal, and the precision determination of the target mass yielding unprecedented precision in the relative detector uncertainty. Heeger's group is now leading an R\\&D effort towards a short-baseline oscillation experiment, called PROSPECT, at a US research reactor and the development of antineutrino detectors with advanced background discrimination.

  19. Multi-objective experimental design for (13)C-based metabolic flux analysis.

    Science.gov (United States)

    Bouvin, Jeroen; Cajot, Simon; D'Huys, Pieter-Jan; Ampofo-Asiama, Jerry; Anné, Jozef; Van Impe, Jan; Geeraerd, Annemie; Bernaerts, Kristel

    2015-10-01

    (13)C-based metabolic flux analysis is an excellent technique to resolve fluxes in the central carbon metabolism but costs can be significant when using specialized tracers. This work presents a framework for cost-effective design of (13)C-tracer experiments, illustrated on two different networks. Linear and non-linear optimal input mixtures are computed for networks for Streptomyces lividans and a carcinoma cell line. If only glucose tracers are considered as labeled substrate for a carcinoma cell line or S. lividans, the best parameter estimation accuracy is obtained by mixtures containing high amounts of 1,2-(13)C2 glucose combined with uniformly labeled glucose. Experimental designs are evaluated based on a linear (D-criterion) and non-linear approach (S-criterion). Both approaches generate almost the same input mixture, however, the linear approach is favored due to its low computational effort. The high amount of 1,2-(13)C2 glucose in the optimal designs coincides with a high experimental cost, which is further enhanced when labeling is introduced in glutamine and aspartate tracers. Multi-objective optimization gives the possibility to assess experimental quality and cost at the same time and can reveal excellent compromise experiments. For example, the combination of 100% 1,2-(13)C2 glucose with 100% position one labeled glutamine and the combination of 100% 1,2-(13)C2 glucose with 100% uniformly labeled glutamine perform equally well for the carcinoma cell line, but the first mixture offers a decrease in cost of $ 120 per ml-scale cell culture experiment. We demonstrated the validity of a multi-objective linear approach to perform optimal experimental designs for the non-linear problem of (13)C-metabolic flux analysis. Tools and a workflow are provided to perform multi-objective design. The effortless calculation of the D-criterion can be exploited to perform high-throughput screening of possible (13)C-tracers, while the illustrated benefit of multi-objective

  20. SEGMENT OF FINANCIAL CORPORATIONS AS AN OBJECT OF FINANCIAL AND STATISTICAL ANALYSIS

    OpenAIRE

    Marat F. Mazitov

    2013-01-01

    The article is devoted to the study specific features of the formation and change of economic assets of financial corporations as an object of management and financial analysis. He author identifies the features and gives the classification of institutional units belonging to the sector of financial corporations from the viewpoint of assessment and financial analysis of the flows, reflecting change of their assets.

  1. Context based Coding of Binary Shapes by Object Boundary Straightness Analysis

    DEFF Research Database (Denmark)

    Aghito, Shankar Manuel; Forchhammer, Søren

    2004-01-01

    A new lossless compression scheme for bilevel images targeted at binary shapes of image and video objects is presented. The scheme is based on a local analysis of the digital straightness of the causal part of the object boundary, which is used in the context definition for arithmetic encoding....... Tested on individual images of binary shapes and binary layers of digital maps the algorithm outperforms PWC, JBIG and MPEG-4 CAE. On the binary shapes the code lengths are reduced by 21%, 25%, and 42%, respectively. On the maps the reductions are 34%, 32%, and 59%, respectively. The algorithm is also...

  2. Combination of optically measured coordinates and displacements for quantitative investigation of complex objects

    Science.gov (United States)

    Andrae, Peter; Beeck, Manfred-Andreas; Jueptner, Werner P. O.; Nadeborn, Werner; Osten, Wolfgang

    1996-09-01

    Holographic interferometry makes it possible to measure high precision displacement data in the range of the wavelength of the used laser light. However, the determination of 3D- displacement vectors of objects with complex surfaces requires the measurement of 3D-object coordinates not only to consider local sensitivities but to distinguish between in-plane deformation, i.e. strains, and out-of-plane components, i.e. shears, too. To this purpose both the surface displacement and coordinates have to be combined and it is advantageous to make the data available for CAE- systems. The object surface has to be approximated analytically from the measured point cloud to generate a surface mesh. The displacement vectors can be assigned to the nodes of this surface mesh for visualization of the deformation of the object under test. They also can be compared to the results of FEM-calculations or can be used as boundary conditions for further numerical investigations. Here the 3D-object coordinates are measured in a separate topometric set-up using a modified fringe projection technique to acquire absolute phase values and a sophisticated geometrical model to map these phase data onto coordinates precisely. The determination of 3D-displacement vectors requires the measurement of several interference phase distributions for at least three independent sensitivity directions depending on the observation and illumination directions as well as the 3D-position of each measuring point. These geometric quantities have to be transformed into a reference coordinate system of the interferometric set-up in order to calculate the geometric matrix. The necessary transformation can be realized by means of a detection of object features in both data sets and a subsequent determination of the external camera orientation. This paper presents a consistent solution for the measurement and combination of shape and displacement data including their transformation into simulation systems. The

  3. Modeling Self-Occlusions/Disocclusions in Dynamic Shape and Appearance Tracking for Obtaining Precise Shape

    KAUST Repository

    Yang, Yanchao

    2013-05-01

    We present a method to determine the precise shape of a dynamic object from video. This problem is fundamental to computer vision, and has a number of applications, for example, 3D video/cinema post-production, activity recognition and augmented reality. Current tracking algorithms that determine precise shape can be roughly divided into two categories: 1) Global statistics partitioning methods, where the shape of the object is determined by discriminating global image statistics, and 2) Joint shape and appearance matching methods, where a template of the object from the previous frame is matched to the next image. The former is limited in cases of complex object appearance and cluttered background, where global statistics cannot distinguish between the object and background. The latter is able to cope with complex appearance and a cluttered background, but is limited in cases of camera viewpoint change and object articulation, which induce self-occlusions and self-disocclusions of the object of interest. The purpose of this thesis is to model self-occlusion/disocclusion phenomena in a joint shape and appearance tracking framework. We derive a non-linear dynamic model of the object shape and appearance taking into account occlusion phenomena, which is then used to infer self-occlusions/disocclusions, shape and appearance of the object in a variational optimization framework. To ensure robustness to other unmodeled phenomena that are present in real-video sequences, the Kalman filter is used for appearance updating. Experiments show that our method, which incorporates the modeling of self-occlusion/disocclusion, increases the accuracy of shape estimation in situations of viewpoint change and articulation, and out-performs current state-of-the-art methods for shape tracking.

  4. Characterisation of surface roughness for ultra-precision freeform surfaces

    International Nuclear Information System (INIS)

    Li Huifen; Cheung, C F; Lee, W B; To, S; Jiang, X Q

    2005-01-01

    Ultra-precision freeform surfaces are widely used in many advanced optics applications which demand for having surface roughness down to nanometer range. Although a lot of research work has been reported on the study of surface generation, reconstruction and surface characterization such as MOTIF and fractal analysis, most of them are focused on axial symmetric surfaces such as aspheric surfaces. Relative little research work has been found in the characterization of surface roughness in ultra-precision freeform surfaces. In this paper, a novel Robust Gaussian Filtering (RGF) method is proposed for the characterisation of surface roughness for ultra-precision freeform surfaces with known mathematic model or a cloud of discrete points. A series of computer simulation and measurement experiments were conducted to verify the capability of the proposed method. The experimental results were found to agree well with the theoretical results

  5. Environmental Testing for Precision Parts and Instruments

    International Nuclear Information System (INIS)

    Choi, Man Yong; Park, Jeong Hak; Yun, Kyu Tek

    2001-01-01

    Precision parts and instruments are tested to evaluate performance in development-process and product-step to prement a potential defect due to a failure design. In this paper, Environmental test technology, which is the basis of reliability analysis, is introduced with examples of test criterion, test method for products, encoder and traffic signal controller, and measuring instruments. Recently, as the importance of the environmental test technology is recognised. It is proposed that training of test technician and technology of jig design and failure analysis are very essential

  6. Comparative study of 2-DOF micromirrors for precision light manipulation

    Science.gov (United States)

    Young, Johanna I.; Shkel, Andrei M.

    2001-08-01

    Many industry experts predict that the future of fiber optic telecommunications depends on the development of all-optical components for switching of photonic signals from fiber to fiber throughout the networks. MEMS is a promising technology for providing all-optical switching at high speeds with significant cost reductions. This paper reports on the the analysis of two designs for 2-DOF electrostatically actuated MEMS micromirrors for precision controllable large optical switching arrays. The behavior of the micromirror designs is predicted by coupled-field electrostatic and modal analysis using a finite element analysis (FEA) multi-physics modeling software. The analysis indicates that the commonly used gimbal type mirror design experiences electrostatic interference and would therefore be difficult to precisely control for 2-DOF motion. We propose a new design approach which preserves 2-DOF actuation while minimizing electrostatic interference between the drive electrodes and the mirror. Instead of using two torsional axes, we use one actuator which combines torsional and flexural DOFs. A comparative analysis of the conventional gimbal design and the one proposed in this paper is performed.

  7. Doublet Pulse Coherent Laser Radar for Tracking of Resident Space Objects

    Science.gov (United States)

    2014-09-01

    any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a...tracking 10 cm2 cross section targets in LEO as well as tracking near Earth objects (NEOs) such as meteoroids, and asteroids may well be possible...using short pulsewidth doublet pulse coherent ladar technique offers a means for precision tracking. The technique offers best of both worlds ; precise

  8. Optical Aperture Synthesis Object's Information Extracting Based on Wavelet Denoising

    International Nuclear Information System (INIS)

    Fan, W J; Lu, Y

    2006-01-01

    Wavelet denoising is studied to improve OAS(optical aperture synthesis) object's Fourier information extracting. Translation invariance wavelet denoising based on Donoho wavelet soft threshold denoising is researched to remove Pseudo-Gibbs in wavelet soft threshold image. OAS object's information extracting based on translation invariance wavelet denoising is studied. The study shows that wavelet threshold denoising can improve the precision and the repetition of object's information extracting from interferogram, and the translation invariance wavelet denoising information extracting is better than soft threshold wavelet denoising information extracting

  9. Precision of coherence analysis to detect cerebral autoregulation by near-infrared spectroscopy in preterm infants

    DEFF Research Database (Denmark)

    Hahn, GH; Christensen, KB; Leung, TS

    2010-01-01

    Coherence between spontaneous fluctuations in arterial blood pressure (ABP) and the cerebral near-infrared spectroscopy signal can detect cerebral autoregulation. Because reliable measurement depends on signals with high signal-to-noise ratio, we hypothesized that coherence is more precisely...... determined when fluctuations in ABP are large rather than small. Therefore, we investigated whether adjusting for variability in ABP (variabilityABP) improves precision. We examined the impact of variabilityABP within the power spectrum in each measurement and between repeated measurements in preterm infants....... We also examined total monitoring time required to discriminate among infants with a simulation study. We studied 22 preterm infants (GAABP within the power spectrum did not improve the precision. However, adjusting...

  10. Assessing the readiness of precision medicine interoperabilty: An exploratory study of the National Institutes of Health genetic testing registry

    Directory of Open Access Journals (Sweden)

    Jay G Ronquillo

    2017-11-01

    Full Text Available Background:  Precision medicine involves three major innovations currently taking place in healthcare:  electronic health records, genomics, and big data.  A major challenge for healthcare providers, however, is understanding the readiness for practical application of initiatives like precision medicine. Objective:  To better understand the current state and challenges of precision medicine interoperability using a national genetic testing registry as a starting point, placed in the context of established interoperability formats. Methods:  We performed an exploratory analysis of the National Institutes of Health Genetic Testing Registry.  Relevant standards included Health Level Seven International Version 3 Implementation Guide for Family History, the Human Genome Organization Gene Nomenclature Committee (HGNC database, and Systematized Nomenclature of Medicine – Clinical Terms (SNOMED CT.  We analyzed the distribution of genetic testing laboratories, genetic test characteristics, and standardized genome/clinical code mappings, stratified by laboratory setting. Results: There were a total of 25472 genetic tests from 240 laboratories testing for approximately 3632 distinct genes.  Most tests focused on diagnosis, mutation confirmation, and/or risk assessment of germline mutations that could be passed to offspring.  Genes were successfully mapped to all HGNC identifiers, but less than half of tests mapped to SNOMED CT codes, highlighting significant gaps when linking genetic tests to standardized clinical codes that explain the medical motivations behind test ordering.   Conclusion:  While precision medicine could potentially transform healthcare, successful practical and clinical application will first require the comprehensive and responsible adoption of interoperable standards, terminologies, and formats across all aspects of the precision medicine pipeline.

  11. Non-destructive analysis of museum objects by fibre-optic Raman spectroscopy.

    Science.gov (United States)

    Vandenabeele, Peter; Tate, Jim; Moens, Luc

    2007-02-01

    Raman spectroscopy is a versatile technique that has frequently been applied for the investigation of art objects. By using mobile Raman instrumentation it is possible to investigate the artworks without the need for sampling. This work evaluates the use of a dedicated mobile spectrometer for the investigation of a range of museum objects in museums in Scotland, including antique Egyptian sarcophagi, a panel painting, painted surfaces on paper and textile, and the painted lid and soundboard of an early keyboard instrument. The investigations of these artefacts illustrate some analytical challenges that arise when analysing museum objects, including fluorescing varnish layers, ambient sunlight, large dimensions of artefacts and the need to handle fragile objects with care. Analysis of the musical instrument (the Mar virginals) was undertaken in the exhibition gallery, while on display, which meant that interaction with the public and health and safety issues had to be taken into account. Experimental set-up for the non-destructive Raman spectroscopic investigation of a textile banner in the National Museums of Scotland.

  12. Optimizing top precision performance measure of content-based image retrieval by learning similarity function

    KAUST Repository

    Liang, Ru-Ze

    2017-04-24

    In this paper we study the problem of content-based image retrieval. In this problem, the most popular performance measure is the top precision measure, and the most important component of a retrieval system is the similarity function used to compare a query image against a database image. However, up to now, there is no existing similarity learning method proposed to optimize the top precision measure. To fill this gap, in this paper, we propose a novel similarity learning method to maximize the top precision measure. We model this problem as a minimization problem with an objective function as the combination of the losses of the relevant images ranked behind the top-ranked irrelevant image, and the squared Frobenius norm of the similarity function parameter. This minimization problem is solved as a quadratic programming problem. The experiments over two benchmark data sets show the advantages of the proposed method over other similarity learning methods when the top precision is used as the performance measure.

  13. Optimizing top precision performance measure of content-based image retrieval by learning similarity function

    KAUST Repository

    Liang, Ru-Ze; Shi, Lihui; Wang, Haoxiang; Meng, Jiandong; Wang, Jim Jing-Yan; Sun, Qingquan; Gu, Yi

    2017-01-01

    In this paper we study the problem of content-based image retrieval. In this problem, the most popular performance measure is the top precision measure, and the most important component of a retrieval system is the similarity function used to compare a query image against a database image. However, up to now, there is no existing similarity learning method proposed to optimize the top precision measure. To fill this gap, in this paper, we propose a novel similarity learning method to maximize the top precision measure. We model this problem as a minimization problem with an objective function as the combination of the losses of the relevant images ranked behind the top-ranked irrelevant image, and the squared Frobenius norm of the similarity function parameter. This minimization problem is solved as a quadratic programming problem. The experiments over two benchmark data sets show the advantages of the proposed method over other similarity learning methods when the top precision is used as the performance measure.

  14. Precision Airdrop (Largage de precision)

    Science.gov (United States)

    2005-12-01

    NAVIGATION TO A PRECISION AIRDROP OVERVIEW RTO-AG-300-V24 2 - 9 the point from various compass headings. As the tests are conducted, the resultant...rate. This approach avoids including a magnetic compass for the heading reference, which has difficulties due to local changes in the magnetic field...Scientifica della Difesa ROYAUME-UNI Via XX Settembre 123 Dstl Knowledge Services ESPAGNE 00187 Roma Information Centre, Building 247 SDG TECEN / DGAM

  15. Precision medicine for nurses: 101.

    Science.gov (United States)

    Lemoine, Colleen

    2014-05-01

    To introduce the key concepts and terms associated with precision medicine and support understanding of future developments in the field by providing an overview and history of precision medicine, related ethical considerations, and nursing implications. Current nursing, medical and basic science literature. Rapid progress in understanding the oncogenic drivers associated with cancer is leading to a shift toward precision medicine, where treatment is based on targeting specific genetic and epigenetic alterations associated with a particular cancer. Nurses will need to embrace the paradigm shift to precision medicine, expend the effort necessary to learn the essential terminology, concepts and principles, and work collaboratively with physician colleagues to best position our patients to maximize the potential that precision medicine can offer. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Scalable and Precise Abstraction of Programs for Trustworthy Software

    Science.gov (United States)

    2017-01-01

    1.2.3 A mobile contract infrastructure for Java , with corresponding higher-order generalizations of relational abstract domains...precision of a static analysis. These parameters induce an analytic framework that spans a continuum from the null analysis up to the concrete semantics...inevitable merging that creates false positives. Global monotonicity still guarantees termination. 1.2.3 A mobile contract infrastructure for Java , with

  17. Modeling Self-Occlusions/Disocclusions in Dynamic Shape and Appearance Tracking for Obtaining Precise Shape

    KAUST Repository

    Yang, Yanchao

    2013-01-01

    We present a method to determine the precise shape of a dynamic object from video. This problem is fundamental to computer vision, and has a number of applications, for example, 3D video/cinema post-production, activity recognition and augmented

  18. Precision genome editing

    DEFF Research Database (Denmark)

    Steentoft, Catharina; Bennett, Eric P; Schjoldager, Katrine Ter-Borch Gram

    2014-01-01

    Precise and stable gene editing in mammalian cell lines has until recently been hampered by the lack of efficient targeting methods. While different gene silencing strategies have had tremendous impact on many biological fields, they have generally not been applied with wide success in the field...... of glycobiology, primarily due to their low efficiencies, with resultant failure to impose substantial phenotypic consequences upon the final glycosylation products. Here, we review novel nuclease-based precision genome editing techniques enabling efficient and stable gene editing, including gene disruption...... by introducing single or double-stranded breaks at a defined genomic sequence. We here compare and contrast the different techniques and summarize their current applications, highlighting cases from the field of glycobiology as well as pointing to future opportunities. The emerging potential of precision gene...

  19. Automation of Precise Time Reference Stations (PTRS)

    Science.gov (United States)

    Wheeler, P. J.

    1985-04-01

    The U.S. Naval Observatory is presently engaged in a program of automating precise time stations (PTS) and precise time reference stations (PTBS) by using a versatile mini-computer controlled data acquisition system (DAS). The data acquisition system is configured to monitor locally available PTTI signals such as LORAN-C, OMEGA, and/or the Global Positioning System. In addition, the DAS performs local standard intercomparison. Computer telephone communications provide automatic data transfer to the Naval Observatory. Subsequently, after analysis of the data, results and information can be sent back to the precise time reference station to provide automatic control of remote station timing. The DAS configuration is designed around state of the art standard industrial high reliability modules. The system integration and software are standardized but allow considerable flexibility to satisfy special local requirements such as stability measurements, performance evaluation and printing of messages and certificates. The DAS operates completely independently and may be queried or controlled at any time with a computer or terminal device (control is protected for use by authorized personnel only). Such DAS equipped PTS are operational in Hawaii, California, Texas and Florida.

  20. Autonomous Space Object Catalogue Construction and Upkeep Using Sensor Control Theory

    Science.gov (United States)

    Moretti, N.; Rutten, M.; Bessell, T.; Morreale, B.

    The capability to track objects in space is critical to safeguard domestic and international space assets. Infrequent measurement opportunities, complex dynamics and partial observability of orbital state makes the tracking of resident space objects nontrivial. It is not uncommon for human operators to intervene with space tracking systems, particularly in scheduling sensors. This paper details the development of a system that maintains a catalogue of geostationary objects through dynamically tasking sensors in real time by managing the uncertainty of object states. As the number of objects in space grows the potential for collision grows exponentially. Being able to provide accurate assessment to operators regarding costly collision avoidance manoeuvres is paramount; the accuracy of which is highly dependent on how object states are estimated. The system represents object state and uncertainty using particles and utilises a particle filter for state estimation. Particle filters capture the model and measurement uncertainty accurately, allowing for a more comprehensive representation of the state’s probability density function. Additionally, the number of objects in space is growing disproportionally to the number of sensors used to track them. Maintaining precise positions for all objects places large loads on sensors, limiting the time available to search for new objects or track high priority objects. Rather than precisely track all objects our system manages the uncertainty in orbital state for each object independently. The uncertainty is allowed to grow and sensor data is only requested when the uncertainty must be reduced. For example when object uncertainties overlap leading to data association issues or if the uncertainty grows to beyond a field of view. These control laws are formulated into a cost function, which is optimised in real time to task sensors. By controlling an optical telescope the system has been able to construct and maintain a catalogue

  1. Precision Cosmology

    Science.gov (United States)

    Jones, Bernard J. T.

    2017-04-01

    Preface; Notation and conventions; Part I. 100 Years of Cosmology: 1. Emerging cosmology; 2. The cosmic expansion; 3. The cosmic microwave background; 4. Recent cosmology; Part II. Newtonian Cosmology: 5. Newtonian cosmology; 6. Dark energy cosmological models; 7. The early universe; 8. The inhomogeneous universe; 9. The inflationary universe; Part III. Relativistic Cosmology: 10. Minkowski space; 11. The energy momentum tensor; 12. General relativity; 13. Space-time geometry and calculus; 14. The Einstein field equations; 15. Solutions of the Einstein equations; 16. The Robertson-Walker solution; 17. Congruences, curvature and Raychaudhuri; 18. Observing and measuring the universe; Part IV. The Physics of Matter and Radiation: 19. Physics of the CMB radiation; 20. Recombination of the primeval plasma; 21. CMB polarisation; 22. CMB anisotropy; Part V. Precision Tools for Precision Cosmology: 23. Likelihood; 24. Frequentist hypothesis testing; 25. Statistical inference: Bayesian; 26. CMB data processing; 27. Parametrising the universe; 28. Precision cosmology; 29. Epilogue; Appendix A. SI, CGS and Planck units; Appendix B. Magnitudes and distances; Appendix C. Representing vectors and tensors; Appendix D. The electromagnetic field; Appendix E. Statistical distributions; Appendix F. Functions on a sphere; Appendix G. Acknowledgements; References; Index.

  2. Flight-Test Evaluation of Kinematic Precise Point Positioning of Small UAVs

    Directory of Open Access Journals (Sweden)

    Jason N. Gross

    2016-01-01

    Full Text Available An experimental analysis of Global Positioning System (GPS flight data collected onboard a Small Unmanned Aerial Vehicle (SUAV is conducted in order to demonstrate that postprocessed kinematic Precise Point Positioning (PPP solutions with precisions approximately 6 cm 3D Residual Sum of Squares (RSOS can be obtained on SUAVs that have short duration flights with limited observational periods (i.e., only ~≤5 minutes of data. This is a significant result for the UAV flight testing community because an important and relevant benefit of the PPP technique over traditional Differential GPS (DGPS techniques, such as Real-Time Kinematic (RTK, is that there is no requirement for maintaining a short baseline separation to a differential GNSS reference station. Because SUAVs are an attractive platform for applications such as aerial surveying, precision agriculture, and remote sensing, this paper offers an experimental evaluation of kinematic PPP estimation strategies using SUAV platform data. In particular, an analysis is presented in which the position solutions that are obtained from postprocessing recorded UAV flight data with various PPP software and strategies are compared to solutions that were obtained using traditional double-differenced ambiguity fixed carrier-phase Differential GPS (CP-DGPS. This offers valuable insight to assist designers of SUAV navigation systems whose applications require precise positioning.

  3. Electroweak precision data and gravitino dark matter

    Indian Academy of Sciences (India)

    We analyze the precision observables in the context of the GDM, focusing on parameter combina- tions that fulfill 0.094 < ΩCDMh2 < 0.129 [7]. In order to simplify the analysis in a motivated manner, we .... m1/2 discussed above maps into an analogous preference for moderate tan β (see ref. [2]). It can be shown that, at the ...

  4. Accuracy and precision of protein-ligand interaction kinetics determined from chemical shift titrations

    Energy Technology Data Exchange (ETDEWEB)

    Markin, Craig J.; Spyracopoulos, Leo, E-mail: leo.spyracopoulos@ualberta.ca [University of Alberta, Department of Biochemistry (Canada)

    2012-12-15

    NMR-monitored chemical shift titrations for the study of weak protein-ligand interactions represent a rich source of information regarding thermodynamic parameters such as dissociation constants (K{sub D}) in the micro- to millimolar range, populations for the free and ligand-bound states, and the kinetics of interconversion between states, which are typically within the fast exchange regime on the NMR timescale. We recently developed two chemical shift titration methods wherein co-variation of the total protein and ligand concentrations gives increased precision for the K{sub D} value of a 1:1 protein-ligand interaction (Markin and Spyracopoulos in J Biomol NMR 53: 125-138, 2012). In this study, we demonstrate that classical line shape analysis applied to a single set of {sup 1}H-{sup 15}N 2D HSQC NMR spectra acquired using precise protein-ligand chemical shift titration methods we developed, produces accurate and precise kinetic parameters such as the off-rate (k{sub off}). For experimentally determined kinetics in the fast exchange regime on the NMR timescale, k{sub off} {approx} 3,000 s{sup -1} in this work, the accuracy of classical line shape analysis was determined to be better than 5 % by conducting quantum mechanical NMR simulations of the chemical shift titration methods with the magnetic resonance toolkit GAMMA. Using Monte Carlo simulations, the experimental precision for k{sub off} from line shape analysis of NMR spectra was determined to be 13 %, in agreement with the theoretical precision of 12 % from line shape analysis of the GAMMA simulations in the presence of noise and protein concentration errors. In addition, GAMMA simulations were employed to demonstrate that line shape analysis has the potential to provide reasonably accurate and precise k{sub off} values over a wide range, from 100 to 15,000 s{sup -1}. The validity of line shape analysis for k{sub off} values approaching intermediate exchange ({approx}100 s{sup -1}), may be facilitated by

  5. Isolation and genetic analysis of pure cells from forensic biological mixtures: The precision of a digital approach.

    Science.gov (United States)

    Fontana, F; Rapone, C; Bregola, G; Aversa, R; de Meo, A; Signorini, G; Sergio, M; Ferrarini, A; Lanzellotto, R; Medoro, G; Giorgini, G; Manaresi, N; Berti, A

    2017-07-01

    Latest genotyping technologies allow to achieve a reliable genetic profile for the offender identification even from extremely minute biological evidence. The ultimate challenge occurs when genetic profiles need to be retrieved from a mixture, which is composed of biological material from two or more individuals. In this case, DNA profiling will often result in a complex genetic profile, which is then subject matter for statistical analysis. In principle, when more individuals contribute to a mixture with different biological fluids, their single genetic profiles can be obtained by separating the distinct cell types (e.g. epithelial cells, blood cells, sperm), prior to genotyping. Different approaches have been investigated for this purpose, such as fluorescent-activated cell sorting (FACS) or laser capture microdissection (LCM), but currently none of these methods can guarantee the complete separation of different type of cells present in a mixture. In other fields of application, such as oncology, DEPArray™ technology, an image-based, microfluidic digital sorter, has been widely proven to enable the separation of pure cells, with single-cell precision. This study investigates the applicability of DEPArray™ technology to forensic samples analysis, focusing on the resolution of the forensic mixture problem. For the first time, we report here the development of an application-specific DEPArray™ workflow enabling the detection and recovery of pure homogeneous cell pools from simulated blood/saliva and semen/saliva mixtures, providing full genetic match with genetic profiles of corresponding donors. In addition, we assess the performance of standard forensic methods for DNA quantitation and genotyping on low-count, DEPArray™-isolated cells, showing that pure, almost complete profiles can be obtained from as few as ten haploid cells. Finally, we explore the applicability in real casework samples, demonstrating that the described approach provides complete

  6. Electroweak precision measurements in CMS

    CERN Document Server

    Dordevic, Milos

    2017-01-01

    An overview of recent results on electroweak precision measurements from the CMS Collaboration is presented. Studies of the weak boson differential transverse momentum spectra, Z boson angular coefficients, forward-backward asymmetry of Drell-Yan lepton pairs and charge asymmetry of W boson production are made in comparison to the state-of-the-art Monte Carlo generators and theoretical predictions. The results show a good agreement with the Standard Model. As a proof of principle for future W mass measurements, a W-like analysis of the Z boson mass is performed.

  7. Metastability in reversible diffusion processes II. Precise asymptotics for small eigenvalues

    CERN Document Server

    Bovier, A; Klein, M

    2002-01-01

    We continue the analysis of the problem of metastability for reversible diffusion processes, initiated in \\cite{BEGK3}, with a precise analysis of the low-lying spectrum of the generator. Recall that we are considering processes with generators of the form $-\\e \\Delta +\

  8. Multi-objective Analysis for a Sequencing Planning of Mixed-model Assembly Line

    Science.gov (United States)

    Shimizu, Yoshiaki; Waki, Toshiya; Yoo, Jae Kyu

    Diversified customer demands are raising importance of just-in-time and agile manufacturing much more than before. Accordingly, introduction of mixed-model assembly lines becomes popular to realize the small-lot-multi-kinds production. Since it produces various kinds on the same assembly line, a rational management is of special importance. With this point of view, this study focuses on a sequencing problem of mixed-model assembly line including a paint line as its preceding process. By taking into account the paint line together, reducing work-in-process (WIP) inventory between these heterogeneous lines becomes a major concern of the sequencing problem besides improving production efficiency. Finally, we have formulated the sequencing problem as a bi-objective optimization problem to prevent various line stoppages, and to reduce the volume of WIP inventory simultaneously. Then we have proposed a practical method for the multi-objective analysis. For this purpose, we applied the weighting method to derive the Pareto front. Actually, the resulting problem is solved by a meta-heuristic method like SA (Simulated Annealing). Through numerical experiments, we verified the validity of the proposed approach, and discussed the significance of trade-off analysis between the conflicting objectives.

  9. Ultra-wideband ranging precision and accuracy

    International Nuclear Information System (INIS)

    MacGougan, Glenn; O'Keefe, Kyle; Klukas, Richard

    2009-01-01

    This paper provides an overview of ultra-wideband (UWB) in the context of ranging applications and assesses the precision and accuracy of UWB ranging from both a theoretical perspective and a practical perspective using real data. The paper begins with a brief history of UWB technology and the most current definition of what constitutes an UWB signal. The potential precision of UWB ranging is assessed using Cramer–Rao lower bound analysis. UWB ranging methods are described and potential error sources are discussed. Two types of commercially available UWB ranging radios are introduced which are used in testing. Actual ranging accuracy is assessed from line-of-sight testing under benign signal conditions by comparison to high-accuracy electronic distance measurements and to ranges derived from GPS real-time kinematic positioning. Range measurements obtained in outdoor testing with line-of-sight obstructions and strong reflection sources are compared to ranges derived from classically surveyed positions. The paper concludes with a discussion of the potential applications for UWB ranging

  10. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  11. Comparing the Precision of Information Retrieval of MeSH-Controlled Vocabulary Search Method and a Visual Method in the Medline Medical Database.

    Science.gov (United States)

    Hariri, Nadjla; Ravandi, Somayyeh Nadi

    2014-01-01

    Medline is one of the most important databases in the biomedical field. One of the most important hosts for Medline is Elton B. Stephens CO. (EBSCO), which has presented different search methods that can be used based on the needs of the users. Visual search and MeSH-controlled search methods are among the most common methods. The goal of this research was to compare the precision of the retrieved sources in the EBSCO Medline base using MeSH-controlled and visual search methods. This research was a semi-empirical study. By holding training workshops, 70 students of higher education in different educational departments of Kashan University of Medical Sciences were taught MeSH-Controlled and visual search methods in 2012. Then, the precision of 300 searches made by these students was calculated based on Best Precision, Useful Precision, and Objective Precision formulas and analyzed in SPSS software using the independent sample T Test, and three precisions obtained with the three precision formulas were studied for the two search methods. The mean precision of the visual method was greater than that of the MeSH-Controlled search for all three types of precision, i.e. Best Precision, Useful Precision, and Objective Precision, and their mean precisions were significantly different (P searches. Fifty-three percent of the participants in the research also mentioned that the use of the combination of the two methods produced better results. For users, it is more appropriate to use a natural, language-based method, such as the visual method, in the EBSCO Medline host than to use the controlled method, which requires users to use special keywords. The potential reason for their preference was that the visual method allowed them more freedom of action.

  12. Understanding and Optimizing Asynchronous Low-Precision Stochastic Gradient Descent

    Science.gov (United States)

    De Sa, Christopher; Feldman, Matthew; Ré, Christopher; Olukotun, Kunle

    2018-01-01

    Stochastic gradient descent (SGD) is one of the most popular numerical algorithms used in machine learning and other domains. Since this is likely to continue for the foreseeable future, it is important to study techniques that can make it run fast on parallel hardware. In this paper, we provide the first analysis of a technique called Buckwild! that uses both asynchronous execution and low-precision computation. We introduce the DMGC model, the first conceptualization of the parameter space that exists when implementing low-precision SGD, and show that it provides a way to both classify these algorithms and model their performance. We leverage this insight to propose and analyze techniques to improve the speed of low-precision SGD. First, we propose software optimizations that can increase throughput on existing CPUs by up to 11×. Second, we propose architectural changes, including a new cache technique we call an obstinate cache, that increase throughput beyond the limits of current-generation hardware. We also implement and analyze low-precision SGD on the FPGA, which is a promising alternative to the CPU for future SGD systems. PMID:29391770

  13. Software for precise tracking of cell proliferation

    International Nuclear Information System (INIS)

    Kurokawa, Hiroshi; Noda, Hisayori; Sugiyama, Mayu; Sakaue-Sawano, Asako; Fukami, Kiyoko; Miyawaki, Atsushi

    2012-01-01

    Highlights: ► We developed software for analyzing cultured cells that divide as well as migrate. ► The active contour model (Snakes) was used as the core algorithm. ► The time backward analysis was also used for efficient detection of cell division. ► With user-interactive correction functions, the software enables precise tracking. ► The software was successfully applied to cells with fluorescently-labeled nuclei. -- Abstract: We have developed a multi-target cell tracking program TADOR, which we applied to a series of fluorescence images. TADOR is based on an active contour model that is modified in order to be free of the problem of locally optimal solutions, and thus is resistant to signal fluctuation and morphological changes. Due to adoption of backward tracing and addition of user-interactive correction functions, TADOR is used in an off-line and semi-automated mode, but enables precise tracking of cell division. By applying TADOR to the analysis of cultured cells whose nuclei had been fluorescently labeled, we tracked cell division and cell-cycle progression on coverslips over an extended period of time.

  14. High-speed scanning stroboscopic fringe-pattern projection technology for three-dimensional shape precision measurement.

    Science.gov (United States)

    Yang, Guowei; Sun, Changku; Wang, Peng; Xu, Yixin

    2014-01-10

    A high-speed scanning stroboscopic fringe-pattern projection system is designed. A high-speed rotating polygon mirror and a line-structured laser cooperate to produce stable and unambiguous stroboscopic fringe patterns. The system combines the rapidity of the grating projection with the high accuracy of the line-structured laser light source. The fringe patterns have fast frame rate, great density, high precision, and high brightness, with convenience and accuracy in adjusting brightness, frequency, linewidth, and the amount of phase shift. The characteristics and the stability of this system are verified by experiments. Experimental results show that the finest linewidth can reach 40 μm and that the minimum fringe cycle is 80 μm. Circuit modulation makes the light source system flexibly adjustable, easy to control in real time, and convenient to project various fringe patterns. Combined with different light intensity adjustment algorithms and 3D computation models, the 3D topography with high accuracy can be obtained for objects measured under different environments or objects with different sizes, morphologies, and optical properties. The proposed system shows a broad application prospect for fast 3D shape precision measurements, particularly in the industrial field of 3D online detection for precision devices.

  15. GNSS global real-time augmentation positioning: Real-time precise satellite clock estimation, prototype system construction and performance analysis

    Science.gov (United States)

    Chen, Liang; Zhao, Qile; Hu, Zhigang; Jiang, Xinyuan; Geng, Changjiang; Ge, Maorong; Shi, Chuang

    2018-01-01

    Lots of ambiguities in un-differenced (UD) model lead to lower calculation efficiency, which isn't appropriate for the high-frequency real-time GNSS clock estimation, like 1 Hz. Mixed differenced model fusing UD pseudo-range and epoch-differenced (ED) phase observations has been introduced into real-time clock estimation. In this contribution, we extend the mixed differenced model for realizing multi-GNSS real-time clock high-frequency updating and a rigorous comparison and analysis on same conditions are performed to achieve the best real-time clock estimation performance taking the efficiency, accuracy, consistency and reliability into consideration. Based on the multi-GNSS real-time data streams provided by multi-GNSS Experiment (MGEX) and Wuhan University, GPS + BeiDou + Galileo global real-time augmentation positioning prototype system is designed and constructed, including real-time precise orbit determination, real-time precise clock estimation, real-time Precise Point Positioning (RT-PPP) and real-time Standard Point Positioning (RT-SPP). The statistical analysis of the 6 h-predicted real-time orbits shows that the root mean square (RMS) in radial direction is about 1-5 cm for GPS, Beidou MEO and Galileo satellites and about 10 cm for Beidou GEO and IGSO satellites. Using the mixed differenced estimation model, the prototype system can realize high-efficient real-time satellite absolute clock estimation with no constant clock-bias and can be used for high-frequency augmentation message updating (such as 1 Hz). The real-time augmentation message signal-in-space ranging error (SISRE), a comprehensive accuracy of orbit and clock and effecting the users' actual positioning performance, is introduced to evaluate and analyze the performance of GPS + BeiDou + Galileo global real-time augmentation positioning system. The statistical analysis of real-time augmentation message SISRE is about 4-7 cm for GPS, whlile 10 cm for Beidou IGSO/MEO, Galileo and about 30 cm

  16. High precision ray tracing in cylindrically symmetric electrostatics

    Energy Technology Data Exchange (ETDEWEB)

    Edwards Jr, David, E-mail: dej122842@gmail.com

    2015-11-15

    Highlights: • High precision ray tracing is formulated using power series techniques. • Ray tracing is possible for fields generated by solution to laplace's equation. • Spatial and temporal orders of 4–10 are included. • Precisions in test geometries of hemispherical deflector analyzer of ∼10{sup −20} have been obtained. • This solution offers a considerable extension to the ray tracing accuracy over the current state of art. - Abstract: With the recent availability of a high order FDM solution to the curved boundary value problem, it is now possible to determine potentials in such geometries with considerably greater accuracy than had been available with the FDM method. In order for the algorithms used in the accurate potential calculations to be useful in ray tracing, an integration of those algorithms needs to be placed into the ray trace process itself. The object of this paper is to incorporate these algorithms into a solution of the equations of motion of the ray and, having done this, to demonstrate its efficacy. The algorithm incorporation has been accomplished by using power series techniques and the solution constructed has been tested by tracing the medial ray through concentric sphere geometries. The testing has indicated that precisions of ray calculations of 10{sup −20} are now possible. This solution offers a considerable extension to the ray tracing accuracy over the current state of art.

  17. THE PRISM MULTI-OBJECT SURVEY (PRIMUS). II. DATA REDUCTION AND REDSHIFT FITTING

    Energy Technology Data Exchange (ETDEWEB)

    Cool, Richard J. [MMT Observatory, Tucson, AZ 85721 (United States); Moustakas, John [Department of Physics, Siena College, 515 Loudon Rd., Loudonville, NY 12211 (United States); Blanton, Michael R.; Hogg, David W. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States); Burles, Scott M. [D.E. Shaw and Co. L.P, 20400 Stevens Creek Blvd., Suite 850, Cupertino, CA 95014 (United States); Coil, Alison L.; Aird, James; Mendez, Alexander J. [Department of Physics, Center for Astrophysics and Space Sciences, University of California, 9500 Gilman Dr., La Jolla, San Diego, CA 92093 (United States); Eisenstein, Daniel J. [Harvard-Smithsonian Center for Astrophysics, 60 Garden St, MS 20, Cambridge, MA 02138 (United States); Wong, Kenneth C. [Steward Observatory, The University of Arizona, 933 N. Cherry Ave., Tucson, AZ 85721 (United States); Zhu, Guangtun [Center for Astrophysical Sciences, Department of Physics and Astronomy, Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Bernstein, Rebecca A. [Department of Astronomy and Astrophysics, UCA/Lick Observatory, University of California, 1156 High Street, Santa Cruz, CA 95064 (United States); Bolton, Adam S. [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States)

    2013-04-20

    The PRIsm MUlti-object Survey (PRIMUS) is a spectroscopic galaxy redshift survey to z {approx} 1 completed with a low-dispersion prism and slitmasks allowing for simultaneous observations of {approx}2500 objects over 0.18 deg{sup 2}. The final PRIMUS catalog includes {approx}130,000 robust redshifts over 9.1 deg{sup 2}. In this paper, we summarize the PRIMUS observational strategy and present the data reduction details used to measure redshifts, redshift precision, and survey completeness. The survey motivation, observational techniques, fields, target selection, slitmask design, and observations are presented in Coil et al. Comparisons to existing higher-resolution spectroscopic measurements show a typical precision of {sigma}{sub z}/(1 + z) = 0.005. PRIMUS, both in area and number of redshifts, is the largest faint galaxy redshift survey completed to date and is allowing for precise measurements of the relationship between active galactic nuclei and their hosts, the effects of environment on galaxy evolution, and the build up of galactic systems over the latter half of cosmic history.

  18. Object-Based Dense Matching Method for Maintaining Structure Characteristics of Linear Buildings.

    Science.gov (United States)

    Su, Nan; Yan, Yiming; Qiu, Mingjie; Zhao, Chunhui; Wang, Liguo

    2018-03-29

    In this paper, we proposed a novel object-based dense matching method specially for the high-precision disparity map of building objects in urban areas, which can maintain accurate object structure characteristics. The proposed framework mainly includes three stages. Firstly, an improved edge line extraction method is proposed for the edge segments to fit closely to building outlines. Secondly, a fusion method is proposed for the outlines under the constraint of straight lines, which can maintain the building structural attribute with parallel or vertical edges, which is very useful for the dense matching method. Finally, we proposed an edge constraint and outline compensation (ECAOC) dense matching method to maintain building object structural characteristics in the disparity map. In the proposed method, the improved edge lines are used to optimize matching search scope and matching template window, and the high-precision building outlines are used to compensate the shape feature of building objects. Our method can greatly increase the matching accuracy of building objects in urban areas, especially at building edges. For the outline extraction experiments, our fusion method verifies the superiority and robustness on panchromatic images of different satellites and different resolutions. For the dense matching experiments, our ECOAC method shows great advantages for matching accuracy of building objects in urban areas compared with three other methods.

  19. Object-Based Dense Matching Method for Maintaining Structure Characteristics of Linear Buildings

    Directory of Open Access Journals (Sweden)

    Nan Su

    2018-03-01

    Full Text Available In this paper, we proposed a novel object-based dense matching method specially for the high-precision disparity map of building objects in urban areas, which can maintain accurate object structure characteristics. The proposed framework mainly includes three stages. Firstly, an improved edge line extraction method is proposed for the edge segments to fit closely to building outlines. Secondly, a fusion method is proposed for the outlines under the constraint of straight lines, which can maintain the building structural attribute with parallel or vertical edges, which is very useful for the dense matching method. Finally, we proposed an edge constraint and outline compensation (ECAOC dense matching method to maintain building object structural characteristics in the disparity map. In the proposed method, the improved edge lines are used to optimize matching search scope and matching template window, and the high-precision building outlines are used to compensate the shape feature of building objects. Our method can greatly increase the matching accuracy of building objects in urban areas, especially at building edges. For the outline extraction experiments, our fusion method verifies the superiority and robustness on panchromatic images of different satellites and different resolutions. For the dense matching experiments, our ECOAC method shows great advantages for matching accuracy of building objects in urban areas compared with three other methods.

  20. Object-oriented Method of Hierarchical Urban Building Extraction from High-resolution Remote-Sensing Imagery

    Directory of Open Access Journals (Sweden)

    TAO Chao

    2016-02-01

    Full Text Available An automatic urban building extraction method for high-resolution remote-sensing imagery,which combines building segmentation based on neighbor total variations with object-oriented analysis,is presented in this paper. Aimed at different extraction complexity from various buildings in the segmented image,a hierarchical building extraction strategy with multi-feature fusion is adopted. Firstly,we extract some rectangle buildings which remain intact after segmentation through shape analysis. Secondly,in order to ensure each candidate building target to be independent,multidirectional morphological road-filtering algorithm is designed which can separate buildings from the neighboring roads with similar spectrum. Finally,we take the extracted buildings and the excluded non-buildings as samples to establish probability model respectively,and Bayesian discriminating classifier is used for making judgment of the other candidate building objects to get the ultimate extraction result. The experimental results have shown that the approach is able to detect buildings with different structure and spectral features in the same image. The results of performance evaluation also support the robustness and precision of the approach developed.

  1. Laser precision microfabrication

    CERN Document Server

    Sugioka, Koji; Pique, Alberto

    2010-01-01

    Miniaturization and high precision are rapidly becoming a requirement for many industrial processes and products. As a result, there is greater interest in the use of laser microfabrication technology to achieve these goals. This book composed of 16 chapters covers all the topics of laser precision processing from fundamental aspects to industrial applications to both inorganic and biological materials. It reviews the sate of the art of research and technological development in the area of laser processing.

  2. Precision Measurements of $H \\rightarrow b\\bar{b}$ Coupling from $e^{+}e^{-}$ Collisions

    CERN Document Server

    Lai, Laura

    2015-01-01

    The proposed 80 to 100 km Future Circular Collider will be an ideal setup for studying properties of the newly-discovered Higgs boson with much more precision. Physics of $e^{+}e^{-}$ collisions are accessible at a center-of-mass energy of 90 to 350 GeV and high luminosity. The lepton collider, FCC-ee, may be used as an intermediate stage before pp collision to study decay channels of the Higgs at a center-of-mass energy of 240 GeV. The objective of this project is to search for the uncertainty of $H \\rightarrow b\\bar{b}$ coupling via event generation and simulation. Pythia8 was utilized in event generation, Delphes for CMS detector simulation, and ROOT for data analysis. After improving the b-tagging efficiency formula in CMS detector simulation, a C++/ROOT analysis was performed on the simulated data. Through this method, the $H \\rightarrow b\\bar{b}$ coupling was measured to be 6\\% with an integrated luminosity of $500 fb^{-1}$ with $Z \\rightarrow \\mu^{+}\\mu^{-}$, $H \\rightarrow b\\bar{b}$ decay.

  3. Analysis of 14C and 13C in teeth provides precise birth dating and clues to geographical origin.

    Science.gov (United States)

    Alkass, K; Buchholz, B A; Druid, H; Spalding, K L

    2011-06-15

    The identification of human bodies in situations when there are no clues as to the person's identity from circumstantial data, poses a difficult problem to the investigators. The determination of age and sex of the body can be crucial in order to limit the search to individuals that are a possible match. We analyzed the proportion of bomb pulse derived carbon-14 ((14)C) incorporated in the enamel of teeth from individuals from different geographical locations. The 'bomb pulse' refers to a significant increase in (14)C levels in the atmosphere caused by above ground test detonations of nuclear weapons during the cold war (1955-1963). By comparing (14)C levels in enamel with (14)C atmospheric levels systematically recorded over time, high precision birth dating of modern biological material is possible. Above ground nuclear bomb testing was largely restricted to a couple of locations in the northern hemisphere, producing differences in atmospheric (14)C levels at various geographical regions, particularly in the early phase. Therefore, we examined the precision of (14)C birth dating of enamel as a function of time of formation and geographical location. We also investigated the use of the stable isotope (13)C as an indicator of geographical origin of an individual. Dental enamel was isolated from 95 teeth extracted from 84 individuals to study the precision of the (14)C method along the bomb spike. For teeth formed before 1955 (N=17), all but one tooth showed negative Δ(14)C values. Analysis of enamel from teeth formed during the rising part of the bomb-spike (1955-1963, N=12) and after the peak (>1963, N=66) resulted in an average absolute date of birth estimation error of 1.9±1.4 and 1.3±1.0 years, respectively. Geographical location of an individual had no adverse effect on the precision of year of birth estimation using radiocarbon dating. In 46 teeth, measurement of (13)C was also performed. Scandinavian teeth showed a substantially greater depression in

  4. Objective Audio Quality Assessment Based on Spectro-Temporal Modulation Analysis

    OpenAIRE

    Guo, Ziyuan

    2011-01-01

    Objective audio quality assessment is an interdisciplinary research area that incorporates audiology and machine learning. Although much work has been made on the machine learning aspect, the audiology aspect also deserves investigation. This thesis proposes a non-intrusive audio quality assessment algorithm, which is based on an auditory model that simulates human auditory system. The auditory model is based on spectro-temporal modulation analysis of spectrogram, which has been proven to be ...

  5. The landscape of precision cancer medicine clinical trials in the United States.

    Science.gov (United States)

    Roper, Nitin; Stensland, Kristian D; Hendricks, Ryan; Galsky, Matthew D

    2015-05-01

    Advances in tumor biology and multiplex genomic analysis have ushered in the era of precision cancer medicine. Little is currently known, however, about the landscape of prospective "precision cancer medicine" clinical trials in the U.S. We identified all adult interventional cancer trials registered on ClinicalTrials.gov between September 2005 and May 2013. Trials were classified as "precision cancer medicine" if a genomic alteration in a predefined set of 88 genes was required for enrollment. Baseline characteristics were ascertained for each trial. Of the initial 18,797 trials identified, 9094 (48%) were eligible for inclusion: 684 (8%) were classified as precision cancer medicine trials and 8410 (92%) were non-precision cancer medicine trials. Compared with non-precision cancer medicine trials, precision cancer medicine trials were significantly more likely to be phase II [RR 1.19 (1.10-1.29), pPrecision medicine trials required 38 unique genomic alterations for enrollment. The proportion of precision cancer medicine trials compared to the total number of trials increased from 3% in 2006 to 16% in 2013. The proportion of adult cancer clinical trials in the U.S. requiring a genomic alteration for enrollment has increased substantially over the past several years. However, such trials still represent a small minority of studies performed within the cancer clinical trials enterprise and include a small subset of putatively "actionable" alterations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Multi-object segmentation framework using deformable models for medical imaging analysis.

    Science.gov (United States)

    Namías, Rafael; D'Amato, Juan Pablo; Del Fresno, Mariana; Vénere, Marcelo; Pirró, Nicola; Bellemare, Marc-Emmanuel

    2016-08-01

    Segmenting structures of interest in medical images is an important step in different tasks such as visualization, quantitative analysis, simulation, and image-guided surgery, among several other clinical applications. Numerous segmentation methods have been developed in the past three decades for extraction of anatomical or functional structures on medical imaging. Deformable models, which include the active contour models or snakes, are among the most popular methods for image segmentation combining several desirable features such as inherent connectivity and smoothness. Even though different approaches have been proposed and significant work has been dedicated to the improvement of such algorithms, there are still challenging research directions as the simultaneous extraction of multiple objects and the integration of individual techniques. This paper presents a novel open-source framework called deformable model array (DMA) for the segmentation of multiple and complex structures of interest in different imaging modalities. While most active contour algorithms can extract one region at a time, DMA allows integrating several deformable models to deal with multiple segmentation scenarios. Moreover, it is possible to consider any existing explicit deformable model formulation and even to incorporate new active contour methods, allowing to select a suitable combination in different conditions. The framework also introduces a control module that coordinates the cooperative evolution of the snakes and is able to solve interaction issues toward the segmentation goal. Thus, DMA can implement complex object and multi-object segmentations in both 2D and 3D using the contextual information derived from the model interaction. These are important features for several medical image analysis tasks in which different but related objects need to be simultaneously extracted. Experimental results on both computed tomography and magnetic resonance imaging show that the proposed

  7. About the problems and perspectives of making precision compressor blades

    Directory of Open Access Journals (Sweden)

    V. E. Galiev

    2014-01-01

    Full Text Available The problems of manufacturing blades with high precision profile geometry are considered in the article. The variant of the technology under development rules out the use of mechanical processing methods for blades airfoil. The article consists of an introduction and six small sections.The introduction sets out the requirements for modern aircraft engines, makes a list of problems arisen in the process of their manufacturing, and marks the relevance of the work.The first section analyzes the existing technology of precision blades. There is an illustration reflecting the stages of the process. Their advantages and disadvantages are marked.The second section provides an illustration, which shows the system-based blades used in the manufacturing process and a model of the work piece using the technology being developed. An analysis of each basing scheme is presented.In the third section we list the existing control methods of geometrical parameters of blades airfoil and present the measurement error data of devices. The special attention is paid to the impossibility to control the accuracy of geometrical parameters of precision blades.The fourth section presents the advantages of the electrochemical machining method with a consistent vibration of tool-electrode and with feeding the pulses of technology current over the traditional method. The article presents data accuracy and surface roughness of the blades airfoil reached owing to precision electrochemical machining. It illustrates machines that implement the given method of processing and components manufactured on them.The fifth section describes the steps of the developed process with justification for the use of the proposed operations.Based on the analysis, the author argues that the application of the proposed process to manufacture the precision compressor blades ensures producing the items that meet the requirements of the drawing.

  8. In pursuit of precision: the calibration of minds and machines in late nineteenth-century psychology.

    Science.gov (United States)

    Benschop, R; Draaisma, D

    2000-01-01

    A prominent feature of late nineteenth-century psychology was its intense preoccupation with precision. Precision was at once an ideal and an argument: the quest for precision helped psychology to establish its status as a mature science, sharing a characteristic concern with the natural sciences. We will analyse how psychologists set out to produce precision in 'mental chronometry', the measurement of the duration of psychological processes. In his Leipzig laboratory, Wundt inaugurated an elaborate research programme on mental chronometry. We will look at the problem of calibration of experimental apparatus and will describe the intricate material, literary, and social technologies involved in the manufacture of precision. First, we shall discuss some of the technical problems involved in the measurement of ever shorter time-spans. Next, the Cattell-Berger experiments will help us to argue against the received view that all the precision went into the hardware, and practically none into the social organization of experimentation. Experimenters made deliberate efforts to bring themselves and their subjects under a regime of control and calibration similar to that which reigned over the experimental machinery. In Leipzig psychology, the particular blend of material and social technology resulted in a specific object of study: the generalized mind. We will then show that the distribution of precision in experimental psychology outside Leipzig demanded a concerted effort of instruments, texts, and people. It will appear that the forceful attempts to produce precision and uniformity had some rather paradoxical consequences.

  9. The Making of Paranormal Belief: History, Discourse Analysis and the Object of Belief

    OpenAIRE

    White, Lewis

    2013-01-01

    The present study comprises a discursive analysis of a cognitive phenomenon, paranormal beliefs. A discursive psychological approach to belief highlights that an important component of the cognitivist work has been how the object of paranormal belief has been defined in formal study. Using discourse analysis, as developed as a method in the history of psychology, this problem is explored through analysis of published scales. The findings highlight three rhetorical themes that are deployed in ...

  10. Mechanism and experimental research on ultra-precision grinding of ferrite

    Science.gov (United States)

    Ban, Xinxing; Zhao, Huiying; Dong, Longchao; Zhu, Xueliang; Zhang, Chupeng; Gu, Yawen

    2017-02-01

    Ultra-precision grinding of ferrite is conducted to investigate the removal mechanism. Effect of the accuracy of machine tool key components on grinding surface quality is analyzed. The surface generation model of ferrite ultra-precision grinding machining is established. In order to reveal the surface formation mechanism of ferrite in the process of ultraprecision grinding, furthermore, the scientific and accurate of the calculation model are taken into account to verify the grinding surface roughness, which is proposed. Orthogonal experiment is designed using the high precision aerostatic turntable and aerostatic spindle for ferrite which is a typical hard brittle materials. Based on the experimental results, the influence factors and laws of ultra-precision grinding surface of ferrite are discussed through the analysis of the surface roughness. The results show that the quality of ferrite grinding surface is the optimal parameters, when the wheel speed of 20000r/mm, feed rate of 10mm/min, grinding depth of 0.005mm, and turntable rotary speed of 5r/min, the surface roughness Ra can up to 75nm.

  11. Correlated cryo-fluorescence and cryo-electron microscopy with high spatial precision and improved sensitivity

    International Nuclear Information System (INIS)

    Schorb, Martin; Briggs, John A.G.

    2014-01-01

    Performing fluorescence microscopy and electron microscopy on the same sample allows fluorescent signals to be used to identify and locate features of interest for subsequent imaging by electron microscopy. To carry out such correlative microscopy on vitrified samples appropriate for structural cryo-electron microscopy it is necessary to perform fluorescence microscopy at liquid-nitrogen temperatures. Here we describe an adaptation of a cryo-light microscopy stage to permit use of high-numerical aperture objectives. This allows high-sensitivity and high-resolution fluorescence microscopy of vitrified samples. We describe and apply a correlative cryo-fluorescence and cryo-electron microscopy workflow together with a fiducial bead-based image correlation procedure. This procedure allows us to locate fluorescent bacteriophages in cryo-electron microscopy images with an accuracy on the order of 50 nm, based on their fluorescent signal. It will allow the user to precisely and unambiguously identify and locate objects and events for subsequent high-resolution structural study, based on fluorescent signals. - Highlights: • Workflow for correlated cryo-fluorescence and cryo-electron microscopy. • Cryo-fluorescence microscopy setup incorporating a high numerical aperture objective. • Fluorescent signals located in cryo-electron micrographs with 50 nm spatial precision

  12. Correlated cryo-fluorescence and cryo-electron microscopy with high spatial precision and improved sensitivity

    Energy Technology Data Exchange (ETDEWEB)

    Schorb, Martin [Structural and Computational Biology Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany); Briggs, John A.G., E-mail: john.briggs@embl.de [Structural and Computational Biology Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany); Cell Biology and Biophysics Unit, European Molecular Biology Laboratory, D-69117 Heidelberg (Germany)

    2014-08-01

    Performing fluorescence microscopy and electron microscopy on the same sample allows fluorescent signals to be used to identify and locate features of interest for subsequent imaging by electron microscopy. To carry out such correlative microscopy on vitrified samples appropriate for structural cryo-electron microscopy it is necessary to perform fluorescence microscopy at liquid-nitrogen temperatures. Here we describe an adaptation of a cryo-light microscopy stage to permit use of high-numerical aperture objectives. This allows high-sensitivity and high-resolution fluorescence microscopy of vitrified samples. We describe and apply a correlative cryo-fluorescence and cryo-electron microscopy workflow together with a fiducial bead-based image correlation procedure. This procedure allows us to locate fluorescent bacteriophages in cryo-electron microscopy images with an accuracy on the order of 50 nm, based on their fluorescent signal. It will allow the user to precisely and unambiguously identify and locate objects and events for subsequent high-resolution structural study, based on fluorescent signals. - Highlights: • Workflow for correlated cryo-fluorescence and cryo-electron microscopy. • Cryo-fluorescence microscopy setup incorporating a high numerical aperture objective. • Fluorescent signals located in cryo-electron micrographs with 50 nm spatial precision.

  13. Ageing influence for the evaluation of DXA precision in female subjects

    International Nuclear Information System (INIS)

    Lin Qiang; Yu Wei; Qin Mingwei; Shang Wei; Tian Junping; Han Shaomei

    2006-01-01

    Objective: To investigate whether aging factor influence the precision of DXA measurement at the lumbar spine in females. Methods: A total of 90 female subjects were recruited and divided into three age groups, i.e. 45-55 years, 56-65 years and 66-75 years. There were 30 female subjects for each age group. Each subject was scanned twice at the same day. Mean BMD values from L2 to L4 were collected and grouped by calculating the root mean square (RMS). Precision errors were expressed as root mean square (RMS). P 2 , (0.992±0.010) g/cm 2 , (0.910±0.010) g/cm 2 , respectively. Mean BMD values from L2 to L4 decreased with increasing age group. Root mean square was lower in the 45 -55 age group, and was same between 56-65 and 66-75 age group. There were significant difference of BMD standard deviation between both there groups (F=5.213, P<0.05) any age group (q value I vs II 0.035; II vs III 0.500; I vs III 0.035, P<0.05). Conclusion: Age could influence the precision of DXA measurement at the site of lumbar spine in females. Therefore, caution should be paid to the age of female subjects recruited for the evaluation of precision for DXA measurement in the clinical trials. (authors)

  14. Data evaluation and CNGS beam localization with the precision tracker of the OPERA detector

    International Nuclear Information System (INIS)

    Bick, D.

    2007-04-01

    In this diploma thesis, the data evaluation for the OPERA precision tracker is presented. Furthermore investigations of a precise CNGS beam localization with the precision tracker are performed. After an overview of past and present developments in neutrino physics, the OPERA detector is presented in this thesis. Emphasis is given to the precision tracker which has been partly commissioned in the end of the last year. A first analysis of the functionality with cosmic muons has been performed, as well as the inclusion of data in the OPERA software framework. Within this thesis some useful tools have been developed which are also presented. Finally, divergence effects from the nominal beam line of the CNGS neutrino beam and possible detection with the precision tracker are studied. (orig.)

  15. Data evaluation and CNGS beam localization with the precision tracker of the OPERA detector

    Energy Technology Data Exchange (ETDEWEB)

    Bick, D.

    2007-04-15

    In this diploma thesis, the data evaluation for the OPERA precision tracker is presented. Furthermore investigations of a precise CNGS beam localization with the precision tracker are performed. After an overview of past and present developments in neutrino physics, the OPERA detector is presented in this thesis. Emphasis is given to the precision tracker which has been partly commissioned in the end of the last year. A first analysis of the functionality with cosmic muons has been performed, as well as the inclusion of data in the OPERA software framework. Within this thesis some useful tools have been developed which are also presented. Finally, divergence effects from the nominal beam line of the CNGS neutrino beam and possible detection with the precision tracker are studied. (orig.)

  16. Precision requirements for single-layer feed-forward neural networks

    NARCIS (Netherlands)

    Annema, Anne J.; Hoen, K.; Hoen, Klaas; Wallinga, Hans

    1994-01-01

    This paper presents a mathematical analysis of the effect of limited precision analog hardware for weight adaptation to be used in on-chip learning feedforward neural networks. Easy-to-read equations and simple worst-case estimations for the maximum tolerable imprecision are presented. As an

  17. Decoupling of the leading contribution in the discrete BFKL analysis of high-precision HERA data

    Energy Technology Data Exchange (ETDEWEB)

    Kowalski, H. [Deutsches Elektronen-Synchrotron DESY, Hamburg (Germany); Lipatov, L.N. [St. Petersburg State University, St. Petersburg (Russian Federation); Petersburg Nuclear Physics Institute, Gatchina (Russian Federation); Ross, D.A. [University of Southampton, School of Physics and Astronomy, Southampton (United Kingdom); Schulz, O. [Max Planck Institute for Physics, Munich (Germany)

    2017-11-15

    We analyse, in NLO, the physical properties of the discrete eigenvalue solution for the BFKL equation. We show that a set of eigenfunctions with positive eigenvalues, ω, together with a small contribution from a continuum of eigenfunctions with negative ω, provide an excellent description of high-precision HERA F{sub 2} data in the region, x < 0.001, Q{sup 2} > 6 GeV{sup 2}. The phases of the eigenfunctions can be obtained from a simple parametrisation of the pomeron spectrum, which has a natural motivation within BFKL. The data analysis shows that the first eigenfunction decouples completely or almost completely from the proton. This suggests that there exists an additional ground state, which is naturally saturated and may have the properties of the soft pomeron. (orig.)

  18. Knock-Outs, Stick-Outs, Cut-Outs: Clipping Paths Separate Objects from Background.

    Science.gov (United States)

    Wilson, Bradley

    1998-01-01

    Outlines a six-step process that allows computer operators, using Photoshop software, to create "knock-outs" to precisely define the path that will serve to separate the object from the background. (SR)

  19. Precise Orbit Determination of GPS Satellites Using Phase Observables

    Directory of Open Access Journals (Sweden)

    Myung-Kook Jee

    1997-12-01

    Full Text Available The accuracy of user position by GPS is heavily dependent upon the accuracy of satellite position which is usually transmitted to GPS users in radio signals. The real-time satellite position information directly obtained from broadcast ephimerides has the accuracy of 3 x 10 meters which is very unsatisfactory to measure 100km baseline to the accuracy of less than a few mili-meters. There are globally at present seven orbit analysis centers capable of generating precise GPS ephimerides and their orbit quality is of the order of about 10cm. Therefore, precise orbit model and phase processing technique were reviewed and consequently precise GPS ephimerides were produced after processing the phase observables of 28 global GPS stations for 1 day. Initial 6 orbit parameters and 2 solar radiation coefficients were estimated using batch least square algorithm and the final results were compared with the orbit of IGS, the International GPS Service for Geodynamics.

  20. Using Bayesian Population Viability Analysis to Define Relevant Conservation Objectives.

    Directory of Open Access Journals (Sweden)

    Adam W Green

    Full Text Available Adaptive management provides a useful framework for managing natural resources in the face of uncertainty. An important component of adaptive management is identifying clear, measurable conservation objectives that reflect the desired outcomes of stakeholders. A common objective is to have a sustainable population, or metapopulation, but it can be difficult to quantify a threshold above which such a population is likely to persist. We performed a Bayesian metapopulation viability analysis (BMPVA using a dynamic occupancy model to quantify the characteristics of two wood frog (Lithobates sylvatica metapopulations resulting in sustainable populations, and we demonstrate how the results could be used to define meaningful objectives that serve as the basis of adaptive management. We explored scenarios involving metapopulations with different numbers of patches (pools using estimates of breeding occurrence and successful metamorphosis from two study areas to estimate the probability of quasi-extinction and calculate the proportion of vernal pools producing metamorphs. Our results suggest that ≥50 pools are required to ensure long-term persistence with approximately 16% of pools producing metamorphs in stable metapopulations. We demonstrate one way to incorporate the BMPVA results into a utility function that balances the trade-offs between ecological and financial objectives, which can be used in an adaptive management framework to make optimal, transparent decisions. Our approach provides a framework for using a standard method (i.e., PVA and available information to inform a formal decision process to determine optimal and timely management policies.

  1. Airport object extraction based on visual attention mechanism and parallel line detection

    Science.gov (United States)

    Lv, Jing; Lv, Wen; Zhang, Libao

    2017-10-01

    Target extraction is one of the important aspects in remote sensing image analysis and processing, which has wide applications in images compression, target tracking, target recognition and change detection. Among different targets, airport has attracted more and more attention due to its significance in military and civilian. In this paper, we propose a novel and reliable airport object extraction model combining visual attention mechanism and parallel line detection algorithm. First, a novel saliency analysis model for remote sensing images with airport region is proposed to complete statistical saliency feature analysis. The proposed model can precisely extract the most salient region and preferably suppress the background interference. Then, the prior geometric knowledge is analyzed and airport runways contained two parallel lines with similar length are detected efficiently. Finally, we use the improved Otsu threshold segmentation method to segment and extract the airport regions from the salient map of remote sensing images. The experimental results demonstrate that the proposed model outperforms existing saliency analysis models and shows good performance in the detection of the airport.

  2. Object-Based Image Analysis in Wetland Research: A Review

    Directory of Open Access Journals (Sweden)

    Iryna Dronova

    2015-05-01

    Full Text Available The applications of object-based image analysis (OBIA in remote sensing studies of wetlands have been growing over recent decades, addressing tasks from detection and delineation of wetland bodies to comprehensive analyses of within-wetland cover types and their change. Compared to pixel-based approaches, OBIA offers several important benefits to wetland analyses related to smoothing of the local noise, incorporating meaningful non-spectral features for class separation and accounting for landscape hierarchy of wetland ecosystem organization and structure. However, there has been little discussion on whether unique challenges of wetland environments can be uniformly addressed by OBIA across different types of data, spatial scales and research objectives, and to what extent technical and conceptual aspects of this framework may themselves present challenges in a complex wetland setting. This review presents a synthesis of 73 studies that applied OBIA to different types of remote sensing data, spatial scale and research objectives. It summarizes the progress and scope of OBIA uses in wetlands, key benefits of this approach, factors related to accuracy and uncertainty in its applications and the main research needs and directions to expand the OBIA capacity in the future wetland studies. Growing demands for higher-accuracy wetland characterization at both regional and local scales together with advances in very high resolution remote sensing and novel tasks in wetland restoration monitoring will likely continue active exploration of the OBIA potential in these diverse and complex environments.

  3. Precision Medicine and Men's Health.

    Science.gov (United States)

    Mata, Douglas A; Katchi, Farhan M; Ramasamy, Ranjith

    2017-07-01

    Precision medicine can greatly benefit men's health by helping to prevent, diagnose, and treat prostate cancer, benign prostatic hyperplasia, infertility, hypogonadism, and erectile dysfunction. For example, precision medicine can facilitate the selection of men at high risk for prostate cancer for targeted prostate-specific antigen screening and chemoprevention administration, as well as assist in identifying men who are resistant to medical therapy for prostatic hyperplasia, who may instead require surgery. Precision medicine-trained clinicians can also let couples know whether their specific cause of infertility should be bypassed by sperm extraction and in vitro fertilization to prevent abnormalities in their offspring. Though precision medicine's role in the management of hypogonadism has yet to be defined, it could be used to identify biomarkers associated with individual patients' responses to treatment so that appropriate therapy can be prescribed. Last, precision medicine can improve erectile dysfunction treatment by identifying genetic polymorphisms that regulate response to medical therapies and by aiding in the selection of patients for further cardiovascular disease screening.

  4. Evaluating fuzzy operators of an object-based image analysis for detecting landslides and their changes

    Science.gov (United States)

    Feizizadeh, Bakhtiar; Blaschke, Thomas; Tiede, Dirk; Moghaddam, Mohammad Hossein Rezaei

    2017-09-01

    This article presents a method of object-based image analysis (OBIA) for landslide delineation and landslide-related change detection from multi-temporal satellite images. It uses both spatial and spectral information on landslides, through spectral analysis, shape analysis, textural measurements using a gray-level co-occurrence matrix (GLCM), and fuzzy logic membership functionality. Following an initial segmentation step, particular combinations of various information layers were investigated to generate objects. This was achieved by applying multi-resolution segmentation to IRS-1D, SPOT-5, and ALOS satellite imagery in sequential steps of feature selection and object classification, and using slope and flow direction derivatives from a digital elevation model together with topographically-oriented gray level co-occurrence matrices. Fuzzy membership values were calculated for 11 different membership functions using 20 landslide objects from a landslide training data. Six fuzzy operators were used for the final classification and the accuracies of the resulting landslide maps were compared. A Fuzzy Synthetic Evaluation (FSE) approach was adapted for validation of the results and for an accuracy assessment using the landslide inventory database. The FSE approach revealed that the AND operator performed best with an accuracy of 93.87% for 2005 and 94.74% for 2011, closely followed by the MEAN Arithmetic operator, while the OR and AND (*) operators yielded relatively low accuracies. An object-based change detection was then applied to monitor landslide-related changes that occurred in northern Iran between 2005 and 2011. Knowledge rules to detect possible landslide-related changes were developed by evaluating all possible landslide-related objects for both time steps.

  5. Precision muonium spectroscopy

    International Nuclear Information System (INIS)

    Jungmann, Klaus P.

    2016-01-01

    The muonium atom is the purely leptonic bound state of a positive muon and an electron. It has a lifetime of 2.2 µs. The absence of any known internal structure provides for precision experiments to test fundamental physics theories and to determine accurate values of fundamental constants. In particular ground state hyperfine structure transitions can be measured by microwave spectroscopy to deliver the muon magnetic moment. The frequency of the 1s–2s transition in the hydrogen-like atom can be determined with laser spectroscopy to obtain the muon mass. With such measurements fundamental physical interactions, in particular quantum electrodynamics, can also be tested at highest precision. The results are important input parameters for experiments on the muon magnetic anomaly. The simplicity of the atom enables further precise experiments, such as a search for muonium–antimuonium conversion for testing charged lepton number conservation and searches for possible antigravity of muons and dark matter. (author)

  6. Very high precision and accuracy analysis of triple isotopic ratios of water. A critical instrumentation comparison study.

    Science.gov (United States)

    Gkinis, Vasileios; Holme, Christian; Morris, Valerie; Thayer, Abigail Grace; Vaughn, Bruce; Kjaer, Helle Astrid; Vallelonga, Paul; Simonsen, Marius; Jensen, Camilla Marie; Svensson, Anders; Maffrezzoli, Niccolo; Vinther, Bo; Dallmayr, Remi

    2017-04-01

    We present a performance comparison study between two state of the art Cavity Ring Down Spectrometers (Picarro L2310-i, L2140-i). The comparison took place during the Continuous Flow Analysis (CFA) campaign for the measurement of the Renland ice core, over a period of three months. Instant and complete vaporisation of the ice core melt stream, as well as of in-house water reference materials is achieved by accurate control of microflows of liquid into a homemade calibration system by following simple principles of the Hagen-Poiseuille law. Both instruments share the same vaporisation unit in a configuration that minimises sample preparation discrepancies between the two analyses. We describe our SMOW-SLAP calibration and measurement protocols for such a CFA application and present quality control metrics acquired during the full period of the campaign on a daily basis. The results indicate an unprecedented performance for all 3 isotopic ratios (δ2H, δ17O, δ18O ) in terms of precision, accuracy and resolution. We also comment on the precision and accuracy of the second order excess parameters of HD16O and H217O over H218O (Dxs, Δ17O ). To our knowledge these are the first reported CFA measurements at this level of precision and accuracy for all three isotopic ratios. Differences on the performance of the two instruments are carefully assessed during the measurement and reported here. Our quality control protocols extend to the area of low water mixing ratios, a regime in which often atmospheric vapour measurements take place and Cavity Ring Down Analysers show a poorer performance due to the lower signal to noise ratios. We address such issues and propose calibration protocols from which water vapour isotopic analyses can benefit from.

  7. Constraining supersymmetry with precision data

    International Nuclear Information System (INIS)

    Pierce, D.M.; Erler, J.

    1997-01-01

    We discuss the results of a global fit to precision data in supersymmetric models. We consider both gravity- and gauge-mediated models. As the superpartner spectrum becomes light, the global fit to the data typically results in larger values of χ 2 . We indicate the regions of parameter space which are excluded by the data. We discuss the additional effect of the B(B→X s γ) measurement. Our analysis excludes chargino masses below M Z in the simplest gauge-mediated model with μ>0, with stronger constraints for larger values of tanβ. copyright 1997 American Institute of Physics

  8. Precision engineering: an evolutionary perspective.

    Science.gov (United States)

    Evans, Chris J

    2012-08-28

    Precision engineering is a relatively new name for a technology with roots going back over a thousand years; those roots span astronomy, metrology, fundamental standards, manufacturing and money-making (literally). Throughout that history, precision engineers have created links across disparate disciplines to generate innovative responses to society's needs and wants. This review combines historical and technological perspectives to illuminate precision engineering's current character and directions. It first provides us a working definition of precision engineering and then reviews the subject's roots. Examples will be given showing the contributions of the technology to society, while simultaneously showing the creative tension between the technological convergence that spurs new directions and the vertical disintegration that optimizes manufacturing economics.

  9. High Precision Fast Projective Synchronization for Chaotic Systems with Unknown Parameters

    Science.gov (United States)

    Nian, Fuzhong; Wang, Xingyuan; Lin, Da; Niu, Yujun

    2013-08-01

    A high precision fast projective synchronization method for chaotic systems with unknown parameters was proposed by introducing optimal matrix. Numerical simulations indicate that the precision be improved about three orders compared with other common methods under the same condition of software and hardware. Moreover, when average error is less than 10-3, the synchronization speed is 6500 times than common methods, the iteration needs only 4 times. The unknown parameters also were identified rapidly. The theoretical analysis and proof also were given.

  10. Toward precision medicine in Alzheimer's disease.

    Science.gov (United States)

    Reitz, Christiane

    2016-03-01

    In Western societies, Alzheimer's disease (AD) is the most common form of dementia and the sixth leading cause of death. In recent years, the concept of precision medicine, an approach for disease prevention and treatment that is personalized to an individual's specific pattern of genetic variability, environment and lifestyle factors, has emerged. While for some diseases, in particular select cancers and a few monogenetic disorders such as cystic fibrosis, significant advances in precision medicine have been made over the past years, for most other diseases precision medicine is only in its beginning. To advance the application of precision medicine to a wider spectrum of disorders, governments around the world are starting to launch Precision Medicine Initiatives, major efforts to generate the extensive scientific knowledge needed to integrate the model of precision medicine into every day clinical practice. In this article we summarize the state of precision medicine in AD, review major obstacles in its development, and discuss its benefits in this highly prevalent, clinically and pathologically complex disease.

  11. FROM PERSONALIZED TO PRECISION MEDICINE

    Directory of Open Access Journals (Sweden)

    K. V. Raskina

    2017-01-01

    Full Text Available The need to maintain a high quality of life against a backdrop of its inevitably increasing duration is one of the main problems of modern health care. The concept of "right drug to the right patient at the right time", which at first was bearing the name "personalized", is currently unanimously approved by international scientific community as "precision medicine". Precision medicine takes all the individual characteristics into account: genes diversity, environment, lifestyles, and even bacterial microflora and also involves the use of the latest technological developments, which serves to ensure that each patient gets assistance fitting his state best. In the United States, Canada and France national precision medicine programs have already been submitted and implemented. The aim of this review is to describe the dynamic integration of precision medicine methods into routine medical practice and life of modern society. The new paradigm prospects description are complemented by figures, proving the already achieved success in the application of precise methods for example, the targeted therapy of cancer. All in all, the presence of real-life examples, proving the regularity of transition to a new paradigm, and a wide range  of technical and diagnostic capabilities available and constantly evolving make the all-round transition to precision medicine almost inevitable.

  12. The multi-filter rotating shadowband radiometer (MFRSR) - precision infrared radiometer (PIR) platform in Fairbanks: Scientific objectives

    Energy Technology Data Exchange (ETDEWEB)

    Stamnes, K.; Leontieva, E. [Univ. of Alaska, Fairbanks (United States)

    1996-04-01

    The multi-filter rotating shadowband radiometer (MFRSR) and precision infrared radiometer (PIR) have been employed at the Geophysical Institute in Fairbanks to check their performance under arctic conditions. Drawing on the experience of the previous measurements in the Arctic, the PIR was equipped with a ventilator to prevent frost and moisture build-up. We adopted the Solar Infrared Observing Sytem (SIROS) concept from the Southern Great Plains Cloud and Radiation Testbed (CART) to allow implementation of the same data processing software for a set of radiation and meteorological instruments. To validate the level of performance of the whole SIROS prior to its incorporation into the North Slope of Alaska (NSA) Cloud and Radiation Testbed Site instrumental suite for flux radiatin measurements, the comparison between measurements and model predictions will be undertaken to assess the MFRSR-PIR Arctic data quality.

  13. The value of precision for image-based decision support in weed management

    DEFF Research Database (Denmark)

    Franco de los Ríos, Camilo; Pedersen, Søren Marcus; Papaharalampos, Haris

    2017-01-01

    Decision support methodologies in precision agriculture should integrate the different dimensions composing the added complexity of operational decision problems. Special attention has to be given to the adequate knowledge extraction techniques for making sense of the collected data, processing...... the information for assessing decision makers and farmers in the efficient and sustainable management of the field. Focusing on weed management, the integration of operational aspects for weed spraying is an open challenge for modeling the farmers’ decision problem, identifying satisfactory solutions...... for the implementation of automatic weed recognition procedures. The objective of this paper is to develop a decision support methodology for detecting the undesired weed from aerial images, building an image-based viewpoint consisting in relevant operational knowledge for applying precision spraying. In this way...

  14. The precision problem in conservation and restoration

    Science.gov (United States)

    Hiers, J. Kevin; Jackson, Stephen T.; Hobbs, Richard J.; Bernhardt, Emily S.; Valentine, Leonie E.

    2016-01-01

    Within the varied contexts of environmental policy, conservation of imperilled species populations, and restoration of damaged habitats, an emphasis on idealized optimal conditions has led to increasingly specific targets for management. Overly-precise conservation targets can reduce habitat variability at multiple scales, with unintended consequences for future ecological resilience. We describe this dilemma in the context of endangered species management, stream restoration, and climate-change adaptation. Inappropriate application of conservation targets can be expensive, with marginal conservation benefit. Reduced habitat variability can limit options for managers trying to balance competing objectives with limited resources. Conservation policies should embrace habitat variability, expand decision-space appropriately, and support adaptation to local circumstances to increase ecological resilience in a rapidly changing world.

  15. Spatial analysis of NDVI readings with difference sampling density

    Science.gov (United States)

    Advanced remote sensing technologies provide research an innovative way of collecting spatial data for use in precision agriculture. Sensor information and spatial analysis together allow for a complete understanding of the spatial complexity of a field and its crop. The objective of the study was...

  16. HARNESSING BIG DATA FOR PRECISION MEDICINE: INFRASTRUCTURES AND APPLICATIONS.

    Science.gov (United States)

    Yu, Kun-Hsing; Hart, Steven N; Goldfeder, Rachel; Zhang, Qiangfeng Cliff; Parker, Stephen C J; Snyder, Michael

    2017-01-01

    Precision medicine is a health management approach that accounts for individual differences in genetic backgrounds and environmental exposures. With the recent advancements in high-throughput omics profiling technologies, collections of large study cohorts, and the developments of data mining algorithms, big data in biomedicine is expected to provide novel insights into health and disease states, which can be translated into personalized disease prevention and treatment plans. However, petabytes of biomedical data generated by multiple measurement modalities poses a significant challenge for data analysis, integration, storage, and result interpretation. In addition, patient privacy preservation, coordination between participating medical centers and data analysis working groups, as well as discrepancies in data sharing policies remain important topics of discussion. In this workshop, we invite experts in omics integration, biobank research, and data management to share their perspectives on leveraging big data to enable precision medicine.Workshop website: http://tinyurl.com/PSB17BigData; HashTag: #PSB17BigData.

  17. Dynamical Coordination of Hand Intrinsic Muscles for Precision Grip in Diabetes Mellitus.

    Science.gov (United States)

    Li, Ke; Wei, Na; Cheng, Mei; Hou, Xingguo; Song, Jun

    2018-03-12

    This study investigated the effects of diabetes mellitus (DM) on dynamical coordination of hand intrinsic muscles during precision grip. Precision grip was tested using a custom designed apparatus with stable and unstable loads, during which the surface electromyographic (sEMG) signals of the abductor pollicis brevis (APB) and first dorsal interosseous (FDI) were recorded simultaneously. Recurrence quantification analysis (RQA) was applied to quantify the dynamical structure of sEMG signals of the APB and FDI; and cross recurrence quantification analysis (CRQA) was used to assess the intermuscular coupling between the two intrinsic muscles. This study revealed that the DM altered the dynamical structure of muscle activation for the FDI and the dynamical intermuscular coordination between the APB and FDI during precision grip. A reinforced feedforward mechanism that compensates the loss of sensory feedbacks in DM may be responsible for the stronger intermuscular coupling between the APB and FDI muscles. Sensory deficits in DM remarkably decreased the capacity of online motor adjustment based on sensory feedback, rendering a lower adaptability to the uncertainty of environment. This study shed light on inherent dynamical properties underlying the intrinsic muscle activation and intermuscular coordination for precision grip and the effects of DM on hand sensorimotor function.

  18. Transferability of Object-Oriented Image Analysis Methods for Slum Identification

    Directory of Open Access Journals (Sweden)

    Alfred Stein

    2013-08-01

    Full Text Available Updated spatial information on the dynamics of slums can be helpful to measure and evaluate progress of policies. Earlier studies have shown that semi-automatic detection of slums using remote sensing can be challenging considering the large variability in definition and appearance. In this study, we explored the potential of an object-oriented image analysis (OOA method to detect slums, using very high resolution (VHR imagery. This method integrated expert knowledge in the form of a local slum ontology. A set of image-based parameters was identified that was used for differentiating slums from non-slum areas in an OOA environment. The method was implemented on three subsets of the city of Ahmedabad, India. Results show that textural features such as entropy and contrast derived from a grey level co-occurrence matrix (GLCM and the size of image segments are stable parameters for classification of built-up areas and the identification of slums. Relation with classified slum objects, in terms of enclosed by slums and relative border with slums was used to refine classification. The analysis on three different subsets showed final accuracies ranging from 47% to 68%. We conclude that our method produces useful results as it allows including location specific adaptation, whereas generically applicable rulesets for slums are still to be developed.

  19. Learning Objectives for Master's theses at DTU Management Engineering

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Rasmussen, Birgitte; Hinz, Hector Nøhr

    2010-01-01

    , different. The DTU Study Handbook states that:”Learning objectives are an integrated part of the supervision”, which provides you with the opportunity – naturally in cooperation with your supervisor – to formulate learning objectives for your Master's thesis. There are at least three good reasons for being...... that you formulate precise and useful learning objectives for your Master's thesis. These notes of inspiration have been written to help you do exactly this. The notes discuss the requirements for the learning objectives, examples of learning objectives and the assessment criteria defined by DTU Management...... Engineering as well as, but not least, some useful things to remember concerning your submission and the assessment of the Master's thesis. DTU Management Engineering Claus Thorp Hansen Birgitte Rasmussen Hector Nøhr Hinz © DTU Management Engineering 2010 ISBN nr. 978-87-90855-94-7 This document...

  20. Benchmarking the Applicability of Ontology in Geographic Object-Based Image Analysis

    Directory of Open Access Journals (Sweden)

    Sachit Rajbhandari

    2017-11-01

    Full Text Available In Geographic Object-based Image Analysis (GEOBIA, identification of image objects is normally achieved using rule-based classification techniques supported by appropriate domain knowledge. However, GEOBIA currently lacks a systematic method to formalise the domain knowledge required for image object identification. Ontology provides a representation vocabulary for characterising domain-specific classes. This study proposes an ontological framework that conceptualises domain knowledge in order to support the application of rule-based classifications. The proposed ontological framework is tested with a landslide case study. The Web Ontology Language (OWL is used to construct an ontology in the landslide domain. The segmented image objects with extracted features are incorporated into the ontology as instances. The classification rules are written in Semantic Web Rule Language (SWRL and executed using a semantic reasoner to assign instances to appropriate landslide classes. Machine learning techniques are used to predict new threshold values for feature attributes in the rules. Our framework is compared with published work on landslide detection where ontology was not used for the image classification. Our results demonstrate that a classification derived from the ontological framework accords with non-ontological methods. This study benchmarks the ontological method providing an alternative approach for image classification in the case study of landslides.

  1. IMPLEMENTATION OF OBJECT TRACKING ALGORITHMS ON THE BASIS OF CUDA TECHNOLOGY

    Directory of Open Access Journals (Sweden)

    B. A. Zalesky

    2014-01-01

    Full Text Available A fast version of correlation algorithm to track objects on video-sequences made by a nonstabilized camcorder is presented. The algorithm is based on comparison of local correlations of the object image and regions of video-frames. The algorithm is implemented in programming technology CUDA. Application of CUDA allowed to attain real time execution of the algorithm. To improve its precision and stability, a robust version of the Kalman filter has been incorporated into the flowchart. Tests showed applicability of the algorithm to practical object tracking.

  2. Cognitive-motor interference while grasping, lifting and holding objects.

    Directory of Open Access Journals (Sweden)

    Erwan Guillery

    Full Text Available In daily life, object manipulation is usually performed concurrently to the execution of cognitive tasks. The aim of the present study was to determine which aspects of precision grip require cognitive resources using a motor-cognitive dual-task paradigm. Eighteen healthy participants took part in the experiment, which comprised two conditions. In the first condition, participants performed a motor task without any concomitant cognitive task. They were instructed to grip, lift and hold an apparatus incorporating strain gauges allowing a continuous measurement of the force perpendicular to each contact surface (grip force, GF as well as the total tangential force applied on the object (load force, LF. In the second condition, participants performed the same motor task while concurrently performing a cognitive task consisting in a complex visual search combined with counting. In the dual-task condition, we found a significant increase in the duration of the preload phase (time between initial contact of the fingers with the apparatus and onset of the load force, as well as a significant increase of the grip force during the holding phase, indicating that the cognitive task interfered with the initial force scaling performed during the preload phase and the fine-tuning of grip force during the hold phase. These findings indicate that these aspects of precision grip require cognitive resources. In contrast, other aspects of the precision grip, such as the temporal coupling between grip and load forces, were not affected by the cognitive task, suggesting that they reflect more automatic processes. Taken together, our results suggest that assessing the dynamic and temporal parameters of precision grip in the context of a concurrent cognitive task may constitute a more ecological and better-suited tool to characterize motor dysfunction in patients.

  3. Precision medicine in pediatric oncology: Lessons learned and next steps

    Science.gov (United States)

    Mody, Rajen J.; Prensner, John R.; Everett, Jessica; Parsons, D. Williams; Chinnaiyan, Arul M.

    2017-01-01

    The maturation of genomic technologies has enabled new discoveries in disease pathogenesis as well as new approaches to patient care. In pediatric oncology, patients may now receive individualized genomic analysis to identify molecular aberrations of relevance for diagnosis and/or treatment. In this context, several recent clinical studies have begun to explore the feasibility and utility of genomics-driven precision medicine. Here, we review the major developments in this field, discuss current limitations, and explore aspects of the clinical implementation of precision medicine, which lack consensus. Lastly, we discuss ongoing scientific efforts in this arena, which may yield future clinical applications. PMID:27748023

  4. Development of sensor guided precision sprayers

    NARCIS (Netherlands)

    Nieuwenhuizen, A.T.; Zande, van de J.C.

    2013-01-01

    Sensor guided precision sprayers were developed to automate the spray process with a focus on emission reduction and identical or increased efficacy, with the precision agriculture concept in mind. Within the project “Innovations2” sensor guided precision sprayers were introduced to leek,

  5. Autocalibration of high precision drift tubes

    International Nuclear Information System (INIS)

    Bacci, C.; Bini, C.; Ciapetti, G.; De Zorzi, G.; Gauzzi, P.; Lacava, F.; Nisati, A.; Pontecorvo, L.; Rosati, S.; Veneziano, S.; Cambiaghi, M.; Casellotti, G.; Conta, C.; Fraternali, M.; Lanza, A.; Livan, M.; Polesello, G.; Rimoldi, A.; Vercesi, V.

    1997-01-01

    We present the results on MDT (monitored drift tubes) autocalibration studies obtained from the analysis of the data collected in Summer 1995 on the H8B Muon Test Beam. In particular we studied the possibility of autocalibration of the MDT using four or three layers of tubes, and we compared the calibration obtained using a precise external tracker with the output of the autocalibration procedure. Results show the feasibility of autocalibration with four and three tubes and the good accuracy of the autocalibration procedure. (orig.)

  6. Equivalence and precision of knee cartilage morphometry between different segmentation teams, cartilage regions, and MR acquisitions

    Science.gov (United States)

    Schneider, E; Nevitt, M; McCulloch, C; Cicuttini, FM; Duryea, J; Eckstein, F; Tamez-Pena, J

    2012-01-01

    Objective To compare precision and evaluate equivalence of femorotibial cartilage volume (VC) and mean cartilage thickness (ThCtAB.Me) from independent segmentation teams using identical MR images from three series: sagittal 3D Dual Echo in the Steady State (DESS), coronal multi-planar reformat (DESS-MPR) of DESS and coronal 3D Fast Low Angle SHot (FLASH). Design 19 subjects underwent test-retest MR imaging at 3 Tesla. Four teams segmented the cartilage using prospectively defined plate regions and rules. Mixed models analysis of the pooled data were used to evaluate the effect of acquisition, team and plate on precision and Pearson correlations and mixed models to evaluate equivalence. Results Segmentation team differences dominated measurement variability in most cartilage regions for all image series. Precision of VC and ThCtAB.Me differed significantly by team and cartilage plate, but not between FLASH and DESS. Mean values of VC and ThCtAB.Me differed by team (P<0.05) for DESS, FLASH and DESS-MPR, FLASH VC was 4–6% larger than DESS in the medial tibia and lateral central femur, and FLASH ThCtAB.Me was 5–6% larger in the medial tibia, but 4–8% smaller in the medial central femur. Correlations betweenDESS and FLASH for VC and ThCtAB.Me were high (r=0.90–0.97), except for DESS versus FLASH medial central femur ThCtAB.Me (r=0.81–0.83). Conclusions Cartilage morphology metrics from different image contrasts had similar precision, were generally equivalent, and may be combined for cross-sectional analyses if potential systematic offsets are accounted for. Data from different teams should not be pooled unless equivalence is demonstrated for cartilage metrics of interest. PMID:22521758

  7. Designing concept maps for a precise and objective description of pharmaceutical innovations

    Directory of Open Access Journals (Sweden)

    Iordatii Maia

    2013-01-01

    Full Text Available Abstract Background When a new drug is launched onto the market, information about the new manufactured product is contained in its monograph and evaluation report published by national drug agencies. Health professionals need to be able to determine rapidly and easily whether the new manufactured product is potentially useful for their practice. There is therefore a need to identify the best way to group together and visualize the main items of information describing the nature and potential impact of the new drug. The objective of this study was to identify these items of information and to bring them together in a model that could serve as the standard for presenting the main features of new manufactured product. Methods We developed a preliminary conceptual model of pharmaceutical innovations, based on the knowledge of the authors. We then refined this model, using a random sample of 40 new manufactured drugs recently approved by the national drug regulatory authorities in France and covering a broad spectrum of innovations and therapeutic areas. Finally, we used another sample of 20 new manufactured drugs to determine whether the model was sufficiently comprehensive. Results The results of our modeling led to three sub models described as conceptual maps representingi the medical context for use of the new drug (indications, type of effect, therapeutical arsenal for the same indications, ii the nature of the novelty of the new drug (new molecule, new mechanism of action, new combination, new dosage, etc., and iii the impact of the drug in terms of efficacy, safety and ease of use, compared with other drugs with the same indications. Conclusions Our model can help to standardize information about new drugs released onto the market. It is potentially useful to the pharmaceutical industry, medical journals, editors of drug databases and medical software, and national or international drug regulation agencies, as a means of describing the main

  8. Adobe Boxes: Locating Object Proposals Using Object Adobes.

    Science.gov (United States)

    Fang, Zhiwen; Cao, Zhiguo; Xiao, Yang; Zhu, Lei; Yuan, Junsong

    2016-09-01

    Despite the previous efforts of object proposals, the detection rates of the existing approaches are still not satisfactory enough. To address this, we propose Adobe Boxes to efficiently locate the potential objects with fewer proposals, in terms of searching the object adobes that are the salient object parts easy to be perceived. Because of the visual difference between the object and its surroundings, an object adobe obtained from the local region has a high probability to be a part of an object, which is capable of depicting the locative information of the proto-object. Our approach comprises of three main procedures. First, the coarse object proposals are acquired by employing randomly sampled windows. Then, based on local-contrast analysis, the object adobes are identified within the enlarged bounding boxes that correspond to the coarse proposals. The final object proposals are obtained by converging the bounding boxes to tightly surround the object adobes. Meanwhile, our object adobes can also refine the detection rate of most state-of-the-art methods as a refinement approach. The extensive experiments on four challenging datasets (PASCAL VOC2007, VOC2010, VOC2012, and ILSVRC2014) demonstrate that the detection rate of our approach generally outperforms the state-of-the-art methods, especially with relatively small number of proposals. The average time consumed on one image is about 48 ms, which nearly meets the real-time requirement.

  9. High-precision shape representation using a neuromorphic vision sensor with synchronous address-event communication interface

    Science.gov (United States)

    Belbachir, A. N.; Hofstätter, M.; Litzenberger, M.; Schön, P.

    2009-10-01

    A synchronous communication interface for neuromorphic temporal contrast vision sensors is described and evaluated in this paper. This interface has been designed for ultra high-speed synchronous arbitration of a temporal contrast image sensors pixels' data. Enabling high-precision timestamping, this system demonstrates its uniqueness for handling peak data rates and preserving the main advantage of the neuromorphic electronic systems, that is high and accurate temporal resolution. Based on a synchronous arbitration concept, the timestamping has a resolution of 100 ns. Both synchronous and (state-of-the-art) asynchronous arbiters have been implemented in a neuromorphic dual-line vision sensor chip in a standard 0.35 µm CMOS process. The performance analysis of both arbiters and the advantages of the synchronous arbitration over asynchronous arbitration in capturing high-speed objects are discussed in detail.

  10. High-precision shape representation using a neuromorphic vision sensor with synchronous address-event communication interface

    International Nuclear Information System (INIS)

    Belbachir, A N; Hofstätter, M; Litzenberger, M; Schön, P

    2009-01-01

    A synchronous communication interface for neuromorphic temporal contrast vision sensors is described and evaluated in this paper. This interface has been designed for ultra high-speed synchronous arbitration of a temporal contrast image sensors pixels' data. Enabling high-precision timestamping, this system demonstrates its uniqueness for handling peak data rates and preserving the main advantage of the neuromorphic electronic systems, that is high and accurate temporal resolution. Based on a synchronous arbitration concept, the timestamping has a resolution of 100 ns. Both synchronous and (state-of-the-art) asynchronous arbiters have been implemented in a neuromorphic dual-line vision sensor chip in a standard 0.35 µm CMOS process. The performance analysis of both arbiters and the advantages of the synchronous arbitration over asynchronous arbitration in capturing high-speed objects are discussed in detail

  11. Precision of DVC approaches for strain analysis in bone imaged with μCT at different dimensional levels.

    Science.gov (United States)

    Dall'Ara, Enrico; Peña-Fernández, Marta; Palanca, Marco; Giorgi, Mario; Cristofolini, Luca; Tozzi, Gianluca

    2017-11-01

    Accurate measurement of local strain in heterogeneous and anisotropic bone tissue is fundamental to understand the pathophysiology of musculoskeletal diseases, to evaluate the effect of interventions from preclinical studies, and to optimize the design and delivery of biomaterials. Digital volume correlation (DVC) can be used to measure the three-dimensional displacement and strain fields from micro-Computed Tomography (µCT) images of loaded specimens. However, this approach is affected by the quality of the input images, by the morphology and density of the tissue under investigation, by the correlation scheme, and by the operational parameters used in the computation. Therefore, for each application the precision of the method should be evaluated. In this paper we present the results collected from datasets analyzed in previous studies as well as new data from a recent experimental campaign for characterizing the relationship between the precision of two different DVC approaches and the spatial resolution of the outputs. Different bone structures scanned with laboratory source µCT or Synchrotron light µCT (SRµCT) were processed in zero-strain tests to evaluate the precision of the DVC methods as a function of the subvolume size that ranged from 8 to 2500 micrometers. The results confirmed that for every microstructure the precision of DVC improves for larger subvolume size, following power laws. However, for the first time large differences in the precision of both local and global DVC approaches have been highlighted when SRµCT or in vivo µCT images were used instead of conventional ex vivo µCT. These findings suggest that in situ mechanical testing protocols applied in SRµCT facilities should be optimized in order to allow DVC analyses of localized strain measurements. Moreover, for in vivo µCT applications DVC analyses should be performed only with relatively course spatial resolution for achieving a reasonable precision of the method. In conclusion

  12. The study of the precision and accuracy of quality control in DXA bone mineral densitometry

    International Nuclear Information System (INIS)

    Gong Jian; Xu Hao

    2005-01-01

    Objective: To study the precision and accuracy of quality control (QC) in dual-energy X-ray absorptiometry (DXA) bone mineral densitometry so as to raise the reliability and necessity of the results. Methods: 1) Short-term precision trial: 30 people and 30 SD male rats were chosen, and a precision trail was performed. Each people was scanned twice and reposited in next study. The precision and the least significant change (LSC) of each examinated region were calculated. The short-term precision trail of the rats was performed in the similar way. 2) Accuracy trial: measured the body phantom supplied by factor daily, and compared the results with real value, then calculated the accuracy and correction factor. A Shewhart chart was set up based on average values. Results: 1) People's coefficient of variation (CV) and LSC in the lumbar and proximal femur were 0.7%-2.2% and 0.018-0.048 g/cm 2 . Rats' whole body short-term precision was 0.9%. 2) The average accuracy of DXA densitometer was -0.81%, the correction factor was 0.992. The average bone mineral density measured in successive 25 d was 1.244 g/cm 2 , the standard deviation (SD) was 0.008. Conclusion: The precision and accuracy trail can help to get the information about the working state of the instrument and to analyze the measured results, and can effectively raise the reliability of the measure. (authors)

  13. Genetic Particle Swarm Optimization–Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection

    Science.gov (United States)

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-01-01

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm. PMID:27483285

  14. Genetic Particle Swarm Optimization-Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection.

    Science.gov (United States)

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-07-30

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm.

  15. The International GPS Service (IGS) as a Continuous Reference System for Precise GPS Positioning

    Science.gov (United States)

    Neilan, Ruth; Heflin, Michael; Watkins, Michael; Zumberge, James

    1996-01-01

    The International GPS Service for Geodynamics (IGS) is an organization which operates under the auspices of the International Association of Geodesy (IAG) and has been operational since January 1994. The primary objective of the IGS is to provide precise GPS data and data products to support geodetic and geophysical research activities.

  16. Analysis of Camera Parameters Value in Various Object Distances Calibration

    International Nuclear Information System (INIS)

    Yusoff, Ahmad Razali; Ariff, Mohd Farid Mohd; Idris, Khairulnizam M; Majid, Zulkepli; Setan, Halim; Chong, Albert K

    2014-01-01

    In photogrammetric applications, good camera parameters are needed for mapping purpose such as an Unmanned Aerial Vehicle (UAV) that encompassed with non-metric camera devices. Simple camera calibration was being a common application in many laboratory works in order to get the camera parameter's value. In aerial mapping, interior camera parameters' value from close-range camera calibration is used to correct the image error. However, the causes and effects of the calibration steps used to get accurate mapping need to be analyze. Therefore, this research aims to contribute an analysis of camera parameters from portable calibration frame of 1.5 × 1 meter dimension size. Object distances of two, three, four, five, and six meters are the research focus. Results are analyzed to find out the changes in image and camera parameters' value. Hence, camera calibration parameter's of a camera is consider different depend on type of calibration parameters and object distances

  17. Precision tests of the standard model at LEP

    International Nuclear Information System (INIS)

    Mele, Barbara; Universita La Sapienza, Rome

    1994-01-01

    Recent LEP results on electroweak precision measurements are reviewed. Line-shape and asymmetries analysis on the Z 0 peak is described. Then, the consistency of the Standard Model predictions with experimental data and consequent limits on the top mass are discussed. Finally, the possibility of extracting information and constrains on new theoretical models from present data is examined. (author). 20 refs., 5 tabs

  18. McDonald Observatory Planetary Search - A high precision stellar radial velocity survey for other planetary systems

    Science.gov (United States)

    Cochran, William D.; Hatzes, Artie P.

    1993-01-01

    The McDonald Observatory Planetary Search program surveyed a sample of 33 nearby F, G, and K stars since September 1987 to search for substellar companion objects. Measurements of stellar radial velocity variations to a precision of better than 10 m/s were performed as routine observations to detect Jovian planets in orbit around solar type stars. Results confirm the detection of a companion object to HD114762.

  19. Evaluation of mechanical properties and microstructural characterization of consolidated Cobalt-Chromium-Molybdenum obtained by selective laser melting and precision casting

    International Nuclear Information System (INIS)

    Mergulhão, Marcello Vertamatti

    2017-01-01

    The objective of this work was to study the mechanical properties and microstructural characterization of specimens of the Co-Cr-Mo alloy obtained by additive manufacturing -selective laser melting (SLM) and precision casting aiming at the manufacture of dental prostheses. The following steps were carried out on Co-Cr-Mo gas-atomized powders: 1) investigation of the physical, chemical and thermal properties of atomized powders in different grain sizes (denominated: D1 <15 μm, D2 20-50 μm and D3 > 75 μm); 2) the consolidation of standard specimens via consolidation techniques; 3) characterization of consolidated by analysis of: cytotoxicity, porosity, X ray diffraction and dilatometry; 4) mechanical characterization of tensile, 3 point bending, hardness (macro and micro Vickers) tests and microstructural characterization (optical and scanning electron microscopy). In general, the results observed were: the grain size D2 (20-50 μm) is the one that best fits in the analysis of packaging, for the consolidation by SLM; the biocompatibility of the samples obtained a positive result for both processing techniques; the mechanical evaluation of the specimens shows that the SLM technique provides superior mechanical properties (yield stress, rupture stress, maximum stress, elongation and hardness), compared to those obtained by the precision casting technique; the microstructure obtained by the SLM process results in an ultrafine grains with high chemical homogeneity, differentiated by the gross dendritic microstructure in the casting process. In this way, the development of the present study evidenced superior quality in manufacturing customized dental components (copings) by SLM technique compared to precision casting. (author)

  20. GPR Detection of Buried Symmetrically Shaped Mine-like Objects using Selective Independent Component Analysis

    DEFF Research Database (Denmark)

    Karlsen, Brian; Sørensen, Helge Bjarup Dissing; Larsen, Jan

    2003-01-01

    from small-scale anti-personal (AP) mines to large-scale anti-tank (AT) mines were designed. Large-scale SF-GPR measurements on this series of mine-like objects buried in soil were performed. The SF-GPR data was acquired using a wideband monostatic bow-tie antenna operating in the frequency range 750......This paper addresses the detection of mine-like objects in stepped-frequency ground penetrating radar (SF-GPR) data as a function of object size, object content, and burial depth. The detection approach is based on a Selective Independent Component Analysis (SICA). SICA provides an automatic...... ranking of components, which enables the suppression of clutter, hence extraction of components carrying mine information. The goal of the investigation is to evaluate various time and frequency domain ICA approaches based on SICA. Performance comparison is based on a series of mine-like objects ranging...

  1. Joint Conditional Random Field Filter for Multi-Object Tracking

    Directory of Open Access Journals (Sweden)

    Luo Ronghua

    2011-03-01

    Full Text Available Object tracking can improve the performance of mobile robot especially in populated dynamic environments. A novel joint conditional random field Filter (JCRFF based on conditional random field with hierarchical structure is proposed for multi-object tracking by abstracting the data associations between objects and measurements to be a sequence of labels. Since the conditional random field makes no assumptions about the dependency structure between the observations and it allows non-local dependencies between the state and the observations, the proposed method can not only fuse multiple cues including shape information and motion information to improve the stability of tracking, but also integrate moving object detection and object tracking quite well. At the same time, implementation of multi-object tracking based on JCRFF with measurements from the laser range finder on a mobile robot is studied. Experimental results with the mobile robot developed in our lab show that the proposed method has higher precision and better stability than joint probabilities data association filter (JPDAF.

  2. Compendium of Neutron Beam Facilities for High Precision Nuclear Data Measurements

    International Nuclear Information System (INIS)

    2014-07-01

    The recent advances in the development of nuclear science and technology, demonstrating the globally growing economy, require highly accurate, powerful simulations and precise analysis of the experimental results. Confidence in these results is still determined by the accuracy of the atomic and nuclear input data. For studying material response, neutron beams produced from accelerators and research reactors in broad energy spectra are reliable and indispensable tools to obtain high accuracy experimental results for neutron induced reactions. The IAEA supports the accomplishment of high precision nuclear data using nuclear facilities in particular, based on particle accelerators and research reactors around the world. Such data are essential for numerous applications in various industries and research institutions, including the safety and economical operation of nuclear power plants, future fusion reactors, nuclear medicine and non-destructive testing technologies. The IAEA organized and coordinated the technical meeting Use of Neutron Beams for High Precision Nuclear Data Measurements, in Budapest, Hungary, 10–14 December 2012. The meeting was attended by participants from 25 Member States and three international organizations — the European Organization for Nuclear Research (CERN), the Joint Research Centre (JRC) and the Organisation for Economic Co-operation and Development (OECD) Nuclear Energy Agency (OECD/NEA). The objectives of the meeting were to provide a forum to exchange existing know-how and to share the practical experiences of neutron beam facilities and associated instrumentation, with regard to the measurement of high precision nuclear data using both accelerators and research reactors. Furthermore, the present status and future developments of worldwide accelerator and research reactor based neutron beam facilities were discussed. This publication is a summary of the technical meeting and additional materials supplied by the international

  3. What is precision medicine?

    Science.gov (United States)

    König, Inke R; Fuchs, Oliver; Hansen, Gesine; von Mutius, Erika; Kopp, Matthias V

    2017-10-01

    The term "precision medicine" has become very popular over recent years, fuelled by scientific as well as political perspectives. Despite its popularity, its exact meaning, and how it is different from other popular terms such as "stratified medicine", "targeted therapy" or "deep phenotyping" remains unclear. Commonly applied definitions focus on the stratification of patients, sometimes referred to as a novel taxonomy, and this is derived using large-scale data including clinical, lifestyle, genetic and further biomarker information, thus going beyond the classical "signs-and-symptoms" approach.While these aspects are relevant, this description leaves open a number of questions. For example, when does precision medicine begin? In which way does the stratification of patients translate into better healthcare? And can precision medicine be viewed as the end-point of a novel stratification of patients, as implied, or is it rather a greater whole?To clarify this, the aim of this paper is to provide a more comprehensive definition that focuses on precision medicine as a process. It will be shown that this proposed framework incorporates the derivation of novel taxonomies and their role in healthcare as part of the cycle, but also covers related terms. Copyright ©ERS 2017.

  4. Study of nanometer-level precise phase-shift system used in electronic speckle shearography and phase-shift pattern interferometry

    Science.gov (United States)

    Jing, Chao; Liu, Zhongling; Zhou, Ge; Zhang, Yimo

    2011-11-01

    The nanometer-level precise phase-shift system is designed to realize the phase-shift interferometry in electronic speckle shearography pattern interferometry. The PZT is used as driving component of phase-shift system and translation component of flexure hinge is developed to realize micro displacement of non-friction and non-clearance. Closed-loop control system is designed for high-precision micro displacement, in which embedded digital control system is developed for completing control algorithm and capacitive sensor is used as feedback part for measuring micro displacement in real time. Dynamic model and control model of the nanometer-level precise phase-shift system is analyzed, and high-precision micro displacement is realized with digital PID control algorithm on this basis. It is proved with experiments that the location precision of the precise phase-shift system to step signal of displacement is less than 2nm and the location precision to continuous signal of displacement is less than 5nm, which is satisfied with the request of the electronic speckle shearography and phase-shift pattern interferometry. The stripe images of four-step phase-shift interferometry and the final phase distributed image correlated with distortion of objects are listed in this paper to prove the validity of nanometer-level precise phase-shift system.

  5. Object Classification Based on Analysis of Spectral Characteristics of Seismic Signal Envelopes

    Science.gov (United States)

    Morozov, Yu. V.; Spektor, A. A.

    2017-11-01

    A method for classifying moving objects having a seismic effect on the ground surface is proposed which is based on statistical analysis of the envelopes of received signals. The values of the components of the amplitude spectrum of the envelopes obtained applying Hilbert and Fourier transforms are used as classification criteria. Examples illustrating the statistical properties of spectra and the operation of the seismic classifier are given for an ensemble of objects of four classes (person, group of people, large animal, vehicle). It is shown that the computational procedures for processing seismic signals are quite simple and can therefore be used in real-time systems with modest requirements for computational resources.

  6. Installing and Executing Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) and Its Dependencies

    Science.gov (United States)

    2017-02-01

    SUPPLEMENTARY NOTES 14. ABSTRACT Information Object Analysis, Intent, Dissemination, and Enhancement (IOAIDE) is a novel information framework developed...prototyping. It supports dynamic plugin of analysis modules, for either research or analysis tasks. The framework integrates multiple image processing...Requirements 2 3. Installing the Software for IOAIDE 2 3.1 Load ARL Software 2 3.2 Load ARL Applications 4 3.3 Load the DSPro Software 7 3.4 Update Java

  7. Assessment of precision and concordance of quantitative mitochondrial DNA assays: a collaborative international quality assurance study

    NARCIS (Netherlands)

    Hammond, Emma L.; Sayer, David; Nolan, David; Walker, Ulrich A.; Ronde, Anthony de; Montaner, Julio S. G.; Cote, Helene C. F.; Gahan, Michelle E.; Cherry, Catherine L.; Wesselingh, Steven L.; Reiss, Peter; Mallal, Simon

    2003-01-01

    Background: A number of international research groups have developed DNA quantitation assays in order to investigate the role of mitochondrial DNA depletion in anti-retroviral therapy-induced toxicities. Objectives: A collaborative study was undertaken to evaluate intra-assay precision and between

  8. Precision medicine driven by cancer systems biology.

    Science.gov (United States)

    Filipp, Fabian V

    2017-03-01

    Molecular insights from genome and systems biology are influencing how cancer is diagnosed and treated. We critically evaluate big data challenges in precision medicine. The melanoma research community has identified distinct subtypes involving chronic sun-induced damage and the mitogen-activated protein kinase driver pathway. In addition, despite low mutation burden, non-genomic mitogen-activated protein kinase melanoma drivers are found in membrane receptors, metabolism, or epigenetic signaling with the ability to bypass central mitogen-activated protein kinase molecules and activating a similar program of mitogenic effectors. Mutation hotspots, structural modeling, UV signature, and genomic as well as non-genomic mechanisms of disease initiation and progression are taken into consideration to identify resistance mutations and novel drug targets. A comprehensive precision medicine profile of a malignant melanoma patient illustrates future rational drug targeting strategies. Network analysis emphasizes an important role of epigenetic and metabolic master regulators in oncogenesis. Co-occurrence of driver mutations in signaling, metabolic, and epigenetic factors highlights how cumulative alterations of our genomes and epigenomes progressively lead to uncontrolled cell proliferation. Precision insights have the ability to identify independent molecular pathways suitable for drug targeting. Synergistic treatment combinations of orthogonal modalities including immunotherapy, mitogen-activated protein kinase inhibitors, epigenetic inhibitors, and metabolic inhibitors have the potential to overcome immune evasion, side effects, and drug resistance.

  9. Observing exoplanet populations with high-precision astrometry

    Science.gov (United States)

    Sahlmann, Johannes

    2012-06-01

    This thesis deals with the application of the astrometry technique, consisting in measuring the position of a star in the plane of the sky, for the discovery and characterisation of extra-solar planets. It is feasible only with a very high measurement precision, which motivates the use of space observatories, the development of new ground-based astronomical instrumentation and of innovative data analysis methods: The study of Sun-like stars with substellar companions using CORALIE radial velocities and HIPPARCOS astrometry leads to the determination of the frequency of close brown dwarf companions and to the discovery of a dividing line between massive planets and brown dwarf companions; An observation campaign employing optical imaging with a very large telescope demonstrates sufficient astrometric precision to detect planets around ultra-cool dwarf stars and the first results of the survey are presented; Finally, the design and initial astrometric performance of PRIMA, ! a new dual-feed near-infrared interferometric observing facility for relative astrometry is presented.

  10. Precision Medicine in Cancer Treatment

    Science.gov (United States)

    Precision medicine helps doctors select cancer treatments that are most likely to help patients based on a genetic understanding of their disease. Learn about the promise of precision medicine and the role it plays in cancer treatment.

  11. Precision electron polarimetry

    International Nuclear Information System (INIS)

    Chudakov, E.

    2013-01-01

    A new generation of precise Parity-Violating experiments will require a sub-percent accuracy of electron beam polarimetry. Compton polarimetry can provide such accuracy at high energies, but at a few hundred MeV the small analyzing power limits the sensitivity. Mo/ller polarimetry provides a high analyzing power independent on the beam energy, but is limited by the properties of the polarized targets commonly used. Options for precision polarimetry at 300 MeV will be discussed, in particular a proposal to use ultra-cold atomic hydrogen traps to provide a 100%-polarized electron target for Mo/ller polarimetry

  12. The migration of femoral components after total hip replacement surgery: accuracy and precision of software-aided measurements

    International Nuclear Information System (INIS)

    Decking, J.; Schuetz, U.; Decking, R.; Puhl, W.

    2003-01-01

    Objective: To assess the accuracy and precision of a software-aided system to measure migration of femoral components after total hip replacement (THR) on digitised radiographs. Design and patients: Subsidence and varus-valgus tilt of THR stems within the femur were measured on digitised anteroposterior pelvic radiographs. The measuring software (UMA, GEMED, Germany) relies on bony landmarks and comparability parameters of two consecutive radiographs. Its accuracy and precision were calculated by comparing it with the gold standard in migration measurements, radiostereometric analysis (RSA). Radiographs and corresponding RSA measurements were performed in 60 patients (38-69 years) following cementless THR surgery. Results and conclusions: The UMA software measured the subsidence of the stems with an accuracy of ±2.5 mm and varus-valgus tilt with an accuracy of ±1.8 (95% confidence interval). A good interobserver and intraobserver reliability was calculated with Cronbach's alpha ranging from 0.86 to 0.97. Measuring the subsidence of THR stems within the femur is an important parameter in the diagnosis of implant loosening. Software systems such as UMA improve the accuracy of migration measurements and are easy to use on routinely performed radiographs of operated hip joints. (orig.)

  13. The objective assessment of experts' and novices' suturing skills using an image analysis program.

    Science.gov (United States)

    Frischknecht, Adam C; Kasten, Steven J; Hamstra, Stanley J; Perkins, Noel C; Gillespie, R Brent; Armstrong, Thomas J; Minter, Rebecca M

    2013-02-01

    To objectively assess suturing performance using an image analysis program and to provide validity evidence for this assessment method by comparing experts' and novices' performance. In 2009, the authors used an image analysis program to extract objective variables from digital images of suturing end products obtained during a previous study involving third-year medical students (novices) and surgical faculty and residents (experts). Variables included number of stitches, stitch length, total bite size, travel, stitch orientation, total bite-size-to-travel ratio, and symmetry across the incision ratio. The authors compared all variables between groups to detect significant differences and two variables (total bite-size-to-travel ratio and symmetry across the incision ratio) to ideal values. Five experts and 15 novices participated. Experts' and novices' performances differed significantly (P 0.8) for total bite size (P = .009, d = 1.5), travel (P = .045, d = 1.1), total bite-size-to-travel ratio (P algorithm can extract variables from digital images of a running suture and rapidly provide quantitative summative assessment feedback. The significant differences found between groups confirm that this system can discriminate between skill levels. This image analysis program represents a viable training tool for objectively assessing trainees' suturing, a foundational skill for many medical specialties.

  14. Can precision agriculture increase the profitability and sustainability of the production of potatoes and olives?

    NARCIS (Netherlands)

    Evert, van Frits K.; Gaitán-Cremaschi, Daniel; Fountas, Spyros; Kempenaar, Corné

    2017-01-01

    For farmers, the application of Precision Agriculture (PA) technology is expected to lead to an increase in profitability. For society, PA is expected to lead to increased sustainability.The objective of this paper is to determine for a number of common PA practices how much they increase

  15. Objective Bayesian Analysis of Skew- t Distributions

    KAUST Repository

    BRANCO, MARCIA D'ELIA

    2012-02-27

    We study the Jeffreys prior and its properties for the shape parameter of univariate skew-t distributions with linear and nonlinear Student\\'s t skewing functions. In both cases, we show that the resulting priors for the shape parameter are symmetric around zero and proper. Moreover, we propose a Student\\'s t approximation of the Jeffreys prior that makes an objective Bayesian analysis easy to perform. We carry out a Monte Carlo simulation study that demonstrates an overall better behaviour of the maximum a posteriori estimator compared with the maximum likelihood estimator. We also compare the frequentist coverage of the credible intervals based on the Jeffreys prior and its approximation and show that they are similar. We further discuss location-scale models under scale mixtures of skew-normal distributions and show some conditions for the existence of the posterior distribution and its moments. Finally, we present three numerical examples to illustrate the implications of our results on inference for skew-t distributions. © 2012 Board of the Foundation of the Scandinavian Journal of Statistics.

  16. Precision Medicine in Gastrointestinal Pathology.

    Science.gov (United States)

    Wang, David H; Park, Jason Y

    2016-05-01

    -Precision medicine is the promise of individualized therapy and management of patients based on their personal biology. There are now multiple global initiatives to perform whole-genome sequencing on millions of individuals. In the United States, an early program was the Million Veteran Program, and a more recent proposal in 2015 by the president of the United States is the Precision Medicine Initiative. To implement precision medicine in routine oncology care, genetic variants present in tumors need to be matched with effective clinical therapeutics. When we focus on the current state of precision medicine for gastrointestinal malignancies, it becomes apparent that there is a mixed history of success and failure. -To present the current state of precision medicine using gastrointestinal oncology as a model. We will present currently available targeted therapeutics, promising new findings in clinical genomic oncology, remaining quality issues in genomic testing, and emerging oncology clinical trial designs. -Review of the literature including clinical genomic studies on gastrointestinal malignancies, clinical oncology trials on therapeutics targeted to molecular alterations, and emerging clinical oncology study designs. -Translating our ability to sequence thousands of genes into meaningful improvements in patient survival will be the challenge for the next decade.

  17. The Precision Problem in Conservation and Restoration.

    Science.gov (United States)

    Hiers, J Kevin; Jackson, Stephen T; Hobbs, Richard J; Bernhardt, Emily S; Valentine, Leonie E

    2016-11-01

    Within the varied contexts of environmental policy, conservation of imperilled species populations, and restoration of damaged habitats, an emphasis on idealized optimal conditions has led to increasingly specific targets for management. Overly-precise conservation targets can reduce habitat variability at multiple scales, with unintended consequences for future ecological resilience. We describe this dilemma in the context of endangered species management, stream restoration, and climate-change adaptation. Inappropriate application of conservation targets can be expensive, with marginal conservation benefit. Reduced habitat variability can limit options for managers trying to balance competing objectives with limited resources. Conservation policies should embrace habitat variability, expand decision-space appropriately, and support adaptation to local circumstances to increase ecological resilience in a rapidly changing world. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Precision Photometry and Astrometry from Pan-STARRS

    Science.gov (United States)

    Magnier, Eugene A.; Pan-STARRS Team

    2018-01-01

    The Pan-STARRS 3pi Survey has been calibrated with excellent precision for both astrometry and photometry. The Pan-STARRS Data Release 1, opened to the public on 2016 Dec 16, provides photometry in 5 well-calibrated, well-defined bandpasses (grizy) astrometrically registered to the Gaia frame. Comparisons with other surveys illustrate the high quality of the calibration and provide tests of remaining systematic errors in both Pan-STARRS and those external surveys. With photometry and astrometry of roughly 3 billion astronomical objects, the Pan-STARRS DR1 has substantial overlap with Gaia, SDSS, 2MASS and other surveys. I will discuss the astrometric tie between Pan-STARRS DR1 and Gaia and show comparisons between Pan-STARRS and other large-scale surveys.

  19. Real-time analysis of δ13C- and δD-CH4 by high precision laser spectroscopy

    Science.gov (United States)

    Eyer, Simon; Emmenegger, Lukas; Tuzson, Béla; Fischer, Hubertus; Mohn, Joachim

    2014-05-01

    Methane (CH4) is the most important non-CO2 greenhouse gas (GHG) contributing 18% to total radiative forcing. Anthropogenic sources (e.g. ruminants, landfills) contribute 60% to total emissions and led to an increase in its atmospheric mixing ratio from 700 ppb in pre-industrial times to 1819 ± 1 ppb in 2012 [1]. Analysis of the most abundant methane isotopologues 12CH4, 13CH4 and 12CH3D can be used to disentangle the various source/sink processes [2] and to develop target oriented reduction strategies. High precision isotopic analysis of CH4 can be accomplished by isotope-ratio mass-spectrometry (IRMS) [2] and more recently by mid-infrared laser-based spectroscopic techniques. For high precision measurements in ambient air, however, both techniques rely on preconcentration of the target gas [3]. In an on-going project, we developed a fully-automated, field-deployable CH4 preconcentration unit coupled to a dual quantum cascade laser absorption spectrometer (QCLAS) for real-time analysis of CH4 isotopologues. The core part of the rack-mounted (19 inch) device is a highly-efficient adsorbent trap attached to a motorized linear drive system and enclosed in a vacuum chamber. Thereby, the adsorbent trap can be decoupled from the Stirling cooler during desorption for fast desorption and optimal heat management. A wide variety of adsorbents, including: HayeSep D, molecular sieves as well as the novel metal-organic frameworks and carbon nanotubes were characterized regarding their surface area, isosteric enthalpy of adsorption and selectivity for methane over nitrogen. The most promising candidates were tested on the preconcentration device and a preconcentration by a factor > 500 was obtained. Furthermore analytical interferants (e.g. N2O, CO2) are separated by step-wise desorption of trace gases. A QCL absorption spectrometer previously described by Tuzson et al. (2010) for CH4 flux measurements was modified to obtain a platform for high precision and simultaneous

  20. Current evidence on hospital antimicrobial stewardship objectives : A systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E J L; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W T Cohen; Overdiek, Hans W P M; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M P M; Wolfs, Tom F W; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  1. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, E.C.; Hulscher, M.E.J.L.; Mouton, J.W.; Verduin, C.M.; Stuart, J.W.; Overdiek, H.W.; Linden, P.D. van der; Natsch, S.S.; Hertogh, C.M.; Wolfs, T.F.; Schouten, J.A.; Kullberg, B.J.; Prins, J.M.

    2016-01-01

    BACKGROUND: Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes:

  2. [Progress in precision medicine: a scientific perspective].

    Science.gov (United States)

    Wang, B; Li, L M

    2017-01-10

    Precision medicine is a new strategy for disease prevention and treatment by taking into account differences in genetics, environment and lifestyles among individuals and making precise diseases classification and diagnosis, which can provide patients with personalized, targeted prevention and treatment. Large-scale population cohort studies are fundamental for precision medicine research, and could produce best evidence for precision medicine practices. Current criticisms on precision medicine mainly focus on the very small proportion of benefited patients, the neglect of social determinants for health, and the possible waste of limited medical resources. In spite of this, precision medicine is still a most hopeful research area, and would become a health care practice model in the future.

  3. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture

    Directory of Open Access Journals (Sweden)

    Francisco Javier Ferrández-Pastor

    2016-07-01

    Full Text Available The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water; however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols, the evolution of Internet technologies (Internet of Things and ubiquitous computing (Ubiquitous Sensor Networks allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists when a project is launched.

  4. Developing Ubiquitous Sensor Network Platform Using Internet of Things: Application in Precision Agriculture.

    Science.gov (United States)

    Ferrández-Pastor, Francisco Javier; García-Chamizo, Juan Manuel; Nieto-Hidalgo, Mario; Mora-Pascual, Jerónimo; Mora-Martínez, José

    2016-07-22

    The application of Information Technologies into Precision Agriculture methods has clear benefits. Precision Agriculture optimises production efficiency, increases quality, minimises environmental impact and reduces the use of resources (energy, water); however, there are different barriers that have delayed its wide development. Some of these main barriers are expensive equipment, the difficulty to operate and maintain and the standard for sensor networks are still under development. Nowadays, new technological development in embedded devices (hardware and communication protocols), the evolution of Internet technologies (Internet of Things) and ubiquitous computing (Ubiquitous Sensor Networks) allow developing less expensive systems, easier to control, install and maintain, using standard protocols with low-power consumption. This work develops and test a low-cost sensor/actuator network platform, based in Internet of Things, integrating machine-to-machine and human-machine-interface protocols. Edge computing uses this multi-protocol approach to develop control processes on Precision Agriculture scenarios. A greenhouse with hydroponic crop production was developed and tested using Ubiquitous Sensor Network monitoring and edge control on Internet of Things paradigm. The experimental results showed that the Internet technologies and Smart Object Communication Patterns can be combined to encourage development of Precision Agriculture. They demonstrated added benefits (cost, energy, smart developing, acceptance by agricultural specialists) when a project is launched.

  5. Modeling and control of precision actuators

    CERN Document Server

    Kiong, Tan Kok

    2013-01-01

    IntroductionGrowing Interest in Precise ActuatorsTypes of Precise ActuatorsApplications of Precise ActuatorsNonlinear Dynamics and ModelingHysteresisCreepFrictionForce RipplesIdentification and Compensation of Preisach Hysteresis in Piezoelectric ActuatorsSVD-Based Identification and Compensation of Preisach HysteresisHigh-Bandwidth Identification and Compensation of Hysteretic Dynamics in Piezoelectric ActuatorsConcluding RemarksIdentification and Compensation of Frict

  6. The Structure of the Proton in the LHC Precision Era

    NARCIS (Netherlands)

    Gao, Jun; Harland-Lang, Lucian; Rojo, Juan

    2017-01-01

    We review recent progress in the determination of the parton distribution functions (PDFs) of the proton, with emphasis on the applications for precision phenomenology at the Large Hadron Collider (LHC). First of all, we introduce the general theoretical framework underlying the global QCD analysis

  7. Data Sharing For Precision Medicine: Policy Lessons And Future Directions.

    Science.gov (United States)

    Blasimme, Alessandro; Fadda, Marta; Schneider, Manuel; Vayena, Effy

    2018-05-01

    Data sharing is a precondition of precision medicine. Numerous organizations have produced abundant guidance on data sharing. Despite such efforts, data are not being shared to a degree that can trigger the expected data-driven revolution in precision medicine. We set out to explore why. Here we report the results of a comprehensive analysis of data-sharing guidelines issued over the past two decades by multiple organizations. We found that the guidelines overlap on a restricted set of policy themes. However, we observed substantial fragmentation in the policy landscape across specific organizations and data types. This may have contributed to the current stalemate in data sharing. To move toward a more efficient data-sharing ecosystem for precision medicine, policy makers should explore innovative ways to cope with central policy themes such as privacy, consent, and data quality; focus guidance on interoperability, attribution, and public engagement; and promote data-sharing policies that can be adapted to multiple data types.

  8. A Psychoacoustic-Based Multiple Audio Object Coding Approach via Intra-Object Sparsity

    Directory of Open Access Journals (Sweden)

    Maoshen Jia

    2017-12-01

    Full Text Available Rendering spatial sound scenes via audio objects has become popular in recent years, since it can provide more flexibility for different auditory scenarios, such as 3D movies, spatial audio communication and virtual classrooms. To facilitate high-quality bitrate-efficient distribution for spatial audio objects, an encoding scheme based on intra-object sparsity (approximate k-sparsity of the audio object itself is proposed in this paper. The statistical analysis is presented to validate the notion that the audio object has a stronger sparseness in the Modified Discrete Cosine Transform (MDCT domain than in the Short Time Fourier Transform (STFT domain. By exploiting intra-object sparsity in the MDCT domain, multiple simultaneously occurring audio objects are compressed into a mono downmix signal with side information. To ensure a balanced perception quality of audio objects, a Psychoacoustic-based time-frequency instants sorting algorithm and an energy equalized Number of Preserved Time-Frequency Bins (NPTF allocation strategy are proposed, which are employed in the underlying compression framework. The downmix signal can be further encoded via Scalar Quantized Vector Huffman Coding (SQVH technique at a desirable bitrate, and the side information is transmitted in a lossless manner. Both objective and subjective evaluations show that the proposed encoding scheme outperforms the Sparsity Analysis (SPA approach and Spatial Audio Object Coding (SAOC in cases where eight objects were jointly encoded.

  9. Mapping landslide source and transport areas in VHR images with Object-Based Analysis and Support Vector Machines

    Science.gov (United States)

    Heleno, Sandra; Matias, Magda; Pina, Pedro

    2015-04-01

    Visual interpretation of satellite imagery remains extremely demanding in terms of resources and time, especially when dealing with numerous multi-scale landslides affecting wide areas, such as is the case of rainfall-induced shallow landslides. Applying automated methods can contribute to more efficient landslide mapping and updating of existing inventories, and in recent years the number and variety of approaches is rapidly increasing. Very High Resolution (VHR) images, acquired by space-borne sensors with sub-metric precision, such as Ikonos, Quickbird, Geoeye and Worldview, are increasingly being considered as the best option for landslide mapping, but these new levels of spatial detail also present new challenges to state of the art image analysis tools, asking for automated methods specifically suited to map landslide events on VHR optical images. In this work we develop and test a methodology for semi-automatic landslide recognition and mapping of landslide source and transport areas. The method combines object-based image analysis and a Support Vector Machine supervised learning algorithm, and was tested using a GeoEye-1 multispectral image, sensed 3 days after a damaging landslide event in Madeira Island, together with a pre-event LiDAR DEM. Our approach has proved successful in the recognition of landslides on a 15 Km2-wide study area, with 81 out of 85 landslides detected in its validation regions. The classifier also showed reasonable performance (false positive rate 60% and false positive rate below 36% in both validation regions) in the internal mapping of landslide source and transport areas, in particular in the sunnier east-facing slopes. In the less illuminated areas the classifier is still able to accurately map the source areas, but performs poorly in the mapping of landslide transport areas.

  10. Clinical proteomics-driven precision medicine for targeted cancer therapy: current overview and future perspectives.

    Science.gov (United States)

    Zhou, Li; Wang, Kui; Li, Qifu; Nice, Edouard C; Zhang, Haiyuan; Huang, Canhua

    2016-01-01

    Cancer is a common disease that is a leading cause of death worldwide. Currently, early detection and novel therapeutic strategies are urgently needed for more effective management of cancer. Importantly, protein profiling using clinical proteomic strategies, with spectacular sensitivity and precision, offer excellent promise for the identification of potential biomarkers that would direct the development of targeted therapeutic anticancer drugs for precision medicine. In particular, clinical sample sources, including tumor tissues and body fluids (blood, feces, urine and saliva), have been widely investigated using modern high-throughput mass spectrometry-based proteomic approaches combined with bioinformatic analysis, to pursue the possibilities of precision medicine for targeted cancer therapy. Discussed in this review are the current advantages and limitations of clinical proteomics, the available strategies of clinical proteomics for the management of precision medicine, as well as the challenges and future perspectives of clinical proteomics-driven precision medicine for targeted cancer therapy.

  11. A descriptive analysis of quantitative indices for multi-objective block layout

    Directory of Open Access Journals (Sweden)

    Amalia Medina Palomera

    2013-01-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  12. Introduction to the GEOBIA 2010 special issue: From pixels to geographic objects in remote sensing image analysis

    Science.gov (United States)

    Addink, Elisabeth A.; Van Coillie, Frieke M. B.; De Jong, Steven M.

    2012-04-01

    Traditional image analysis methods are mostly pixel-based and use the spectral differences of landscape elements at the Earth surface to classify these elements or to extract element properties from the Earth Observation image. Geographic object-based image analysis (GEOBIA) has received considerable attention over the past 15 years for analyzing and interpreting remote sensing imagery. In contrast to traditional image analysis, GEOBIA works more like the human eye-brain combination does. The latter uses the object's color (spectral information), size, texture, shape and occurrence to other image objects to interpret and analyze what we see. GEOBIA starts by segmenting the image grouping together pixels into objects and next uses a wide range of object properties to classify the objects or to extract object's properties from the image. Significant advances and improvements in image analysis and interpretation are made thanks to GEOBIA. In June 2010 the third conference on GEOBIA took place at the Ghent University after successful previous meetings in Calgary (2008) and Salzburg (2006). This special issue presents a selection of the 2010 conference papers that are worked out as full research papers for JAG. The papers cover GEOBIA applications as well as innovative methods and techniques. The topics range from vegetation mapping, forest parameter estimation, tree crown identification, urban mapping, land cover change, feature selection methods and the effects of image compression on segmentation. From the original 94 conference papers, 26 full research manuscripts were submitted; nine papers were selected and are presented in this special issue. Selection was done on the basis of quality and topic of the studies. The next GEOBIA conference will take place in Rio de Janeiro from 7 to 9 May 2012 where we hope to welcome even more scientists working in the field of GEOBIA.

  13. Current evidence on hospital antimicrobial stewardship objectives: a systematic review and meta-analysis

    NARCIS (Netherlands)

    Schuts, Emelie C.; Hulscher, Marlies E. J. L.; Mouton, Johan W.; Verduin, Cees M.; Stuart, James W. T. Cohen; Overdiek, Hans W. P. M.; van der Linden, Paul D.; Natsch, Stephanie; Hertogh, Cees M. P. M.; Wolfs, Tom F. W.; Schouten, Jeroen A.; Kullberg, Bart Jan; Prins, Jan M.

    2016-01-01

    Antimicrobial stewardship is advocated to improve the quality of antimicrobial use. We did a systematic review and meta-analysis to assess whether antimicrobial stewardship objectives had any effects in hospitals and long-term care facilities on four predefined patients' outcomes: clinical outcomes,

  14. Commercial objectives, technology transfer, and systems analysis for fusion power development

    Science.gov (United States)

    Dean, Stephen O.

    1988-09-01

    Fusion is an inexhaustible source of energy that has the potential for economic commercial applications with excellent safety and environmental characteristics. The primary focus for the fusion energy development program is the generation of central station electricity. Fusion has the potential, however, for many other applications. The fact that a large fraction of the energy released in a DT fusion reaction is carried by high energy neutrons suggests potentially unique applications. In addition, fusion R and D will lead to new products and new markets. Each fusion application must meet certain standards of economic and safety and environmental attractiveness. For this reason, economics on the one hand, and safety and environment and licensing on the other, are the two primary criteria for setting long range commercial fusion objectives. A major function of systems analysis is to evaluate the potential of fusion against these objectives and to help guide the fusion R and D program toward practical applications. The transfer of fusion technology and skills from the national labs and universities to industry is the key to achieving the long range objective of commercial fusion applications.

  15. A comparison of accuracy and precision of 5 gait-event detection algorithms from motion capture in horses during over ground walk

    DEFF Research Database (Denmark)

    Olsen, Emil; Boye, Jenny Katrine; Pfau, Thilo

    2012-01-01

    and use robust and validated algorithms. It is the objective of this study to compare accuracy (bias) and precision (SD) for five published human and equine motion capture foot-on/off and stance phase detection algorithms during walk. Six horses were walked over 8 seamlessly embedded force plates...... of mass generally provides the most accurate and precise results in walk....

  16. Image Processing Methods Usable for Object Detection on the Chessboard

    Directory of Open Access Journals (Sweden)

    Beran Ladislav

    2016-01-01

    Full Text Available Image segmentation and object detection is challenging problem in many research. Although many algorithms for image segmentation have been invented, there is no simple algorithm for image segmentation and object detection. Our research is based on combination of several methods for object detection. The first method suitable for image segmentation and object detection is colour detection. This method is very simply, but there is problem with different colours. For this method it is necessary to have precisely determined colour of segmented object before all calculations. In many cases it is necessary to determine this colour manually. Alternative simply method is method based on background removal. This method is based on difference between reference image and detected image. In this paper several methods suitable for object detection are described. Thisresearch is focused on coloured object detection on chessboard. The results from this research with fusion of neural networks for user-computer game checkers will be applied.

  17. Precision Diagnosis, Monitoring and Control of Structural Component Degradation in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Han, J. H.; Choi, M. S.; Lee, D. H.; Hur, D. H.; Na, J. W.; Kim, K. M.; Hong, J. H.; Kim, H. S.

    2007-06-01

    The occurrence of structural material degradations in NPPs and their progress during operation are directly related to the safety and the integrity of NPPs. The various kinds of material degradation are usually examined by methods of material integrity evaluation and non-destructive evaluation(NDE). Material integrity evaluation is well known as classical method to interpret cause and mechanism of degradation and failure, however, this method has a limitation of detection and diagnosis for actual condition of flaws and defects occurring during plant operation, particularly for their formation in the early stage. NDE used widely for detection of defects formed on structural materials provides many information for safety regulation, plant management, repairing, however, this technique has a generic problem in its reliability due to low detectability and ability of signal analysis, etc. The objective of this research project is to develop the advanced technologies ensuring a precision diagnosis on the various kind of defects in structural materials of NPP and a high performance in material degradation evaluation. Many of the advanced technologies were developed in the 1st phase of this project. They contributed to interpret more precisely the root causes of degradation, failure and to establish the proper measures for the safety and integrity of NPPs. The accomplishment of comprehensive technology developed as planned will be practically applied to the nuclear industries and contributed to improve the safety and integrity of NPPs

  18. The structure of the proton in the LHC precision era

    Science.gov (United States)

    Gao, Jun; Harland-Lang, Lucian; Rojo, Juan

    2018-05-01

    We review recent progress in the determination of the parton distribution functions (PDFs) of the proton, with emphasis on the applications for precision phenomenology at the Large Hadron Collider (LHC). First of all, we introduce the general theoretical framework underlying the global QCD analysis of the quark and gluon internal structure of protons. We then present a detailed overview of the hard-scattering measurements, and the corresponding theory predictions, that are used in state-of-the-art PDF fits. We emphasize here the role that higher-order QCD and electroweak corrections play in the description of recent high-precision collider data. We present the methodology used to extract PDFs in global analyses, including the PDF parametrization strategy and the definition and propagation of PDF uncertainties. Then we review and compare the most recent releases from the various PDF fitting collaborations, highlighting their differences and similarities. We discuss the role that QED corrections and photon-initiated contributions play in modern PDF analysis. We provide representative examples of the implications of PDF fits for high-precision LHC phenomenological applications, such as Higgs coupling measurements and searches for high-mass New Physics resonances. We conclude this report by discussing some selected topics relevant for the future of PDF determinations, including the treatment of theoretical uncertainties, the connection with lattice QCD calculations, and the role of PDFs at future high-energy colliders beyond the LHC.

  19. Precision wildlife medicine: applications of the human-centred precision medicine revolution to species conservation.

    Science.gov (United States)

    Whilde, Jenny; Martindale, Mark Q; Duffy, David J

    2017-05-01

    The current species extinction crisis is being exacerbated by an increased rate of emergence of epizootic disease. Human-induced factors including habitat degradation, loss of biodiversity and wildlife population reductions resulting in reduced genetic variation are accelerating disease emergence. Novel, efficient and effective approaches are required to combat these epizootic events. Here, we present the case for the application of human precision medicine approaches to wildlife medicine in order to enhance species conservation efforts. We consider how the precision medicine revolution, coupled with the advances made in genomics, may provide a powerful and feasible approach to identifying and treating wildlife diseases in a targeted, effective and streamlined manner. A number of case studies of threatened species are presented which demonstrate the applicability of precision medicine to wildlife conservation, including sea turtles, amphibians and Tasmanian devils. These examples show how species conservation could be improved by using precision medicine techniques to determine novel treatments and management strategies for the specific medical conditions hampering efforts to restore population levels. Additionally, a precision medicine approach to wildlife health has in turn the potential to provide deeper insights into human health and the possibility of stemming and alleviating the impacts of zoonotic diseases. The integration of the currently emerging Precision Medicine Initiative with the concepts of EcoHealth (aiming for sustainable health of people, animals and ecosystems through transdisciplinary action research) and One Health (recognizing the intimate connection of humans, animal and ecosystem health and addressing a wide range of risks at the animal-human-ecosystem interface through a coordinated, collaborative, interdisciplinary approach) has great potential to deliver a deeper and broader interdisciplinary-based understanding of both wildlife and human

  20. Nanomaterials for Cancer Precision Medicine.

    Science.gov (United States)

    Wang, Yilong; Sun, Shuyang; Zhang, Zhiyuan; Shi, Donglu

    2018-04-01

    Medical science has recently advanced to the point where diagnosis and therapeutics can be carried out with high precision, even at the molecular level. A new field of "precision medicine" has consequently emerged with specific clinical implications and challenges that can be well-addressed by newly developed nanomaterials. Here, a nanoscience approach to precision medicine is provided, with a focus on cancer therapy, based on a new concept of "molecularly-defined cancers." "Next-generation sequencing" is introduced to identify the oncogene that is responsible for a class of cancers. This new approach is fundamentally different from all conventional cancer therapies that rely on diagnosis of the anatomic origins where the tumors are found. To treat cancers at molecular level, a recently developed "microRNA replacement therapy" is applied, utilizing nanocarriers, in order to regulate the driver oncogene, which is the core of cancer precision therapeutics. Furthermore, the outcome of the nanomediated oncogenic regulation has to be accurately assessed by the genetically characterized, patient-derived xenograft models. Cancer therapy in this fashion is a quintessential example of precision medicine, presenting many challenges to the materials communities with new issues in structural design, surface functionalization, gene/drug storage and delivery, cell targeting, and medical imaging. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. High Precision Edge Detection Algorithm for Mechanical Parts

    Science.gov (United States)

    Duan, Zhenyun; Wang, Ning; Fu, Jingshun; Zhao, Wenhui; Duan, Boqiang; Zhao, Jungui

    2018-04-01

    High precision and high efficiency measurement is becoming an imperative requirement for a lot of mechanical parts. So in this study, a subpixel-level edge detection algorithm based on the Gaussian integral model is proposed. For this purpose, the step edge normal section line Gaussian integral model of the backlight image is constructed, combined with the point spread function and the single step model. Then gray value of discrete points on the normal section line of pixel edge is calculated by surface interpolation, and the coordinate as well as gray information affected by noise is fitted in accordance with the Gaussian integral model. Therefore, a precise location of a subpixel edge was determined by searching the mean point. Finally, a gear tooth was measured by M&M3525 gear measurement center to verify the proposed algorithm. The theoretical analysis and experimental results show that the local edge fluctuation is reduced effectively by the proposed method in comparison with the existing subpixel edge detection algorithms. The subpixel edge location accuracy and computation speed are improved. And the maximum error of gear tooth profile total deviation is 1.9 μm compared with measurement result with gear measurement center. It indicates that the method has high reliability to meet the requirement of high precision measurement.

  2. Optimization of deformation monitoring networks using finite element strain analysis

    Science.gov (United States)

    Alizadeh-Khameneh, M. Amin; Eshagh, Mehdi; Jensen, Anna B. O.

    2018-04-01

    An optimal design of a geodetic network can fulfill the requested precision and reliability of the network, and decrease the expenses of its execution by removing unnecessary observations. The role of an optimal design is highlighted in deformation monitoring network due to the repeatability of these networks. The core design problem is how to define precision and reliability criteria. This paper proposes a solution, where the precision criterion is defined based on the precision of deformation parameters, i. e. precision of strain and differential rotations. A strain analysis can be performed to obtain some information about the possible deformation of a deformable object. In this study, we split an area into a number of three-dimensional finite elements with the help of the Delaunay triangulation and performed the strain analysis on each element. According to the obtained precision of deformation parameters in each element, the precision criterion of displacement detection at each network point is then determined. The developed criterion is implemented to optimize the observations from the Global Positioning System (GPS) in Skåne monitoring network in Sweden. The network was established in 1989 and straddled the Tornquist zone, which is one of the most active faults in southern Sweden. The numerical results show that 17 out of all 21 possible GPS baseline observations are sufficient to detect minimum 3 mm displacement at each network point.

  3. An aberrant precision account of autism.

    Directory of Open Access Journals (Sweden)

    Rebecca P Lawson

    2014-05-01

    Full Text Available Autism is a neurodevelopmental disorder characterised by problems with social-communication, restricted interests and repetitive behaviour. A recent and controversial article presented a compelling normative explanation for the perceptual symptoms of autism in terms of a failure of Bayesian inference (Pellicano and Burr, 2012. In response, we suggested that when Bayesian interference is grounded in its neural instantiation – namely, predictive coding – many features of autistic perception can be attributed to aberrant precision (or beliefs about precision within the context of hierarchical message passing in the brain (Friston et al., 2013. Here, we unpack the aberrant precision account of autism. Specifically, we consider how empirical findings – that speak directly or indirectly to neurobiological mechanisms – are consistent with the aberrant encoding of precision in autism; in particular, an imbalance of the precision ascribed to sensory evidence relative to prior beliefs.

  4. On selecting a prior for the precision parameter of Dirichlet process mixture models

    Science.gov (United States)

    Dorazio, R.M.

    2009-01-01

    In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.

  5. Cold rolling precision forming of shaft parts theory and technologies

    CERN Document Server

    Song, Jianli; Li, Yongtang

    2017-01-01

    This book presents in detail the theory, processes and equipment involved in cold rolling precision forming technologies, focusing on spline and thread shaft parts. The main topics discussed include the status quo of research on cold rolling precision forming technologies; the design and calculation of process parameters; the numerical simulation of cold rolling forming processes; and the equipment used in cold rolling forming. The mechanism of cold rolling forming is extremely complex, and research on the processes, theory and mechanical analysis of spline cold rolling forming has remained very limited to date. In practice, the forming processes and production methods used are mainly chosen on the basis of individual experience. As such, there is a marked lack of both systematic, theory-based guidelines, and of specialized books covering theoretical analysis, numerical simulation, experiments and equipment used in spline cold rolling forming processes – all key points that are included in this book and ill...

  6. [Precision Oncology and "Molecular Tumor Boards" - Concepts, Chances and Challenges].

    Science.gov (United States)

    Holch, Julian Walter; Westphalen, Christoph Benedikt; Hiddemann, Wolfgang; Heinemann, Volker; Jung, Andreas; Metzeler, Klaus Hans

    2017-11-01

    Recent developments in genomics allow a more and more comprehensive genetic analysis of human malignancies, and have sparked hopes that this will contribute to the development of novel targeted, effective and well-tolerated therapies.While targeted therapies have improved the prognosis of many cancer patients with certain tumor types, "precision oncology" also brings along new challenges. Highly personalized treatment strategies require new strategies for clinical trials and translation into routine clinical practice. We review the current technical approaches for "universal genetic testing" in cancer, and potential pitfalls in the interpretation of such data. We then provide an overview of the available evidence supporting treatment strategies based on extended genetic analysis. Based on the available data, we conclude that "precision oncology" approaches that go beyond the current standard of care should be pursued within the framework of an interdisciplinary "molecular tumor board", and preferably within clinical trials. © Georg Thieme Verlag KG Stuttgart · New York.

  7. NCI and the Precision Medicine Initiative®

    Science.gov (United States)

    NCI's activities related to precision medicine focuses on new and expanded precision medicine clinical trials; mechanisms to overcome drug resistance to cancer treatments; and developing a shared digital repository of precision medicine trials data.

  8. Deficits in Coordinative Bimanual Timing Precision in Children With Specific Language Impairment.

    Science.gov (United States)

    Vuolo, Janet; Goffman, Lisa; Zelaznik, Howard N

    2017-02-01

    Our objective was to delineate components of motor performance in specific language impairment (SLI); specifically, whether deficits in timing precision in one effector (unimanual tapping) and in two effectors (bimanual clapping) are observed in young children with SLI. Twenty-seven 4- to 5-year-old children with SLI and 21 age-matched peers with typical language development participated. All children engaged in a unimanual tapping and a bimanual clapping timing task. Standard measures of language and motor performance were also obtained. No group differences in timing variability were observed in the unimanual tapping task. However, compared with typically developing peers, children with SLI were more variable in their timing precision in the bimanual clapping task. Nine of the children with SLI performed greater than 1 SD below the mean on a standardized motor assessment. The children with low motor performance showed the same profile as observed across all children with SLI, with unaffected unimanual and impaired bimanual timing precision. Although unimanual timing is unaffected, children with SLI show a deficit in timing that requires bimanual coordination. We propose that the timing deficits observed in children with SLI are associated with the increased demands inherent in bimanual performance.

  9. From prospective biobanking to precision medicine: BIO-RAIDs – an EU study protocol in cervical cancer

    International Nuclear Information System (INIS)

    Ngo, Charlotte; Samuels, Sanne; Bagrintseva, Ksenia; Slocker, Andrea; Hupé, Philippe; Kenter, Gemma; Popovic, Marina; Samet, Nina; Tresca, Patricia; Leyen, Heiko von der; Deutsch, Eric; Rouzier, Roman; Belin, Lisa; Kamal, Maud; Scholl, Suzy

    2015-01-01

    Cervical cancer (CC) is -second to breast cancer- a dominant cause of gynecological cancer-related deaths worldwide. CC tumor biopsies and blood samples are of easy access and vital for the development of future precision medicine strategies. BIO-RAIDs is a prospective multicenter European study, presently recruiting patients in 6 EU countries. Tumor and liquid biopsies from patients with previously non-treated cervical cancer (stages IB2-IV) are collected at defined time points. Patients receive standard primary treatment according to the stage of their disease. 700 patients are planned to be enrolled. The main objectives are the discovery of -dominant molecular alterations, -signalling pathway activation, and -tumor micro-environment patterns that may predict response or resistance to treatment. An exhaustive molecular analysis is performed using 1° Next generation sequencing, 2° Reverse phase protein arrays and 3° Immuno-histochemistry. The clinical study BIO-RAIDs is activated in all planned countries, 170 patients have been recruited till now. This study will make an important contribution towards precision medicine treatments in cervical cancer. The results will support the development of clinical practice guidelines for cervical cancer patients to improve their prognosis and their quality of life. Clinicaltrials.gov: NCT02428842, registered 10 February 2015

  10. MRI histogram analysis enables objective and continuous classification of intervertebral disc degeneration.

    Science.gov (United States)

    Waldenberg, Christian; Hebelka, Hanna; Brisby, Helena; Lagerstrand, Kerstin Magdalena

    2018-05-01

    Magnetic resonance imaging (MRI) is the best diagnostic imaging method for low back pain. However, the technique is currently not utilized in its full capacity, often failing to depict painful intervertebral discs (IVDs), potentially due to the rough degeneration classification system used clinically today. MR image histograms, which reflect the IVD heterogeneity, may offer sensitive imaging biomarkers for IVD degeneration classification. This study investigates the feasibility of using histogram analysis as means of objective and continuous grading of IVD degeneration. Forty-nine IVDs in ten low back pain patients (six males, 25-69 years) were examined with MRI (T2-weighted images and T2-maps). Each IVD was semi-automatically segmented on three mid-sagittal slices. Histogram features of the IVD were extracted from the defined regions of interest and correlated to Pfirrmann grade. Both T2-weighted images and T2-maps displayed similar histogram features. Histograms of well-hydrated IVDs displayed two separate peaks, representing annulus fibrosus and nucleus pulposus. Degenerated IVDs displayed decreased peak separation, where the separation was shown to correlate strongly with Pfirrmann grade (P histogram appearances. Histogram features correlated well with IVD degeneration, suggesting that IVD histogram analysis is a suitable tool for objective and continuous IVD degeneration classification. As histogram analysis revealed IVD heterogeneity, it may be a clinical tool for characterization of regional IVD degeneration effects. To elucidate the usefulness of histogram analysis in patient management, IVD histogram features between asymptomatic and symptomatic individuals needs to be compared.

  11. Multiscale change-point analysis of inhomogeneous Poisson processes using unbalanced wavelet decompositions

    NARCIS (Netherlands)

    Jansen, M.H.; Di Bucchianico, A.; Mattheij, R.M.M.; Peletier, M.A.

    2006-01-01

    We present a continuous wavelet analysis of count data with timevarying intensities. The objective is to extract intervals with significant intensities from background intervals. This includes the precise starting point of the significant interval, its exact duration and the (average) level of

  12. [Precision Nursing: Individual-Based Knowledge Translation].

    Science.gov (United States)

    Chiang, Li-Chi; Yeh, Mei-Ling; Su, Sui-Lung

    2016-12-01

    U.S. President Obama announced a new era of precision medicine in the Precision Medicine Initiative (PMI). This initiative aims to accelerate the progress of personalized medicine in light of individual requirements for prevention and treatment in order to improve the state of individual and public health. The recent and dramatic development of large-scale biologic databases (such as the human genome sequence), powerful methods for characterizing patients (such as genomics, microbiome, diverse biomarkers, and even pharmacogenomics), and computational tools for analyzing big data are maximizing the potential benefits of precision medicine. Nursing science should follow and keep pace with this trend in order to develop empirical knowledge and expertise in the area of personalized nursing care. Nursing scientists must encourage, examine, and put into practice innovative research on precision nursing in order to provide evidence-based guidance to clinical practice. The applications in personalized precision nursing care include: explanations of personalized information such as the results of genetic testing; patient advocacy and support; anticipation of results and treatment; ongoing chronic monitoring; and support for shared decision-making throughout the disease trajectory. Further, attention must focus on the family and the ethical implications of taking a personalized approach to care. Nurses will need to embrace the paradigm shift to precision nursing and work collaboratively across disciplines to provide the optimal personalized care to patients. If realized, the full potential of precision nursing will provide the best chance for good health for all.

  13. Precision production: enabling deterministic throughput for precision aspheres with MRF

    Science.gov (United States)

    Maloney, Chris; Entezarian, Navid; Dumas, Paul

    2017-10-01

    Aspherical lenses offer advantages over spherical optics by improving image quality or reducing the number of elements necessary in an optical system. Aspheres are no longer being used exclusively by high-end optical systems but are now replacing spherical optics in many applications. The need for a method of production-manufacturing of precision aspheres has emerged and is part of the reason that the optics industry is shifting away from artisan-based techniques towards more deterministic methods. Not only does Magnetorheological Finishing (MRF) empower deterministic figure correction for the most demanding aspheres but it also enables deterministic and efficient throughput for series production of aspheres. The Q-flex MRF platform is designed to support batch production in a simple and user friendly manner. Thorlabs routinely utilizes the advancements of this platform and has provided results from using MRF to finish a batch of aspheres as a case study. We have developed an analysis notebook to evaluate necessary specifications for implementing quality control metrics. MRF brings confidence to optical manufacturing by ensuring high throughput for batch processing of aspheres.

  14. Automated optical testing of LWIR objective lenses using focal plane array sensors

    Science.gov (United States)

    Winters, Daniel; Erichsen, Patrik; Domagalski, Christian; Peter, Frank; Heinisch, Josef; Dumitrescu, Eugen

    2012-10-01

    The image quality of today's state-of-the-art IR objective lenses is constantly improving while at the same time the market for thermography and vision grows strongly. Because of increasing demands on the quality of IR optics and increasing production volumes, the standards for image quality testing increase and tests need to be performed in shorter time. Most high-precision MTF testing equipment for the IR spectral bands in use today relies on the scanning slit method that scans a 1D detector over a pattern in the image generated by the lens under test, followed by image analysis to extract performance parameters. The disadvantages of this approach are that it is relatively slow, it requires highly trained operators for aligning the sample and the number of parameters that can be extracted is limited. In this paper we present lessons learned from the R and D process on using focal plane array (FPA) sensors for testing of long-wave IR (LWIR, 8-12 m) optics. Factors that need to be taken into account when switching from scanning slit to FPAs are e.g.: the thermal background from the environment, the low scene contrast in the LWIR, the need for advanced image processing algorithms to pre-process camera images for analysis and camera artifacts. Finally, we discuss 2 measurement systems for LWIR lens characterization that we recently developed with different target applications: 1) A fully automated system suitable for production testing and metrology that uses uncooled microbolometer cameras to automatically measure MTF (on-axis and at several o-axis positions) and parameters like EFL, FFL, autofocus curves, image plane tilt, etc. for LWIR objectives with an EFL between 1 and 12mm. The measurement cycle time for one sample is typically between 6 and 8s. 2) A high-precision research-grade system using again an uncooled LWIR camera as detector, that is very simple to align and operate. A wide range of lens parameters (MTF, EFL, astigmatism, distortion, etc.) can be

  15. Surface characterization protocol for precision aspheric optics

    Science.gov (United States)

    Sarepaka, RamaGopal V.; Sakthibalan, Siva; Doodala, Somaiah; Panwar, Rakesh S.; Kotaria, Rajendra

    2017-10-01

    In Advanced Optical Instrumentation, Aspherics provide an effective performance alternative. The aspheric fabrication and surface metrology, followed by aspheric design are complementary iterative processes for Precision Aspheric development. As in fabrication, a holistic approach of aspheric surface characterization is adopted to evaluate actual surface error and to aim at the deliverance of aspheric optics with desired surface quality. Precision optical surfaces are characterized by profilometry or by interferometry. Aspheric profiles are characterized by contact profilometers, through linear surface scans to analyze their Form, Figure and Finish errors. One must ensure that, the surface characterization procedure does not add to the resident profile errors (generated during the aspheric surface fabrication). This presentation examines the errors introduced post-surface generation and during profilometry of aspheric profiles. This effort is to identify sources of errors and is to optimize the metrology process. The sources of error during profilometry may be due to: profilometer settings, work-piece placement on the profilometer stage, selection of zenith/nadir points of aspheric profiles, metrology protocols, clear aperture - diameter analysis, computational limitations of the profiler and the software issues etc. At OPTICA, a PGI 1200 FTS contact profilometer (Taylor-Hobson make) is used for this study. Precision Optics of various profiles are studied, with due attention to possible sources of errors during characterization, with multi-directional scan approach for uniformity and repeatability of error estimation. This study provides an insight of aspheric surface characterization and helps in optimal aspheric surface production methodology.

  16. Context-based object-of-interest detection for a generic traffic surveillance analysis system

    NARCIS (Netherlands)

    Bao, X.; Javanbakhti, S.; Zinger, S.; Wijnhoven, R.G.J.; With, de P.H.N.

    2014-01-01

    We present a new traffic surveillance video analysis system, focusing on building a framework with robust and generic techniques, based on both scene understanding and moving object-of-interest detection. Since traffic surveillance is widely applied, we want to design a single system that can be

  17. A comparison of manual anthropometric measurements with Kinect-based scanned measurements in terms of precision and reliability.

    Science.gov (United States)

    Bragança, Sara; Arezes, Pedro; Carvalho, Miguel; Ashdown, Susan P; Castellucci, Ignacio; Leão, Celina

    2018-01-01

    Collecting anthropometric data for real-life applications demands a high degree of precision and reliability. It is important to test new equipment that will be used for data collectionOBJECTIVE:Compare two anthropometric data gathering techniques - manual methods and a Kinect-based 3D body scanner - to understand which of them gives more precise and reliable results. The data was collected using a measuring tape and a Kinect-based 3D body scanner. It was evaluated in terms of precision by considering the regular and relative Technical Error of Measurement and in terms of reliability by using the Intraclass Correlation Coefficient, Reliability Coefficient, Standard Error of Measurement and Coefficient of Variation. The results obtained showed that both methods presented better results for reliability than for precision. Both methods showed relatively good results for these two variables, however, manual methods had better results for some body measurements. Despite being considered sufficiently precise and reliable for certain applications (e.g. apparel industry), the 3D scanner tested showed, for almost every anthropometric measurement, a different result than the manual technique. Many companies design their products based on data obtained from 3D scanners, hence, understanding the precision and reliability of the equipment used is essential to obtain feasible results.

  18. New methods for precision Moeller polarimetry*

    International Nuclear Information System (INIS)

    Gaskell, D.; Meekins, D.G.; Yan, C.

    2007-01-01

    Precision electron beam polarimetry is becoming increasingly important as parity violation experiments attempt to probe the frontiers of the standard model. In the few GeV regime, Moeller polarimetry is well suited to high-precision measurements, however is generally limited to use at relatively low beam currents (<10 μA). We present a novel technique that will enable precision Moeller polarimetry at very large currents, up to 100 μA. (orig.)

  19. Age-Related Decline of Precision and Binding in Visual Working Memory

    Science.gov (United States)

    2013-01-01

    Working memory declines with normal aging, but the nature of this impairment is debated. Studies based on detecting changes to arrays of visual objects have identified two possible components to age-related decline: a reduction in the number of items that can be stored, or a deficit in maintaining the associations (bindings) between individual object features. However, some investigations have reported intact binding with aging, and specific deficits arising only in Alzheimer’s disease. Here, using a recently developed continuous measure of recall fidelity, we tested the precision with which adults of different ages could reproduce from memory the orientation and color of a probed array item. The results reveal a further component of cognitive decline: an age-related decrease in the resolution with which visual information can be maintained in working memory. This increase in recall variability with age was strongest under conditions of greater memory load. Moreover, analysis of the distribution of errors revealed that older participants were more likely to incorrectly report one of the unprobed items in memory, consistent with an age-related increase in misbinding. These results indicate a systematic decline with age in working memory resources that can be recruited to store visual information. The paradigm presented here provides a sensitive index of both memory resolution and feature binding, with the potential for assessing their modulation by interventions. The findings have implications for understanding the mechanisms underpinning working memory deficits in both health and disease. PMID:23978008

  20. Impact of PET/CT system, reconstruction protocol, data analysis method, and repositioning on PET/CT precision: An experimental evaluation using an oncology and brain phantom.

    Science.gov (United States)

    Mansor, Syahir; Pfaehler, Elisabeth; Heijtel, Dennis; Lodge, Martin A; Boellaard, Ronald; Yaqub, Maqsood

    2017-12-01

    In longitudinal oncological and brain PET/CT studies, it is important to understand the repeatability of quantitative PET metrics in order to assess change in tracer uptake. The present studies were performed in order to assess precision as function of PET/CT system, reconstruction protocol, analysis method, scan duration (or image noise), and repositioning in the field of view. Multiple (repeated) scans have been performed using a NEMA image quality (IQ) phantom and a 3D Hoffman brain phantom filled with 18 F solutions on two systems. Studies were performed with and without randomly (PET/CT, especially in the case of smaller spheres (PET metrics depends on the combination of reconstruction protocol, data analysis methods and scan duration (scan statistics). Moreover, precision was also affected by phantom repositioning but its impact depended on the data analysis method in combination with the reconstructed voxel size (tissue fraction effect). This study suggests that for oncological PET studies the use of SUV peak may be preferred over SUV max because SUV peak is less sensitive to patient repositioning/tumor sampling. © 2017 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.