WorldWideScience

Sample records for point target analysis

  1. Advancing prevention of sexually transmitted infections through point-of-care testing: target product profiles and landscape analysis.

    Science.gov (United States)

    Toskin, Igor; Murtagh, Maurine; Peeling, Rosanna W; Blondeel, Karel; Cordero, Joanna; Kiarie, James

    2017-12-01

    Advancing the field of point-of-care testing (POCT) for STIs can rapidly and substantially improve STI control and prevention by providing targeted, essential STI services (case detection and screening). POCT enables definitive diagnosis and appropriate treatment in a single visit and home and community-based testing. Since 2014, the WHO Department of Reproductive Health and Research, in collaboration with technical partners, has completed four landscape analyses of promising diagnostics for use at or near the point of patient care to detect syphilis, Neisseria gonorrhoeae , Chlamydia trachomatis , Trichomonas vaginalis and the human papillomavirus. The analyses comprised a literature review and interviews. Two International Technical Consultations on STI POCTs (2014 and 2015) resulted in the development of target product profiles (TPP). Experts in STI microbiology, laboratory diagnostics, clinical management, public health and epidemiology participated in the consultations with representation from all WHO regions. The landscape analysis identified diagnostic tests that are either available on the market, to be released in the near future or in the pipeline. The TPPs specify 28 analytical and operational characteristics of POCTs for use in different populations for surveillance, screening and case management. None of the tests that were identified in the landscape analysis met all of the targets of the TPPs. More efforts of the global health community are needed to accelerate access to affordable quality-assured STI POCTs, particularly in low- and middle-income countries, by supporting the development of new diagnostic platforms as well as strengthening the validation and implementation of existing diagnostics according to internationally endorsed standards and the best available evidence. © World Health Organization 2017. Licensee BMJ Publishing Group Limited. This is an open access article distributed under the terms of the Creative Commons Attribution IGO

  2. Function Point Analysis Depot

    Science.gov (United States)

    Muniz, R.; Martinez, El; Szafran, J.; Dalton, A.

    2011-01-01

    The Function Point Analysis (FPA) Depot is a web application originally designed by one of the NE-C3 branch's engineers, Jamie Szafran, and created specifically for the Software Development team of the Launch Control Systems (LCS) project. The application consists of evaluating the work of each developer to be able to get a real estimate of the hours that is going to be assigned to a specific task of development. The Architect Team had made design change requests for the depot to change the schema of the application's information; that information, changed in the database, needed to be changed in the graphical user interface (GUI) (written in Ruby on Rails (RoR and the web service/server side in Java to match the database changes. These changes were made by two interns from NE-C, Ricardo Muniz from NE-C3, who made all the schema changes for the GUI in RoR and Edwin Martinez, from NE-C2, who made all the changes in the Java side.

  3. Dim point target detection against bright background

    Science.gov (United States)

    Zhang, Yao; Zhang, Qiheng; Xu, Zhiyong; Xu, Junping

    2010-05-01

    For target detection within a large-field cluttered background from a long distance, several difficulties, involving low contrast between target and background, little occupancy, illumination ununiformity caused by vignetting of lens, and system noise, make it a challenging problem. The existing approaches to dim target detection can be roughly divided into two categories: detection before tracking (DBT) and tracking before detection (TBD). The DBT-based scheme has been widely used in practical applications due to its simplicity, but it often requires working in the situation with a higher signal-to-noise ratio (SNR). In contrast, the TBD-based methods can provide impressive detection results even in the cases of very low SNR; unfortunately, the large memory requirement and high computational load prevents these methods from real-time tasks. In this paper, we propose a new method for dim target detection. We address this problem by combining the advantages of the DBT-based scheme in computational efficiency and of the TBD-based in detection capability. Our method first predicts the local background, and then employs the energy accumulation and median filter to remove background clutter. The dim target is finally located by double window filtering together with an improved high order correlation which speeds up the convergence. The proposed method is implemented on a hardware platform and performs suitably in outside experiments.

  4. Target reference points of social policy

    Directory of Open Access Journals (Sweden)

    V. P. Vasiliev

    2015-01-01

    Full Text Available This article examines the problems of developing an integrated approach to the formation of interconnected factors of social policy. Identified the need to integrate economic and social factors. As a system of socio-economic, policy includes the phenomena of human and social capital. The analysis of the international rankings of economic and social development from the perspective of the key problems of social dynamics. Presents directions of social policy of Russia in the long term and short term. Named social risks of devaluation of the national currency.

  5. Change point analysis and assessment

    DEFF Research Database (Denmark)

    Müller, Sabine; Neergaard, Helle; Ulhøi, John Parm

    2011-01-01

    The aim of this article is to develop an analytical framework for studying processes such as continuous innovation and business development in high-tech SME clusters that transcends the traditional qualitative-quantitative divide. It integrates four existing and well-recognized approaches to stud...... to studying events, processes and change, mamely change-point analysis, event-history analysis, critical-incident technique and sequence analysis....

  6. Parametric statistical change point analysis

    CERN Document Server

    Chen, Jie

    2000-01-01

    This work is an in-depth study of the change point problem from a general point of view and a further examination of change point analysis of the most commonly used statistical models Change point problems are encountered in such disciplines as economics, finance, medicine, psychology, signal processing, and geology, to mention only several The exposition is clear and systematic, with a great deal of introductory material included Different models are presented in each chapter, including gamma and exponential models, rarely examined thus far in the literature Other models covered in detail are the multivariate normal, univariate normal, regression, and discrete models Extensive examples throughout the text emphasize key concepts and different methodologies are used, namely the likelihood ratio criterion, and the Bayesian and information criterion approaches A comprehensive bibliography and two indices complete the study

  7. Point target detection using super-resolution reconstruction

    NARCIS (Netherlands)

    Lange, D.J.J. de; Dijk, J.; Eekeren, A.W.M. van; Schutte, K.

    2007-01-01

    Surveillance applications are primarily concerned with detection of targets. In electro-optical surveillance systems, missiles or other weapons coming towards you are observed as moving points. Typically, such moving targets need to be detected in a very short time. One of the problems is that the

  8. High precision target center determination from a point cloud

    Directory of Open Access Journals (Sweden)

    K. Kregar

    2013-10-01

    Full Text Available Many applications of terrestrial laser scanners (TLS require the determination of a specific point from a point cloud. In this paper procedure of high precision planar target center acquisition from point cloud is presented. The process is based on an image matching algorithm but before we can deal with raster image to fit a target on it, we need to properly determine the best fitting plane and project points on it. The main emphasis of this paper is in the precision estimation and propagation through the whole procedure which allows us to obtain precision assessment of final results (target center coordinates. Theoretic precision estimations – obtained through the procedure were rather high so we compared them with the empiric precision estimations obtained as standard deviations of results of 60 independently scanned targets. An χ2-test confirmed that theoretic precisions are overestimated. The problem most probably lies in the overestimated precisions of the plane parameters due to vast redundancy of points. However, empirical precisions also confirmed that the proposed procedure can ensure a submillimeter precision level. The algorithm can automatically detect grossly erroneous results to some extent. It can operate when the incidence angles of a laser beam are as high as 80°, which is desirable property if one is going to use planar targets as tie points in scan registration. The proposed algorithm will also contribute to improve TLS calibration procedures.

  9. Inertial fusion energy target injection, tracking, and beam pointing

    International Nuclear Information System (INIS)

    Petzoldt, R.W.

    1995-01-01

    Several cryogenic targets must be injected each second into a reaction chamber. Required target speed is about 100 m/s. Required accuracy of the driver beams on target is a few hundred micrometers. Fuel strength is calculated to allow acceleration in excess of 10,000 m/s 2 if the fuel temperature is less than 17 K. A 0.1 μm thick dual membrane will allow nearly 2,000 m/s 2 acceleration. Acceleration is gradually increased and decreased over a few membrane oscillation periods (a few ms), to avoid added stress from vibrations which could otherwise cause a factor of two decrease in allowed acceleration. Movable shielding allows multiple targets to be in flight toward the reaction chamber at once while minimizing neutron heating of subsequent targets. The use of multiple injectors is recommended for redundancy which increases availability and allows a higher pulse rate. Gas gun, rail gun, induction accelerator, and electrostatic accelerator target injection devices are studied, and compared. A gas gun is the preferred device for indirect-drive targets due to its simplicity and proven reliability. With the gas gun, the amount of gas required for each target (about 10 to 100 mg) is acceptable. A revolver loading mechanism is recommended with a cam operated poppet valve to control the gas flow. Cutting vents near the muzzle of the gas gun barrel is recommended to improve accuracy and aid gas pumping. If a railgun is used, we recommend an externally applied magnetic field to reduce required current by an order of magnitude. Optical target tracking is recommended. Up/down counters are suggested to predict target arrival time. Target steering is shown to be feasible and would avoid the need to actively point the beams. Calculations show that induced tumble from electrostatically steering the target is not excessive

  10. SU-E-T-310: Targeting Safety Improvements Through Analysis of Near-Miss Error Detection Points in An Incident Learning Database

    Energy Technology Data Exchange (ETDEWEB)

    Novak, A; Nyflot, M; Sponseller, P; Howard, J; Logan, W; Holland, L; Jordan, L; Carlson, J; Ermoian, R; Kane, G; Ford, E; Zeng, J [University of Washington, Seattle, WA (United States)

    2014-06-01

    Purpose: Radiation treatment planning involves a complex workflow that can make safety improvement efforts challenging. This study utilizes an incident reporting system to identify detection points of near-miss errors, in order to guide our departmental safety improvement efforts. Previous studies have examined where errors arise, but not where they are detected or their patterns. Methods: 1377 incidents were analyzed from a departmental nearmiss error reporting system from 3/2012–10/2013. All incidents were prospectively reviewed weekly by a multi-disciplinary team, and assigned a near-miss severity score ranging from 0–4 reflecting potential harm (no harm to critical). A 98-step consensus workflow was used to determine origination and detection points of near-miss errors, categorized into 7 major steps (patient assessment/orders, simulation, contouring/treatment planning, pre-treatment plan checks, therapist/on-treatment review, post-treatment checks, and equipment issues). Categories were compared using ANOVA. Results: In the 7-step workflow, 23% of near-miss errors were detected within the same step in the workflow, while an additional 37% were detected by the next step in the workflow, and 23% were detected two steps downstream. Errors detected further from origination were more severe (p<.001; Figure 1). The most common source of near-miss errors was treatment planning/contouring, with 476 near misses (35%). Of those 476, only 72(15%) were found before leaving treatment planning, 213(45%) were found at physics plan checks, and 191(40%) were caught at the therapist pre-treatment chart review or on portal imaging. Errors that passed through physics plan checks and were detected by therapists were more severe than other errors originating in contouring/treatment planning (1.81 vs 1.33, p<0.001). Conclusion: Errors caught by radiation treatment therapists tend to be more severe than errors caught earlier in the workflow, highlighting the importance of safety

  11. Analysis of irregularly distributed points

    DEFF Research Database (Denmark)

    Hartelius, Karsten

    1996-01-01

    , but the usage of simple kriging may lead to ill-conditioned matrices when applied to highly irregularly distributed points. Adaptive Kalman filter schemes are investigated. A new parallel Kalman filter algorithm based on windowing technique gives good results in a case study on the Igallico satellite scene...... and represents an interesting contextuel classifier. Extended Kalman filtering on the other hand seems to be well suited for interpolation in gradually changing environments. Bayesian restoration is applied to a point matching problem, which consists of matching a grid to an image of (irregularly) distributed...

  12. Targeting Entry Points for Ethics in Chemistry Teaching and Learning

    Science.gov (United States)

    Coppola, Brian P.

    2000-11-01

    In 1994, faculty in the University of Michigan department of chemistry began targeting entry points in the undergraduate and graduate curricula for the formal consideration of ethical reasoning. For students, many professional development issues occur naturally, especially in laboratory courses and participation in research. A formal education in ethical reasoning seeks to provide the tools by which individuals may anticipate situations and analyze behavioral options and their consequences. Students need to see that faculty value their moral development as seriously as their abilities to get good grades or generate results through research, and faculty need to think about their obligations as educators for the student as a whole individual. Like other high-stakes issues, consideration of questions of ethical practices requires that students have a relatively safe and supervised environment outside of the laboratory.

  13. Measuring coseismic displacements with point-like targets offset tracking

    KAUST Repository

    Hu, Xie

    2014-01-01

    Offset tracking is an important complement to measure large ground displacements in both azimuth and range dimensions where synthetic aperture radar (SAR) interferometry is unfeasible. Subpixel offsets can be obtained by searching for the cross-correlation peak calculated from the match patches uniformly distributed on two SAR images. However, it has its limitations, including redundant computation and incorrect estimations on decorrelated patches. In this letter, we propose a simple strategy that performs offset tracking on detected point-like targets (PT). We first detect image patches within bright PT by using a sinc-like template from a single SAR image and then perform offset tracking on them to obtain the pixel shifts. Compared with the standard method, the application on the 2010 M 7.2 El Mayor-Cucapah earthquake shows that the proposed PT offset tracking can significantly increase the cross-correlation and thus result in both efficiency and reliability improvements. © 2013 IEEE.

  14. Effects of target height and width on 2D pointing movement duration and kinematics.

    Science.gov (United States)

    Bohan, Michael; Longstaff, Mitchell G; Van Gemmert, Arend W A; Rand, Miya K; Stelmach, George E

    2003-07-01

    This study examined the impact of target geometry on the trajectories of rapid pointing movements. Participants performed a graphic point-to-point task using a pen on a digitizer tablet with targets and real time trajectories displayed on a computer screen. Circular- and elliptical-shaped targets were used in order to systematically vary the accuracy constraints along two dimensions. Consistent with Fitts Law, movement time increased as target difficulty increased. Analysis of movement kinematics revealed different patterns for targets constrained by height (H) and width (W). When W was the constraining factor, movements of greater precision were characterized by a lower peak velocity and a longer deceleration phase, with trajectories that were aimed relatively farther away from the center of the target and were more variable across trials. This indicates an emphasis on reactive, sensory-based control. When H was the constraining factor, however, movements of greater precision were characterized by a longer acceleration phase, a lower peak velocity, and a longer deceleration phase. The initial trajectory was aimed closer to the center of the target, and the trajectory path across trials was more constrained. This suggests a greater reliance on both predictive and reactive control.

  15. Point Information Gain and Multidimensional Data Analysis

    Directory of Open Access Journals (Sweden)

    Renata Rychtáriková

    2016-10-01

    Full Text Available We generalize the point information gain (PIG and derived quantities, i.e., point information gain entropy (PIE and point information gain entropy density (PIED, for the case of the Rényi entropy and simulate the behavior of PIG for typical distributions. We also use these methods for the analysis of multidimensional datasets. We demonstrate the main properties of PIE/PIED spectra for the real data with the examples of several images and discuss further possible utilizations in other fields of data processing.

  16. Linear covariance analysis for gimbaled pointing systems

    Science.gov (United States)

    Christensen, Randall S.

    Linear covariance analysis has been utilized in a wide variety of applications. Historically, the theory has made significant contributions to navigation system design and analysis. More recently, the theory has been extended to capture the combined effect of navigation errors and closed-loop control on the performance of the system. These advancements have made possible rapid analysis and comprehensive trade studies of complicated systems ranging from autonomous rendezvous to vehicle ascent trajectory analysis. Comprehensive trade studies are also needed in the area of gimbaled pointing systems where the information needs are different from previous applications. It is therefore the objective of this research to extend the capabilities of linear covariance theory to analyze the closed-loop navigation and control of a gimbaled pointing system. The extensions developed in this research include modifying the linear covariance equations to accommodate a wider variety of controllers. This enables the analysis of controllers common to gimbaled pointing systems, with internal states and associated dynamics as well as actuator command filtering and auxiliary controller measurements. The second extension is the extraction of power spectral density estimates from information available in linear covariance analysis. This information is especially important to gimbaled pointing systems where not just the variance but also the spectrum of the pointing error impacts the performance. The extended theory is applied to a model of a gimbaled pointing system which includes both flexible and rigid body elements as well as input disturbances, sensor errors, and actuator errors. The results of the analysis are validated by direct comparison to a Monte Carlo-based analysis approach. Once the developed linear covariance theory is validated, analysis techniques that are often prohibitory with Monte Carlo analysis are used to gain further insight into the system. These include the creation

  17. Non-Targeted Analysis Challenge (Non-targeted screening workshop)

    Science.gov (United States)

    This brief presentation is intended to motivate discussion of the "Non-Targeted Analysis Challenge" at the Advancing Non-Targeted Analyses of Xenobiotics in Environmental and Biological Media workshop held at the EPA RTP campus.

  18. Evaluating gaze-based interface tools to facilitate point-and-select tasks with small targets

    DEFF Research Database (Denmark)

    Skovsgaard, Henrik; Mateo, Julio C.; Hansen, John Paulin

    2011-01-01

    Gaze interaction affords hands-free control of computers. Pointing to and selecting small targets using gaze alone is difficult because of the limited accuracy of gaze pointing. This is the first experimental comparison of gaze-based interface tools for small-target (e.g. <12 × 12 pixels) point-a...

  19. Enhancing RGI lyase thermostability by targeted single point mutations

    DEFF Research Database (Denmark)

    Silva, Inês R.; Larsen, Dorte Møller; Jers, Carsten

    2013-01-01

    Rhamnogalacturonan I lyase (RGI lyase) (EC 4.2.2.-) catalyzes the cleavage of rhamnogalacturonan I in pectins by β-elimination. In this study the thermal stability of a RGI lyase (PL 11) originating from Bacillus licheniformis DSM 13/ATCC14580 was increased by a targeted protein engineering...

  20. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  1. PROPOSAL FOR AN USE CASE POINT ANALYSIS

    Directory of Open Access Journals (Sweden)

    Claudio G. Bernardo

    2011-12-01

    Full Text Available The aim of this paper is to present Analysis by Use Case Point that is used for specifying requirements in different systems. This tool is important for software development, cost versus time for states prepared to help in planning any activity. A proposal to solve a case of calculations in a lawyers’ association, which has the priority map all your processes and create systems that can improve customer service while remaining competitive in your market.

  2. The effect of target precuing on pointing with mouse and touchpad

    DEFF Research Database (Denmark)

    Hertzum, Morten; Hornbæk, Kasper

    2013-01-01

    it only at the onset of pointing trials. We investigate this for young, adult, and elderly participants pointing with mouse and touchpad. Target precuing affects the trial completion time, the reaction time, the sheer movement time, and multiple movement kinematics. In addition, target precuing interacts...

  3. Target validation for FCV technology development in Japan from energy competition point of view

    International Nuclear Information System (INIS)

    ENDO Eiichi

    2006-01-01

    The objective of this work is to validate the technical targets in the governmental hydrogen energy road-map of Japan by analyzing market penetration of fuel cell vehicle(FCV)s and effects of fuel price and carbon tax on it from technology competition point of view. In this analysis, an energy system model of Japan based on MARKAL is used. The results of the analysis show that hydrogen FCVs could not have cost-competitiveness until 2030 without carbon tax, including the governmental actual plan of carbon tax. However, as the carbon tax rate increases, instead of conventional vehicles including gasoline hybrid electric vehicle, hydrogen FCVs penetrate to the market earlier and more. By assuming higher fuel price and severer carbon tax rate, market share of hydrogen FCVs approaches to the governmental goal. This suggests that cheaper vehicle cost and/or hydrogen price than those targeted in the road-map is required. At the same time, achievement of the technical targets in the road-map also allows to attain the market penetration target of hydrogen FCVs in some possible conditions. (authors)

  4. Tipping point analysis of ocean acoustic noise

    Science.gov (United States)

    Livina, Valerie N.; Brouwer, Albert; Harris, Peter; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2018-02-01

    We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations) of the time series.

  5. Tipping point analysis of ocean acoustic noise

    Directory of Open Access Journals (Sweden)

    V. N. Livina

    2018-02-01

    Full Text Available We apply tipping point analysis to a large record of ocean acoustic data to identify the main components of the acoustic dynamical system and study possible bifurcations and transitions of the system. The analysis is based on a statistical physics framework with stochastic modelling, where we represent the observed data as a composition of deterministic and stochastic components estimated from the data using time-series techniques. We analyse long-term and seasonal trends, system states and acoustic fluctuations to reconstruct a one-dimensional stochastic equation to approximate the acoustic dynamical system. We apply potential analysis to acoustic fluctuations and detect several changes in the system states in the past 14 years. These are most likely caused by climatic phenomena. We analyse trends in sound pressure level within different frequency bands and hypothesize a possible anthropogenic impact on the acoustic environment. The tipping point analysis framework provides insight into the structure of the acoustic data and helps identify its dynamic phenomena, correctly reproducing the probability distribution and scaling properties (power-law correlations of the time series.

  6. Segmentation of foreground apple targets by fusing visual attention mechanism and growth rules of seed points

    International Nuclear Information System (INIS)

    Qu, W.; Shang, W.; Shao, Y.; Wang, D.; Yu, X.; Song, H.

    2015-01-01

    Accurate segmentation of apple targets is one of the most important problems to be solved in the vision system of apple picking robots. This work aimed to solve the difficulties that background targets often bring to foreground targets segmentation, by fusing the visual attention mechanism and the growth rule of seed points. Background targets could be eliminated by extracting the ROI (region of interest) of apple targets; the ROI was roughly segmented on the HSV color space, and then each of the pixels was used as a seed growing point. The growth rule of the seed points was adopted to obtain the whole area of apple targets from seed growing points. The proposed method was tested with 20 images captured in a natural scene, including 54 foreground apple targets and approximately 84 background apple targets. Experimental results showed that the proposed method can remove background targets and focus on foreground targets, while the k-means algorithm and the chromatic aberration algorithm cannot. Additionally, its average segmentation error rate was 13.23%, which is 2.71% higher than that of the k-means algorithm and 2.95% lower than that of the chromatic aberration algorithm. In conclusion, the proposed method contributes to the vision system of apple-picking robots to locate foreground apple targets quickly and accurately under a natural scene. (Author)

  7. Segmentation of foreground apple targets by fusing visual attention mechanism and growth rules of seed points

    Energy Technology Data Exchange (ETDEWEB)

    Qu, W.; Shang, W.; Shao, Y.; Wang, D.; Yu, X.; Song, H.

    2015-07-01

    Accurate segmentation of apple targets is one of the most important problems to be solved in the vision system of apple picking robots. This work aimed to solve the difficulties that background targets often bring to foreground targets segmentation, by fusing the visual attention mechanism and the growth rule of seed points. Background targets could be eliminated by extracting the ROI (region of interest) of apple targets; the ROI was roughly segmented on the HSV color space, and then each of the pixels was used as a seed growing point. The growth rule of the seed points was adopted to obtain the whole area of apple targets from seed growing points. The proposed method was tested with 20 images captured in a natural scene, including 54 foreground apple targets and approximately 84 background apple targets. Experimental results showed that the proposed method can remove background targets and focus on foreground targets, while the k-means algorithm and the chromatic aberration algorithm cannot. Additionally, its average segmentation error rate was 13.23%, which is 2.71% higher than that of the k-means algorithm and 2.95% lower than that of the chromatic aberration algorithm. In conclusion, the proposed method contributes to the vision system of apple-picking robots to locate foreground apple targets quickly and accurately under a natural scene. (Author)

  8. Segmentation of foreground apple targets by fusing visual attention mechanism and growth rules of seed points

    Directory of Open Access Journals (Sweden)

    Weifeng Qu

    2015-09-01

    Full Text Available Accurate segmentation of apple targets is one of the most important problems to be solved in the vision system of apple picking robots. This work aimed to solve the difficulties that background targets often bring to foreground targets segmentation, by fusing the visual attention mechanism and the growth rule of seed points. Background targets could be eliminated by extracting the ROI (region of interest of apple targets; the ROI was roughly segmented on the HSV color space, and then each of the pixels was used as a seed growing point. The growth rule of the seed points was adopted to obtain the whole area of apple targets from seed growing points. The proposed method was tested with 20 images captured in a natural scene, including 54 foreground apple targets and approximately 84 background apple targets. Experimental results showed that the proposed method can remove background targets and focus on foreground targets, while the k-means algorithm and the chromatic aberration algorithm cannot. Additionally, its average segmentation error rate was 13.23%, which is 2.71% higher than that of the k-means algorithm and 2.95% lower than that of the chromatic aberration algorithm. In conclusion, the proposed method contributes to the vision system of apple-picking robots to locate foreground apple targets quickly and accurately under a natural scene.

  9. Performance measures for parameter extraction of sensor array point targets using the discrete chirp Fourier transform

    Science.gov (United States)

    Santiago, Nayda; Aceros Moreno, Cesar A.; Rodriguez, Domingo

    2006-05-01

    This work presents a new methodology for the formulation of discrete chirp Fourier transform (DCFT) algorithms and it discusses performance measures pertaining to the mapping of these algorithms to hardware computational structures (HCS) as well as the extraction of chirp rate estimation parameters of multicomponent nonstationary signals arriving from point targets. The methodology centers on the use of Kronecker products algebra, a branch of finite dimensional multilinear algebra, as a language to present a canonical formulation of the DCFT algorithm and its associated properties. The methodology also explains how to search for variants of this canonical formulation that contribute to enhance the mapping process to a target HCS. The parameter extraction technique uses time-frequency properties of the DCFT in a modeled delay-Doppler synthetic aperture radar (SAR) remote sensing and surveillance environment to treat multicomponent return signals of prime length, with additive Gaussian noise as background clutter, and extract associated chirp rate parameters. The fusion of time-frequency information, acquired from transformed chirp or linear frequency modulated (FM) signals using the DCFT, with information obtained when the signals are treated using the discrete ambiguity function acting as point target response, point spread function, or impulse response, is used to further enhance the estimation process. For the case of very long signals, parallel algorithm implementations have been obtained on cluster computers. A theoretical computer performance analysis was conducted on the cluster implementation based on a methodology that applies well-defined design of experiments methods to the identification of relations among different levels in the process of mapping computational operations to high-performance computing systems. The use of statistics for identification of relationships among factors has formalized the search for solutions to the mapping problem and this

  10. Novel Spatiotemporal Filter for Dim Point Targets Detection in Infrared Image Sequences

    Directory of Open Access Journals (Sweden)

    Zhaohui Li

    2015-01-01

    Full Text Available Dim point target detection is of great importance in both civil and military fields. In this paper a novel spatiotemporal filter is proposed to incorporate both the spatial and temporal features of moving dim point targets. Since targets are expected to be detected as far as possible, in this situation, they have no texture features in spatial dimensions, appearing like isolated points. Based on the attributes, potential targets are extracted by searching the local maximum point in a sliding window. And the potential targets are then correlated based on target moving patterns. After combining local maximum points and target moving patterns, structure background in infrared scene is removed. Next, the temporal profiles of infrared sense are reviewed and examined. By a new max-median filter performing on temporal profiles, the intensity of target pulse signal is extracted. Finally, each temporal profile is divided into several pieces to estimate the variance of the temporal profiles, which leads to a new detection metric. The proposed approach is tested via several infrared image sequences. The results show that our proposed method can significantly reduce the complex background in aerial infrared image sequence and have a good detection performance.

  11. The phylogenomic analysis of the anaphase promoting complex and its targets points to complex and modern-like control of the cell cycle in the last common ancestor of eukaryotes

    Directory of Open Access Journals (Sweden)

    Brochier-Armanet Céline

    2011-09-01

    Full Text Available Abstract Background The Anaphase Promoting Complex or Cyclosome (APC/C is the largest member of the ubiquitin ligase [E3] family. It plays a crucial role in the control of the cell cycle and cell proliferation by mediating the proteolysis of key components by the proteasome. APC/C is made of a dozen subunits that assemble into a large complex of ~1.5 MDa, which interacts with various cofactors and targets. Results Using comparative genomic and phylogenetic approaches, we showed that 24 out of 37 known APC/C subunits, adaptors/co-activators and main targets, were already present in the Last Eukaryotic Common Ancestor (LECA and were well conserved to a few exceptions in all present-day eukaryotic lineages. The phylogenetic analysis of the 24 components inferred to be present in LECA showed that they contain a reliable phylogenetic signal to reconstruct the phylogeny of the domain Eucarya. Conclusions Taken together our analyses indicated that LECA had a complex and highly controlled modern-like cell cycle. Moreover, we showed that, despite what is generally assumed, proteins involved in housekeeping cellular functions may be a good complement to informational genes to study the phylogeny of eukaryotes.

  12. Preliminary Design and Analysis of the GIFTS Instrument Pointing System

    Science.gov (United States)

    Zomkowski, Paul P.

    2003-01-01

    The Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) Instrument is the next generation spectrometer for remote sensing weather satellites. The GIFTS instrument will be used to perform scans of the Earth s atmosphere by assembling a series of field-of- views (FOV) into a larger pattern. Realization of this process is achieved by step scanning the instrument FOV in a contiguous fashion across any desired portion of the visible Earth. A 2.3 arc second pointing stability, with respect to the scanning instrument, must be maintained for the duration of the FOV scan. A star tracker producing attitude data at 100 Hz rate will be used by the autonomous pointing algorithm to precisely track target FOV s on the surface of the Earth. The main objective is to validate the pointing algorithm in the presence of spacecraft disturbances and determine acceptable disturbance limits from expected noise sources. Proof of concept validation of the pointing system algorithm is carried out with a full system simulation developed using Matlab Simulink. Models for the following components function within the full system simulation: inertial reference unit (IRU), attitude control system (ACS), reaction wheels, star tracker, and mirror controller. With the spacecraft orbital position and attitude maintained to within specified limits the pointing algorithm receives quaternion, ephemeris, and initialization data that are used to construct the required mirror pointing commands at a 100 Hz rate. This comprehensive simulation will also aid in obtaining a thorough understanding of spacecraft disturbances and other sources of pointing system errors. Parameter sensitivity studies and disturbance analysis will be used to obtain limits of operability for the GIFTS instrument. The culmination of this simulation development and analysis will be used to validate the specified performance requirements outlined for this instrument.

  13. Satellite Video Point-target Tracking in Combination with Motion Smoothness Constraint and Grayscale Feature

    OpenAIRE

    WU Jiaqi; ZHANG Guo; WANG Taoyang; JIANG Yonghua

    2017-01-01

    In view of the problem of satellite video point-target tracking, a method of Bayesian classification for tracking with the constraint of motion smoothness is proposed, which named Bayesian MoST. The idea of naive Bayesian classification without relying on any prior probability of target is introduced. Under the constraint of motion smoothness, the gray level similarity feature is used to describe the likelihood of the target. And then, the simplified conditional probability correction model o...

  14. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  15. Impedance analysis of acupuncture points and pathways

    International Nuclear Information System (INIS)

    Teplan, Michal; Kukucka, Marek; Ondrejkovicová, Alena

    2011-01-01

    Investigation of impedance characteristics of acupuncture points from acoustic to radio frequency range is addressed. Discernment and localization of acupuncture points in initial single subject study was unsuccessfully attempted by impedance map technique. Vector impedance analyses determined possible resonant zones in MHz region.

  16. Impedance analysis of acupuncture points and pathways

    Science.gov (United States)

    Teplan, Michal; Kukučka, Marek; Ondrejkovičová, Alena

    2011-12-01

    Investigation of impedance characteristics of acupuncture points from acoustic to radio frequency range is addressed. Discernment and localization of acupuncture points in initial single subject study was unsuccessfully attempted by impedance map technique. Vector impedance analyses determined possible resonant zones in MHz region.

  17. Point-of-care detection of extracellular vesicles: Sensitivity optimization and multiple-target detection.

    Science.gov (United States)

    Oliveira-Rodríguez, Myriam; Serrano-Pertierra, Esther; García, Agustín Costa; López-Martín, Soraya; Yañez-Mo, María; Cernuda-Morollón, Eva; Blanco-López, M C

    2017-01-15

    Extracellular vesicles (EVs) are membrane-bound nanovesicles delivered by different cellular lineages under physiological and pathological conditions. Although these vesicles have shown relevance as biomarkers for a number of diseases, their isolation and detection still has several technical drawbacks, mainly related with problems of sensitivity and time-consumed. Here, we reported a rapid and multiple-targeted lateral flow immunoassay (LFIA) system for the detection of EVs isolated from human plasma. A range of different labels (colloidal gold, carbon black and magnetic nanoparticles) was compared as detection probe in LFIA, being gold nanoparticles that showed better results. Using this platform, we demonstrated that improvements may be carried out by incorporating additional capture lines with different antibodies. The device exhibited a limit of detection (LOD) of 3.4×10 6 EVs/µL when anti-CD81 and anti-CD9 were selected as capture antibodies in a multiple-targeted format, and anti-CD63 labeled with gold nanoparticles was used as detection probe. This LFIA, coupled to EVs isolation kits, could become a rapid and useful tool for the point-of-care detection of EVs, with a total analysis time of two hours. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    process. Residuals are ascribed to locations in the empty background, as well as to data points of the point pattern. We obtain variance formulae, and study standardised residuals. There is also an analogy between our spatial residuals and the usual residuals for (non-spatial) generalised linear models...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  19. Calculation of target-specific point distribution for 2D mobile laser scanners.

    Science.gov (United States)

    Cahalane, Conor; McElhinney, Conor P; Lewis, Paul; McCarthy, Tim

    2014-05-27

    The current generation of Mobile Mapping Systems (MMSs) capture high density spatial data in a short time-frame. The quantity of data is difficult to predict as there is no concrete understanding of the point density that different scanner configurations and hardware settings will exhibit for objects at specific distances. Obtaining the required point density impacts survey time, processing time, data storage and is also the underlying limit of automated algorithms. This paper details a novel method for calculating point and profile information for terrestrial MMSs which are required for any point density calculation. Through application of algorithms utilising 3D surface normals and 2D geometric formulae, the theoretically optimal profile spacing and point spacing are calculated on targets. Both of these elements are a major factor in calculating point density on arbitrary objects, such as road signs, poles or buildings-all important features in asset management surveys.

  20. Calculation of Target-Specific Point Distribution for 2D Mobile Laser Scanners

    Directory of Open Access Journals (Sweden)

    Conor Cahalane

    2014-05-01

    Full Text Available The current generation of Mobile Mapping Systems (MMSs capture high density spatial data in a short time-frame. The quantity of data is difficult to predict as there is no concrete understanding of the point density that different scanner configurations and hardware settings will exhibit for objects at specific distances. Obtaining the required point density impacts survey time, processing time, data storage and is also the underlying limit of automated algorithms. This paper details a novel method for calculating point and profile information for terrestrial MMSs which are required for any point density calculation. Through application of algorithms utilising 3D surface normals and 2D geometric formulae, the theoretically optimal profile spacing and point spacing are calculated on targets. Both of these elements are a major factor in calculating point density on arbitrary objects, such as road signs, poles or buildings-all important features in asset management surveys.

  1. A steady-state target calculation method based on "point" model for integrating processes.

    Science.gov (United States)

    Pang, Qiang; Zou, Tao; Zhang, Yanyan; Cong, Qiumei

    2015-05-01

    Aiming to eliminate the influences of model uncertainty on the steady-state target calculation for integrating processes, this paper presented an optimization method based on "point" model and a method determining whether or not there is a feasible solution of steady-state target. The optimization method resolves the steady-state optimization problem of integrating processes under the framework of two-stage structure, which builds a simple "point" model for the steady-state prediction, and compensates the error between "point" model and real process in each sampling interval. Simulation results illustrate that the outputs of integrating variables can be restricted within the constraints, and the calculation errors between actual outputs and optimal set-points are small, which indicate that the steady-state prediction model can predict the future outputs of integrating variables accurately. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  2. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate from...... different fingers. Our model achieves good performance on an NIST-FBI fingerprint database of 258 matched fingerprint pairs....

  3. Music analysis and point-set compression

    DEFF Research Database (Denmark)

    Meredith, David

    2015-01-01

    COSIATEC, SIATECCompress and Forth’s algorithm are point-set compression algorithms developed for discovering repeated patterns in music, such as themes and motives that would be of interest to a music analyst. To investigate their effectiveness and versatility, these algorithms were evaluated...... on three analytical tasks that depend on the discovery of repeated patterns: classifying folk song melodies into tune families, discovering themes and sections in polyphonic music, and discovering subject and countersubject entries in fugues. Each algorithm computes a compressed encoding of a point......-set representation of a musical object in the form of a list of compact patterns, each pattern being given with a set of vectors indicating its occurrences. However, the algorithms adopt different strategies in their attempts to discover encodings that maximize compression.The best-performing algorithm on the folk...

  4. Target Tracking Based Scene Analysis

    Science.gov (United States)

    1984-08-01

    NATO Advanced Study PG Institute, Braunlage/ Harz , FRG, June 21 July 2, 1I82 Springer, Berlin, 1983, pp. 493-501. 141 B. Bhanu."Recognition of...Braunlage/ Harz . FRG, June 21 - July 2, 1082 Springer, Berlin, 1083. pp 10.1-124. [81 R.B. Cate, T.*1B. Dennis, J.T. Mallin, K.S. Nedelman, NEIL Trenchard, and...34Image, Sequence Processing and Dynamic Scene Analysis", Proceedings of NATO,. Advanced Study Institute, Braunlage/ Harz , FRG, June 21 - July 2, 1982

  5. Point reflector model for the simulation of radar target glint and Doppler phenomena

    Science.gov (United States)

    Grubeck, H.

    1995-02-01

    This report describes the mathematics of a radar model for the simulation of glint and Doppler phenomena. Glint is an unwanted phenomenon, which deteriorates radar tracking performance. The term denotes a fluctuation of the target direction, experienced by a radar, tracking a complex target. Doppler shift refers to the frequency change of a radar wave, due to the reflection against a moving target. It can be used by modern so-called coherent radar systems for velocity determination. A three-dimensional space is modeled, containing a rigid body of point scattering reflectors (the target) and a point of measuring (the radar). The target and the radar can move freely and independently in the space. The movement of the target and the radar is described with a number of coordinate systems, which are presented in this report. Some simple simulations are also presented in this report. A simulation tool is available for interested users and the purpose of this report is to announce its existence. The program is written in MATLAB Simulink.

  6. A starting-point strategy for interior-point algorithms for shakedown analysis of engineering structures

    Science.gov (United States)

    Simon, Jaan-Willem; Höwer, Daniel; Weichert, Dieter

    2014-05-01

    Lower-bound shakedown analysis leads to nonlinear convex optimization problems with large numbers of unknowns and constraints, the solution of which can be obtained efficiently by interior-point algorithms. The performance of these algorithms strongly depends on the choice of the starting point. In general, starting points should be located inside the feasible region. In addition, they should also be well centred. Although there exist several heuristics for the construction of suitable starting points, these are restricted, as long as only the mathematical procedure is considered without taking into account the nature of the underlying mechanical problem. Thus, in this article, a strategy is proposed for choosing appropriate starting points for interior-point algorithms applied to shakedown analysis. This strategy is based on both the mathematical characteristics and the physical meaning of the variables involved. The efficiency of the new method is illustrated by numerical examples from the field of power plant engineering.

  7. 33 CFR 334.200 - Chesapeake Bay, Point Lookout to Cedar Point; aerial and surface firing range and target area, U...

    Science.gov (United States)

    2010-07-01

    ... 33 Navigation and Navigable Waters 3 2010-07-01 2010-07-01 false Chesapeake Bay, Point Lookout to... Chesapeake Bay, Point Lookout to Cedar Point; aerial and surface firing range and target area, U.S. Naval Air... of Chesapeake Bay within an area described as follows: Beginning at the easternmost extremity of...

  8. Fixed point theory, variational analysis, and optimization

    CERN Document Server

    Al-Mezel, Saleh Abdullah R; Ansari, Qamrul Hasan

    2015-01-01

    ""There is a real need for this book. It is useful for people who work in areas of nonlinear analysis, optimization theory, variational inequalities, and mathematical economics.""-Nan-Jing Huang, Sichuan University, Chengdu, People's Republic of China

  9. Sensory Integration during Vibration of Postural Muscle Tendons When Pointing to a Memorized Target.

    Science.gov (United States)

    Teasdale, Normand; Furmanek, Mariusz P; Germain Robitaille, Mathieu; de Oliveira, Fabio Carlos Lucas; Simoneau, Martin

    2016-01-01

    Vibrating ankle muscles in freely standing persons elicits a spatially oriented postural response. For instance, vibrating the Achilles tendons induces a backward displacement of the body while vibrating the tibialis anterior muscle tendons induces a forward displacement. These displacements have been called vibration induced falling (VIF) responses and they presumably are automatic. Because of the long delay between the onset of the vibration and the onset of the VIF (about 700 ms), and the widespread cortical activation following vibration, there is a possibility that the sensory signals available before the VIF can be used by the central nervous system to plan a hand pointing action. This study examined this suggestion. Ten healthy young participants stood on a force platform and initially were trained to point with and without vision to a target located in front of them. Then, they were exposed to conditions with vibration of the Achilles tendons or tibialis anterior muscle tendons and pointed at the target without vision. The vibration stopped between each trial. Trials with vision (without vibration) were given every five trials to maintain an accurate perception of the target's spatial location. Ankle vibrations did not have an effect on the position of the center of foot pressure (COP) before the onset of the pointing actions. Furthermore, reaction and movement times of the pointing actions were unaffected by the vibration. The hypotheses were that if proprioceptive information evoked by ankle vibrations alters the planning of a pointing action, the amplitude of the movement should scale according to the muscle tendons that are vibrated. For Achilles tendon vibration, participants undershot the target indicating the planning of the pointing action was influenced by the vibration-evoked proprioceptive information (forward displacement of the body). When the tibialis anterior were vibrated (backward displacement of the body), however, shorter movements were

  10. NEWTONIAN IMPERIALIST COMPETITVE APPROACH TO OPTIMIZING OBSERVATION OF MULTIPLE TARGET POINTS IN MULTISENSOR SURVEILLANCE SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. Afghan-Toloee

    2013-09-01

    Full Text Available The problem of specifying the minimum number of sensors to deploy in a certain area to face multiple targets has been generally studied in the literatures. In this paper, we are arguing the multi-sensors deployment problem (MDP. The Multi-sensor placement problem can be clarified as minimizing the cost required to cover the multi target points in the area. We propose a more feasible method for the multi-sensor placement problem. Our method makes provision the high coverage of grid based placements while minimizing the cost as discovered in perimeter placement techniques. The NICA algorithm as improved ICA (Imperialist Competitive Algorithm is used to decrease the performance time to explore an enough solution compared to other meta-heuristic schemes such as GA, PSO and ICA. A three dimensional area is used for clarify the multiple target and placement points, making provision x, y, and z computations in the observation algorithm. A structure of model for the multi-sensor placement problem is proposed: The problem is constructed as an optimization problem with the objective to minimize the cost while covering all multiple target points upon a given probability of observation tolerance.

  11. Re-Entry Point Targeting for LEO Spacecraft using Aerodynamic Drag

    Science.gov (United States)

    Omar, Sanny; Bevilacqua, Riccardo; Fineberg, Laurence; Treptow, Justin; Johnson, Yusef; Clark, Scott

    2016-01-01

    Most Low Earth Orbit (LEO) spacecraft do not have thrusters and re-enter atmosphere in random locations at uncertain times. Objects pose a risk to persons, property, or other satellites. Has become a larger concern with the recent increase in small satellites. Working on a NASA funded project to design a retractable drag device to expedite de-orbit and target a re-entry location through modulation of the drag area. Will be discussing the re-entry point targeting algorithm here.

  12. Dynamic analysis: a new point of view

    Science.gov (United States)

    Chaves, Eduardo W. V.

    2016-05-01

    In this article, an alternative to the classical dynamic equation formulation is presented. To achieve this goal, we need to derive the reciprocal theorem in rates and the principle of virtual work in rates, in a small deformation regime, with which we will be able to obtain an expression for damping force. In this new formulation, some terms that are not commonly considered in the classical formulation appear, e.g., the term that is function of jerk (the rate of change of acceleration). Moreover, in this formulation the term that characterizes material nonlinearity, in dynamic analysis, appears naturally.

  13. Point Cluster Analysis Using a 3D Voronoi Diagram with Applications in Point Cloud Segmentation

    Directory of Open Access Journals (Sweden)

    Shen Ying

    2015-08-01

    Full Text Available Three-dimensional (3D point analysis and visualization is one of the most effective methods of point cluster detection and segmentation in geospatial datasets. However, serious scattering and clotting characteristics interfere with the visual detection of 3D point clusters. To overcome this problem, this study proposes the use of 3D Voronoi diagrams to analyze and visualize 3D points instead of the original data item. The proposed algorithm computes the cluster of 3D points by applying a set of 3D Voronoi cells to describe and quantify 3D points. The decompositions of point cloud of 3D models are guided by the 3D Voronoi cell parameters. The parameter values are mapped from the Voronoi cells to 3D points to show the spatial pattern and relationships; thus, a 3D point cluster pattern can be highlighted and easily recognized. To capture different cluster patterns, continuous progressive clusters and segmentations are tested. The 3D spatial relationship is shown to facilitate cluster detection. Furthermore, the generated segmentations of real 3D data cases are exploited to demonstrate the feasibility of our approach in detecting different spatial clusters for continuous point cloud segmentation.

  14. Principal component analysis for fermionic critical points

    Science.gov (United States)

    Costa, Natanael C.; Hu, Wenjian; Bai, Z. J.; Scalettar, Richard T.; Singh, Rajiv R. P.

    2017-11-01

    We use determinant quantum Monte Carlo (DQMC), in combination with the principal component analysis (PCA) approach to unsupervised learning, to extract information about phase transitions in several of the most fundamental Hamiltonians describing strongly correlated materials. We first explore the zero-temperature antiferromagnet to singlet transition in the periodic Anderson model, the Mott insulating transition in the Hubbard model on a honeycomb lattice, and the magnetic transition in the 1/6-filled Lieb lattice. We then discuss the prospects for learning finite temperature superconducting transitions in the attractive Hubbard model, for which there is no sign problem. Finally, we investigate finite temperature charge density wave (CDW) transitions in the Holstein model, where the electrons are coupled to phonon degrees of freedom, and carry out a finite size scaling analysis to determine Tc. We examine the different behaviors associated with Hubbard-Stratonovich auxiliary field configurations on both the entire space-time lattice and on a single imaginary time slice, or other quantities, such as equal-time Green's and pair-pair correlation functions.

  15. Two-point anchoring of a lanthanide-binding peptide to a target protein enhances the paramagnetic anisotropic effect

    International Nuclear Information System (INIS)

    Saio, Tomohide; Ogura, Kenji; Yokochi, Masashi; Kobashigawa, Yoshihiro; Inagaki, Fuyuhiko

    2009-01-01

    Paramagnetic lanthanide ions fixed in a protein frame induce several paramagnetic effects such as pseudo-contact shifts and residual dipolar couplings. These effects provide long-range distance and angular information for proteins and, therefore, are valuable in protein structural analysis. However, until recently this approach had been restricted to metal-binding proteins, but now it has become applicable to non-metalloproteins through the use of a lanthanide-binding tag. Here we report a lanthanide-binding peptide tag anchored via two points to the target proteins. Compared to conventional single-point attached tags, the two-point linked tag provides two to threefold stronger anisotropic effects. Though there is slight residual mobility of the lanthanide-binding tag, the present tag provides a higher anisotropic paramagnetic effect

  16. Effects of aging and Tai Chi on finger-pointing toward stationary and moving visual targets.

    Science.gov (United States)

    Kwok, Jasmine C; Hui-Chan, Christina W; Tsang, William W

    2010-01-01

    Kwok JC, Hui-Chan CW, Tsang WW. Effects of aging and Tai Chi on finger-pointing toward stationary and moving visual targets. To examine the aging effect on speed and accuracy in finger pointing toward stationary and moving visual targets between young and older healthy subjects and whether or not Tai Chi practitioners perform better than healthy older controls in these tasks. Cross-sectional study. University-based rehabilitation center. University students (n=30) (aged 24.2+/-3.1y), were compared with healthy older control subjects (n=30) (aged 72.3+/-7.2y) and experienced (n=31) (mean years of practice, 7.1+/-6.5y) Tai Chi practitioners (aged 70.3+/-5.9y). Not applicable. Subjects pointed with the index finger of their dominant hand from a fixed starting position on a desk to a visual signal (1.2cm diameter dot) appearing on a display unit, as quickly and as accurately as possible. Outcome measures included (1) reaction time-the time from the appearance of the dot to the onset of the anterior deltoid electromyographic response; (2) movement time-the time from onset of the electromyographic response to touching of the dot; and (3) accuracy-the absolute deviation of the subject's finger-pointing location from center of the dot. Young subjects achieved significantly faster reaction and movement times with significantly better accuracy than older control subjects in all finger-pointing tasks. Tai Chi practitioners attained significantly better accuracy than older controls in pointing to stationary visual signals appearing contralaterally and centrally to their pointing hand. They also demonstrated significantly better accuracy when the target was moving. Accuracy in Tai Chi practitioners was similar to young controls. Eye-hand coordination in finger-pointing declines with age in time and accuracy domains. However, Tai Chi practitioners attained significantly better accuracy than control subjects similar in age, sex, and physical activity level. Copyright (c) 2010

  17. Investigating Spatial Patterns of Persistent Scatterer Interferometry Point Targets and Landslide Occurrences in the Arno River Basin

    Directory of Open Access Journals (Sweden)

    Ping Lu

    2014-07-01

    Full Text Available Persistent Scatterer Interferometry (PSI has been widely used for landslide studies in recent years. This paper investigated the spatial patterns of PSI point targets and landslide occurrences in the Arno River basin in Central Italy. The main purpose is to analyze whether spatial patterns of Persistent Scatterers (PS can be recognized as indicators of landslide occurrences throughout the whole basin. The bivariate K-function was employed to assess spatial relationships between PS and landslides. The PSI point targets were acquired from almost 4 years (from March 2003 to January 2007 of RADARSAT-1 images. The landslide inventory was collected from 15 years (from 1992–2007 of surveying and mapping data, mainly including remote sensing data, topographic maps and field investigations. The proposed approach is able to assess spatial patterns between a variety of PS and landslides, in particular, to understand if PSI point targets are spatially clustered (spatial attraction or randomly distributed (spatial independency on various types of landslides across the basin. Additionally, the degree and scale distances of PS clustering on a variety of landslides can be characterized. The results rejected the null hypothesis that PSI point targets appear to cluster similarly on four types of landslides (slides, flows, falls and creeps in the Arno River basin. Significant influence of PS velocities and acquisition orbits can be noticed on detecting landslides with different states of activities. Despite that the assessment may be influenced by the quality of landslide inventory and Synthetic Aperture Radar (SAR images, the proposed approach is expected to provide guidelines for studies trying to detect and investigate landslide occurrences at a regional scale through spatial statistical analysis of PS, for which an advanced understanding of the impact of scale distances on landslide clustering is fundamentally needed.

  18. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method.

    Science.gov (United States)

    Shen, Yueqian; Lindenbergh, Roderik; Wang, Jinhu

    2016-12-24

    A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  19. Change Analysis in Structural Laser Scanning Point Clouds: The Baseline Method

    Directory of Open Access Journals (Sweden)

    Yueqian Shen

    2016-12-01

    Full Text Available A method is introduced for detecting changes from point clouds that avoids registration. For many applications, changes are detected between two scans of the same scene obtained at different times. Traditionally, these scans are aligned to a common coordinate system having the disadvantage that this registration step introduces additional errors. In addition, registration requires stable targets or features. To avoid these issues, we propose a change detection method based on so-called baselines. Baselines connect feature points within one scan. To analyze changes, baselines connecting corresponding points in two scans are compared. As feature points either targets or virtual points corresponding to some reconstructable feature in the scene are used. The new method is implemented on two scans sampling a masonry laboratory building before and after seismic testing, that resulted in damages in the order of several centimeters. The centres of the bricks of the laboratory building are automatically extracted to serve as virtual points. Baselines connecting virtual points and/or target points are extracted and compared with respect to a suitable structural coordinate system. Changes detected from the baseline analysis are compared to a traditional cloud to cloud change analysis demonstrating the potential of the new method for structural analysis.

  20. Satellite Video Point-target Tracking in Combination with Motion Smoothness Constraint and Grayscale Feature

    Directory of Open Access Journals (Sweden)

    WU Jiaqi

    2017-09-01

    Full Text Available In view of the problem of satellite video point-target tracking, a method of Bayesian classification for tracking with the constraint of motion smoothness is proposed, which named Bayesian MoST. The idea of naive Bayesian classification without relying on any prior probability of target is introduced. Under the constraint of motion smoothness, the gray level similarity feature is used to describe the likelihood of the target. And then, the simplified conditional probability correction model of classifier is created according to the independence assumption Bayes theorem. Afterwards, the tracking target position can be determined by estimating the target posterior probability on the basis of the model. Meanwhile, the Kalman filter, an assistance and optimization method, is used to enhance the robustness of tracking processing. The theoretical method proposed are validated in a number of six experiments using SkySat and JL1H video, each has two segments. The experiment results show that the BMoST method proposed have good performance, the tracking precision is about 90% and tracking trajectory is smoothing. The method could satisfy the needs of the following advanced treatment in satellite video.

  1. Thermal-hydraulic analysis of PDS-XADS spallation target

    International Nuclear Information System (INIS)

    Ai Nisai; Yu Jiyang; Yang Yongwei

    2012-01-01

    This paper is a study of the thermal-hydraulic analysis of PDS-XADS spallation target for the large (80 MW) core concept. PDS-XADS is a small scale experimental accelerator driven sub-critical system (ADS). The analysis presented in this paper is based on lead bismuth eutectic (LBE) cooled XADS type experimental reactors, which are the de signs of the European experimental (PDS-XADS) project. The spallation target is a very important component of accelerator driven sub-critical system (ADS) because it is responsible to keep the reactor power at the required level by spallation reactions. A high rate of neutron production by spallation reaction creates the problem of decay heat cooling. LBE flow is properly cooled, but the window is not properly cooled because of the stagnation point in the pole of the window. It would be very difficult to keep the window temperature below the design limit, which is an important design limit challenge. Thermal-hydraulic analysis of LBE spallation target has been carried out by using ANSYS CFX 11.0. The detailed CFD analysis, which reveals thermal and hydraulic conditions in the window and spallation region, is carried out for different spallation target designs. Finally, the spallation target design limit is used to choose the best design. (authors)

  2. Development of Hazard Analysis Critical Control Points (HACCP ...

    African Journals Online (AJOL)

    Development of Hazard Analysis Critical Control Points (HACCP) and Enhancement of Microbial Safety Quality during Production of Fermented Legume Based ... Nigerian Food Journal ... Critical control points during production of iru and okpehe, two fermented condiments, were identified in four processors in Nigeria.

  3. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...

  4. Point Cloud Based Visibility Analysis : first experimental results

    NARCIS (Netherlands)

    Zhang, G.; van Oosterom, P.J.M.; Verbree, E.; Bregt, Arnold; Sarjakoski, Tapani; Lammeren, Ron van; Rip, Frans

    2017-01-01

    Visibility computed from a LiDAR point cloud offers several advantages compared to using a gridded digital height-model. With a higher resolution and detailed information, point cloud data can provide precise analysis as well as an opportunity to avoid the process of generating a surface

  5. CFD analysis of the HYPER spallation target

    International Nuclear Information System (INIS)

    Cho, Chungho; Tak, Nam-il; Choi, Jae-Hyuk; Lee, Yong-Bum

    2008-01-01

    KAERI (Korea Atomic Energy Research Institute) is developing an accelerator driven system (ADS) named HYPER (HYbrid Power Extraction Reactor) for a transmutation of long-lived nuclear wastes. One of the challenging tasks for the HYPER system is to design a large spallation target with a beam power of 15-25 MW. The paper focuses on a thermal-hydraulic analysis of the active part of the HYPER target. Computational fluid dynamics (CFD) analysis was performed by using a commercial code CFX 5.7.1. Several advanced turbulence models with different grid structures were applied. The CFX results reveal a significant impact of the turbulence model on the window temperature. Particularly, the k-ε model predicts the lowest window temperature among the five investigated turbulence models

  6. Multidimensional digital filters for point-target detection in cluttered infrared scenes

    Science.gov (United States)

    Kennedy, Hugh L.

    2014-11-01

    A three-dimensional (3-D) spatiotemporal prediction-error filter (PEF) is used to enhance foreground/background contrast in (real and simulated) sensor image sequences. Relative velocity is utilized to extract point targets that would otherwise be indistinguishable with spatial frequency alone. An optical-flow field is generated using local estimates of the 3-D autocorrelation function via the application of the fast Fourier transform (FFT) and inverse FFT. Velocity estimates are then used to tune in a background-whitening PEF that is matched to the motion and texture of the local background. Finite impulse response (FIR) filters are designed and implemented in the frequency domain. An analytical expression for the frequency response of velocity-tuned FIR filters, of odd or even dimension with an arbitrary delay in each dimension, is derived.

  7. Congruence analysis of point clouds from unstable stereo image sequences

    Directory of Open Access Journals (Sweden)

    C. Jepping

    2014-06-01

    Full Text Available This paper deals with the correction of exterior orientation parameters of stereo image sequences over deformed free-form surfaces without control points. Such imaging situation can occur, for example, during photogrammetric car crash test recordings where onboard high-speed stereo cameras are used to measure 3D surfaces. As a result of such measurements 3D point clouds of deformed surfaces are generated for a complete stereo sequence. The first objective of this research focusses on the development and investigation of methods for the detection of corresponding spatial and temporal tie points within the stereo image sequences (by stereo image matching and 3D point tracking that are robust enough for a reliable handling of occlusions and other disturbances that may occur. The second objective of this research is the analysis of object deformations in order to detect stable areas (congruence analysis. For this purpose a RANSAC-based method for congruence analysis has been developed. This process is based on the sequential transformation of randomly selected point groups from one epoch to another by using a 3D similarity transformation. The paper gives a detailed description of the congruence analysis. The approach has been tested successfully on synthetic and real image data.

  8. Numerical Investigation of the Time Discretization Impact on the Accuracy of a Point Target Localization by UWB Radar

    Science.gov (United States)

    Buša, Ján; Kocur, Dušan; Švecová, Mária

    2018-02-01

    UWB radar technologies enable localization of moving persons (targets) situated behind nonmetallic obstacles. Under exact knowledge of the propagation times, from the transmitting to the receiving antennas, of the radar emitted electromagnetic wave (time of arrival, TOA), a highly accurate target localization can be achieved. Since TOA estimates only are available, their use for target localization may result in a sizeable target localization error. In this paper we study the influence of TOA quantization on the point target localization accuracy using numerical simulation methods.

  9. Colocalization coefficients evaluating the distribution of molecular targets in microscopy methods based on pointed patterns

    Czech Academy of Sciences Publication Activity Database

    Pastorek, Lukáš; Sobol, Margaryta; Hozák, Pavel

    2016-01-01

    Roč. 146, č. 4 (2016), s. 391-406 ISSN 0948-6143 R&D Projects: GA TA ČR(CZ) TE01020118; GA ČR GA15-08738S; GA MŠk(CZ) ED1.1.00/02.0109; GA MŠk(CZ) LM2015062 Grant - others:Human Frontier Science Program(FR) RGP0017/2013 Institutional support: RVO:68378050 Keywords : Colocalization * Quantitative analysis * Pointed patterns * Transmission electron microscopy * Manders' coefficients * Immunohistochemistry Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.553, year: 2016

  10. Nuclear Security: Target Analysis-rev

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Surinder Paul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Gibbs, Philip W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bultz, Garl A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-03-01

    The objectives of this presentation are to understand target identification, including roll-up and protracted theft; evaluate target identification in the SNRI; recognize the target characteristics and consequence levels; and understand graded safeguards.

  11. Effect of target color and scanning geometry on terrestrial LiDAR point-cloud noise and plane fitting

    Science.gov (United States)

    Bolkas, Dimitrios; Martinez, Aaron

    2018-01-01

    Point-cloud coordinate information derived from terrestrial Light Detection And Ranging (LiDAR) is important for several applications in surveying and civil engineering. Plane fitting and segmentation of target-surfaces is an important step in several applications such as in the monitoring of structures. Reliable parametric modeling and segmentation relies on the underlying quality of the point-cloud. Therefore, understanding how point-cloud errors affect fitting of planes and segmentation is important. Point-cloud intensity, which accompanies the point-cloud data, often goes hand-in-hand with point-cloud noise. This study uses industrial particle boards painted with eight different colors (black, white, grey, red, green, blue, brown, and yellow) and two different sheens (flat and semi-gloss) to explore how noise and plane residuals vary with scanning geometry (i.e., distance and incidence angle) and target-color. Results show that darker colors, such as black and brown, can produce point clouds that are several times noisier than bright targets, such as white. In addition, semi-gloss targets manage to reduce noise in dark targets by about 2-3 times. The study of plane residuals with scanning geometry reveals that, in many of the cases tested, residuals decrease with increasing incidence angles, which can assist in understanding the distribution of plane residuals in a dataset. Finally, a scheme is developed to derive survey guidelines based on the data collected in this experiment. Three examples demonstrate that users should consider instrument specification, required precision of plane residuals, required point-spacing, target-color, and target-sheen, when selecting scanning locations. Outcomes of this study can aid users to select appropriate instrumentation and improve planning of terrestrial LiDAR data-acquisition.

  12. Super-resolution imaging using multi- electrode CMUTs: theoretical design and simulation using point targets.

    Science.gov (United States)

    You, Wei; Cretu, Edmond; Rohling, Robert

    2013-11-01

    This paper investigates a low computational cost, super-resolution ultrasound imaging method that leverages the asymmetric vibration mode of CMUTs. Instead of focusing on the broadband received signal on the entire CMUT membrane, we utilize the differential signal received on the left and right part of the membrane obtained by a multi-electrode CMUT structure. The differential signal reflects the asymmetric vibration mode of the CMUT cell excited by the nonuniform acoustic pressure field impinging on the membrane, and has a resonant component in immersion. To improve the resolution, we propose an imaging method as follows: a set of manifold matrices of CMUT responses for multiple focal directions are constructed off-line with a grid of hypothetical point targets. During the subsequent imaging process, the array sequentially steers to multiple angles, and the amplitudes (weights) of all hypothetical targets at each angle are estimated in a maximum a posteriori (MAP) process with the manifold matrix corresponding to that angle. Then, the weight vector undergoes a directional pruning process to remove the false estimation at other angles caused by the side lobe energy. Ultrasound imaging simulation is performed on ring and linear arrays with a simulation program adapted with a multi-electrode CMUT structure capable of obtaining both average and differential received signals. Because the differential signals from all receiving channels form a more distinctive temporal pattern than the average signals, better MAP estimation results are expected than using the average signals. The imaging simulation shows that using differential signals alone or in combination with the average signals produces better lateral resolution than the traditional phased array or using the average signals alone. This study is an exploration into the potential benefits of asymmetric CMUT responses for super-resolution imaging.

  13. Reconstruction analysis of the IRAS Point Source Catalog Redshift Survey

    NARCIS (Netherlands)

    Narayanan, VK; Weinberg, DH; Branchini, E; Frenk, CS; Maddox, S; Oliver, S; Rowan-Robinson, M; Saunders, W

    We present the results of reconstruction analysis of the galaxy distribution in a spherical region of radius 50 h(-1) Mpc centered on the Local Group, as mapped by the IRAS Point Source Catalog Redshift Survey (PSCz). We reconstruct this galaxy distribution using 15 different models for structure

  14. Trend analysis and change point detection of annual and seasonal ...

    Indian Academy of Sciences (India)

    2005; Partal and Kahya 2006;. Keywords. Climate change; temperature; precipitation; trend analysis; change point detection; southwest Iran. J. Earth Syst. Sci. 123, No. 2, March 2014, pp. 281–295 ...... level are indicated by shaded triangles and hollow triangles indicate insignificant trends. Figure 7. Sequential values of the ...

  15. Stability Analysis of Periodic Systems by Truncated Point Mappings

    Science.gov (United States)

    Guttalu, R. S.; Flashner, H.

    1996-01-01

    An approach is presented deriving analytical stability and bifurcation conditions for systems with periodically varying coefficients. The method is based on a point mapping(period to period mapping) representation of the system's dynamics. An algorithm is employed to obtain an analytical expression for the point mapping and its dependence on the system's parameters. The algorithm is devised to derive the coefficients of a multinominal expansion of the point mapping up to an arbitrary order in terms of the state variables and of the parameters. Analytical stability and bifurcation condition are then formulated and expressed as functional relations between the parameters. To demonstrate the application of the method, the parametric stability of Mathieu's equation and of a two-degree of freedom system are investigated. The results obtained by the proposed approach are compared to those obtained by perturbation analysis and by direct integration which we considered to the "exact solution". It is shown that, unlike perturbation analysis, the proposed method provides very accurate solution even for large valuesof the parameters. If an expansion of the point mapping in terms of a small parameter is performed the method is equivalent to perturbation analysis. Moreover, it is demonstrated that the method can be easily applied to multiple-degree-of-freedom systems using the same framework. This feature is an important advantage since most of the existing analysis methods apply mainly to single-degree-of-freedom systems and their extension to higher dimensions is difficult and computationally cumbersome.

  16. Environmental Impact and Hazards Analysis Critical Control Point ...

    African Journals Online (AJOL)

    Tsire is a local meat delicacy (kebab) in northern Nigeria, which has become popular and widely acceptable throughout the country and even beyond. Three production sites of tsire were evaluated for the environmental impact and hazard analysis critical control point (HACCP) on the microbiological and chemical qualities ...

  17. Trend analysis and change point detection of annual and seasonal ...

    Indian Academy of Sciences (India)

    temperature and precipitation series have been investigated by many researchers throughout the world (Serra et al. 2001; Turkes and Sumer 2004;. Zer Lin et al. 2005; Partal and Kahya 2006;. Keywords. Climate change; temperature; precipitation; trend analysis; change point detection; southwest Iran. J. Earth Syst. Sci.

  18. Trend analysis and change point detection of annual and seasonal ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 123; Issue 2. Trend analysis and change point detection of annual and seasonal precipitation and ... Department of Geography, University of Pune, Pune 411 007, India. Centre for Advanced Training, Indian Institute of Tropical Meteorology, Pune 411 008, India.

  19. A systematic analysis of the Braitenberg vehicle 2b for point-like stimulus sources

    International Nuclear Information System (INIS)

    Rañó, Iñaki

    2012-01-01

    Braitenberg vehicles have been used experimentally for decades in robotics with limited empirical understanding. This paper presents the first mathematical model of the vehicle 2b, displaying so-called aggression behaviour, and analyses the possible trajectories for point-like smooth stimulus sources. This sensory-motor steering control mechanism is used to implement biologically grounded target approach, target-seeking or obstacle-avoidance behaviour. However, the analysis of the resulting model reveals that complex and unexpected trajectories can result even for point-like stimuli. We also prove how the implementation of the controller and the vehicle morphology interact to affect the behaviour of the vehicle. This work provides a better understanding of Braitenberg vehicle 2b, explains experimental results and paves the way for a formally grounded application on robotics as well as for a new way of understanding target seeking in biology. (paper)

  20. A mathematical analysis of multiple-target SELEX.

    Science.gov (United States)

    Seo, Yeon-Jung; Chen, Shiliang; Nilsen-Hamilton, Marit; Levine, Howard A

    2010-10-01

    SELEX (Systematic Evolution of Ligands by Exponential Enrichment) is a procedure by which a mixture of nucleic acids can be fractionated with the goal of identifying those with specific biochemical activities. One combines the mixture with a specific target molecule and then separates the target-NA complex from the resulting reactions. The target-NA complex is separated from the unbound NA by mechanical means (such as by filtration), the NA is eluted from the complex, amplified by PCR (polymerase chain reaction), and the process repeated. After several rounds, one should be left with the nucleic acids that best bind to the target. The problem was first formulated mathematically in Irvine et al. (J. Mol. Biol. 222:739-761, 1991). In Levine and Nilsen-Hamilton (Comput. Biol. Chem. 31:11-25, 2007), a mathematical analysis of the process was given. In Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998), multiple target SELEX was considered. It was assumed that each target has a single nucleic acid binding site that permits occupation by no more than one nucleic acid. Here, we revisit Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998) using the same assumptions. The iteration scheme is shown to be convergent and a simplified algorithm is given. Our interest here is in the behavior of the multiple target SELEX process as a discrete "time" dynamical system. Our goal is to characterize the limiting states and their dependence on the initial distribution of nucleic acid and target fraction components. (In multiple target SELEX, we vary the target component fractions, but not their concentrations, as fixed and the initial pool of nucleic acids as a variable starting condition). Given N nucleic acids and a target consisting of M subtarget component species, there is an M × N matrix of affinities, the (i,j) entry corresponding to the affinity of the jth nucleic acid for the ith subtarget. We give a structure condition on this matrix that is equivalent to the following

  1. Design of thermostable rhamnogalacturonan lyase mutants from Bacillus licheniformis by combination of targeted single point mutations

    DEFF Research Database (Denmark)

    da Silva, Ines Isabel Cardoso Rodrigues; Jers, Carsten; Otten, Harm

    2014-01-01

    Rhamnogalacturonan I lyases (RGI lyases) (EC 4.2.2.-) catalyze cleavage of α-1,4 bonds between rhamnose and galacturonic acid in the backbone of pectins by β-elimination. In the present study, targeted improvement of the thermostability of a PL family 11 RGI lyase from Bacillus licheniformis (DSM...... the wild-type RGI lyase in Bacillus subtilis as opposed to in Pichia pastoris; this effect is suggested to be a negative result of glycosylation of the P. pastoris expressed enzyme. A ~ twofold improvement in thermal stability at 60 °C, accompanied by less significant increases in Tm of the enzyme mutants......, were obtained due to additive stabilizing effects of single amino acid mutations (E434L, G55V, and G326E) compared to the wild type. The crystal structure of the B. licheniformis wild-type RGI lyase was also determined; the structural analysis corroborated that especially mutation of charged amino...

  2. Scattering analysis of point processes and random measures

    International Nuclear Information System (INIS)

    Hanisch, K.H.

    1984-01-01

    In the present paper scattering analysis of point processes and random measures is studied. Known formulae which connect the scattering intensity with the pair distribution function of the studied structures are proved in a rigorous manner with tools of the theory of point processes and random measures. For some special fibre processes the scattering intensity is computed. For a class of random measures, namely for 'grain-germ-models', a new formula is proved which yields the pair distribution function of the 'grain-germ-model' in terms of the pair distribution function of the underlying point process (the 'germs') and of the mean structure factor and the mean squared structure factor of the particles (the 'grains'). (author)

  3. POINT CLOUD ANALYSIS FOR CONSERVATION AND ENHANCEMENT OF MODERNIST ARCHITECTURE

    Directory of Open Access Journals (Sweden)

    M. Balzani

    2017-02-01

    An applied research focused on the analysis of surface specifications and material properties by means of 3D laser scanner survey has been developed within the project of Digital Preservation of FAUUSP building, Faculdade de Arquitetura e Urbanismo da Universidade de São Paulo, Brazil. The integrated 3D survey has been performed by the DIAPReM Center of the Department of Architecture of the University of Ferrara in cooperation with the FAUUSP. The 3D survey has allowed the realization of a point cloud model of the external surfaces, as the basis to investigate in detail the formal characteristics, geometric textures and surface features. The digital geometric model was also the basis for processing the intensity values acquired by laser scanning instrument; this method of analysis was an essential integration to the macroscopic investigations in order to manage additional information related to surface characteristics displayable on the point cloud.

  4. Process for structural geologic analysis of topography and point data

    Science.gov (United States)

    Eliason, Jay R.; Eliason, Valerie L. C.

    1987-01-01

    A quantitative method of geologic structural analysis of digital terrain data is described for implementation on a computer. Assuming selected valley segments are controlled by the underlying geologic structure, topographic lows in the terrain data, defining valley bottoms, are detected, filtered and accumulated into a series line segments defining contiguous valleys. The line segments are then vectorized to produce vector segments, defining valley segments, which may be indicative of the underlying geologic structure. Coplanar analysis is performed on vector segment pairs to determine which vectors produce planes which represent underlying geologic structure. Point data such as fracture phenomena which can be related to fracture planes in 3-dimensional space can be analyzed to define common plane orientation and locations. The vectors, points, and planes are displayed in various formats for interpretation.

  5. Spectrography analysis of stainless steel by the point to point technique

    International Nuclear Information System (INIS)

    Bona, A.

    1986-01-01

    A method for the determination of the elements Ni, Cr, Mn, Si, Mo, Nb, Cu, Co and V in stainless steel by emission spectrographic analysis using high voltage spark sources is presented. The 'point-to-point' technique is employed. The experimental parameters were optimized taking account a compromise between the detection sensitivity and the precision of the measurement. The parameters investigated were the high voltage capacitance, the inductance, the analytical and auxiliary gaps, the period of pre burn spark and the time of exposure. The edge shape of the counter electrodes and the type of polishing and diameter variation of the stailess steel eletrodes were evaluated in preliminary assays. In addition the degradation of the chemical power of the developer was also investigated. Counter electrodes of graphite, copper, aluminium and iron were employed and the counter electrode itself was used as an internal standard. In the case of graphite counter electrodes the iron lines were employed as internal standard. The relative errors were the criteria for evaluation of these experiments. The National Bureau of Standards - Certified reference stainless steel standards and the Eletrometal Acos Finos S.A. samples (certified by the supplier) were employed for drawing in the calibration systems and analytical curves. The best results were obtained using the convencional graphite counter electrodes. The inaccuracy and the imprecision of the proposed method varied from 2% to 15% and from 1% to 9% respectively. This present technique was compared to others instrumental techniques such as inductively coupled plasma, X-ray fluorescence and neutron activation analysis. The advantages and disadvantages for each case were discussed. (author) [pt

  6. SPATIAL ANALYSIS TO SUPPORT GEOGRAPHIC TARGETING OF GENOTYPES TO ENVIRONMENTS

    Directory of Open Access Journals (Sweden)

    Glenn eHyman

    2013-03-01

    Full Text Available Crop improvement efforts have benefited greatly from advances in available data, computing technology and methods for targeting genotypes to environments. These advances support the analysis of genotype by environment interactions to understand how well a genotype adapts to environmental conditions. This paper reviews the use of spatial analysis to support crop improvement research aimed at matching genotypes to their most appropriate environmental niches. Better data sets are now available on soils, weather and climate, elevation, vegetation, crop distribution and local conditions where genotypes are tested in experimental trial sites. The improved data are now combined with spatial analysis methods to compare environmental conditions across sites, create agro-ecological region maps and assess environment change. Climate, elevation and vegetation data sets are now widely available, supporting analyses that were much more difficult even five or ten years ago. While detailed soil data for many parts of the world remains difficult to acquire for crop improvement studies, new advances in digital soil mapping are likely to improve our capacity. Site analysis and matching and regional targeting methods have advanced in parallel to data and technology improvements. All these developments have increased our capacity to link genotype to phenotype and point to a vast potential to improve crop adaptation efforts.

  7. Spectral analysis of heart rate variability during trigger point acupuncture.

    Science.gov (United States)

    Kitagawa, Yoji; Kimura, Kenichi; Yoshida, Sohei

    2014-06-01

    To clarify changes in the cardiovascular autonomic nervous system function due to trigger point acupuncture, we evaluated differences in responses between acupuncture at trigger points and those at other sites using spectral analysis of heart rate variability. Subjects were 35 healthy men. Before measurements began the subjects were assigned to a trigger point acupuncture or control group based on the presence/absence of referred pain on applying pressure to a taut band within the right tibialis anterior muscle. The measurements were conducted in a room with a temperature of 25°C, with subjects in a long sitting position after 10 min rest. Acupuncture needles were retained for 10 min at two sites on the right tibialis anterior muscle. ECG was performed simultaneously with measurements of blood pressure and the respiratory cycle. Based on the R-R interval on the ECG, frequency analysis was performed, low-frequency (LF) and high-frequency (HF) components were extracted and the ratio of LF to HF components (LF/HF) was evaluated. The trigger point acupuncture group showed a transient decrease in heart rate and an increase in the HF component but no significant changes in LF/HF. In the control group, no significant changes were observed in heart rate, the HF component or LF/HF. There were no consistent changes in systolic or diastolic blood pressure in either group. These data suggest that acupuncture stimulation of trigger points of the tibialis anterior muscle transiently increases parasympathetic nerve activity. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  8. Growth Curve Analysis and Change-Points Detection in Extremes

    KAUST Repository

    Meng, Rui

    2016-05-15

    The thesis consists of two coherent projects. The first project presents the results of evaluating salinity tolerance in barley using growth curve analysis where different growth trajectories are observed within barley families. The study of salinity tolerance in plants is crucial to understanding plant growth and productivity. Because fully-automated smarthouses with conveyor systems allow non-destructive and high-throughput phenotyping of large number of plants, it is now possible to apply advanced statistical tools to analyze daily measurements and to study salinity tolerance. To compare different growth patterns of barley variates, we use functional data analysis techniques to analyze the daily projected shoot areas. In particular, we apply the curve registration method to align all the curves from the same barley family in order to summarize the family-wise features. We also illustrate how to use statistical modeling to account for spatial variation in microclimate in smarthouses and for temporal variation across runs, which is crucial for identifying traits of the barley variates. In our analysis, we show that the concentrations of sodium and potassium in leaves are negatively correlated, and their interactions are associated with the degree of salinity tolerance. The second project studies change-points detection methods in extremes when multiple time series data are available. Motived by the scientific question of whether the chances to experience extreme weather are different in different seasons of a year, we develop a change-points detection model to study changes in extremes or in the tail of a distribution. Most of existing models identify seasons from multiple yearly time series assuming a season or a change-point location remains exactly the same across years. In this work, we propose a random effect model that allows the change-point to vary from year to year, following a given distribution. Both parametric and nonparametric methods are developed

  9. Point Cloud Analysis for Conservation and Enhancement of Modernist Architecture

    Science.gov (United States)

    Balzani, M.; Maietti, F.; Mugayar Kühl, B.

    2017-02-01

    Documentation of cultural assets through improved acquisition processes for advanced 3D modelling is one of the main challenges to be faced in order to address, through digital representation, advanced analysis on shape, appearance and conservation condition of cultural heritage. 3D modelling can originate new avenues in the way tangible cultural heritage is studied, visualized, curated, displayed and monitored, improving key features such as analysis and visualization of material degradation and state of conservation. An applied research focused on the analysis of surface specifications and material properties by means of 3D laser scanner survey has been developed within the project of Digital Preservation of FAUUSP building, Faculdade de Arquitetura e Urbanismo da Universidade de São Paulo, Brazil. The integrated 3D survey has been performed by the DIAPReM Center of the Department of Architecture of the University of Ferrara in cooperation with the FAUUSP. The 3D survey has allowed the realization of a point cloud model of the external surfaces, as the basis to investigate in detail the formal characteristics, geometric textures and surface features. The digital geometric model was also the basis for processing the intensity values acquired by laser scanning instrument; this method of analysis was an essential integration to the macroscopic investigations in order to manage additional information related to surface characteristics displayable on the point cloud.

  10. Target Audience of Live Opera Transmissions to Cinema Theatres from the Marketing Point of View

    Directory of Open Access Journals (Sweden)

    Radek Tahal

    2016-03-01

    Full Text Available Opera has a famous history and even the present-day repertoire in opera houses mostly consists of classical and well-known works. Marketers are trying to find new ways that would enable opera lovers all over the world to enjoy top quality performances. One of the most successful models is real-time transmissions of operas to geographically remote cinemas. Cinemas from all around the world participate in the project. In this paper, the authors analyze the spectators´ profile and point out differences between North America and the Czech Republic, focusing on transmissions of performances by the Metropolitan Opera in New York. The authors submit a detailed analysis of the socio-demographic characteristics of the spectators and the attendance frequency. Special attention is paid to the marketing profile of Czech spectators, based on primary data gathered in the research. The paper is a combination of research report and business case study. The study reveals that female visitors prevail. Elderly people are also represented in high percentages. The spectators are characterized by refined taste in their lifestyles and familiarity with modern technology.

  11. 21 CFR 123.6 - Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard analysis and Hazard Analysis Critical... Provisions § 123.6 Hazard analysis and Hazard Analysis Critical Control Point (HACCP) plan. (a) Hazard analysis. Every processor shall conduct, or have conducted for it, a hazard analysis to determine whether...

  12. A computational network analysis based on targets of antipsychotic agents.

    Science.gov (United States)

    Gao, Lei; Feng, Shuo; Liu, Zhao-Yuan; Wang, Jiu-Qiang; Qi, Ke-Ke; Wang, Kai

    2018-03-01

    Currently, numerous antipsychotic agents have been developed in the area of pharmacological treatment of schizophrenia. However, the molecular mechanism underlying multi targets of antipsychotics were yet to be explored. In this study we performed a computational network analysis based on targets of antipsychotic agents. We retrieved a total of 96 targets from 56 antipsychotic agents. By expression enrichment analysis, we identified that the expressions of antipsychotic target genes were significantly enriched in liver, brain, blood and corpus striatum. By protein-protein interaction (PPI) network analysis, a PPI network with 77 significantly interconnected target genes was generated. By historeceptomics analysis, significant brain region specific target-drug interactions were identified in targets of dopamine receptors (DRD1-Olanzapine in caudate nucleus and pons (P-valueantipsychotic targets and insights for molecular mechanism of antipsychotic agents. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Tumble Graphs: Avoiding Misleading End Point Extrapolation When Graphing Interactions From a Moderated Multiple Regression Analysis

    Science.gov (United States)

    Bodner, Todd E.

    2016-01-01

    This article revisits how the end points of plotted line segments should be selected when graphing interactions involving a continuous target predictor variable. Under the standard approach, end points are chosen at ±1 or 2 standard deviations from the target predictor mean. However, when the target predictor and moderator are correlated or the…

  14. Assisting People with Multiple Disabilities by Improving Their Computer Pointing Efficiency with an Automatic Target Acquisition Program

    Science.gov (United States)

    Shih, Ching-Hsiang; Shih, Ching-Tien; Peng, Chin-Ling

    2011-01-01

    This study evaluated whether two people with multiple disabilities would be able to improve their pointing performance through an Automatic Target Acquisition Program (ATAP) and a newly developed mouse driver (i.e. a new mouse driver replaces standard mouse driver, and is able to monitor mouse movement and intercept click action). Initially, both…

  15. Percolation analysis for cosmic web with discrete points

    Science.gov (United States)

    Zhang, Jiajun; Cheng, Dalong; Chu, Ming-Chung

    2018-01-01

    Percolation analysis has long been used to quantify the connectivity of the cosmic web. Most of the previous work is based on density fields on grids. By smoothing into fields, we lose information about galaxy properties like shape or luminosity. The lack of mathematical modeling also limits our understanding for the percolation analysis. To overcome these difficulties, we have studied percolation analysis based on discrete points. Using a friends-of-friends (FoF) algorithm, we generate the S -b b relation, between the fractional mass of the largest connected group (S ) and the FoF linking length (b b ). We propose a new model, the probability cloud cluster expansion theory to relate the S -b b relation with correlation functions. We show that the S -b b relation reflects a combination of all orders of correlation functions. Using N-body simulation, we find that the S -b b relation is robust against redshift distortion and incompleteness in observation. From the Bolshoi simulation, with halo abundance matching (HAM), we have generated a mock galaxy catalog. Good matching of the projected two-point correlation function with observation is confirmed. However, comparing the mock catalog with the latest galaxy catalog from Sloan Digital Sky Survey (SDSS) Data Release (DR)12, we have found significant differences in their S -b b relations. This indicates that the mock galaxy catalog cannot accurately retain higher-order correlation functions than the two-point correlation function, which reveals the limit of the HAM method. As a new measurement, the S -b b relation is applicable to a wide range of data types, fast to compute, and robust against redshift distortion and incompleteness and contains information of all orders of correlation functions.

  16. Neutron performance analysis for ESS target proposal

    International Nuclear Information System (INIS)

    Magán, M.; Terrón, S.; Thomsen, K.; Sordo, F.; Perlado, J.M.; Bermejo, F.J.

    2012-01-01

    In the course of discussing different target types for their suitability in the European Spallation Source (ESS) one main focus was on neutronics' performance. Diverse concepts have been assessed baselining some preliminary engineering and geometrical details and including some optimization. With the restrictions and resulting uncertainty imposed by the lack of detailed designs optimizations at the time of compiling this paper, the conclusion drawn is basically that there is a little difference in the neutronic yield of the investigated targets. Other criteria like safety, environmental compatibility, reliability and cost will thus dominate the choice of an ESS target.

  17. Molecular Composition Analysis of Distant Targets

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose a system capable of probing the molecular composition of cold solar system targets such as asteroids, comets, planets and moons from a distant vantage....

  18. Lidar point density analysis: implications for identifying water bodies

    Science.gov (United States)

    Worstell, Bruce B.; Poppenga, Sandra K.; Evans, Gayla A.; Prince, Sandra

    2014-01-01

    Most airborne topographic light detection and ranging (lidar) systems operate within the near-infrared spectrum. Laser pulses from these systems frequently are absorbed by water and therefore do not generate reflected returns on water bodies in the resulting void regions within the lidar point cloud. Thus, an analysis of lidar voids has implications for identifying water bodies. Data analysis techniques to detect reduced lidar return densities were evaluated for test sites in Blackhawk County, Iowa, and Beltrami County, Minnesota, to delineate contiguous areas that have few or no lidar returns. Results from this study indicated a 5-meter radius moving window with fewer than 23 returns (28 percent of the moving window) was sufficient for delineating void regions. Techniques to provide elevation values for void regions to flatten water features and to force channel flow in the downstream direction also are presented.

  19. Tipping point analysis of a large ocean ambient sound record

    Science.gov (United States)

    Livina, Valerie N.; Harris, Peter; Brower, Albert; Wang, Lian; Sotirakopoulos, Kostas; Robinson, Stephen

    2017-04-01

    We study a long (2003-2015) high-resolution (250Hz) sound pressure record provided by the Comprehensive Nuclear-Test-Ban Treaty Organisation (CTBTO) from the hydro-acoustic station Cape Leeuwin (Australia). We transform the hydrophone waveforms into five bands of 10-min-average sound pressure levels (including the third-octave band) and apply tipping point analysis techniques [1-3]. We report the results of the analysis of fluctuations and trends in the data and discuss the BigData challenges in processing this record, including handling data segments of large size and possible HPC solutions. References: [1] Livina et al, GRL 2007, [2] Livina et al, Climate of the Past 2010, [3] Livina et al, Chaos 2015.

  20. Global point signature for shape analysis of carpal bones

    International Nuclear Information System (INIS)

    Chaudhari, Abhijit J; Badawi, Ramsey D; Leahy, Richard M; Joshi, Anand A; Wise, Barton L; Lane, Nancy E

    2014-01-01

    We present a method based on spectral theory for the shape analysis of carpal bones of the human wrist. We represent the cortical surface of the carpal bone in a coordinate system based on the eigensystem of the two-dimensional Helmholtz equation. We employ a metric—global point signature (GPS)—that exploits the scale and isometric invariance of eigenfunctions to quantify overall bone shape. We use a fast finite-element-method to compute the GPS metric. We capitalize upon the properties of GPS representation—such as stability, a standard Euclidean (ℓ 2 ) metric definition, and invariance to scaling, translation and rotation—to perform shape analysis of the carpal bones of ten women and ten men from a publicly-available database. We demonstrate the utility of the proposed GPS representation to provide a means for comparing shapes of the carpal bones across populations. (paper)

  1. Uncertainty analysis of point by point sampling complex surfaces using touch probe CMMs

    DEFF Research Database (Denmark)

    Barini, Emanuele; Tosello, Guido; De Chiffre, Leonardo

    2007-01-01

    The paper describes a study concerning point by point scanning of complex surfaces using tactile CMMs. A four factors-two level full factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, combined in a singl...

  2. Unevenness Point Descriptor for Terrain Analysis in Mobile Robot Applications

    Directory of Open Access Journals (Sweden)

    Mauro Bellone

    2013-07-01

    Full Text Available In recent years, the use of imaging sensors that produce a three-dimensional representation of the environment has become an efficient solution to increase the degree of perception of autonomous mobile robots. Accurate and dense 3D point clouds can be generated from traditional stereo systems and laser scanners or from the new generation of RGB-D cameras, representing a versatile, reliable and cost-effective solution that is rapidly gaining interest within the robotics community. For autonomous mobile robots, it is critical to assess the traversability of the surrounding environment, especially when driving across natural terrain. In this paper, a novel approach to detect traversable and non-traversable regions of the environment from a depth image is presented that could enhance mobility and safety through integration with localization, control and planning methods. The proposed algorithm is based on the analysis of the normal vector of a surface obtained through Principal Component Analysis and it leads to the definition of a novel, so defined, Unevenness Point Descriptor. Experimental results, obtained with vehicles operating in indoor and outdoor environments, are presented to validate this approach.

  3. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel D.

    2015-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software packagePySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is

  4. Spatially explicit spectral analysis of point clouds and geospatial data

    Science.gov (United States)

    Buscombe, Daniel

    2016-01-01

    The increasing use of spatially explicit analyses of high-resolution spatially distributed data (imagery and point clouds) for the purposes of characterising spatial heterogeneity in geophysical phenomena necessitates the development of custom analytical and computational tools. In recent years, such analyses have become the basis of, for example, automated texture characterisation and segmentation, roughness and grain size calculation, and feature detection and classification, from a variety of data types. In this work, much use has been made of statistical descriptors of localised spatial variations in amplitude variance (roughness), however the horizontal scale (wavelength) and spacing of roughness elements is rarely considered. This is despite the fact that the ratio of characteristic vertical to horizontal scales is not constant and can yield important information about physical scaling relationships. Spectral analysis is a hitherto under-utilised but powerful means to acquire statistical information about relevant amplitude and wavelength scales, simultaneously and with computational efficiency. Further, quantifying spatially distributed data in the frequency domain lends itself to the development of stochastic models for probing the underlying mechanisms which govern the spatial distribution of geological and geophysical phenomena. The software package PySESA (Python program for Spatially Explicit Spectral Analysis) has been developed for generic analyses of spatially distributed data in both the spatial and frequency domains. Developed predominantly in Python, it accesses libraries written in Cython and C++ for efficiency. It is open source and modular, therefore readily incorporated into, and combined with, other data analysis tools and frameworks with particular utility for supporting research in the fields of geomorphology, geophysics, hydrography, photogrammetry and remote sensing. The analytical and computational structure of the toolbox is described

  5. Amplification-refractory mutation system (ARMS) analysis of point mutations.

    Science.gov (United States)

    Little, S

    2001-05-01

    The amplification-refractory mutation system (ARMS) is a simple method for detecting any mutation involving single base changes or small deletions. ARMS is based on the use of sequence-specific PCR primers that allow amplification of test DNA only when the target allele is contained within the sample. Following an ARMS reaction the presence or absence of a PCR product is diagnostic for the presence or absence of the target allele. The protocols detailed here outline methods that can be used to analyze human genomic DNA for one or more mutations. The Basic Protocol describes the development and application of an ARMS test for a single mutation; the Alternate Protocol extends this to multiplex ARMS for the analysis of two or more mutations. The Support Protocol describes a rapid DNA extraction method from blood or mouthwash samples that yields DNA compatible with the type of tests described. The amplification-refractory mutation system (ARMS) is a simple method for detecting any mutation involving single base change The amplification-refractory mutation system (ARMS) is a simple method for detecting any mutation involving single base change.

  6. ANALYSIS OF PULSE OPTICAL TARGET SEEKER STATIC CHARACTERISTICS AT TARGET AIRCRAFTS EXPOSURE

    Directory of Open Access Journals (Sweden)

    K. V. Trifonov,

    2016-01-01

    Full Text Available Subject of Research.The paper deals with operating principles of optical pulse target seekers based on quadrant photodiode when targets are located in short-range field region. Method. Target image shape and light intensity distribution can affect static characteristics and cause appearance of image energy maximums when targets are located in short-range field region. Physical modeling of static characteristics plotting process was carried out. The main idea of the proposed method lies in counting sums of image pixels intensities in every virtual area of the sensor while virtual frame of the whole photodetector is moving over the target image. Main Results. Most probable target illumination directions were analyzed. Critical distances when the first extra image energy maximum appears were calculated for every target illumination directions. Time of missile uncontrollable flight at a near miss distance was also estimated. Practical Relevance. Research results point out that using of control loop proper logic is required to provide reliable target shot down for active and semi-active laser homing systems. Also disabling of such systems should be carried out when targets are located in short-range field region.

  7. Activation analysis utilizing byproduct neutrons of cyclotron internal target runs

    International Nuclear Information System (INIS)

    Koh, K.; Finn, R.; Smith, P.; Tavano, E.; Dwyer, J.; Sheh, H.

    1985-01-01

    The neutron flux generated by the CS-30 cyclotron at Mt Sinai Medical Center during routine internal target runs was characterized by employing various elements as neutron monitors. The characteristic (p,xn) nuclear reactions from internal targets bombarded by 26.5 Mev protons and the cyclotron inner wall bombarded by stray protons produce a neutron flux of approximately 2 x 10 9 cm -2 s -1 at energies up to 22 MeV at a point immediately outside the cyclotron vacuum chamber. Samples exposed to neutron fluences up to 5 x 10 14 cm -2 were analyzed with a Ge(Li) detector. Although the detection limits are relatively high (i.e., Au-0.2 μg; In-1 μg; Na-50 μg), this mode of neutron activation analysis is ancillary to other irradiations and allows a large number of samples to be monitored. This approach may provide an alternative to a neutron generator for research activation applications. (orig.)

  8. Evaluating Acupuncture Point and Nonacupuncture Point Stimulation with EEG: A High-Frequency Power Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Kwang-Ho Choi

    2016-01-01

    Full Text Available To identify physical and sensory responses to acupuncture point stimulation (APS, nonacupuncture point stimulation (NAPS and no stimulation (NS, changes in the high-frequency power spectrum before and after stimulation were evaluated with electroencephalography (EEG. A total of 37 healthy subjects received APS at the LI4 point, NAPS, or NS with their eyes closed. Background brain waves were measured before, during, and after stimulation using 8 channels. Changes in the power spectra of gamma waves and high beta waves before, during, and after stimulation were comparatively analyzed. After NAPS, absolute high beta power (AHBP, relative high beta power (RHBP, absolute gamma power (AGP, and relative gamma power (RGP tended to increase in all channels. But no consistent notable changes were found for APS and NS. NAPS is believed to cause temporary reactions to stress, tension, and sensory responses of the human body, while APS responds stably compared to stimulation of other parts of the body.

  9. Automatic detection of the unknown number point targets in FMICW radar signals

    Czech Academy of Sciences Publication Activity Database

    Rejfek, L.; Mošna, Zbyšek; Beran, L.; Fišer, O.; Dobrovolný, M.

    2017-01-01

    Roč. 4, č. 11 (2017), s. 116-120 ISSN 2313-626X R&D Projects: GA ČR(CZ) GA15-24688S Institutional support: RVO:68378289 Keywords : FMICW radar * 2D FFT * signal filtration * taraget detection * target parameter estimation Subject RIV: DG - Athmosphere Sciences, Meteorology OBOR OECD: Meteorology and atmospheric sciences http://science-gate.com/IJAAS/ Articles /2017-4-11/18%202017-4-11-pp.116-120.pdf

  10. A critical analysis of the tender points in fibromyalgia.

    Science.gov (United States)

    Harden, R Norman; Revivo, Gadi; Song, Sharon; Nampiaparampil, Devi; Golden, Gary; Kirincic, Marie; Houle, Timothy T

    2007-03-01

    To pilot methodologies designed to critically assess the American College of Rheumatology's (ACR) diagnostic criteria for fibromyalgia. Prospective, psychophysical testing. An urban teaching hospital. Twenty-five patients with fibromyalgia and 31 healthy controls (convenience sample). Pressure pain threshold was determined at the 18 ACR tender points and five sham points using an algometer (dolorimeter). The patients "algometric total scores" (sums of the patients' average pain thresholds at the 18 tender points) were derived, as well as pain thresholds across sham points. The "algometric total score" could differentiate patients with fibromyalgia from normals with an accuracy of 85.7% (P pain across sham points than across ACR tender points, sham points also could be used for diagnosis (85.7%; Ps tested vs other painful conditions. The points specified by the ACR were only modestly superior to sham points in making the diagnosis. Most importantly, this pilot suggests single points, smaller groups of points, or sham points may be as effective in diagnosing fibromyalgia as the use of all 18 points, and suggests methodologies to definitively test that hypothesis.

  11. Simulations of HXR Foot-point Source Sizes for Modified Thick-target Models

    Czech Academy of Sciences Publication Activity Database

    Moravec, Z.; Varady, Michal; Karlický, Marian; Kašparová, Jana

    2013-01-01

    Roč. 37, č. 2 (2013), s. 535-540 ISSN 1845-8319. [Hvar Astrophysical Colloquium /12./. Hvar, 03.09.2012-07.09.2012] R&D Projects: GA ČR GAP209/10/1680; GA ČR GAP209/12/0103 Institutional support: RVO:67985815 Keywords : solar flares * hard X-rays * foot-point sources Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics

  12. Material-Point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    2007-01-01

    The aim of this paper is to rest different kinds of spatial interpolation for the material-point method.......The aim of this paper is to rest different kinds of spatial interpolation for the material-point method....

  13. Slope failure analysis using the random material point method

    NARCIS (Netherlands)

    Wang, B.; Hicks, M.A.; Vardon, P.J.

    2016-01-01

    The random material point method (RMPM), which combines random field theory and the material point method (MPM), is proposed. It differs from the random finite-element method (RFEM), by assigning random field (cell) values to material points that are free to move relative to the computational grid

  14. Point process analysis of noise in early invertebrate vision.

    Directory of Open Access Journals (Sweden)

    Kris V Parag

    2017-10-01

    Full Text Available Noise is a prevalent and sometimes even dominant aspect of many biological processes. While many natural systems have adapted to attenuate or even usefully integrate noise, the variability it introduces often still delimits the achievable precision across biological functions. This is particularly so for visual phototransduction, the process responsible for converting photons of light into usable electrical signals (quantum bumps. Here, randomness of both the photon inputs (regarded as extrinsic noise and the conversion process (intrinsic noise are seen as two distinct, independent and significant limitations on visual reliability. Past research has attempted to quantify the relative effects of these noise sources by using approximate methods that do not fully account for the discrete, point process and time ordered nature of the problem. As a result the conclusions drawn from these different approaches have led to inconsistent expositions of phototransduction noise performance. This paper provides a fresh and complete analysis of the relative impact of intrinsic and extrinsic noise in invertebrate phototransduction using minimum mean squared error reconstruction techniques based on Bayesian point process (Snyder filters. An integrate-fire based algorithm is developed to reliably estimate photon times from quantum bumps and Snyder filters are then used to causally estimate random light intensities both at the front and back end of the phototransduction cascade. Comparison of these estimates reveals that the dominant noise source transitions from extrinsic to intrinsic as light intensity increases. By extending the filtering techniques to account for delays, it is further found that among the intrinsic noise components, which include bump latency (mean delay and jitter and shape (amplitude and width variance, it is the mean delay that is critical to noise performance. As the timeliness of visual information is important for real-time action, this

  15. Point process analysis of noise in early invertebrate vision.

    Science.gov (United States)

    Parag, Kris V; Vinnicombe, Glenn

    2017-10-01

    Noise is a prevalent and sometimes even dominant aspect of many biological processes. While many natural systems have adapted to attenuate or even usefully integrate noise, the variability it introduces often still delimits the achievable precision across biological functions. This is particularly so for visual phototransduction, the process responsible for converting photons of light into usable electrical signals (quantum bumps). Here, randomness of both the photon inputs (regarded as extrinsic noise) and the conversion process (intrinsic noise) are seen as two distinct, independent and significant limitations on visual reliability. Past research has attempted to quantify the relative effects of these noise sources by using approximate methods that do not fully account for the discrete, point process and time ordered nature of the problem. As a result the conclusions drawn from these different approaches have led to inconsistent expositions of phototransduction noise performance. This paper provides a fresh and complete analysis of the relative impact of intrinsic and extrinsic noise in invertebrate phototransduction using minimum mean squared error reconstruction techniques based on Bayesian point process (Snyder) filters. An integrate-fire based algorithm is developed to reliably estimate photon times from quantum bumps and Snyder filters are then used to causally estimate random light intensities both at the front and back end of the phototransduction cascade. Comparison of these estimates reveals that the dominant noise source transitions from extrinsic to intrinsic as light intensity increases. By extending the filtering techniques to account for delays, it is further found that among the intrinsic noise components, which include bump latency (mean delay and jitter) and shape (amplitude and width) variance, it is the mean delay that is critical to noise performance. As the timeliness of visual information is important for real-time action, this delay could

  16. Throughput analysis of point-to-multi-point hybric FSO/RF network

    KAUST Repository

    Rakia, Tamer

    2017-07-31

    This paper presents and analyzes a point-to-multi-point (P2MP) network that uses a number of free-space optical (FSO) links for data transmission from the central node to the different remote nodes. A common backup radio-frequency (RF) link is used by the central node for data transmission to any remote node in case of the failure of any one of FSO links. We develop a cross-layer Markov chain model to study the throughput from central node to a tagged remote node. Numerical examples are presented to compare the performance of the proposed P2MP hybrid FSO/RF network with that of a P2MP FSO-only network and show that the P2MP Hybrid FSO/RF network achieves considerable performance improvement over the P2MP FSO-only network.

  17. A TARGETED SEARCH FOR POINT SOURCES OF EeV NEUTRONS

    Energy Technology Data Exchange (ETDEWEB)

    Aab, A. [Universität Siegen, Siegen (Germany); Abreu, P.; Andringa, S. [Laboratório de Instrumentação e Física Experimental de Partículas-LIP and Instituto Superior Técnico-IST, Universidade de Lisboa-UL (Portugal); Aglietta, M. [Osservatorio Astrofisico di Torino (INAF), Università di Torino and Sezione INFN, Torino (Italy); Ahlers, M. [University of Wisconsin, Madison, WI (United States); Ahn, E. J. [Fermilab, Batavia, IL (United States); Al Samarai, I. [Institut de Physique Nucléaire d' Orsay (IPNO), Université Paris 11, CNRS-IN2P3, Orsay (France); Albuquerque, I. F. M. [Universidade de São Paulo, Instituto de Física, São Paulo, SP (Brazil); Allekotte, I. [Centro Atómico Bariloche and Instituto Balseiro (CNEA-UNCuyo-CONICET), San Carlos de Bariloche (Argentina); Allen, J. [New York University, New York, NY (United States); Allison, P. [Ohio State University, Columbus, OH (United States); Almela, A. [Universidad Tecnológica Nacional-Facultad Regional Buenos Aires, Buenos Aires (Argentina); Castillo, J. Alvarez [Universidad Nacional Autonoma de Mexico, Mexico, D. F. (Mexico); Alvarez-Muñiz, J. [Universidad de Santiago de Compostela (Spain); Batista, R. Alves [Universität Hamburg, Hamburg (Germany); Ambrosio, M.; Aramo, C. [Università di Napoli " Federico II" and Sezione INFN, Napoli (Italy); Aminaei, A. [IMAPP, Radboud University Nijmegen (Netherlands); Anchordoqui, L. [University of Wisconsin, Milwaukee, WI (United States); Arqueros, F. [Universidad Complutense de Madrid, Madrid (Spain); Collaboration: Pierre Auger Collaboration101; and others

    2014-07-10

    A flux of neutrons from an astrophysical source in the Galaxy can be detected in the Pierre Auger Observatory as an excess of cosmic-ray air showers arriving from the direction of the source. To avoid the statistical penalty for making many trials, classes of objects are tested in combinations as nine ''target sets'', in addition to the search for a neutron flux from the Galactic center or from the Galactic plane. Within a target set, each candidate source is weighted in proportion to its electromagnetic flux, its exposure to the Auger Observatory, and its flux attenuation factor due to neutron decay. These searches do not find evidence for a neutron flux from any class of candidate sources. Tabulated results give the combined p-value for each class, with and without the weights, and also the flux upper limit for the most significant candidate source within each class. These limits on fluxes of neutrons significantly constrain models of EeV proton emission from non-transient discrete sources in the Galaxy.

  18. A Targeted Search for Point Sources of EeV Photons with the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Aab, A. [Institute for Mathematics, Astrophysics and Particle Physics (IMAPP), Radboud Universiteit, Nijmegen (Netherlands); Abreu, P. [Laboratório de Instrumentação e Física Experimental de Partículas—LIP and Instituto Superior Técnico—IST, Universidade de Lisboa—UL, Lisbon (Portugal); Aglietta, M. [INFN, Sezione di Torino, Torino (Italy); Samarai, I. Al [Laboratoire de Physique Nucléaire et de Hautes Energies (LPNHE), Universités Paris 6 et Paris 7, CNRS-IN2P3, Paris (France); Albuquerque, I. F. M. [Universidade de São Paulo, Inst. de Física, São Paulo (Brazil); Allekotte, I. [Centro Atómico Bariloche and Instituto Balseiro (CNEA-UNCuyo-CONICET), San Carlos de Bariloche (Argentina); Almela, A. [Instituto de Tecnologías en Detección y Astropartículas (CNEA, CONICET, UNSAM), Centro Atómico Constituyentes, Comisión Nacional de Energía Atómica, Buenos Aires (Argentina); Castillo, J. Alvarez [Universidad Nacional Autónoma de México, México, D. F., México (Mexico); Alvarez-Muñiz, J. [Universidad de Santiago de Compostela, La Coruña (Spain); Anastasi, G. A. [Gran Sasso Science Institute (INFN), L’Aquila (Italy); and others

    2017-03-10

    Simultaneous measurements of air showers with the fluorescence and surface detectors of the Pierre Auger Observatory allow a sensitive search for EeV photon point sources. Several Galactic and extragalactic candidate objects are grouped in classes to reduce the statistical penalty of many trials from that of a blind search and are analyzed for a significant excess above the background expectation. The presented search does not find any evidence for photon emission at candidate sources, and combined p -values for every class are reported. Particle and energy flux upper limits are given for selected candidate sources. These limits significantly constrain predictions of EeV proton emission models from non-transient Galactic and nearby extragalactic sources, as illustrated for the particular case of the Galactic center region.

  19. Change point analysis of mean annual air temperature in Iran

    Science.gov (United States)

    Shirvani, A.

    2015-06-01

    The existence of change point in the mean of air temperature is an important indicator of climate change. In this study, Student's t parametric and Mann-Whitney nonparametric Change Point Models (CPMs) were applied to test whether a change point has occurred in the mean of annual Air Temperature Anomalies Time Series (ATATS) of 27 synoptic stations in different regions of Iran for the period 1956-2010. The Likelihood Ratio Test (LRT) was also applied to evaluate the detected change points. The ATATS of all stations except Bandar Anzali and Gorgan stations, which were serially correlated, were transformed to produce an uncorrelated pre-whitened time series as an input file for the CPMs and LRT. Both the Student's t and Mann-Whitney CPMs detected the change point in the ATATS of (a) Tehran Mehrabad, Abadan, Kermanshah, Khoramabad and Yazd in 1992, (b) Mashhad and Tabriz in 1993, (c) Bandar Anzali, Babolsar and Ramsar in 1994, (d) Kerman and Zahedan in 1996 at 5% significance level. The likelihood ratio test shows that the ATATS before and after detected change points in these 12 stations are normally distributed with different means. The Student's t and Mann-Whitney CPMs suggested different change points for individual stations in Bushehr, Bam, Shahroud, and Gorgan. However, the LRT confirmed the change points in these four stations as 1997, 1996, 1993, and 1996, respectively. No change points were detected in the remaining 11 stations.

  20. Thermal Hydraulic and Structural Analysis of Liquid Metal Target System

    International Nuclear Information System (INIS)

    Lee, Yong Suk; Chung, Chang Hyun

    2002-01-01

    A subcritical transmutation reactor research is in progress for treatment of spent fuel. The subcritical transmutation reactor needs target system to produce high-energy neutrons. In target system, beam window is subject to high thermal field, because it interacts with high energy proton beam. In this study, target was designed based on thermal-hydraulic analysis, and thermal-structural analysis of window was performed. Preliminary design and mechanical analysis of liquid Pb-Bi target and 9Cr-2WVTa window were performed. Target was designed in a way to decrease window temperature. Installation of diffuse plate which has higher porosity in central zone was considered. Temperature and stress of window were analyzed varying minimum window thickness, beam power, and coolant flow rate. Thermal-bending stress was generated in window because of temperature gradient along the thickness of window. Coolant flow rate had insignificant effect on window stresses. It can be concluded that the target and window can be used in transmutation reactor operating condition (1 GeV, 6.78 mA). In this study, only static analysis has been made. But, accelerator beam trip can frequently occur in accelerator operation, so window and target container dynamic stress analysis will be needed. Furthermore, study about corrosion or irradiation characteristics of window will be needed in designing target and window. (authors)

  1. Gene Expression Profile of Glioblastoma Multiforme Invasive Phenotype Points to New Therapeutic Targets

    Directory of Open Access Journals (Sweden)

    Dominique B. Hoelzinger

    2005-01-01

    Full Text Available The invasive phenotype of glioblastoma multiforme (GBM is a hallmark of malignant process, yet molecular mechanisms that dictate this locally invasive behavior remain poorly understood. Gene expression profiles of human glioma cells were assessed from laser capture-microdissected GBM cells collected from paired patient tumor cores and white matterinvading cell populations. Changes in gene expression in invading GBM cells were validated by quantitative reverse transcription polymerase chain reaction (QRTPCR and immunohistochemistry in an independent sample set. QRT-PCR confirmed the differential expression in 19 of 21 genes tested. Immunohistochemical analyses of autotaxin (ATX, ephrin 133, B-cell lymphoma-w (BCLW, and protein tyrosine kinase 2 beta showed them to be expressed in invasive glioma cells. The known GBM markers, insulin-like growth factor binding protein 2 and vimentin, were robustly expressed in the tumor core. A glioma invasion tissue microarray confirmed the expression of ATX and BCLW in invasive cells of tumors of various grades. GBM phenotypic and genotypic heterogeneity is well documented. In this study, we show an additional layer of complexity: transcriptional differences between cells of tumor core and invasive cells located in the brain parenchyma. Gene products supporting invasion may be novel targets for manipulation of brain tumor behavior with consequences on treatment outcome.

  2. psRNATarget: a plant small RNA target analysis server.

    Science.gov (United States)

    Dai, Xinbin; Zhao, Patrick Xuechun

    2011-07-01

    Plant endogenous non-coding short small RNAs (20-24 nt), including microRNAs (miRNAs) and a subset of small interfering RNAs (ta-siRNAs), play important role in gene expression regulatory networks (GRNs). For example, many transcription factors and development-related genes have been reported as targets of these regulatory small RNAs. Although a number of miRNA target prediction algorithms and programs have been developed, most of them were designed for animal miRNAs which are significantly different from plant miRNAs in the target recognition process. These differences demand the development of separate plant miRNA (and ta-siRNA) target analysis tool(s). We present psRNATarget, a plant small RNA target analysis server, which features two important analysis functions: (i) reverse complementary matching between small RNA and target transcript using a proven scoring schema, and (ii) target-site accessibility evaluation by calculating unpaired energy (UPE) required to 'open' secondary structure around small RNA's target site on mRNA. The psRNATarget incorporates recent discoveries in plant miRNA target recognition, e.g. it distinguishes translational and post-transcriptional inhibition, and it reports the number of small RNA/target site pairs that may affect small RNA binding activity to target transcript. The psRNATarget server is designed for high-throughput analysis of next-generation data with an efficient distributed computing back-end pipeline that runs on a Linux cluster. The server front-end integrates three simplified user-friendly interfaces to accept user-submitted or preloaded small RNAs and transcript sequences; and outputs a comprehensive list of small RNA/target pairs along with the online tools for batch downloading, key word searching and results sorting. The psRNATarget server is freely available at http://plantgrn.noble.org/psRNATarget/.

  3. Pressure Points in Reading Comprehension: A Quantile Multiple Regression Analysis

    Science.gov (United States)

    Logan, Jessica

    2017-01-01

    The goal of this study was to examine how selected pressure points or areas of vulnerability are related to individual differences in reading comprehension and whether the importance of these pressure points varies as a function of the level of children's reading comprehension. A sample of 245 third-grade children were given an assessment battery…

  4. A targeted search for point sources of eev photons with the Pierre Auger Observatory

    Czech Academy of Sciences Publication Activity Database

    Aab, A.; Abreu, P.; Aglietta, M.; Blažek, Jiří; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Mandát, Dušan; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovánek, Petr; Trávníček, Petr; Vícha, Jakub

    2017-01-01

    Roč. 837, č. 2 (2017), 1-7, č. článku L25. ISSN 2041-8205 R&D Projects: GA MŠk LM2015038; GA MŠk LG15014; GA ČR(CZ) GA14-17501S Institutional support: RVO:68378271 Keywords : astroparticle physics * cosmic rays * methods * data analysis Subject RIV: BF - Elementary Particles and High Energy Physics OBOR OECD: Particles and field physics Impact factor: 5.522, year: 2016

  5. Pharmacological receptors of nematoda as target points for action of antiparasitic drugs

    Directory of Open Access Journals (Sweden)

    Trailović Saša M.

    2010-01-01

    Full Text Available Cholinergic receptors of parasitic nematodes are one of the most important possible sites of action of antiparasitic drugs. This paper presents some of our own results of electrophysiological and pharamcological examinations of nicotinic and muscarinic receptors of nematodes, as well as data from literature on a new class of anthelmintics that act precisely on cholinergic receptors. The nicotinic acetylcholine receptor (nAChR is located on somatic muscle cells of nematodes and it is responsible for the coordination of parasite movement. Cholinomimetic anthelmintics act on this receptor, as well as acetylcholine, an endogenic neurotransmitter, but they are not sensitive to enzyme acetylcholineesterase which dissolves acetylcholine. As opposed to the nicotinic receptor of vertebra, whose structure has been examined thoroughly, the stoichiometry of the nicotinic receptor of nematodes is not completely known. However, on the grounds of knowledge acquired so far, a model has been constructed recently of the potential composition of a type of nematodes nicotinic receptor, as the site of action of anthelmintics. Based on earlier investigations, it is supposed that a conventional muscarinic receptor exists in nematodes as well, so that it can also be a new pharamocological target for the development of antinematode drugs. The latest class of synthesized anthelmintics, named aminoacetonitriles (AAD, act via the nicotinic receptor. Monepantel is the first drug from the AAD group as a most significant candidate for registration in veterinary medicine. Even though several groups of cholinomimetic anthelmintics (imiodazothiazoles, tetrahydropyrimidines, organophosphat anthelmintics have been in use in veterinary practice for many years now, it is evident that cholinergic receptors of nematodes still present an attractive place in the examinations and development of new antinematode drugs. .

  6. A targeted search for point sources of EeV neutrons

    Czech Academy of Sciences Publication Activity Database

    Aab, A.; Abreu, P.; Aglietta, M.; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Mandát, Dušan; Nečesal, Petr; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovánek, Petr; Trávníček, Petr; Vícha, Jakub

    2014-01-01

    Roč. 789, č. 2 (2014), s. 1-7 ISSN 2041-8205 R&D Projects: GA ČR(CZ) GA14-17501S; GA MŠk(CZ) 7AMB14AR005; GA MŠk(CZ) LG13007 Institutional support: RVO:68378271 Keywords : cosmic rays * Galaxy * disk * methods * data analysis Subject RIV: BF - Elementary Particles and High Energy Physics Impact factor: 5.339, year: 2014 http://iopscience.iop.org/2041-8205/789/2/L34/pdf/2041-8205_789_2_L34.pdf

  7. Fast Change Point Detection for Electricity Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Berkeley, UC; Gu, William; Choi, Jaesik; Gu, Ming; Simon, Horst; Wu, Kesheng

    2013-08-25

    Electricity is a vital part of our daily life; therefore it is important to avoid irregularities such as the California Electricity Crisis of 2000 and 2001. In this work, we seek to predict anomalies using advanced machine learning algorithms. These algorithms are effective, but computationally expensive, especially if we plan to apply them on hourly electricity market data covering a number of years. To address this challenge, we significantly accelerate the computation of the Gaussian Process (GP) for time series data. In the context of a Change Point Detection (CPD) algorithm, we reduce its computational complexity from O($n^{5}$) to O($n^{2}$). Our efficient algorithm makes it possible to compute the Change Points using the hourly price data from the California Electricity Crisis. By comparing the detected Change Points with known events, we show that the Change Point Detection algorithm is indeed effective in detecting signals preceding major events.

  8. Assessment of Response Surface Models using Independent Confirmation Point Analysis

    Science.gov (United States)

    DeLoach, Richard

    2010-01-01

    This paper highlights various advantages that confirmation-point residuals have over conventional model design-point residuals in assessing the adequacy of a response surface model fitted by regression techniques to a sample of experimental data. Particular advantages are highlighted for the case of design matrices that may be ill-conditioned for a given sample of data. The impact of both aleatory and epistemological uncertainty in response model adequacy assessments is considered.

  9. 21 CFR 120.8 - Hazard Analysis and Critical Control Point (HACCP) plan.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Hazard Analysis and Critical Control Point (HACCP... SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS General Provisions § 120.8 Hazard Analysis and Critical Control Point (HACCP) plan. (a) HACCP plan. Each...

  10. Endoscopic Third Ventriculostomy: Outcome Analysis of an Anterior Entry Point.

    Science.gov (United States)

    Aref, Mohammed; Martyniuk, Amanda; Nath, Siddharth; Koziarz, Alex; Badhiwala, Jetan; Algird, Almunder; Farrokhyar, Forough; Almenawer, Saleh A; Reddy, Kesava

    2017-08-01

    Endoscopic third ventriculostomy (ETV) is a safe and effective treatment for hydrocephalus. An entry point located 4 cm anterior to the coronal suture, 3 cm anterior to Kocher point, and approximately 9 cm from the pupil at the midpupillary line has been used successfully for the last 20 years in our center. We aimed to evaluate this alternative anterior entry point routinely used for ETV, with or without concurrent endoscopic biopsy. Patients undergoing this proposed entry point were examined to evaluate its safety and efficacy. Factors such as patients' age, sex, hydrocephalus etiology, tumor location and pathology, and complication rate were examined through regression analyses to evaluate their impact on tumor biopsy and ETV success rates, and the need for subsequent ventricular shunting. A total of 131 patients were included in the study. ETV was successful in 125 (95.4%) patients. Of these, 26 (19.8%) patients required a biopsy, which was successful in 21 (80.8%) cases. A complication was observed in 10 (7.6%) patients, with a trend toward complications occurring after ETV failure. There was no association between ETV success rate and patients' age (P = 0.5) or sex (P = 0.99). The anterior entry point is a safe and effective method for ETV, especially when considering concurrent ventricular tumor biopsy. This entry point may be considered as a more minimally invasive procedure when using rigid endoscopy and may also eliminate the need for a flexible scope. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Risk-analysis of global climate tipping points

    Energy Technology Data Exchange (ETDEWEB)

    Frieler, Katja; Meinshausen, Malte; Braun, N. [Potsdam Institute for Climate Impact Research e.V., Potsdam (Germany). PRIMAP Research Group] [and others

    2012-09-15

    There are many elements of the Earth system that are expected to change gradually with increasing global warming. Changes might prove to be reversible after global warming returns to lower levels. But there are others that have the potential of showing a threshold behavior. This means that these changes would imply a transition between qualitatively disparate states which can be triggered by only small shifts in background climate (2). These changes are often expected not to be reversible by returning to the current level of warming. The reason for that is, that many of them are characterized by self-amplifying processes that could lead to a new internally stable state which is qualitatively different from before. There are different elements of the climate system that are already identified as potential tipping elements. This group contains the mass losses of the Greenland and the West-Antarctic Ice Sheet, the decline of the Arctic summer sea ice, different monsoon systems, the degradation of coral reefs, the dieback of the Amazon rainforest, the thawing of the permafrost regions as well as the release of methane hydrates (3). Crucially, these tipping elements have regional to global scale effects on human society, biodiversity and/or ecosystem services. Several examples may have a discernable effect on global climate through a large-scale positive feedback. This means they would further amplify the human induced climate change. These tipping elements pose risks comparable to risks found in other fields of human activity: high-impact events that have at least a few percent chance to occur classify as high-risk events. In many of these examples adaptation options are limited and prevention of occurrence may be a more viable strategy. Therefore, a better understanding of the processes driving tipping points is essential. There might be other tipping elements even more critical but not yet identified. These may also lie within our socio-economic systems that are

  12. Material-Point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    2007-01-01

    The aim of this paper is to test different types of spatial interpolation for the material-point method. The interpolations include quadratic elements and cubic splines. A brief introduction to the material-point method is given. Simple liner-elastic problems are tested, including the classical...... cantilevered beam problem. As shown in the paper, the use of negative shape functions is not consistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations of field quantities. It is shown...... that the smoother field representation using the cubic splines yields a physically more realistic behaviour for impact problems than the traditional linear interpolation....

  13. Analysis of multiple end points in consumer research in support of switching drugs from prescription to over-the-counter status: the concept of end-point hierarchies.

    Science.gov (United States)

    Brass, E P; Shay, L E; Leonard-Segal, A

    2009-04-01

    Clinical and regulatory decision making concerning over-the-counter (OTC) drugs requires research designed to understand how consumers will self-manage treatment using the candidate OTC drug. Consumer research for an OTC drug may include studies of label comprehension, self-selection, and actual use. Definition and analysis of end points for these trials have varied in the absence of consensus on optimal approaches. Research programs should prospectively prioritize the importance of label messages based on their roles in the safe and effective use of the drug. The assessment of messages for which failure to heed warnings will expose the consumer to increased risk or clinically relevant treatment failure should receive the highest priority as study end points. Based on the consequences of unheeded warnings, message-specific targets for appropriate response rates can be predefined. This prospective, hierarchical approach to end-point definition, combined with prespecification of targeted correct-response rates, has the potential to increase the scientific rigor and regulatory utility of these important research studies.

  14. Washing and chilling as critical control points in pork slaughter hazard analysis and critical control point (HACCP) systems.

    Science.gov (United States)

    Bolton, D J; Pearce, R A; Sheridan, J J; Blair, I S; McDowell, D A; Harrington, D

    2002-01-01

    The aim of this research was to examine the effects of preslaughter washing, pre-evisceration washing, final carcass washing and chilling on final carcass quality and to evaluate these operations as possible critical control points (CCPs) within a pork slaughter hazard analysis and critical control point (HACCP) system. This study estimated bacterial numbers (total viable counts) and the incidence of Salmonella at three surface locations (ham, belly and neck) on 60 animals/carcasses processed through a small commercial pork abattoir (80 pigs d(-1)). Significant reductions (P HACCP in pork slaughter plants. This research will provide a sound scientific basis on which to develop and implement effective HACCP in pork abattoirs.

  15. Accurate mitochondrial DNA sequencing using off-target reads provides a single test to identify pathogenic point mutations.

    Science.gov (United States)

    Griffin, Helen R; Pyle, Angela; Blakely, Emma L; Alston, Charlotte L; Duff, Jennifer; Hudson, Gavin; Horvath, Rita; Wilson, Ian J; Santibanez-Koref, Mauro; Taylor, Robert W; Chinnery, Patrick F

    2014-12-01

    Mitochondrial disorders are a common cause of inherited metabolic disease and can be due to mutations affecting mitochondrial DNA or nuclear DNA. The current diagnostic approach involves the targeted resequencing of mitochondrial DNA and candidate nuclear genes, usually proceeds step by step, and is time consuming and costly. Recent evidence suggests that variations in mitochondrial DNA sequence can be obtained from whole-exome sequence data, raising the possibility of a comprehensive single diagnostic test to detect pathogenic point mutations. We compared the mitochondrial DNA sequence derived from off-target exome reads with conventional mitochondrial DNA Sanger sequencing in 46 subjects. Mitochondrial DNA sequences can be reliably obtained using three different whole-exome sequence capture kits. Coverage correlates with the relative amount of mitochondrial DNA in the original genomic DNA sample, heteroplasmy levels can be determined using variant and total read depths, and-providing there is a minimum read depth of 20-fold-rare sequencing errors occur at a rate similar to that observed with conventional Sanger sequencing. This offers the prospect of using whole-exome sequence in a diagnostic setting to screen not only all protein coding nuclear genes but also all mitochondrial DNA genes for pathogenic mutations. Off-target mitochondrial DNA reads can also be used to assess quality control and maternal ancestry, inform on ethnic origin, and allow genetic disease association studies not previously anticipated with existing whole-exome data sets.

  16. The 21 point vision analysis without a phoropter.

    Science.gov (United States)

    Kasperek, E L; Hatfield, C

    1975-10-01

    This paper offers a sequence of valid tests, objective in nature, to measure the vision performance and provide an appropriate prescription for an individual who is unable to undergo the demands of a 21 point analytical refraction, through a phoropter, due to age, intelligence or communication difficulty.

  17. Material-Point Analysis of Large-Strain Problems

    DEFF Research Database (Denmark)

    Andersen, Søren

    , it is possible to predict if a certain slope is stable using commercial finite element or finite difference software such as PLAXIS, ABAQUS or FLAC. However, the dynamics during a landslide are less explored. The material-point method (MPM) is a novel numerical method aimed at analysing problems involving...

  18. Material-point Method Analysis of Bending in Elastic Beams

    DEFF Research Database (Denmark)

    Andersen, Søren Mikkel; Andersen, Lars

    The aim of this paper is to test different types of spatial interpolation for the materialpoint method. The interpolations include quadratic elements and cubic splines. A brief introduction to the material-point method is given. Simple liner-elastic problems are tested, including the classical...... cantilevered beam problem. As shown in the paper, the use of negative shape functions is not consistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations of field quantities. It is shown...... that the smoother field representation using the cubic splines yields a physically more realistic behaviour for impact problems than the traditional linear interpolation....

  19. Analysis of Spatial Interpolation in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2010-01-01

    This paper analyses different types of spatial interpolation for the material-point method The interpolations include quadratic elements and cubic splines in addition to the standard linear shape functions usually applied. For the small-strain problem of a vibrating bar, the best results...... are obtained using quadratic elements. It is shown that for more complex problems, the use of partially negative shape functions is inconsistent with the material-point method in its current form, necessitating other types of interpolation such as cubic splines in order to obtain smoother representations...... of field quantities The properties of different interpolation functions are analysed using numerical examples, including the classical cantil-evered beam problem....

  20. Unified analysis of preconditioning methods for saddle point matrices

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe

    2015-01-01

    Roč. 22, č. 2 (2015), s. 233-253 ISSN 1070-5325 R&D Projects: GA MŠk ED1.1.00/02.0070 Institutional support: RVO:68145535 Keywords : saddle point problems * preconditioning * spectral properties Subject RIV: BA - General Mathematics Impact factor: 1.431, year: 2015 http://onlinelibrary.wiley.com/doi/10.1002/nla.1947/pdf

  1. Hazard Analysis and Critical Control Point Program for Foodservice Establishments.

    Science.gov (United States)

    Control Point ( HACCP ) inspections in foodservice operations throughout the state. The HACCP system, which first emerged in the late 1960s, is a rational...has been adopted for use in the foodservice industry. The HACCP system consists of three main components which are the: (1) Assessment of the hazards...operations. This manual was developed to assist local sanitarians in conducting HACCP inspections and in educating foodservice operators and employees

  2. [Segment analysis of the target market of physiotherapeutic services].

    Science.gov (United States)

    Babaskin, D V

    2010-01-01

    The objective of the present study was to demonstrate the possibilities to analyse selected segments of the target market of physiotherapeutic services provided by medical and preventive-facilities of two major types. The main features of a target segment, such as provision of therapeutic massage, are illustrated in terms of two characteristics, namely attractiveness to the users and the ability of a given medical facility to satisfy their requirements. Based on the analysis of portfolio of the available target segments the most promising ones (winner segments) were selected for further marketing studies. This choice does not exclude the possibility of involvement of other segments of medical services in marketing activities.

  3. Crossing thresholds: Analysis of hazardous tipping points in alpine catchments

    Science.gov (United States)

    Lutzmann, Silke; Sass, Oliver

    2016-04-01

    Steep mountain channels or torrents in small alpine catchments are characterized by high geomorphic activity with sediment dynamics being inherently nonlinear and threshold-mediated. Localized, high intensity rainstorms can drive torrential systems past a tipping point resulting in a sudden onset of hazardous events like (flash-) flooding, heavy bedload transport or debris flows. Such responses exhibit an abrupt switch in the fluvial system's mode (e.g. transport / supply limited). Changes in functional connectivity may persist beyond the tipping point. Torrential hazards cause costly damage in the densely populated Alpine Region. Thus, there is a rising interest in potential effects of climate change on torrential sediment dynamics. Understanding critical conditions close to tipping points is important to reduce uncertainty in predicting sediment fluxes. In this study we aim at (i) establishing threshold precipitation characteristics for the Eastern Alps of Austria. Precipitation is hypothesized to be the main forcing factor of torrential events. (ii) How do thresholds vary in space and time? (iii) The effect of external triggers is strongly mediated by the internal disposition of catchments to respond. Which internal conditions are critical for susceptibility? (iv) Is there a change in magnitude or frequency in the recent past and what can be expected for the future? The 71 km2 catchment of the river Schöttlbach in the East Alpine Region of Styria (Austria) is monitored since a heavy precipitation event resulted in a catastrophic flood in July 2011. Sediment mobilization from slopes as well as within-channel storage and bedload transport are regularly measured using photogrammetric methods and sediment impact sensors. Thus, detailed knowledge exists on magnitude and spatial propagation of sediment waves through the catchment. The associated hydro-meteorological (pre-) conditions can be inferred from a dense station network. Changing bedload transport rates and

  4. Mitochondrially targeted ZFNs for selective degradation of pathogenic mitochondrial genomes bearing large-scale deletions or point mutations

    Science.gov (United States)

    Gammage, Payam A; Rorbach, Joanna; Vincent, Anna I; Rebar, Edward J; Minczuk, Michal

    2014-01-01

    We designed and engineered mitochondrially targeted obligate heterodimeric zinc finger nucleases (mtZFNs) for site-specific elimination of pathogenic human mitochondrial DNA (mtDNA). We used mtZFNs to target and cleave mtDNA harbouring the m.8993T>G point mutation associated with neuropathy, ataxia, retinitis pigmentosa (NARP) and the “common deletion” (CD), a 4977-bp repeat-flanked deletion associated with adult-onset chronic progressive external ophthalmoplegia and, less frequently, Kearns-Sayre and Pearson's marrow pancreas syndromes. Expression of mtZFNs led to a reduction in mutant mtDNA haplotype load, and subsequent repopulation of wild-type mtDNA restored mitochondrial respiratory function in a CD cybrid cell model. This study constitutes proof-of-principle that, through heteroplasmy manipulation, delivery of site-specific nuclease activity to mitochondria can alleviate a severe biochemical phenotype in primary mitochondrial disease arising from deleted mtDNA species. PMID:24567072

  5. Analysis of target implosion irradiated by proton beam, (1)

    International Nuclear Information System (INIS)

    Tamba, Moritake; Nagata, Norimasa; Kawata, Shigeo; Niu, Keishiro.

    1982-10-01

    Numerical simulation and analysis were performed for the implosion of a hollow shell target driven by proton beam. The target consists of three layers of Pb, Al and DT. As the Al layer is heated by proton beam, the layer expands and pushes the DT layer toward the target center. To obtain the optimal velocity of DT implosion, the optimal target size and optimal layer thickness were determined. The target size is determined by, for example, the instability of the implosion or beam focusing on the target surface. The Rayleigh-Taylor instability and the unstable implosion due to the inhomogeneity were investigated. Dissipation, nonlinear effects and density gradient at the boundary were expected to reduce the growth rate of the Rayleigh-Taylor instability during the implosion. In order that the deviation of the boundary surface during the implosion is less than the thickness of fuel, the inhomogeneity of the temperature and the density of the target should be less than ten percent. The amplitude of the boundary surface roughness is required to be less than 4 micrometer. (Kato, T.)

  6. The Analysis of Forming Forces in Single Point Incremental Forming

    Directory of Open Access Journals (Sweden)

    Koh Kyung Hee

    2016-01-01

    Full Text Available Incremental forming is a process to produce sheet metal parts in quick. Because there is no need for dedicated dies and molds, this process is less cost and time spent. The purpose of this study is to investigate forming forces in single point incremental forming. Producing a cone frustum of aluminum is tested for forming forces. A dynamometer is used to collect forming forces and analyze them. These forces are compared with cutting forces upon producing same geometrical shapes of experimental parts. The forming forces in Z direction are 40 times larger than the machining forces. A spindle and its axis of a forming machine should be designed enough to withstand the forming forces.

  7. Fixed-point error analysis of Winograd Fourier transform algorithms

    Science.gov (United States)

    Patterson, R. W.; Mcclellan, J. H.

    1978-01-01

    The quantization error introduced by the Winograd Fourier transform algorithm (WFTA) when implemented in fixed-point arithmetic is studied and compared with that of the fast Fourier transform (FFT). The effect of ordering the computational modules and the relative contributions of data quantization error and coefficient quantization error are determined. In addition, the quantization error introduced by the Good-Winograd (GW) algorithm, which uses Good's prime-factor decomposition for the discrete Fourier transform (DFT) together with Winograd's short length DFT algorithms, is studied. Error introduced by the WFTA is, in all cases, worse than that of the FFT. In general, the WFTA requires one or two more bits for data representation to give an error similar to that of the FFT. Error introduced by the GW algorithm is approximately the same as that of the FFT.

  8. Flow pattern analysis for a well defined by point sinks

    NARCIS (Netherlands)

    Nieuwenhuizen, R.; Zijl, W.; Veldhuizen, M. van

    1995-01-01

    This paper deals with the analytical description of single-phase flow caused by abstraction wells and governed by Darcy's law. Since we are mainly interested in the velocity field upon which an analysis of transport phenomena can be based, we may assume that the flow is quasi steady. A well may be

  9. Analysis of the approach to the convection instability point

    NARCIS (Netherlands)

    Boon, J.P.; Lekkerkerker, H.N.W.

    1974-01-01

    A spectral analysis is presented of the fluctuations in a horizontal fluid layer subject to a downward directed temperature gradient, which, for a critical value, drives the system in a convective instability state. It is found that the external force resulting from the combination of the

  10. Stability analysis of the carbuncle phenomenon and the sonic point ...

    Indian Academy of Sciences (India)

    Aayush Agrawal

    instability, we perform a thorough stability analysis and extend previous studies by analysing the pseudo-spectra and hence the ... viour in Riemann solvers and thus help in the design of better solvers for high-Reynolds-number flows. Keywords. ..... the domain and random perturbations being added to all flow variables.

  11. Measuring onion cultivars with image analysis using inflection points

    NARCIS (Netherlands)

    Heijden, van der G.W.A.M.; Vossepoel, A.M.; Polder, G.

    1996-01-01

    The suitability of image analysis was studied to measure bulb characteristics for varietal testing of onions (Allium cepa L.). Eighteen genotypes were used, which covered a whole range of onion shapes, including some quite identical ones. The characteristic height and diameter were measured both by

  12. Real Time Intelligent Target Detection and Analysis with Machine Vision

    Science.gov (United States)

    Howard, Ayanna; Padgett, Curtis; Brown, Kenneth

    2000-01-01

    We present an algorithm for detecting a specified set of targets for an Automatic Target Recognition (ATR) application. ATR involves processing images for detecting, classifying, and tracking targets embedded in a background scene. We address the problem of discriminating between targets and nontarget objects in a scene by evaluating 40x40 image blocks belonging to an image. Each image block is first projected onto a set of templates specifically designed to separate images of targets embedded in a typical background scene from those background images without targets. These filters are found using directed principal component analysis which maximally separates the two groups. The projected images are then clustered into one of n classes based on a minimum distance to a set of n cluster prototypes. These cluster prototypes have previously been identified using a modified clustering algorithm based on prior sensed data. Each projected image pattern is then fed into the associated cluster's trained neural network for classification. A detailed description of our algorithm will be given in this paper. We outline our methodology for designing the templates, describe our modified clustering algorithm, and provide details on the neural network classifiers. Evaluation of the overall algorithm demonstrates that our detection rates approach 96% with a false positive rate of less than 0.03%.

  13. Kick-Off Point (KOP and End of Buildup (EOB Data Analysis in Trajectory Design

    Directory of Open Access Journals (Sweden)

    Novrianti Novrianti

    2017-06-01

    Full Text Available Well X is a development well which is directionally drilled. Directional drilling is choosen because the coordinate target of Well X is above the buffer zone. The directional track plan needs accurate survey calculation in order to make the righ track for directional drilling. There are many survey calculation in directional drilling such as tangential, underbalance, average angle, radius of curvature, and mercury method. Minimum curvature method is used in this directional track plan calculation. This method is used because it gives less error than other method.  Kick-Off Point (KOP and End of Buildup (EOB analysis is done at 200 ft, 400 ft, and 600 ft depth to determine the trajectory design and optimal inclination. The hole problem is also determined in this trajectory track design. Optimal trajectory design determined at 200 ft depth because the inclination below 35º and also already reach the target quite well at 1632.28 ft TVD and 408.16 AHD. The optimal inclination at 200 ft KOP depth because the maximum inclination is 18.87º which is below 35º. Hole problem will occur if the trajectory designed at 600 ft. The problems are stuck pipe and the casing or tubing will not able to bend.

  14. Editorial - Wikipedia popularity from a citation analysis point of view

    OpenAIRE

    Alireza Noruzi

    2009-01-01

    This study aims to provide an overview of the citation rate of Wikipedia since its launch in 2004. It is worth noting that since its inception Wikipedia, the free international multi-lingual encyclopedia, has been subject to criticism (Fasoldt, 2004; Orlowski, 2005; Lipczynska, 2005). Wikipedia as a popular web resource appears in response to every keyword search on Google. One way to test the popularity of a web resource is to use citation analysis method to predict to what extend it is cite...

  15. Penetration analysis of projectile with inclined concrete target

    Directory of Open Access Journals (Sweden)

    Kim S.B.

    2015-01-01

    Full Text Available This paper presents numerical analysis result of projectile penetration with concrete target. We applied dynamic material properties of 4340 steels, aluminium and explosive for projectile body. Dynamic material properties were measured with static tensile testing machine and Hopkinson pressure bar tests. Moreover, we used three concrete damage models included in LS-DYNA 3D, such as SOIL_CONCRETE, CSCM (cap model with smooth interaction and CONCRETE_DAMAGE (K&C concrete models. Strain rate effect for concrete material is important to predict the fracture deformation and shape of concrete, and penetration depth for projectiles. CONCRETE_DAMAGE model with strain rate effect also applied to penetration analysis. Analysis result with CSCM model shows good agreement with penetration experimental data. The projectile trace and fracture shapes of concrete target were compared with experimental data.

  16. Penetration analysis of projectile with inclined concrete target

    Science.gov (United States)

    Kim, S. B.; Kim, H. W.; Yoo, Y. H.

    2015-09-01

    This paper presents numerical analysis result of projectile penetration with concrete target. We applied dynamic material properties of 4340 steels, aluminium and explosive for projectile body. Dynamic material properties were measured with static tensile testing machine and Hopkinson pressure bar tests. Moreover, we used three concrete damage models included in LS-DYNA 3D, such as SOIL_CONCRETE, CSCM (cap model with smooth interaction) and CONCRETE_DAMAGE (K&C concrete) models. Strain rate effect for concrete material is important to predict the fracture deformation and shape of concrete, and penetration depth for projectiles. CONCRETE_DAMAGE model with strain rate effect also applied to penetration analysis. Analysis result with CSCM model shows good agreement with penetration experimental data. The projectile trace and fracture shapes of concrete target were compared with experimental data.

  17. POINT CLOUD REFINEMENT WITH A TARGET-FREE INTRINSIC CALIBRATION OF A MOBILE MULTI-BEAM LIDAR SYSTEM

    Directory of Open Access Journals (Sweden)

    H. Nouiraa

    2016-06-01

    Full Text Available LIDAR sensors are widely used in mobile mapping systems. The mobile mapping platforms allow to have fast acquisition in cities for example, which would take much longer with static mapping systems. The LIDAR sensors provide reliable and precise 3D information, which can be used in various applications: mapping of the environment; localization of objects; detection of changes. Also, with the recent developments, multi-beam LIDAR sensors have appeared, and are able to provide a high amount of data with a high level of detail. A mono-beam LIDAR sensor mounted on a mobile platform will have an extrinsic calibration to be done, so the data acquired and registered in the sensor reference frame can be represented in the body reference frame, modeling the mobile system. For a multibeam LIDAR sensor, we can separate its calibration into two distinct parts: on one hand, we have an extrinsic calibration, in common with mono-beam LIDAR sensors, which gives the transformation between the sensor cartesian reference frame and the body reference frame. On the other hand, there is an intrinsic calibration, which gives the relations between the beams of the multi-beam sensor. This calibration depends on a model given by the constructor, but the model can be non optimal, which would bring errors and noise into the acquired point clouds. In the litterature, some optimizations of the calibration parameters are proposed, but need a specific routine or environment, which can be constraining and time-consuming. In this article, we present an automatic method for improving the intrinsic calibration of a multi-beam LIDAR sensor, the Velodyne HDL-32E. The proposed approach does not need any calibration target, and only uses information from the acquired point clouds, which makes it simple and fast to use. Also, a corrected model for the Velodyne sensor is proposed. An energy function which penalizes points far from local planar surfaces is used to optimize the different

  18. Standard hazard analysis, critical control point and hotel management

    Directory of Open Access Journals (Sweden)

    Vujačić Vesna

    2017-01-01

    Full Text Available Tourism is a dynamic category which is continuously evolving in the world. Specificities that have to be respected in the execution in relation to the food industry are connected with the fact that the main differences which exist regarding the food serving procedure in catering, numerous complex recipes and production technologies, staff fluctuation, old equipment. For an effective and permanent implementation, the HACCP concept is very important for building a serious base. In this case, the base is represented by the people handling the food. This paper presents international ISO standards, the concept of HACCP and the importance of its application in the tourism and hospitality industry. The concept of HACCP is a food safety management system through the analysis and control of biological, chemical and physical hazards in the entire process, from raw material production, procurement, handling, to manufacturing, distribution and consumption of the finished product. The aim of this paper is to present the importance of the application of HACCP concept in tourism and hotel management as a recognizable international standard.

  19. Visibility Analysis in a Point Cloud Based on the Medial Axis Transform

    NARCIS (Netherlands)

    Peters, R.; Ledoux, H.; Biljecki, F.

    2015-01-01

    Visibility analysis is an important application of 3D GIS data. Current approaches require 3D city models that are often derived from detailed aerial point clouds. We present an approach to visibility analysis that does not require a city model but works directly on the point cloud. Our approach is

  20. Integrative analysis to select cancer candidate biomarkers to targeted validation

    Science.gov (United States)

    Heberle, Henry; Domingues, Romênia R.; Granato, Daniela C.; Yokoo, Sami; Canevarolo, Rafael R.; Winck, Flavia V.; Ribeiro, Ana Carolina P.; Brandão, Thaís Bianca; Filgueiras, Paulo R.; Cruz, Karen S. P.; Barbuto, José Alexandre; Poppi, Ronei J.; Minghim, Rosane; Telles, Guilherme P.; Fonseca, Felipe Paiva; Fox, Jay W.; Santos-Silva, Alan R.; Coletta, Ricardo D.; Sherman, Nicholas E.; Paes Leme, Adriana F.

    2015-01-01

    Targeted proteomics has flourished as the method of choice for prospecting for and validating potential candidate biomarkers in many diseases. However, challenges still remain due to the lack of standardized routines that can prioritize a limited number of proteins to be further validated in human samples. To help researchers identify candidate biomarkers that best characterize their samples under study, a well-designed integrative analysis pipeline, comprising MS-based discovery, feature selection methods, clustering techniques, bioinformatic analyses and targeted approaches was performed using discovery-based proteomic data from the secretomes of three classes of human cell lines (carcinoma, melanoma and non-cancerous). Three feature selection algorithms, namely, Beta-binomial, Nearest Shrunken Centroids (NSC), and Support Vector Machine-Recursive Features Elimination (SVM-RFE), indicated a panel of 137 candidate biomarkers for carcinoma and 271 for melanoma, which were differentially abundant between the tumor classes. We further tested the strength of the pipeline in selecting candidate biomarkers by immunoblotting, human tissue microarrays, label-free targeted MS and functional experiments. In conclusion, the proposed integrative analysis was able to pre-qualify and prioritize candidate biomarkers from discovery-based proteomics to targeted MS. PMID:26540631

  1. Economic gains from targeted measures related to non-point pollution in agriculture based on detailed nitrate reduction maps.

    Science.gov (United States)

    Jacobsen, Brian H; Hansen, Anne Lausten

    2016-06-15

    From 1990 to 2003, Denmark reduced N-leaching from the root zone by 50%. However, more measures are required, and in recent years, the focus has been on how to differentiate measures in order to ensure that they are implemented where the effect on N-loss reductions per ha is the greatest. The purpose of the NiCA project has been to estimate the natural nitrate reduction in the groundwater more precisely than before using a plot size down to 1ha. This article builds on these findings and presents the possible economic gains for the farmer when using this information to reach a given N-loss level. Targeted measures are especially relevant where the subsurface N-reduction varies significantly within the same farm and national analyses have shown that a cost reduction of around 20-25% using targeted measures is likely. The analyses show an increasing potential with increasing variation in N-reduction in the catchment. In this analysis, the knowledge of spatial variation in N-reduction potential is used to place measures like catch crops or set-a-side at locations with the greatest effect on 10 case farms in the Norsminde Catchment, Denmark. The findings suggest that the gains are from 0 to 32€/ha and the average farm would gain approximately 14-21€/ha/year from the targeted measures approach. The analysis indicates that the economic gain is greater than the costs of providing the detailed maps of 5-10€/ha/year. When N-loss reduction requirements are increased, the economic gains are greater. When combined with new measures like mini-wetlands and early sowing the economic advantage is increased further. The paper also shows that not all farms can use the detailed information on N-reduction and there is not a clear link between spatial variation in N-reduction at the farm level and possible economic gains for all these 10 farms. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. TARGET - TASK ANALYSIS REPORT GENERATION TOOL, VERSION 1.0

    Science.gov (United States)

    Ortiz, C. J.

    1994-01-01

    The Task Analysis Report Generation Tool, TARGET, is a graphical interface tool used to capture procedural knowledge and translate that knowledge into a hierarchical report. TARGET is based on VISTA, a knowledge acquisition tool developed by the Naval Systems Training Center. TARGET assists a programmer and/or task expert organize and understand the steps involved in accomplishing a task. The user can label individual steps in the task through a dialogue-box and get immediate graphical feedback for analysis. TARGET users can decompose tasks into basic action kernels or minimal steps to provide a clear picture of all basic actions needed to accomplish a job. This method allows the user to go back and critically examine the overall flow and makeup of the process. The user can switch between graphics (box flow diagrams) and text (task hierarchy) versions to more easily study the process being documented. As the practice of decomposition continues, tasks and their subtasks can be continually modified to more accurately reflect the user's procedures and rationale. This program is designed to help a programmer document an expert's task thus allowing the programmer to build an expert system which can help others perform the task. Flexibility is a key element of the system design and of the knowledge acquisition session. If the expert is not able to find time to work on the knowledge acquisition process with the program developer, the developer and subject matter expert may work in iterative sessions. TARGET is easy to use and is tailored to accommodate users ranging from the novice to the experienced expert systems builder. TARGET is written in C-language for IBM PC series and compatible computers running MS-DOS and Microsoft Windows version 3.0 or 3.1. No source code is supplied. The executable also requires 2Mb of RAM, a Microsoft compatible mouse, a VGA display and an 80286, 386 or 486 processor machine. The standard distribution medium for TARGET is one 5.25 inch 360K

  3. Extended Fitts' model of pointing time in eye-gaze input system - Incorporating effects of target shape and movement direction into modeling.

    Science.gov (United States)

    Murata, Atsuo; Fukunaga, Daichi

    2018-04-01

    This study attempted to investigate the effects of the target shape and the movement direction on the pointing time using an eye-gaze input system and extend Fitts' model so that these factors are incorporated into the model and the predictive power of Fitts' model is enhanced. The target shape, the target size, the movement distance, and the direction of target presentation were set as within-subject experimental variables. The target shape included: a circle, and rectangles with an aspect ratio of 1:1, 1:2, 1:3, and 1:4. The movement direction included eight directions: upper, lower, left, right, upper left, upper right, lower left, and lower right. On the basis of the data for identifying the effects of the target shape and the movement direction on the pointing time, an attempt was made to develop a generalized and extended Fitts' model that took into account the movement direction and the target shape. As a result, the generalized and extended model was found to fit better to the experimental data, and be more effective for predicting the pointing time for a variety of human-computer interaction (HCI) task using an eye-gaze input system. Copyright © 2017. Published by Elsevier Ltd.

  4. Bioinformatics analysis of Brucella vaccines and vaccine targets using VIOLIN.

    Science.gov (United States)

    He, Yongqun; Xiang, Zuoshuang

    2010-09-27

    Brucella spp. are Gram-negative, facultative intracellular bacteria that cause brucellosis, one of the commonest zoonotic diseases found worldwide in humans and a variety of animal species. While several animal vaccines are available, there is no effective and safe vaccine for prevention of brucellosis in humans. VIOLIN (http://www.violinet.org) is a web-based vaccine database and analysis system that curates, stores, and analyzes published data of commercialized vaccines, and vaccines in clinical trials or in research. VIOLIN contains information for 454 vaccines or vaccine candidates for 73 pathogens. VIOLIN also contains many bioinformatics tools for vaccine data analysis, data integration, and vaccine target prediction. To demonstrate the applicability of VIOLIN for vaccine research, VIOLIN was used for bioinformatics analysis of existing Brucella vaccines and prediction of new Brucella vaccine targets. VIOLIN contains many literature mining programs (e.g., Vaxmesh) that provide in-depth analysis of Brucella vaccine literature. As a result of manual literature curation, VIOLIN contains information for 38 Brucella vaccines or vaccine candidates, 14 protective Brucella antigens, and 68 host response studies to Brucella vaccines from 97 peer-reviewed articles. These Brucella vaccines are classified in the Vaccine Ontology (VO) system and used for different ontological applications. The web-based VIOLIN vaccine target prediction program Vaxign was used to predict new Brucella vaccine targets. Vaxign identified 14 outer membrane proteins that are conserved in six virulent strains from B. abortus, B. melitensis, and B. suis that are pathogenic in humans. Of the 14 membrane proteins, two proteins (Omp2b and Omp31-1) are not present in B. ovis, a Brucella species that is not pathogenic in humans. Brucella vaccine data stored in VIOLIN were compared and analyzed using the VIOLIN query system. Bioinformatics curation and ontological representation of Brucella vaccines

  5. Bioinformatics analysis of Brucella vaccines and vaccine targets using VIOLIN

    Science.gov (United States)

    2010-01-01

    Background Brucella spp. are Gram-negative, facultative intracellular bacteria that cause brucellosis, one of the commonest zoonotic diseases found worldwide in humans and a variety of animal species. While several animal vaccines are available, there is no effective and safe vaccine for prevention of brucellosis in humans. VIOLIN (http://www.violinet.org) is a web-based vaccine database and analysis system that curates, stores, and analyzes published data of commercialized vaccines, and vaccines in clinical trials or in research. VIOLIN contains information for 454 vaccines or vaccine candidates for 73 pathogens. VIOLIN also contains many bioinformatics tools for vaccine data analysis, data integration, and vaccine target prediction. To demonstrate the applicability of VIOLIN for vaccine research, VIOLIN was used for bioinformatics analysis of existing Brucella vaccines and prediction of new Brucella vaccine targets. Results VIOLIN contains many literature mining programs (e.g., Vaxmesh) that provide in-depth analysis of Brucella vaccine literature. As a result of manual literature curation, VIOLIN contains information for 38 Brucella vaccines or vaccine candidates, 14 protective Brucella antigens, and 68 host response studies to Brucella vaccines from 97 peer-reviewed articles. These Brucella vaccines are classified in the Vaccine Ontology (VO) system and used for different ontological applications. The web-based VIOLIN vaccine target prediction program Vaxign was used to predict new Brucella vaccine targets. Vaxign identified 14 outer membrane proteins that are conserved in six virulent strains from B. abortus, B. melitensis, and B. suis that are pathogenic in humans. Of the 14 membrane proteins, two proteins (Omp2b and Omp31-1) are not present in B. ovis, a Brucella species that is not pathogenic in humans. Brucella vaccine data stored in VIOLIN were compared and analyzed using the VIOLIN query system. Conclusions Bioinformatics curation and ontological

  6. Applications of time-frequency signature analysis to target identification

    Science.gov (United States)

    Gaunaurd, Guillermo C.; Strifors, Hans C.

    1999-03-01

    The overlapping subjects of target identification, inverse scattering and active classification have many applications that differ depending on specific sensors. Many useful techniques for these relevant subjects have been developed in the frequency and the time domains. A more recent approach views the target signatures in the combined or coupled time-frequency domain. For either ultra-wideband (UWB) projectors, or UWB processing these joint time- frequency techniques are particularly advantageous. Such analysis requires the use of some of the scores of non- linear distributions that have been proposed and studied over the years. Basic ones, such as the Wigner distribution and its many relatives, have been shown to belong to the well-studied `Cohen Class.' We will select half-a-dozen of these distributions to study applications that we have addressed and solved in several areas such as: (1) active sonar, (2) underwater mine classification using pulses from explosive sources, (3) identification of submerged shells having different fillers using dolphin bio-sonar `clicks,' and (4) broadband radar pulses to identify aircraft, other targets covered with dielectric absorbing layers, and also (land) mine-like objects buried underground, using a ground penetrating radar. These examples illustrate how the informative identifying features required for accurate target identification are extracted and displayed in this general time-frequency domain.

  7. Identifying radiotherapy target volumes in brain cancer by image analysis.

    Science.gov (United States)

    Cheng, Kun; Montgomery, Dean; Feng, Yang; Steel, Robin; Liao, Hanqing; McLaren, Duncan B; Erridge, Sara C; McLaughlin, Stephen; Nailon, William H

    2015-10-01

    To establish the optimal radiotherapy fields for treating brain cancer patients, the tumour volume is often outlined on magnetic resonance (MR) images, where the tumour is clearly visible, and mapped onto computerised tomography images used for radiotherapy planning. This process requires considerable clinical experience and is time consuming, which will continue to increase as more complex image sequences are used in this process. Here, the potential of image analysis techniques for automatically identifying the radiation target volume on MR images, and thereby assisting clinicians with this difficult task, was investigated. A gradient-based level set approach was applied on the MR images of five patients with grades II, III and IV malignant cerebral glioma. The relationship between the target volumes produced by image analysis and those produced by a radiation oncologist was also investigated. The contours produced by image analysis were compared with the contours produced by an oncologist and used for treatment. In 93% of cases, the Dice similarity coefficient was found to be between 60 and 80%. This feasibility study demonstrates that image analysis has the potential for automatic outlining in the management of brain cancer patients, however, more testing and validation on a much larger patient cohort is required.

  8. SU-F-T-36: Dosimetric Comparison of Point Based Vs. Target Based Prescription for Intracavitary Brachytherapy in Cancer of the Cervix

    Energy Technology Data Exchange (ETDEWEB)

    Ashenafi, M; McDonald, D; Peng, J; Mart, C; Koch, N; Cooper, L; Vanek, K [Medical University of South Carolina, Charleston, SC (United States)

    2016-06-15

    Purpose: Improved patient imaging used for planning the treatment of cervical cancer with Tandem and Ovoid (T&O) Intracavitary high-dose-rate brachytherapy (HDR) now allows for 3D delineation of target volumes and organs-at-risk. However, historical data relies on the conventional point A-based planning technique. A comparative dosimetric study was performed by generating both target-based (TBP) and point-based (PBP) plans for ten clinical patients. Methods: Treatment plans created using Elekta Oncentra v. 4.3 for ten consecutive cervical cancer patients were analyzed. All patients were treated with HDR using the Utrecht T&O applicator. Both CT and MRI imaging modalities were utilized to delineate clinical target volume (CTV) and organs-at-risk (rectum, sigmoid, bladder, and small bowel). Point A (left and right), vaginal mucosa, and ICRU rectum and bladder points were defined on CT. Two plans were generated for each patient using two prescription methods (PBP and TBP). 7Gy was prescribed to each point A for each PBP plan and to the target D90% for each TBP plan. Target V90%, V100%, and V200% were evaluated. In addition, D0.1cc and D2cc were analyzed for each organ-at-risk. Differences were assessed for statistical significance (p<0.05) by use of Student’s t-test. Results: Target coverage was comparable for both planning methods, with each method providing adequate target coverage. TBP showed lower absolute dose to the target volume than PBP (D90% = 7.0Gy vs. 7.4Gy, p=0.028), (V200% = 10.9cc vs. 12.8cc, p=0.014), (ALeft = 6.4Gy vs. 7Gy, p=0.009), and (ARight = 6.4Gy vs. 7Gy, p=0.013). TBP also showed a statistically significant reduction in bladder, rectum, small bowel, and sigmoid doses compared to PBP. There was no statistically significant difference in vaginal mucosa or ICRU-defined rectum and bladder dose. Conclusion: Target based prescription resulted in substantially lower dose to delineated organs-at-risk compared to point based prescription, while

  9. Thermodynamic analysis and experimental study of the effect of atmospheric pressure on the ice point

    International Nuclear Information System (INIS)

    Harvey, A. H.; McLinden, M. O.; Tew, W. L.

    2013-01-01

    We present a detailed thermodynamic analysis of the temperature of the ice point as a function of atmospheric pressure. This analysis makes use of accurate international standards for the properties of water and ice, and of available high-accuracy data for the Henry's constants of atmospheric gases in liquid water. The result is an ice point of 273.150 019(5) K at standard atmospheric pressure, with higher ice-point temperatures (varying nearly linearly with pressure) at lower pressures. The effect of varying ambient CO 2 concentration is analyzed and found to be significant in comparison to other uncertainties in the model. The thermodynamic analysis is compared with experimental measurements of the temperature difference between the ice point and the triple point of water performed at elevations ranging from 145 m to 4302 m, with atmospheric pressures from 101 kPa to 60 kPa

  10. A spatial point pattern analysis in Drosophila blastoderm embryos evaluating the potential inheritance of transcriptional states.

    Directory of Open Access Journals (Sweden)

    Feng He

    Full Text Available The Drosophila blastoderm embryo undergoes rapid cycles of nuclear division. This poses a challenge to genes that need to reliably sense the concentrations of morphogen molecules to form desired expression patterns. Here we investigate whether the transcriptional state of hunchback (hb, a target gene directly activated by the morphogenetic protein Bicoid (Bcd, exhibits properties indicative of inheritance between mitotic cycles. To achieve this, we build a dataset of hb transcriptional states at the resolution of individual nuclei in embryos at early cycle 14. We perform a spatial point pattern (SPP analysis to evaluate the spatial relationships among the nuclei that have distinct numbers of hb gene copies undergoing active transcription in snapshots of embryos. Our statistical tests and simulation studies reveal properties of dispersed clustering for nuclei with both or neither copies of hb undergoing active transcription. Modeling of nuclear lineages from cycle 11 to cycle 14 suggests that these two types of nuclei can achieve spatial clustering when, and only when, the transcriptional states are allowed to propagate between mitotic cycles. Our results are consistent with the possibility where the positional information encoded by the Bcd morphogen gradient may not need to be decoded de novo at all mitotic cycles in the Drosophila blastoderm embryo.

  11. Seafood safety: economics of hazard analysis and Critical Control Point (HACCP) programmes

    National Research Council Canada - National Science Library

    Cato, James C

    1998-01-01

    .... This document on economic issues associated with seafood safety was prepared to complement the work of the Service in seafood technology, plant sanitation and Hazard Analysis Critical Control Point (HACCP) implementation...

  12. Study of the Microfocus X-Ray Tube Based on a Point-Like Target Used for Micro-Computed Tomography.

    Science.gov (United States)

    Zhou, Rifeng; Zhou, Xiaojian; Li, Xiaobin; Cai, Yufang; Liu, Fenglin

    2016-01-01

    For a micro-Computed Tomography (Micro-CT) system, the microfocus X-ray tube is an essential component because the spatial resolution of CT images, in theory, is mainly determined by the size and stability of the X-ray focal spot of the microfocus X-ray tube. However, many factors, including voltage fluctuations, mechanical vibrations, and temperature changes, can cause the size and the stability of the X-ray focal spot to degrade. A new microfocus X-ray tube based on a point-like micro-target in which the X-ray target is irradiated with an unfocused electron beam was investigated. EGS4 Monte Carlo simulation code was employed for the calculation of the X-ray intensity produced from the point-like micro-target and the substrate. The effects of several arrangements of the target material, target and beam size were studied. The simulation results demonstrated that if the intensity of X-rays generated at the point-like target is greater than half of the X-ray intensity produced on the substrate, the X-ray focal spot is determined in part by the point-like target rather than by the electron beam in the conventional X-ray tube. In theory, since it is able to reduce those unfavorable effects such as the electron beam trajectory swinging and the beam size changing for the microfocus X-ray tube, it could alleviate CT image artifacts caused by the X-ray focal spot shift and size change.

  13. Molecular analysis of point mutations in a barley genome exposed to MNU and gamma rays

    Energy Technology Data Exchange (ETDEWEB)

    Kurowska, Marzena, E-mail: mkurowsk@us.edu.pl [Department of Genetics, Faculty of Biology and Environmental Protection, University of Silesia, Jagiellonska 28, 40-032 Katowice (Poland); Labocha-Pawlowska, Anna; Gnizda, Dominika; Maluszynski, Miroslaw; Szarejko, Iwona [Department of Genetics, Faculty of Biology and Environmental Protection, University of Silesia, Jagiellonska 28, 40-032 Katowice (Poland)

    2012-10-15

    We present studies aimed at determining the types and frequencies of mutations induced in the barley genome after treatment with chemical (N-methyl-N-nitrosourea, MNU) and physical (gamma rays) mutagens. We created M{sub 2} populations of a doubled haploid line and used them for the analysis of mutations in targeted DNA sequences and over an entire barley genome using TILLING (Targeting Induced Local Lesions in Genomes) and AFLP (Amplified Fragment Length Polymorphism) technique, respectively. Based on the TILLING analysis of the total DNA sequence of 4,537,117 bp in the MNU population, the average mutation density was estimated as 1/504 kb. Only one nucleotide change was found after an analysis of 3,207,444 bp derived from the highest dose of gamma rays applied. MNU was clearly a more efficient mutagen than gamma rays in inducing point mutations in barley. The majority (63.6%) of the MNU-induced nucleotide changes were transitions, with a similar number of G > A and C > T substitutions. The similar share of G > A and C > T transitions indicates a lack of bias in the repair of O{sup 6}-methylguanine lesions between DNA strands. There was, however, a strong specificity of the nucleotide surrounding the O{sup 6}-meG at the -1 position. Purines formed 81% of nucleotides observed at the -1 site. Scanning the barley genome with AFLP markers revealed ca. a three times higher level of AFLP polymorphism in MNU-treated as compared to the gamma-irradiated population. In order to check whether AFLP markers can really scan the whole barley genome for mutagen-induced polymorphism, 114 different AFLP products, were cloned and sequenced. 94% of bands were heterogenic, with some bands containing up to 8 different amplicons. The polymorphic AFLP products were characterised in terms of their similarity to the records deposited in a GenBank database. The types of sequences present in the polymorphic bands reflected the organisation of the barley genome.

  14. Stakeholder analysis and mapping as targeted communication strategy.

    Science.gov (United States)

    Shirey, Maria R

    2012-09-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author highlights the importance of stakeholder theory and discusses how to apply the theory to conduct a stakeholder analysis. This article also provides an explanation of how to use related stakeholder mapping techniques with targeted communication strategies.

  15. [A new calibration transfer method based on target factor analysis].

    Science.gov (United States)

    Wang, Yan-bin; Yuan, Hong-fu; Lu, Wan-zhen

    2005-03-01

    A new calibration transfer method based on target factor analysis is proposed.The performance of the new method compared with the piecewise direct standardization method. This method was applied to two data sets, of which one is a simulation data set, and the other is an NIR data set composed of benzene, toluene, xylene and isooctane. The results obtained with this new method are at least as well as those obtained by PDS with the biggest improvement occurring when the spectra have some non-linear responses.

  16. Ideal MHD stability analysis of KSTAR target AT mode

    International Nuclear Information System (INIS)

    Yi, S.M.; Kim, J.H.; You, K.I.; Kim, J.Y.

    2009-01-01

    Full text: A main research objective of KSTAR (Korea Superconducting Tokamak Advanced Research) device is to demonstrate the steady-state operation capability of high-performance AT (Advanced Tokamak) mode. To meet this goal, it is critical for KSTAR to have a good MHD stability boundary, particularly against the high-beta ideal instabilities such as the external kink and the ballooning modes. To support this MHD stability KSTAR has been designed to have a strong plasma shape and a close interval between plasma and passive- plate wall. During the conceptual design phase of KSTAR, a preliminary study was performed to estimate the high beta MHD stability limit of KSTAR target AT mode using PEST and VACUUM codes and it was shown that the target AT mode can be stable up to β N ∼ 5 with a well-defined plasma pressure and current profiles. Recently, a new calculation has been performed to estimate the ideal stability limit in various KSTAR operating conditions using DCON code, and it has been observed that there is some difference between the new and old calculation results, particularly in the dependence of the maximum β N value on the toroidal mode number. Here, we thus present a more detailed analysis of the ideal MHD stability limit of KSTAR target AT mode using various codes, which include GATO as well as PEST and DCON, in the comparison of calculation results among the three codes. (author)

  17. Towards semi-automatic rock mass discontinuity orientation and set analysis from 3D point clouds

    Science.gov (United States)

    Guo, Jiateng; Liu, Shanjun; Zhang, Peina; Wu, Lixin; Zhou, Wenhui; Yu, Yinan

    2017-06-01

    Obtaining accurate information on rock mass discontinuities for deformation analysis and the evaluation of rock mass stability is important. Obtaining measurements for high and steep zones with the traditional compass method is difficult. Photogrammetry, three-dimensional (3D) laser scanning and other remote sensing methods have gradually become mainstream methods. In this study, a method that is based on a 3D point cloud is proposed to semi-automatically extract rock mass structural plane information. The original data are pre-treated prior to segmentation by removing outlier points. The next step is to segment the point cloud into different point subsets. Various parameters, such as the normal, dip/direction and dip, can be calculated for each point subset after obtaining the equation of the best fit plane for the relevant point subset. A cluster analysis (a point subset that satisfies some conditions and thus forms a cluster) is performed based on the normal vectors by introducing the firefly algorithm (FA) and the fuzzy c-means (FCM) algorithm. Finally, clusters that belong to the same discontinuity sets are merged and coloured for visualization purposes. A prototype system is developed based on this method to extract the points of the rock discontinuity from a 3D point cloud. A comparison with existing software shows that this method is feasible. This method can provide a reference for rock mechanics, 3D geological modelling and other related fields.

  18. Sedentary Behaviour Profiling of Office Workers: A Sensitivity Analysis of Sedentary Cut-Points.

    Science.gov (United States)

    Boerema, Simone T; Essink, Gerard B; Tönis, Thijs M; van Velsen, Lex; Hermens, Hermie J

    2015-12-25

    Measuring sedentary behaviour and physical activity with wearable sensors provides detailed information on activity patterns and can serve health interventions. At the basis of activity analysis stands the ability to distinguish sedentary from active time. As there is no consensus regarding the optimal cut-point for classifying sedentary behaviour, we studied the consequences of using different cut-points for this type of analysis. We conducted a battery of sitting and walking activities with 14 office workers, wearing the Promove 3D activity sensor to determine the optimal cut-point (in counts per minute (m·s(-2))) for classifying sedentary behaviour. Then, 27 office workers wore the sensor for five days. We evaluated the sensitivity of five sedentary pattern measures for various sedentary cut-points and found an optimal cut-point for sedentary behaviour of 1660 × 10(-3) m·s(-2). Total sedentary time was not sensitive to cut-point changes within ±10% of this optimal cut-point; other sedentary pattern measures were not sensitive to changes within the ±20% interval. The results from studies analyzing sedentary patterns, using different cut-points, can be compared within these boundaries. Furthermore, commercial, hip-worn activity trackers can implement feedback and interventions on sedentary behaviour patterns, using these cut-points.

  19. Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis

    Science.gov (United States)

    Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.

    2017-01-01

    This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.

  20. Adaptive detection method of infrared small target based on target-background separation via robust principal component analysis

    Science.gov (United States)

    Wang, Chuanyun; Qin, Shiyin

    2015-03-01

    Motivated by the robust principal component analysis, infrared small target image is regarded as low-rank background matrix corrupted by sparse target and noise matrices, thus a new target-background separation model is designed, subsequently, an adaptive detection method of infrared small target is presented. Firstly, multi-scale transform and patch transform are used to generate an image patch set for infrared small target detection; secondly, target-background separation of each patch is achieved by recovering the low-rank and sparse matrices using adaptive weighting parameter; thirdly, the image reconstruction and fusion are carried out to obtain the entire separated background and target images; finally, the infrared small target detection is realized by threshold segmentation of template matching similarity measurement. In order to validate the performance of the proposed method, three experiments: target-background separation, background clutter suppression and infrared small target detection, are performed over different clutter background with real infrared small targets in single-frame or sequence images. A series of experiment results demonstrate that the proposed method can not only suppress background clutter effectively even if with strong noise interference but also detect targets accurately with low false alarm rate.

  1. Simultaneous colour visualizations of multiple ALS point cloud attributes for land cover and vegetation analysis

    Science.gov (United States)

    Zlinszky, András; Schroiff, Anke; Otepka, Johannes; Mandlburger, Gottfried; Pfeifer, Norbert

    2014-05-01

    LIDAR point clouds hold valuable information for land cover and vegetation analysis, not only in the spatial distribution of the points but also in their various attributes. However, LIDAR point clouds are rarely used for visual interpretation, since for most users, the point cloud is difficult to interpret compared to passive optical imagery. Meanwhile, point cloud viewing software is available allowing interactive 3D interpretation, but typically only one attribute at a time. This results in a large number of points with the same colour, crowding the scene and often obscuring detail. We developed a scheme for mapping information from multiple LIDAR point attributes to the Red, Green, and Blue channels of a widely used LIDAR data format, which are otherwise mostly used to add information from imagery to create "photorealistic" point clouds. The possible combinations of parameters are therefore represented in a wide range of colours, but relative differences in individual parameter values of points can be well understood. The visualization was implemented in OPALS software, using a simple and robust batch script, and is viewer independent since the information is stored in the point cloud data file itself. In our case, the following colour channel assignment delivered best results: Echo amplitude in the Red, echo width in the Green and normalized height above a Digital Terrain Model in the Blue channel. With correct parameter scaling (but completely without point classification), points belonging to asphalt and bare soil are dark red, low grassland and crop vegetation are bright red to yellow, shrubs and low trees are green and high trees are blue. Depending on roof material and DTM quality, buildings are shown from red through purple to dark blue. Erroneously high or low points, or points with incorrect amplitude or echo width usually have colours contrasting from terrain or vegetation. This allows efficient visual interpretation of the point cloud in planar

  2. Global analysis of small molecule binding to related protein targets.

    Directory of Open Access Journals (Sweden)

    Felix A Kruger

    2012-01-01

    Full Text Available We report on the integration of pharmacological data and homology information for a large scale analysis of small molecule binding to related targets. Differences in small molecule binding have been assessed for curated pairs of human to rat orthologs and also for recently diverged human paralogs. Our analysis shows that in general, small molecule binding is conserved for pairs of human to rat orthologs. Using statistical tests, we identified a small number of cases where small molecule binding is different between human and rat, some of which had previously been reported in the literature. Knowledge of species specific pharmacology can be advantageous for drug discovery, where rats are frequently used as a model system. For human paralogs, we demonstrate a global correlation between sequence identity and the binding of small molecules with equivalent affinity. Our findings provide an initial general model relating small molecule binding and sequence divergence, containing the foundations for a general model to anticipate and predict within-target-family selectivity.

  3. Potential Vaccine Targets against Rabbit Coccidiosis by Immunoproteomic Analysis.

    Science.gov (United States)

    Song, Hongyan; Dong, Ronglian; Qiu, Baofeng; Jing, Jin; Zhu, Shunxing; Liu, Chun; Jiang, Yingmei; Wu, Liucheng; Wang, Shengcun; Miao, Jin; Shao, Yixiang

    2017-02-01

    The aim of this study was to identify antigens for a vaccine or drug target to control rabbit coccidiosis. A combination of 2-dimensional electrophoresis, immunoblotting, and mass spectrometric analysis were used to identify novel antigens from the sporozoites of Eimeria stiedae . Protein spots were recognized by the sera of New Zealand rabbits infected artificially with E. stiedae . The proteins were characterized by matrix-assisted laser desorption ionization time of flight mass spectrometry (MALDI-TOF/TOF-MS) analysis in combination with bioinformatics. Approximately 868 protein spots were detected by silver-staining, and a total of 41 immunoreactive protein spots were recognized by anti- E. stiedae sera. Finally, 23 protein spots were successfully identified. The proteins such as heat shock protein 70 and aspartyl protease may have potential as immunodiagnostic or vaccine antigens. The immunoreactive proteins were found to possess a wide range of biological functions. This study is the first to report the proteins recognized by sera of infected rabbits with E. stiedae , which might be helpful in identifying potential targets for vaccine development to control rabbit coccidiosis.

  4. Developing Multidimensional Likert Scales Using Item Factor Analysis: The Case of Four-Point Items

    Science.gov (United States)

    Asún, Rodrigo A.; Rdz-Navarro, Karina; Alvarado, Jesús M.

    2016-01-01

    This study compares the performance of two approaches in analysing four-point Likert rating scales with a factorial model: the classical factor analysis (FA) and the item factor analysis (IFA). For FA, maximum likelihood and weighted least squares estimations using Pearson correlation matrices among items are compared. For IFA, diagonally weighted…

  5. Large deflection analysis of cantilever beam under end point and distributed load

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Tolou, N; Barari, Amin

    2014-01-01

    distributed loads. Direct nonlinear solution by use of homotopy analysis method was implemented to drive the semi-exact solution of trajectory position of any point along the beam length. For the purpose of comparison, the deflections were calculated and compared to those of finite element method which...... requires numerical solution of simultaneous equations which is a significant drawback for optimization or reliability analysis. This paper is motivated to overcome these shortcomings by presenting an analytical solution for the large deflection analysis of a cantilever beam under free end point and uniform...

  6. Targeted DNA methylation analysis by next-generation sequencing.

    Science.gov (United States)

    Masser, Dustin R; Stanford, David R; Freeman, Willard M

    2015-02-24

    The role of epigenetic processes in the control of gene expression has been known for a number of years. DNA methylation at cytosine residues is of particular interest for epigenetic studies as it has been demonstrated to be both a long lasting and a dynamic regulator of gene expression. Efforts to examine epigenetic changes in health and disease have been hindered by the lack of high-throughput, quantitatively accurate methods. With the advent and popularization of next-generation sequencing (NGS) technologies, these tools are now being applied to epigenomics in addition to existing genomic and transcriptomic methodologies. For epigenetic investigations of cytosine methylation where regions of interest, such as specific gene promoters or CpG islands, have been identified and there is a need to examine significant numbers of samples with high quantitative accuracy, we have developed a method called Bisulfite Amplicon Sequencing (BSAS). This method combines bisulfite conversion with targeted amplification of regions of interest, transposome-mediated library construction and benchtop NGS. BSAS offers a rapid and efficient method for analysis of up to 10 kb of targeted regions in up to 96 samples at a time that can be performed by most research groups with basic molecular biology skills. The results provide absolute quantitation of cytosine methylation with base specificity. BSAS can be applied to any genomic region from any DNA source. This method is useful for hypothesis testing studies of target regions of interest as well as confirmation of regions identified in genome-wide methylation analyses such as whole genome bisulfite sequencing, reduced representation bisulfite sequencing, and methylated DNA immunoprecipitation sequencing.

  7. Challenges in thermal and hydraulic analysis of ADS target systems

    International Nuclear Information System (INIS)

    Groetzbach, G.; Batta, A.; Lefhalm, C.-H.; Otic, I.

    2004-01-01

    The liquid metal cooled spallation targets of Accelerator Driven nuclear reactor Systems obey high thermal loads; in addition some flow and cooling conditions are of a prototypical character; in contrast the operating conditions for the engaged materials are narrow; thus, the target development requires a very careful analysis by experimental and numerical means. Especially the cooling of the steel window, which is heated by the proton beam, needs special care. Some of the main goals of the experimental and numerical analyses of the thermal dynamics of those systems are discusses. The prediction of locally detached flows and of flows with larger recirculation areas suffers from insufficient turbulence modeling; this has to be compensated by using prototypical model experiments, e.g. with water, to select the adequate models and numerical schemes. The well known problems with the Reynolds analogy in predicting the heat transfer in liquid metals requires always prototypic liquid metal experiments to select and adapt the turbulent heat flux models. The uncertainties in liquid metal experiments cannot be neglected; so it is necessary to perform CFD calculations and experiments always hand in hand and to develop improve turbulent heat flux models. One contribution to an improved 3 or 4-equation model is deduced from recent Direct Numerical Simulation (DNS) data. (author)

  8. SeedVicious: Analysis of microRNA target and near-target sites.

    Science.gov (United States)

    Marco, Antonio

    2018-01-01

    Here I describe seedVicious, a versatile microRNA target site prediction software that can be easily fitted into annotation pipelines and run over custom datasets. SeedVicious finds microRNA canonical sites plus other, less efficient, target sites. Among other novel features, seedVicious can compute evolutionary gains/losses of target sites using maximum parsimony, and also detect near-target sites, which have one nucleotide different from a canonical site. Near-target sites are important to study population variation in microRNA regulation. Some analyses suggest that near-target sites may also be functional sites, although there is no conclusive evidence for that, and they may actually be target alleles segregating in a population. SeedVicious does not aim to outperform but to complement existing microRNA prediction tools. For instance, the precision of TargetScan is almost doubled (from 11% to ~20%) when we filter predictions by the distance between target sites using this program. Interestingly, two adjacent canonical target sites are more likely to be present in bona fide target transcripts than pairs of target sites at slightly longer distances. The software is written in Perl and runs on 64-bit Unix computers (Linux and MacOS X). Users with no computing experience can also run the program in a dedicated web-server by uploading custom data, or browse pre-computed predictions. SeedVicious and its associated web-server and database (SeedBank) are distributed under the GPL/GNU license.

  9. Analysis of point source size on measurement accuracy of lateral point-spread function of confocal Raman microscopy

    Science.gov (United States)

    Fu, Shihang; Zhang, Li; Hu, Yao; Ding, Xiang

    2018-01-01

    Confocal Raman Microscopy (CRM) has matured to become one of the most powerful instruments in analytical science because of its molecular sensitivity and high spatial resolution. Compared with conventional Raman Microscopy, CRM can perform three dimensions mapping of tiny samples and has the advantage of high spatial resolution thanking to the unique pinhole. With the wide application of the instrument, there is a growing requirement for the evaluation of the imaging performance of the system. Point-spread function (PSF) is an important approach to the evaluation of imaging capability of an optical instrument. Among a variety of measurement methods of PSF, the point source method has been widely used because it is easy to operate and the measurement results are approximate to the true PSF. In the point source method, the point source size has a significant impact on the final measurement accuracy. In this paper, the influence of the point source sizes on the measurement accuracy of PSF is analyzed and verified experimentally. A theoretical model of the lateral PSF for CRM is established and the effect of point source size on full-width at half maximum of lateral PSF is simulated. For long-term preservation and measurement convenience, PSF measurement phantom using polydimethylsiloxane resin, doped with different sizes of polystyrene microspheres is designed. The PSF of CRM with different sizes of microspheres are measured and the results are compared with the simulation results. The results provide a guide for measuring the PSF of the CRM.

  10. Targeting Villages for Rural Development Using Satellite Image Analysis.

    Science.gov (United States)

    Varshney, Kush R; Chen, George H; Abelson, Brian; Nowocin, Kendall; Sakhrani, Vivek; Xu, Ling; Spatocco, Brian L

    2015-03-01

    Satellite imagery is a form of big data that can be harnessed for many social good applications, especially those focusing on rural areas. In this article, we describe the common problem of selecting sites for and planning rural development activities as informed by remote sensing and satellite image analysis. Effective planning in poor rural areas benefits from information that is not available and is difficult to obtain at any appreciable scale by any means other than algorithms for estimation and inference from remotely sensed images. We discuss two cases in depth: the targeting of unconditional cash transfers to extremely poor villages in sub-Saharan Africa and the siting and planning of solar-powered microgrids in remote villages in India. From these cases, we draw out some common lessons broadly applicable to informed rural development.

  11. Analysis of the three-point-bend test for materials with unequal tension and compression properties

    Science.gov (United States)

    Chamis, C. C.

    1974-01-01

    Structural resins have moduli and strengths which are different in tension and compression. The three-point-bend test is used extensively in their characterization. An investigation is performed to derive all the equations needed for the analysis and test data reduction of the three-point-bend test. The governing equations are derived using well-known linear structural mechanics principles and are represented graphically. The stress concentration effects in the vicinity of the load point are investigated, and failure stress and failure initiation are examined.

  12. Watershed-based point sources permitting strategy and dynamic permit-trading analysis.

    Science.gov (United States)

    Ning, Shu-Kuang; Chang, Ni-Bin

    2007-09-01

    Permit-trading policy in a total maximum daily load (TMDL) program may provide an additional avenue to produce environmental benefit, which closely approximates what would be achieved through a command and control approach, with relatively lower costs. One of the important considerations that might affect the effective trading mechanism is to determine the dynamic transaction prices and trading ratios in response to seasonal changes of assimilative capacity in the river. Advanced studies associated with multi-temporal spatially varied trading ratios among point sources to manage water pollution hold considerable potential for industries and policy makers alike. This paper aims to present an integrated simulation and optimization analysis for generating spatially varied trading ratios and evaluating seasonal transaction prices accordingly. It is designed to configure a permit-trading structure basin-wide and provide decision makers with a wealth of cost-effective, technology-oriented, risk-informed, and community-based management strategies. The case study, seamlessly integrating a QUAL2E simulation model with an optimal waste load allocation (WLA) scheme in a designated TMDL study area, helps understand the complexity of varying environmental resources values over space and time. The pollutants of concern in this region, which are eligible for trading, mainly include both biochemical oxygen demand (BOD) and ammonia-nitrogen (NH3-N). The problem solution, as a consequence, suggests an array of waste load reduction targets in a well-defined WLA scheme and exhibits a dynamic permit-trading framework among different sub-watersheds in the study area. Research findings gained in this paper may extend to any transferable dynamic-discharge permit (TDDP) program worldwide.

  13. Audible sonar images generated with proprioception for target analysis.

    Science.gov (United States)

    Kuc, Roman B

    2017-05-01

    Some blind humans have demonstrated the ability to detect and classify objects with echolocation using palatal clicks. An audible-sonar robot mimics human click emissions, binaural hearing, and head movements to extract interaural time and level differences from target echoes. Targets of various complexity are examined by transverse displacements of the sonar and by target pose rotations that model movements performed by the blind. Controlled sonar movements executed by the robot provide data that model proprioception information available to blind humans for examining targets from various aspects. The audible sonar uses this sonar location and orientation information to form two-dimensional target images that are similar to medical diagnostic ultrasound tomograms. Simple targets, such as single round and square posts, produce distinguishable and recognizable images. More complex targets configured with several simple objects generate diffraction effects and multiple reflections that produce image artifacts. The presentation illustrates the capabilities and limitations of target classification from audible sonar images.

  14. Localization of nerve entry points as targets to block spasticity of the deep posterior compartment muscles of the leg.

    Science.gov (United States)

    Hu, Shuaiyu; Zhuo, Lifan; Zhang, Xiaoming; Yang, Shengbo

    2017-10-01

    To identify the optimal body surface puncture locations and the depths of nerve entry points (NEPs) in the deep posterior compartment muscles of the leg, 60 lower limbs of thirty adult cadavers were dissected in prone position. A curved line on the skin surface joining the lateral to the medial epicondyles of the femur was taken as a horizontal reference line (H). Another curved line joining the lateral epicondyle of the femur to the lateral malleolus was designated the longitudinal reference line (L). Following dissection, the NEPs were labeled with barium sulfate and then subjected to spiral computed tomography scanning. The projection point of the NEP on the posterior skin surface of the leg was designated P, and the projection in the opposite direction across the transverse plane was designated P'. The intersections of P on H and L were identified as P H and P L , and their positions and the depth of the NEP on PP' were measured using the Syngo system and expressed as percentages of H, L, and PP'. The P H points of the tibial posterior, flexor hallucis longus and flexor digitorum longus muscles were located at 38.10, 46.20, and 55.21% of H, respectively. The P L points were located at 25.35, 41.30, and 45.39% of L, respectively. The depths of the NEPs were 49.11, 54.64, and 55.95% of PP', respectively. The accurate location of these NEPs should improve the efficacy and efficiency of chemical neurolysis for treating spasticity of the deep posterior compartment muscles of the leg. Clin. Anat. 30:855-860, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  15. Structure Line Detection from LIDAR Point Clouds Using Topological Elevation Analysis

    Science.gov (United States)

    Lo, C. Y.; Chen, L. C.

    2012-07-01

    Airborne LIDAR point clouds, which have considerable points on object surfaces, are essential to building modeling. In the last two decades, studies have developed different approaches to identify structure lines using two main approaches, data-driven and modeldriven. These studies have shown that automatic modeling processes depend on certain considerations, such as used thresholds, initial value, designed formulas, and predefined cues. Following the development of laser scanning systems, scanning rates have increased and can provide point clouds with higher point density. Therefore, this study proposes using topological elevation analysis (TEA) to detect structure lines instead of threshold-dependent concepts and predefined constraints. This analysis contains two parts: data pre-processing and structure line detection. To preserve the original elevation information, a pseudo-grid for generating digital surface models is produced during the first part. The highest point in each grid is set as the elevation value, and its original threedimensional position is preserved. In the second part, using TEA, the structure lines are identified based on the topology of local elevation changes in two directions. Because structure lines can contain certain geometric properties, their locations have small relieves in the radial direction and steep elevation changes in the circular direction. Following the proposed approach, TEA can be used to determine 3D line information without selecting thresholds. For validation, the TEA results are compared with those of the region growing approach. The results indicate that the proposed method can produce structure lines using dense point clouds.

  16. Point prevalence of access block and overcrowding in New Zealand emergency departments in 2010 and their relationship to the 'Shorter Stays in ED' target.

    Science.gov (United States)

    Jones, Peter G; Olsen, Sarah

    2011-10-01

    To document the extent of access block and ED overcrowding in New Zealand in 2010 and to determine whether these were linked to the hospital's ability to meet the Shorter Stays in ED target. Surveys of all New Zealand EDs were undertaken at two points in time in 2010 to determine ED occupancy. Data on target achievement during corresponding time periods were obtained from the Ministry of Health. In tertiary and secondary hospitals, respectively, access block was seen in 64% versus 23% (P= 0.05) and overcrowding was seen in 57.1% versus 39% (P= 0.45). No hospital with access block met the 'Shorter Stays' target, compared with 60% without access block (P= 0.001). Twenty-three per cent of hospitals with ED overcrowding met the target compared with 43% without ED overcrowding (P= 0.42). The number of patients experiencing ≥8 h delay to admission were 25 in May and 59 in August (P= 0.04). This represented 45.5% and 79.7% of patients waiting for admission, respectively (P= 0.08). Hospital access block was seen more often in larger hospitals and significantly associated with failure to meet the 'Shorter Stays in ED' health target, whereas ED overcrowding was seen in both small and large hospitals, but not associated with failure to meet the target. © 2011 The Authors. EMA © 2011 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.

  17. Use of Point Clouds for River Corridor Analysis, Management, and Design

    Science.gov (United States)

    Pasternack, G. B.

    2015-12-01

    River scientists, managers, and engineers are increasingly working with 1-m scale point clouds of the Earth's surface collected using many different technologies. Although point clouds have tremendous potential, they are fraught with errors and challenges, which has led to an abundance of methodological studies about data quality and coping with uncertainty. In addition, the abundance and spatial autocorrelation of the data necessitate a paradigm shift in scientific analysis away from classic statistical analysis focusing on central tendency and towards an understanding of the primary importance of spatially organized landscape complexity. Looking beyond data processing methods, there are terrific opportunities for linking scientific data analysis of existing conditions with engineering data synthesis to build new landscapes or enhance existing ones to achieve more environmental functionality. In pursuit of these goals the new paradigm of near-census river science is emerging to support mutual analysis and synthesis of point clouds of river corridors focusing on the 1-m scale as the basic building block for characterizing geomorphic processes and ecological functions. Examples of near-census analysis of riverine topography and biota will be presented. A demonstration will be shown using a new software platform to illustrate how key metrics extracted from point clouds can be used to design synthetic river corridors with multiple scales of organized landscape complexity that yield specific geomorphic processes and ecological functions.

  18. Fourier domain target transformation analysis in the thermal infrared

    Science.gov (United States)

    Anderson, D. L.

    1993-01-01

    Remote sensing uses of principal component analysis (PCA) of multispectral images include band selection and optimal color selection for display of information content. PCA has also been used for quantitative determination of mineral types and abundances given end member spectra. The preliminary results of the investigation of target transformation PCA (TTPCA) in the fourier domain to both identify end member spectra in an unknown spectrum, and to then calculate the relative concentrations of these selected end members are presented. Identification of endmember spectra in an unknown sample has previously been performed through bandmatching, expert systems, and binary classifiers. Both bandmatching and expert system techniques require the analyst to select bands or combinations of bands unique to each endmember. Thermal infrared mineral spectra have broad spectral features which vary subtly with composition. This makes identification of unique features difficult. Alternatively, whole spectra can be used in the classification process, in which case there is not need for an expert to identify unique spectra. Use of binary classifiers on whole spectra to identify endmember components has met with some success. These techniques can be used, along with a least squares fit approach on the endmembers identified, to derive compositional information. An alternative to the approach outlined above usese target transformation in conjunction with PCA to both identify and quantify the composition of unknown spectra. Preprocessing of the library and unknown spectra into the fourier domain, and using only a specific number of the components, allows for significant data volume reduction while maintaining a linear relationship in a Beer's Law sense. The approach taken here is to iteratively calculate concentrations, reducing the number of endmember components until only non-negative concentrations remain.

  19. Fixed-point bifurcation analysis in biological models using interval polynomials theory.

    Science.gov (United States)

    Rigatos, Gerasimos G

    2014-06-01

    The paper proposes a systematic method for fixed-point bifurcation analysis in circadian cells and similar biological models using interval polynomials theory. The stages for performing fixed-point bifurcation analysis in such biological systems comprise (i) the computation of fixed points as functions of the bifurcation parameter and (ii) the evaluation of the type of stability for each fixed point through the computation of the eigenvalues of the Jacobian matrix that is associated with the system's nonlinear dynamics model. Stage (ii) requires the computation of the roots of the characteristic polynomial of the Jacobian matrix. This problem is nontrivial since the coefficients of the characteristic polynomial are functions of the bifurcation parameter and the latter varies within intervals. To obtain a clear view about the values of the roots of the characteristic polynomial and about the stability features they provide to the system, the use of interval polynomials theory and particularly of Kharitonov's stability theorem is proposed. In this approach, the study of the stability of a characteristic polynomial with coefficients that vary in intervals is equivalent to the study of the stability of four polynomials with crisp coefficients computed from the boundaries of the aforementioned intervals. The efficiency of the proposed approach for the analysis of fixed-point bifurcations in nonlinear models of biological neurons is tested through numerical and simulation experiments.

  20. Error Analysis of Fast Moving Target Geo-location in Wide Area Surveillance Ground Moving Target Indication Mode

    Directory of Open Access Journals (Sweden)

    Zheng Shi-chao

    2013-12-01

    Full Text Available As an important mode in airborne radar systems, Wide Area Surveillance Ground Moving Target Indication (WAS-GMTI mode has the ability of monitoring a large area in a short time, and then the detected moving targets can be located quickly. However, in real environment, many factors introduce considerable errors into the location of moving targets. In this paper, a fast location method based on the characteristics of the moving targets in WAS-GMTI mode is utilized. And in order to improve the location performance, those factors that introduce location errors are analyzed and moving targets are relocated. Finally, the analysis of those factors is proved to be reasonable by simulation and real data experiments.

  1. Hazard analysis and critical control point (HACCP) history and conceptual overview.

    Science.gov (United States)

    Hulebak, Karen L; Schlosser, Wayne

    2002-06-01

    The concept of Hazard Analysis and Critical Control Point (HACCP) is a system that enables the production of safe meat and poultry products through the thorough analysis of production processes, identification of all hazards that are likely to occur in the production establishment, the identification of critical points in the process at which these hazards may be introduced into product and therefore should be controlled, the establishment of critical limits for control at those points, the verification of these prescribed steps, and the methods by which the processing establishment and the regulatory authority can monitor how well process control through the HACCP plan is working. The history of the development of HACCP is reviewed, and examples of practical applications of HACCP are described.

  2. The acquisition of full fluoroquinolone resistance in Salmonella Typhi by accumulation of point mutations in the topoisomerase targets.

    Science.gov (United States)

    Turner, Arthur K; Nair, Satheesh; Wain, John

    2006-10-01

    To determine the contribution to fluoroquinolone resistance of point mutations in the gyrA and parC genes of Salmonella Typhi. Point mutations that result in Ser-83-->Phe, Ser-83-->Tyr and Asp-87-->Asn amino acid substitutions in GyrA and Glu-84-->Lys in ParC were introduced into a quinolone-susceptible, attenuated strain of Salmonella Typhi using suicide vector technology. This is the first time that this approach has been used in Salmonella and abrogates the need for selection with quinolone antibacterials in the investigation of resistance mutations. A panel of mutants was created using this methodology and tested for quinolone resistance. The ParC substitution alone made no difference to quinolone susceptibility. Any single GyrA substitution resulted in resistance to nalidixic acid (MIC >or= 512 mg/L) and increased by up to 23-fold the MIC of the fluoroquinolones ofloxacin (MIC Phe or Tyr and Asp-87-->Asn in GyrA with Glu-84-->Lys in ParC) showed high levels of resistance to all the fluoroquinolones tested (MICs: gatifloxacin, 3-4 mg/L; ofloxacin, 32 mg/L; ciprofloxacin, 32-64 mg/L). In Salmonella Typhi the fluoroquinolones tested act on GyrA and, at higher concentrations, on ParC. The point mutations conferred reduced susceptibility to ofloxacin and ciprofloxacin, and also reduced susceptibility to gatifloxacin. Three mutations conferred resistance to ofloxacin (32 mg/L), ciprofloxacin (32 mg/L) and to the more active fluoroquinolone gatifloxacin (MIC >or= 3 mg/L). These results predict that the use of ofloxacin or ciprofloxacin will select for resistance to gatifloxacin in nature.

  3. Analysis of Fast Radix-10 Digit Recurrence Algorithms for Fixed-Point and Floating-Point Dividers on FPGAs

    Directory of Open Access Journals (Sweden)

    Malte Baesler

    2013-01-01

    and decimal formats, for instance, commercial, financial, and insurance applications. In this paper we present five different radix-10 digit recurrence dividers for FPGA architectures. The first one implements a simple restoring shift-and-subtract algorithm, whereas each of the other four implementations performs a nonrestoring digit recurrence algorithm with signed-digit redundant quotient calculation and carry-save representation of the residuals. More precisely, the quotient digit selection function of the second divider is implemented fully by means of a ROM, the quotient digit selection function of the third and fourth dividers are based on carry-propagate adders, and the fifth divider decomposes each digit into three components and requires neither a ROM nor a multiplexer. Furthermore, the fixed-point divider is extended to support IEEE 754-2008 compliant decimal floating-point division for decimal64 data format. Finally, the algorithms have been synthesized on a Xilinx Virtex-5 FPGA, and implementation results are given.

  4. Study on characteristic points of boiling curve by using wavelet analysis and genetic algorithm

    International Nuclear Information System (INIS)

    Wei Huiming; Su Guanghui; Qiu Suizheng; Yang Xingbo

    2009-01-01

    Based on the wavelet analysis theory of signal singularity detection,the critical heat flux (CHF) and minimum film boiling starting point (q min ) of boiling curves can be detected and analyzed by using the wavelet multi-resolution analysis. To predict the CHF in engineering, empirical relations were obtained based on genetic algorithm. The results of wavelet detection and genetic algorithm prediction are consistent with experimental data very well. (authors)

  5. Vapor Pressure Data Analysis and Correlation Methodology for Data Spanning the Melting Point

    Science.gov (United States)

    2013-10-01

    specimen is adequately degassed, the liquid menisci in the U-tube are brought to the same level and the pressure read on the manometer . The measurement...VAPOR PRESSURE DATA ANALYSIS AND CORRELATION METHODOLOGY FOR DATA SPANNING THE MELTING POINT ECBC-CR-135 David E...REPORT TYPE Final 3. DATES COVERED (From - To) Mar 2013 - June 2013 4. TITLE AND SUBTITLE Vapor Pressure Data Analysis and Correlation Methodology

  6. Chopped or Long Roughage: What Do Calves Prefer? Using Cross Point Analysis of Double Demand Functions

    DEFF Research Database (Denmark)

    Webb, Laura E.; Jensen, Margit Bak; Engel, Bas

    2014-01-01

    The present study aimed to quantify calves'(Bos taurus) preference for long versus chopped hay and straw, and hay versus straw, using cross point analysis of double demand functions, in a context where energy intake was not a limiting factor. Nine calves, fed milk replacer and concentrate, were...

  7. Methods for cross point analysis of double-demand functions in assessing animal preferences

    DEFF Research Database (Denmark)

    Engel, Bas; Webb, Laura E.; Jensen, Margit Bak

    2014-01-01

    Cross point analysis of double demand functions provides a compelling way to quantify the strength of animal preferences for two simultaneously presented resources. During daily sessions, animals have to work to gain access to (a portion of) either resource, e.g. by pressing one of two panels a r...

  8. Size matters: point pattern analysis biases the estimation of spatial properties of stomata distribution.

    Science.gov (United States)

    Naulin, Paulette I; Valenzuela, Gerardo; Estay, Sergio A

    2017-03-01

    Stomata distribution is an example of biological patterning. Formal methods used to study stomata patterning are generally based on point-pattern analysis, which assumes that stomata are points and ignores the constraints imposed by size on the placement of neighbors. The inclusion of size in the analysis requires the use of a null model based on finite-size object geometry. In this study, we compare the results obtained by analyzing samples from several species using point and disc null models. The results show that depending on the null model used, there was a 20% reduction in the number of samples classified as uniform; these results suggest that stomata patterning is not as general as currently reported. Some samples changed drastically from being classified as uniform to being classified as clustered. In samples of Arabidopsis thaliana, only the disc model identified clustering at high densities of stomata. This reinforces the importance of selecting an appropriate null model to avoid incorrect inferences about underlying biological mechanisms. Based on the results gathered here, we encourage researchers to abandon point-pattern analysis when studying stomata patterning; more realistic conclusions can be drawn from finite-size object analysis. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.

  9. Microchip capillary electrophoresis for point-of-care analysis of lithium

    NARCIS (Netherlands)

    Vrouwe, E.X.; Lüttge, Regina; Vermes, I.; van den Berg, Albert

    Background: Microchip capillary electrophoresis (CE) is a promising method for chemical analysis of complex samples such as whole blood. We evaluated the method for point-of-care testing of lithium. Methods: Chemical separation was performed on standard glass microchip CE devices with a conductivity

  10. Kinetic stability analysis of protein assembly on the center manifold around the critical point.

    Science.gov (United States)

    Tsuruyama, Tatsuaki

    2017-02-02

    Non-linear kinetic analysis is a useful method for illustration of the dynamic behavior of cellular biological systems. To date, center manifold theory (CMT) has not been sufficiently applied for stability analysis of biological systems. The aim of this study is to demonstrate the application of CMT to kinetic analysis of protein assembly and disassembly, and to propose a novel framework for nonlinear multi-parametric analysis. We propose a protein assembly model with nonlinear kinetics provided by the fluctuation in monomer concentrations during their diffusion. When the diffusion process of a monomer is self-limited to give kinetics non-linearity, numerical simulations suggest the probability that the assembly and disassembly oscillate near the critical point. We applied CMT to kinetic analysis of the center manifold around the critical point in detail, and successfully demonstrated bifurcation around the critical point, which explained the observed oscillation. The stability kinetics of the present model based on CMT illustrates a unique feature of protein assembly, namely non-linear behavior. Our findings are expected to provide methodology for analysis of biological systems.

  11. Radiological error: analysis, standard setting, targeted instruction and teamworking

    International Nuclear Information System (INIS)

    FitzGerald, Richard

    2005-01-01

    Diagnostic radiology does not have objective benchmarks for acceptable levels of missed diagnoses [1]. Until now, data collection of radiological discrepancies has been very time consuming. The culture within the specialty did not encourage it. However, public concern about patient safety is increasing. There have been recent innovations in compiling radiological interpretive discrepancy rates which may facilitate radiological standard setting. However standard setting alone will not optimise radiologists' performance or patient safety. We must use these new techniques in radiological discrepancy detection to stimulate greater knowledge sharing, targeted instruction and teamworking among radiologists. Not all radiological discrepancies are errors. Radiological discrepancy programmes must not be abused as an instrument for discrediting individual radiologists. Discrepancy rates must not be distorted as a weapon in turf battles. Radiological errors may be due to many causes and are often multifactorial. A systems approach to radiological error is required. Meaningful analysis of radiological discrepancies and errors is challenging. Valid standard setting will take time. Meanwhile, we need to develop top-up training, mentoring and rehabilitation programmes. (orig.)

  12. Nonlinear consider covariance analysis using a sigma-point filter formulation

    Science.gov (United States)

    Lisano, Michael E.

    2006-01-01

    The research reported here extends the mathematical formulation of nonlinear, sigma-point estimators to enable consider covariance analysis for dynamical systems. This paper presents a novel sigma-point consider filter algorithm, for consider-parameterized nonlinear estimation, following the unscented Kalman filter (UKF) variation on the sigma-point filter formulation, which requires no partial derivatives of dynamics models or measurement models with respect to the parameter list. It is shown that, consistent with the attributes of sigma-point estimators, a consider-parameterized sigma-point estimator can be developed entirely without requiring the derivation of any partial-derivative matrices related to the dynamical system, the measurements, or the considered parameters, which appears to be an advantage over the formulation of a linear-theory sequential consider estimator. It is also demonstrated that a consider covariance analysis performed with this 'partial-derivative-free' formulation yields equivalent results to the linear-theory consider filter, for purely linear problems.

  13. Comparison between non-invasive methods used on paintings by Goya and his contemporaries: hyperspectral imaging vs. point-by-point spectroscopic analysis.

    Science.gov (United States)

    Daniel, Floréal; Mounier, Aurélie; Pérez-Arantegui, Josefina; Pardos, Carlos; Prieto-Taboada, Nagore; Fdez-Ortiz de Vallejuelo, Silvia; Castro, Kepa

    2017-06-01

    The development of non-invasive techniques for the characterization of pigments is crucial in order to preserve the integrity of the artwork. In this sense, the usefulness of hyperspectral imaging was demonstrated. It allows pigment characterization of the whole painting. However, it also sometimes requires the complementation of other point-by-point techniques. In the present article, the advantages of hyperspectral imaging over point-by-point spectroscopic analysis were evaluated. For that purpose, three paintings were analysed by hyperspectral imaging, handheld X-ray fluorescence and handheld Raman spectroscopy in order to determine the best non-invasive technique for pigment identifications. Thanks to this work, the main pigments used in Aragonese artworks, and especially in Goya's paintings, were identified and mapped by imaging reflection spectroscopy. All the analysed pigments corresponded to those used at the time of Goya. Regarding the techniques used, the information obtained by the hyperspectral imaging and point-by-point analysis has been, in general, different and complementary. Given this fact, selecting only one technique is not recommended, and the present work demonstrates the usefulness of the combination of all the techniques used as the best non-invasive methodology for the pigments' characterization. Moreover, the proposed methodology is a relatively quick procedure that allows a larger number of Goya's paintings in the museum to be surveyed, increasing the possibility of obtaining significant results and providing a chance for extensive comparisons, which are relevant from the point of view of art history issues.

  14. Point pattern analysis applied to flood and landslide damage events in Switzerland (1972-2009)

    Science.gov (United States)

    Barbería, Laura; Schulte, Lothar; Carvalho, Filipe; Peña, Juan Carlos

    2017-04-01

    Damage caused by meteorological and hydrological extreme events depends on many factors, not only on hazard, but also on exposure and vulnerability. In order to reach a better understanding of the relation of these complex factors, their spatial pattern and underlying processes, the spatial dependency between values of damage recorded at sites of different distances can be investigated by point pattern analysis. For the Swiss flood and landslide damage database (1972-2009) first steps of point pattern analysis have been carried out. The most severe events have been selected (severe, very severe and catastrophic, according to GEES classification, a total number of 784 damage points) and Ripley's K-test and L-test have been performed, amongst others. For this purpose, R's library spatstat has been used. The results confirm that the damage points present a statistically significant clustered pattern, which could be connected to prevalence of damages near watercourses and also to rainfall distribution of each event, together with other factors. On the other hand, bivariate analysis shows there is no segregated pattern depending on process type: flood/debris flow vs landslide. This close relation points to a coupling between slope and fluvial processes, connectivity between small-size and middle-size catchments and the influence of spatial distribution of precipitation, temperature (snow melt and snow line) and other predisposing factors such as soil moisture, land-cover and environmental conditions. Therefore, further studies will investigate the relationship between the spatial pattern and one or more covariates, such as elevation, distance from watercourse or land use. The final goal will be to perform a regression model to the data, so that the adjusted model predicts the intensity of the point process as a function of the above mentioned covariates.

  15. Dosimetric analysis at ICRU reference points in HDR-brachytherapy of cervical carcinoma.

    Science.gov (United States)

    Eich, H T; Haverkamp, U; Micke, O; Prott, F J; Müller, R P

    2000-01-01

    In vivo dosimetry in bladder and rectum as well as determining doses on suggested reference points following the ICRU report 38 contribute to quality assurance in HDR-brachytherapy of cervical carcinoma, especially to minimize side effects. In order to gain information regarding the radiation exposure at ICRU reference points in rectum, bladder, ureter and regional lymph nodes those were calculated (digitalisation) by means of orthogonal radiographs of 11 applications in patients with cervical carcinoma, who received primary radiotherapy. In addition, the doses at the ICRU rectum reference point was compared to the results of in vivo measurements in the rectum. The in vivo measurements were by factor 1.5 below the doses determined for the ICRU rectum reference point (4.05 +/- 0.68 Gy versus 6.11 +/- 1.63 Gy). Reasons for this were: calibration errors, non-orthogonal radiographs, movement of applicator and probe in the time span between X-ray and application, missing connection of probe and anterior rectal wall. The standard deviation of calculations at ICRU reference points was on average +/- 30%. Possible reasons for the relatively large standard deviation were difficulties in defining the points, identifying them on radiographs and the different locations of the applicators. Although 3 D CT, US or MR based treatment planning using dose volume histogram analysis is more and more established, this simple procedure of marking and digitising the ICRU reference points lengthened treatment planning only by 5 to 10 minutes. The advantages of in vivo dosimetry are easy practicability and the possibility to determine rectum doses during radiation. The advantages of computer-aided planning at ICRU reference points are that calculations are available before radiation and that they can still be taken into account for treatment planning. Both methods should be applied in HDR-brachytherapy of cervical carcinoma.

  16. IMAGE-PLANE ANALYSIS OF n-POINT-MASS LENS CRITICAL CURVES AND CAUSTICS

    International Nuclear Information System (INIS)

    Danek, Kamil; Heyrovský, David

    2015-01-01

    The interpretation of gravitational microlensing events caused by planetary systems or multiple stars is based on the n-point-mass lens model. The first planets detected by microlensing were well described by the two-point-mass model of a star with one planet. By the end of 2014, four events involving three-point-mass lenses had been announced. Two of the lenses were stars with two planetary companions each; two were binary stars with a planet orbiting one component. While the two-point-mass model is well understood, the same cannot be said for lenses with three or more components. Even the range of possible critical-curve topologies and caustic geometries of the three-point-mass lens remains unknown. In this paper we provide new tools for mapping the critical-curve topology and caustic cusp number in the parameter space of n-point-mass lenses. We perform our analysis in the image plane of the lens. We show that all contours of the Jacobian are critical curves of re-scaled versions of the lens configuration. Utilizing this property further, we introduce the cusp curve to identify cusp-image positions on all contours simultaneously. In order to track cusp-number changes in caustic metamorphoses, we define the morph curve, which pinpoints the positions of metamorphosis-point images along the cusp curve. We demonstrate the usage of both curves on simple two- and three-point-mass lens examples. For the three simplest caustic metamorphoses we illustrate the local structure of the image and source planes

  17. Hazard analysis and critical control point (HACCP) for an ultrasound food processing operation.

    Science.gov (United States)

    Chemat, Farid; Hoarau, Nicolas

    2004-05-01

    Emerging technologies, such as ultrasound (US), used for food and drink production often cause hazards for product safety. Classical quality control methods are inadequate to control these hazards. Hazard analysis of critical control points (HACCP) is the most secure and cost-effective method for controlling possible product contamination or cross-contamination, due to physical or chemical hazard during production. The following case study on the application of HACCP to an US food-processing operation demonstrates how the hazards at the critical control points of the process are effectively controlled through the implementation of HACCP.

  18. Comparative analysis among several methods used to solve the point kinetic equations

    International Nuclear Information System (INIS)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da

    2007-01-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  19. Comparative analysis among several methods used to solve the point kinetic equations

    Energy Technology Data Exchange (ETDEWEB)

    Nunes, Anderson L.; Goncalves, Alessandro da C.; Martinez, Aquilino S.; Silva, Fernando Carvalho da [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear; E-mails: alupo@if.ufrj.br; agoncalves@con.ufrj.br; aquilino@lmp.ufrj.br; fernando@con.ufrj.br

    2007-07-01

    The main objective of this work consists on the methodology development for comparison of several methods for the kinetics equations points solution. The evaluated methods are: the finite differences method, the stiffness confinement method, improved stiffness confinement method and the piecewise constant approximations method. These methods were implemented and compared through a systematic analysis that consists basically of confronting which one of the methods consume smaller computational time with higher precision. It was calculated the relative which function is to combine both criteria in order to reach the goal. Through the analyses of the performance factor it is possible to choose the best method for the solution of point kinetics equations. (author)

  20. Meta-Analysis of Effect Sizes Reported at Multiple Time Points Using General Linear Mixed Model

    Science.gov (United States)

    Musekiwa, Alfred; Manda, Samuel O. M.; Mwambi, Henry G.; Chen, Ding-Geng

    2016-01-01

    Meta-analysis of longitudinal studies combines effect sizes measured at pre-determined time points. The most common approach involves performing separate univariate meta-analyses at individual time points. This simplistic approach ignores dependence between longitudinal effect sizes, which might result in less precise parameter estimates. In this paper, we show how to conduct a meta-analysis of longitudinal effect sizes where we contrast different covariance structures for dependence between effect sizes, both within and between studies. We propose new combinations of covariance structures for the dependence between effect size and utilize a practical example involving meta-analysis of 17 trials comparing postoperative treatments for a type of cancer, where survival is measured at 6, 12, 18 and 24 months post randomization. Although the results from this particular data set show the benefit of accounting for within-study serial correlation between effect sizes, simulations are required to confirm these results. PMID:27798661

  1. Optimal Systolic Blood Pressure Target After SPRINT: Insights from a Network Meta-Analysis of Randomized Trials.

    Science.gov (United States)

    Bangalore, Sripal; Toklu, Bora; Gianos, Eugenia; Schwartzbard, Arthur; Weintraub, Howard; Ogedegbe, Gbenga; Messerli, Franz H

    2017-06-01

    The optimal on-treatment blood pressure (BP) target has been a matter of debate. The recent SPRINT trial showed significant benefits of a BP target of meta-analysis. Seventeen trials that enrolled 55,163 patients with 204,103 patient-years of follow-up were included. There was a significant decrease in stroke (rate ratio [RR] 0.54; 95% confidence interval [CI], 0.29-1.00) and myocardial infarction (RR 0.68; 95% CI, 0.47-1.00) with systolic BP <120 mm Hg (vs <160 mm Hg). Sensitivity analysis using achieved systolic BP showed a 72%, 97%, and 227% increase in stroke with systolic BP of <140 mm Hg, <150 mm Hg, and <160 mm, respectively, when compared with systolic BP <120 mm Hg. There was no difference in death, cardiovascular death, or heart failure when comparing any of the BP targets. However, the point estimate favored lower BP targets (<120 mm Hg, <130 mm Hg) when compared with higher BP targets (<140 mm Hg or <150 mm Hg). BP targets of <120 mm Hg and <130 mm Hg ranked #1 and #2, respectively, as the most efficacious target. There was a significant increase in serious adverse effects with systolic BP <120 mm Hg vs <150 mm Hg (RR 1.83; 95% CI, 1.05-3.20) or vs <140 mm Hg (RR 2.12; 95% CI, 1.46-3.08). BP targets of <140 mm Hg and <150 mm Hg ranked #1 and #2, respectively, as the safest target for the outcome of serious adverse effects. Cluster plots for combined efficacy and safety showed that a systolic BP target of <130 mm Hg had optimal balance between efficacy and safety. In patients with hypertension, a on-treatment systolic BP target of <130 mm Hg achieved optimal balance between efficacy and safety. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. Analysis of Features for Synthetic Aperture Radar Target Classification

    Science.gov (United States)

    2015-03-26

    Definition AFRL Air Force Research Laboratory ATR automatic target recognition CFAR constant false alarm rate CV civilian vehicles HOG histograms of oriented...percent bright constant false alarm rate ( CFAR ), and fractal dimension of the target in the image have been used and compared to training data to

  3. CRISPRTarget: bioinformatic prediction and analysis of crRNA targets

    NARCIS (Netherlands)

    Biswas, A.; Gagnon, J.N.; Brouns, S.J.J.; Fineran, P.C.; Brown, C.M.

    2013-01-01

    The bacterial and archaeal CRISPR/Cas adaptive immune system targets specific protospacer nucleotide sequences in invading organisms. This requires base pairing between processed CRISPR RNA and the target protospacer. For type I and II CRISPR/Cas systems, protospacer adjacent motifs (PAM) are

  4. Trend analysis and change point detection of annual and seasonal temperature series in Peninsular Malaysia

    Science.gov (United States)

    Suhaila, Jamaludin; Yusop, Zulkifli

    2017-06-01

    Most of the trend analysis that has been conducted has not considered the existence of a change point in the time series analysis. If these occurred, then the trend analysis will not be able to detect an obvious increasing or decreasing trend over certain parts of the time series. Furthermore, the lack of discussion on the possible factors that influenced either the decreasing or the increasing trend in the series needs to be addressed in any trend analysis. Hence, this study proposes to investigate the trends, and change point detection of mean, maximum and minimum temperature series, both annually and seasonally in Peninsular Malaysia and determine the possible factors that could contribute to the significance trends. In this study, Pettitt and sequential Mann-Kendall (SQ-MK) tests were used to examine the occurrence of any abrupt climate changes in the independent series. The analyses of the abrupt changes in temperature series suggested that most of the change points in Peninsular Malaysia were detected during the years 1996, 1997 and 1998. These detection points captured by Pettitt and SQ-MK tests are possibly related to climatic factors, such as El Niño and La Niña events. The findings also showed that the majority of the significant change points that exist in the series are related to the significant trend of the stations. Significant increasing trends of annual and seasonal mean, maximum and minimum temperatures in Peninsular Malaysia were found with a range of 2-5 °C/100 years during the last 32 years. It was observed that the magnitudes of the increasing trend in minimum temperatures were larger than the maximum temperatures for most of the studied stations, particularly at the urban stations. These increases are suspected to be linked with the effect of urban heat island other than El Niño event.

  5. Analysis of Advantages and Disadvantages of the Location Methods of International Auricular Acupuncture Points.

    Science.gov (United States)

    Rong, Pei-Jing; Zhao, Jing-Jun; Wang, Lei; Zhou, Li-Qun

    2016-01-01

    The international standardization of auricular acupuncture points (AAPs) is an important basis for auricular therapy or auricular diagnosis and treatment. The study on the international standardization of AAPs has gone through a long process, in which the location method is one of the key research projects. There are different points of view in the field of AAPs among experts from different countries or regions. By only analyzing the nine representative location methods, this paper tried to offer a proper location method to locate AAPs. Through analysis of the pros and cons of each location method, the location method applied in the WFAS international standard of AAPs is thoroughly considered as an appropriate method. It is important to keep the right direction during developing an International Organization for Standardization (ISO) international standard of auricular acupuncture points and to improve the research quality of international standardization for AAPs.

  6. Targeting gender: A content analysis of alcohol advertising in magazines.

    Science.gov (United States)

    Jung, A-Reum; Hovland, Roxanne

    2016-01-01

    Creating target specific advertising is fundamental to maximizing advertising effectiveness. When crafting an advertisement, message and creative strategies are considered important because they affect target audiences' attitudes toward advertised products. This study endeavored to find advertising strategies that are likely to have special appeal for men or women by examining alcohol advertising in magazines. The results show that the substance of the messages is the same for men and women, but they only differ in terms of presentation. However, regardless of gender group, the most commonly used strategies in alcohol advertising are appeals to the target audience's emotions.

  7. Point-of-Care Hemoglobin A1c Testing: An Evidence-Based Analysis.

    Science.gov (United States)

    2014-01-01

    The increasing prevalence of diabetes in Ontario means that there will be growing demand for hemoglobin A1c (HbA1c) testing to monitor glycemic control for the management of this chronic disease. Testing HbA1c where patients receive their diabetes care may improve system efficiency if the results from point-of-care HbA1c testing are comparable to those from laboratory HbA1c measurements. To review the correlation between point-of-care HbA1c testing and laboratory HbA1c measurement in patients with diabetes in clinical settings. The literature search included studies published between January 2003 and June 2013. Search terms included glycohemoglobin, hemoglobin A1c, point of care, and diabetes. Studies were included if participants had diabetes; if they compared point-of-care HbA1c devices (licensed by Health Canada and available in Canada) with laboratory HbA1c measurement (reference method); if they performed point-of-care HbA1c testing using capillary blood samples (finger pricks) and laboratory HbA1c measurement using venous blood samples within 7 days; and if they reported a correlation coefficient between point-of-care HbA1c and laboratory HbA1c results. Three point-of-care HbA1c devices were reviewed in this analysis: Bayer's A1cNow+, Bio-Rad's In2it, and Siemens' DCA Vantage. Five observational studies met the inclusion criteria. The pooled results showed a positive correlation between point-of-care HbA1c testing and laboratory HbA1c measurement (correlation coefficient, 0.967; 95% confidence interval, 0.960-0.973). Outcomes were limited to the correlation coefficient, as this was a commonly reported measure of analytical performance in the literature. Results should be interpreted with caution due to risk of bias related to selection of participants, reference standards, and the multiple steps involved in POC HbA1c testing. Moderate quality evidence showed a positive correlation between point-of-care HbA1c testing and laboratory HbA1c measurement. Five

  8. Pasteurised milk and implementation of HACCP (Hazard Analysis Critical Control Point

    Directory of Open Access Journals (Sweden)

    T.B Murdiati

    2004-10-01

    Full Text Available The purpose of pasteurisation is to destroy pathogen bacteria without affecting the taste, flavor, and nutritional value. A study on the implementation of HACCP (Hazard Analysis Critical Control Point in producing pasteurized milk was carried out in four processing unit of pasteurised milk, one in Jakarta, two in Bandung and one in Bogor. The critical control points in the production line were identified. Milk samples were collected from the critical points and were analysed for the total number of microbes. Antibiotic residues were detected on raw milks. The study indicated that one unit in Bandung dan one unit in Jakarta produced pasteurized milk with lower number of microbes than the other units, due to better management and control applied along the chain of production. Penisilin residues was detected in raw milk used by unit in Bogor. Six critical points and the hazard might arise in those points were identified, as well as how to prevent the hazards. Quality assurance system such as HACCP would be able to produce high quality and safety of pasteurised milk, and should be implemented gradually.

  9. Field evaluation of a rapid point-of-care assay for targeting antibiotic treatment for trachoma control: a comparative study.

    Science.gov (United States)

    Michel, Claude-Edouard C; Solomon, Anthony W; Magbanua, Jose P V; Massae, Patrick A; Huang, Ling; Mosha, Jonaice; West, Sheila K; Nadala, Elpidio C B; Bailey, Robin; Wisniewski, Craig; Mabey, David C W; Lee, Helen H

    2006-05-13

    Trachoma results from repeated episodes of conjunctival infection with Chlamydia trachomatis and is the leading infectious cause of blindness. To eliminate trachoma, control programmes use the SAFE strategy (Surgery, Antibiotics, Face cleanliness, and Environmental improvement). The A component is designed to treat C trachomatis infection, and is initiated on the basis of the prevalence of the clinical sign trachomatous inflammation-follicular (TF). Unfortunately, TF correlates poorly with C trachomatis infection. We sought to assess a newly developed point-of-care (POC) assay compared with presence of TF for guiding the use of antibiotics for trachoma control. We compared performance outcomes of the POC assay and presence of TF using commercial PCR as a comparator in 664 children aged 1-9 years in remote, trachoma-endemic villages in Tanzania. Signs of trachoma were graded according to the WHO simplified trachoma grading system. Of 664 participants, 128 (19%) were positive for ocular C trachomatis infection by PCR. Presence of TF had a sensitivity of 64.1% (95% CI 55.8-72.4), specificity of 80.2% (76.8-83.6), and positive predictive value of 43.6% (36.5-50.7). By contrast, the POC assay had a sensitivity of 83.6% (77.2-90.0), specificity of 99.4% (98.8-100.0), and positive predictive value of 97.3% (94.2-100.3). Interagreements and intra-agreements between four novice operators were 0.988 (0.973-1.000) and 0.950 (0.894-1.000), respectively. The POC assay is substantially more accurate than TF prevalence in identifying the presence or absence of infection. Additional studies should assess the use of the assay in the planning and monitoring of trachoma control activities.

  10. System implementation of hazard analysis and critical control points (HACCP) in a nitrogen production plant

    International Nuclear Information System (INIS)

    Barrantes Salazar, Alexandra

    2014-01-01

    System of hazard analysis and critical control points are deployed in a production plant of liquid nitrogen. The fact that the nitrogen has become a complement to food packaging to increase shelf life, or provide a surface that protect it from manipulation, has been the main objective. Analysis of critical control points for the nitrogen production plant has been the adapted methodology. The knowledge of both the standard and the production process, as well as the on site verification process, have been necessary. In addition, all materials and/or processing units that are found in contact with the raw material or the product under study were evaluated. Such a way that the intrinsic risks of each were detected, from the physical, chemical and biological points of view according to the origin or pollution source. For each found risk was evaluated the probability of occurrence according to the frequency and gravity of it, with these variables determined was achieved the definition of the type of risk detected. In the cases that was presented a greater risk or critical, these were subjected decision tree; with which is concluded the non determination of critical control points. However, for each one of them were established the maximum permitted limits. To generate each of the results it has literature or scientific reference of reliable provenance, where is indicated properly the support of the evaluated matter. In a general way, the material matrix and the process matrix are found without critical control points; so that the project is concluded in the analysis, and it has to generate without the monitoring system and verification. To increase this project is suggested in order to cover the packaging system of gaseous nitrogen, due to it was delimited to liquid nitrogen. Furthermore, the liquid nitrogen is a 100% automated and closed process so the introduction of contaminants is very reduced, unlike the gaseous nitrogen process. (author) [es

  11. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    Directory of Open Access Journals (Sweden)

    Ting Zhang

    Full Text Available This paper aims to identify the key fields and their key technical points of oncology by patent analysis.Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC and the International Patent Classification (IPC, respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012 and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year and the standardized values of patent applications in seven years (2006-2012 was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones.The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and

  12. Identification of the Key Fields and Their Key Technical Points of Oncology by Patent Analysis.

    Science.gov (United States)

    Zhang, Ting; Chen, Juan; Jia, Xiaofeng

    2015-01-01

    This paper aims to identify the key fields and their key technical points of oncology by patent analysis. Patents of oncology applied from 2006 to 2012 were searched in the Thomson Innovation database. The key fields and their key technical points were determined by analyzing the Derwent Classification (DC) and the International Patent Classification (IPC), respectively. Patent applications in the top ten DC occupied 80% of all the patent applications of oncology, which were the ten fields of oncology to be analyzed. The number of patent applications in these ten fields of oncology was standardized based on patent applications of oncology from 2006 to 2012. For each field, standardization was conducted separately for each of the seven years (2006-2012) and the mean of the seven standardized values was calculated to reflect the relative amount of patent applications in that field; meanwhile, regression analysis using time (year) and the standardized values of patent applications in seven years (2006-2012) was conducted so as to evaluate the trend of patent applications in each field. Two-dimensional quadrant analysis, together with the professional knowledge of oncology, was taken into consideration in determining the key fields of oncology. The fields located in the quadrant with high relative amount or increasing trend of patent applications are identified as key ones. By using the same method, the key technical points in each key field were identified. Altogether 116,820 patents of oncology applied from 2006 to 2012 were retrieved, and four key fields with twenty-nine key technical points were identified, including "natural products and polymers" with nine key technical points, "fermentation industry" with twelve ones, "electrical medical equipment" with four ones, and "diagnosis, surgery" with four ones. The results of this study could provide guidance on the development direction of oncology, and also help researchers broaden innovative ideas and discover new

  13. Gene microarray data analysis using parallel point-symmetry-based clustering.

    Science.gov (United States)

    Sarkar, Anasua; Maulik, Ujjwal

    2015-01-01

    Identification of co-expressed genes is the central goal in microarray gene expression analysis. Point-symmetry-based clustering is an important unsupervised learning technique for recognising symmetrical convex- or non-convex-shaped clusters. To enable fast clustering of large microarray data, we propose a distributed time-efficient scalable approach for point-symmetry-based K-Means algorithm. A natural basis for analysing gene expression data using symmetry-based algorithm is to group together genes with similar symmetrical expression patterns. This new parallel implementation also satisfies linear speedup in timing without sacrificing the quality of clustering solution on large microarray data sets. The parallel point-symmetry-based K-Means algorithm is compared with another new parallel symmetry-based K-Means and existing parallel K-Means over eight artificial and benchmark microarray data sets, to demonstrate its superiority, in both timing and validity. The statistical analysis is also performed to establish the significance of this message-passing-interface based point-symmetry K-Means implementation. We also analysed the biological relevance of clustering solutions.

  14. Performance Analysis on ISAR Imaging of Space Targets

    Directory of Open Access Journals (Sweden)

    Zhou Yejian

    2017-02-01

    Full Text Available Usually, in traditional Inverse Synthetic Aperture Radar (ISAR systems design and mode selection for space satellite targets, coherent integration gain in azimuth direction hardly can be analyzed, which depends on target’s motion. In this study, we combine the target orbit parameters to determine its motion relative to radar and deduce coherent integration equation in ISAR imaging to realize the selection of imaging intervals based on coherent integration, which can ensure the resolution in azimuth direction. Meanwhile, we analyze the influence of target orbit altitude to echo power and imaging Signal-to-Noise Ratio (SNR that provides a new indicator for space observation ISAR systems design. The result of simulation experiment illustrates that with target orbit altitude increasing, coherent integration gain in azimuth direction of large-angular observation offsets the decreasing of imaging SNR in a degree, which provides a brand-new perspective for space observation ISAR systems and signal processing design.

  15. Monetary targeting and financial system characteristics : An empirical analysis

    NARCIS (Netherlands)

    Samarina, A..

    2012-01-01

    This paper investigates how reforms and characteristics of the financial system affect the likelihood of countries to abandon their strategy of monetary targeting. Apart from financial system characteristics, we include macroeconomic, fiscal, and institutional factors potentially associated with

  16. Integrated targeted and non-targeted analysis of water sample extracts with micro-scale UHPLC–MS

    Directory of Open Access Journals (Sweden)

    Dominik Deyerling

    2015-01-01

    • The filtering of database hits for two criteria (exact mass and partition coefficient significantly reduced the list of suspects and at the same time rendered it possible to perform non-target analysis with lower mass accuracy (no lock-spray in the range of 20–500 ppm.

  17. Visualizing nD Point Clouds as Topological Landscape Profiles to Guide Local Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oesterling, Patrick [Univ. of Leipzig (Germany). Computer Science Dept.; Heine, Christian [Univ. of Leipzig (Germany). Computer Science Dept.; Federal Inst. of Technology (ETH), Zurich (Switzerland). Dept. of Computer Science; Weber, Gunther H. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Scheuermann, Gerik [Univ. of Leipzig (Germany). Computer Science Dept.

    2012-05-04

    Analyzing high-dimensional point clouds is a classical challenge in visual analytics. Traditional techniques, such as projections or axis-based techniques, suffer from projection artifacts, occlusion, and visual complexity.We propose to split data analysis into two parts to address these shortcomings. First, a structural overview phase abstracts data by its density distribution. This phase performs topological analysis to support accurate and non-overlapping presentation of the high-dimensional cluster structure as a topological landscape profile. Utilizing a landscape metaphor, it presents clusters and their nesting as hills whose height, width, and shape reflect cluster coherence, size, and stability, respectively. A second local analysis phase utilizes this global structural knowledge to select individual clusters or point sets for further, localized data analysis. Focusing on structural entities significantly reduces visual clutter in established geometric visualizations and permits a clearer, more thorough data analysis. In conclusion, this analysis complements the global topological perspective and enables the user to study subspaces or geometric properties, such as shape.

  18. Fuzzy Risk Analysis for a Production System Based on the Nagel Point of a Triangle

    Directory of Open Access Journals (Sweden)

    Handan Akyar

    2016-01-01

    Full Text Available Ordering and ranking fuzzy numbers and their comparisons play a significant role in decision-making problems such as social and economic systems, forecasting, optimization, and risk analysis problems. In this paper, a new method for ordering triangular fuzzy numbers using the Nagel point of a triangle is presented. With the aid of the proposed method, reasonable properties of ordering fuzzy numbers are verified. Certain comparative examples are given to illustrate the advantages of the new method. Many papers have been devoted to studies on fuzzy ranking methods, but some of these studies have certain shortcomings. The proposed method overcomes the drawbacks of the existing methods in the literature. The suggested method can order triangular fuzzy numbers as well as crisp numbers and fuzzy numbers with the same centroid point. An application to the fuzzy risk analysis problem is given, based on the suggested ordering approach.

  19. Python Spectral Analysis Tool (PySAT) for Preprocessing, Multivariate Analysis, and Machine Learning with Point Spectra

    Science.gov (United States)

    Anderson, R. B.; Finch, N.; Clegg, S.; Graff, T.; Morris, R. V.; Laura, J.

    2017-06-01

    We present a Python-based library and graphical interface for the analysis of point spectra. The tool is being developed with a focus on methods used for ChemCam data, but is flexible enough to handle spectra from other instruments.

  20. A Deep Learning Prediction Model Based on Extreme-Point Symmetric Mode Decomposition and Cluster Analysis

    OpenAIRE

    Li, Guohui; Zhang, Songling; Yang, Hong

    2017-01-01

    Aiming at the irregularity of nonlinear signal and its predicting difficulty, a deep learning prediction model based on extreme-point symmetric mode decomposition (ESMD) and clustering analysis is proposed. Firstly, the original data is decomposed by ESMD to obtain the finite number of intrinsic mode functions (IMFs) and residuals. Secondly, the fuzzy c-means is used to cluster the decomposed components, and then the deep belief network (DBN) is used to predict it. Finally, the reconstructed ...

  1. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    International Nuclear Information System (INIS)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined

  2. PREMOR: a point reactor exposure model computer code for survey analysis of power plant performance

    Energy Technology Data Exchange (ETDEWEB)

    Vondy, D.R.

    1979-10-01

    The PREMOR computer code was written to exploit a simple, two-group point nuclear reactor power plant model for survey analysis. Up to thirteen actinides, fourteen fission products, and one lumped absorber nuclide density are followed over a reactor history. Successive feed batches are accounted for with provision for from one to twenty batches resident. The effect of exposure of each of the batches to the same neutron flux is determined.

  3. Identification of estrogen target genes during zebrafish embryonic development through transcriptomic analysis.

    Directory of Open Access Journals (Sweden)

    Ruixin Hao

    Full Text Available Estrogen signaling is important for vertebrate embryonic development. Here we have used zebrafish (Danio rerio as a vertebrate model to analyze estrogen signaling during development. Zebrafish embryos were exposed to 1 µM 17β-estradiol (E2 or vehicle from 3 hours to 4 days post fertilization (dpf, harvested at 1, 2, 3 and 4 dpf, and subjected to RNA extraction for transcriptome analysis using microarrays. Differentially expressed genes by E2-treatment were analyzed with hierarchical clustering followed by biological process and tissue enrichment analysis. Markedly distinct sets of genes were up and down-regulated by E2 at the four different time points. Among these genes, only the well-known estrogenic marker vtg1 was co-regulated at all time points. Despite this, the biological functional categories targeted by E2 were relatively similar throughout zebrafish development. According to knowledge-based tissue enrichment, estrogen responsive genes were clustered mainly in the liver, pancreas and brain. This was in line with the developmental dynamics of estrogen-target tissues that were visualized using transgenic zebrafish containing estrogen responsive elements driving the expression of GFP (Tg(5xERE:GFP. Finally, the identified embryonic estrogen-responsive genes were compared to already published estrogen-responsive genes identified in male adult zebrafish (Gene Expression Omnibus database. The expressions of a few genes were co-regulated by E2 in both embryonic and adult zebrafish. These could potentially be used as estrogenic biomarkers for exposure to estrogens or estrogenic endocrine disruptors in zebrafish. In conclusion, our data suggests that estrogen effects on early embryonic zebrafish development are stage- and tissue- specific.

  4. Analysis method of beam pointing stability based on optical transmission matrix

    Science.gov (United States)

    Wang, Chuanchuan; Huang, PingXian; Li, Xiaotong; Cen, Zhaofen

    2016-10-01

    Quite a lot of factors will make effects on beam pointing stability of an optical system, Among them, the element tolerance is one of the most important and common factors. In some large laser systems, it will make final micro beams spot on the image plane deviate obviously. So it is essential for us to achieve effective and accurate analysis theoretically on element tolerance. In order to make the analysis of beam pointing stability convenient and theoretical, we consider transmission of a single chief ray rather than beams approximately to stand for the whole spot deviation. According to optical matrix, we also simplify this complex process of light transmission to multiplication of many matrices. So that we can set up element tolerance model, namely having mathematical expression to illustrate spot deviation in an optical system with element tolerance. In this way, we can realize quantitative analysis of beam pointing stability theoretically. In second half of the paper, we design an experiment to get the spot deviation in a multipass optical system caused by element tolerance, then we adjust the tolerance step by step and compare the results with the datum got from tolerance model, finally prove the correction of tolerance model successfully.

  5. Point-of-Care Hemoglobin A1c Testing: A Budget Impact Analysis.

    Science.gov (United States)

    Chadee, A; Blackhouse, G; Goeree, R

    2014-01-01

    The increasing prevalence of diabetes in Ontario means that there will be growing demand for hemoglobin A1c (HbA1c) testing to monitor glycemic control as part of managing this chronic disease. Testing HbA1c where patients receive their diabetes care may improve system efficiency if the results from point-of-care HbA1c testing are comparable to those from laboratory HbA1c measurements. To estimate the budget impact of point-of-care HbA1c testing to replace laboratory HbA1c measurement for monitoring glycemic control in patients with diabetes in 2013/2014. This analysis compared the average testing cost of 3 point-of-care HbA1c devices licensed by Health Canada and available on the market in Canada (Bayer's A1cNow+, Siemens's DCA Vantage, and Bio Rad's In2it), with that of the laboratory HbA1c reference method. The cost difference between point-of-care HbA1c testing and laboratory HbA1c measurement was calculated. Costs and the corresponding range of net impact were estimated in sensitivity analyses. The total annual costs of laboratory HbA1c measurement and point-of-care HbA1c testing for 2013/2014 were $91.5 million and $86.8 million, respectively. Replacing all laboratory HbA1c measurements with point-of-care HbA1c testing would save approximately $4.7 million over the next year. Savings could be realized by the health care system at each level that point-of-care HbA1c testing is substituted for laboratory HbA1c measurement. If physician fees were excluded from the analysis, the health care system would incur a net impact from using point-of-care HbA1c testing instead of laboratory A1c measurement. Point-of-care HbA1c technology is already in use in the Ontario health care system, but the current uptake is unclear. Knowing the adoption rate and market share of point-of-care HbA1c technology would allow for a more accurate estimate of budget impact. Replacing laboratory HbA1c measurement with point-of-care HbA1c testing or using point-of-care HbA1c testing in

  6. Harmonic Analysis of DC-Link Capacitor Current in Sinusoidally Modulated Neutral-Point-Clamped Inverter

    OpenAIRE

    Gopalakrishnan, KS; Narayanan, G

    2013-01-01

    The voltage ripple and power loss in the DC-capacitor of a voltage source inverter depend on the harmonic currents flowing through the capacitor. This paper presents double Fourier series based harmonic analysis of DC capacitor current in a three-level neutral point clamped inverter, modulated with sine-triangle PWM. The analytical results are validated experimentally on a 5-kVA three-level inverter prototype. The results of the analysis are used for predicting the power loss in the DC cap...

  7. Analysis and Segmentation of Face Images using Point Annotations and Linear Subspace Techniques

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille

    2002-01-01

    This report provides an analysis of 37 annotated frontal face images. All results presented have been obtained using our freely available Active Appearance Model (AAM) implementation. To ensure the reproducibility of the presented experiments, the data set has also been made available. As such......, the data and this report may serve as a point of reference to compare other AAM implementations against. In addition, we address the problem of AAM model truncation using parallel analysis along with a comparable study of the two prevalent AAM learning methods; principal component regression and estimation...

  8. Analysis of intraosseous blood samples using an EPOC point of care analyzer during resuscitation.

    Science.gov (United States)

    Tallman, Crystal Ives; Darracq, Michael; Young, Megann

    2017-03-01

    In the early phases of resuscitation in a critically ill patient, especially those in cardiac arrest, intravenous (IV) access can be difficult to obtain. Intraosseous (IO) access is often used in these critical situations to allow medication administration. When no IV access is available, it is difficult to obtain blood for point of care analysis, yet this information can be crucial in directing the resuscitation. We hypothesized that IO samples may be used with a point of care device to obtain useful information when seconds really do matter. Patients presenting to the emergency department requiring resuscitation and IO placement were prospectively enrolled in a convenience sample. 17 patients were enrolled. IO and IV samples obtained within five minutes of one another were analyzed using separate EPOC® point of care analyzers. Analytes were compared using Bland Altman Plots and intraclass correlation coefficients. In this analysis of convenience sampled critically ill patients, the EPOC® point of care analyzer provided results from IO samples. IO and IV samples were most comparable for pH, bicarbonate, sodium and base excess, and potentially for lactic acid; single outliers for bicarbonate, sodium and base excess were observed. Intraclass correlation coefficients were excellent for sodium and reasonable for pH, pO2, bicarbonate, and glucose. Correlations for other variables measured by the EPOC® analyzer were not as robust. IO samples can be used with a bedside point of care analyzer to rapidly obtain certain laboratory information during resuscitations when IV access is difficult. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Genome-wide analysis of Polycomb targets in Drosophila

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, Yuri B.; Kahn, Tatyana G.; Nix, David A.; Li,Xiao-Yong; Bourgon, Richard; Biggin, Mark; Pirrotta, Vincenzo

    2006-04-01

    Polycomb Group (PcG) complexes are multiprotein assemblages that bind to chromatin and establish chromatin states leading to epigenetic silencing. PcG proteins regulate homeotic genes in flies and vertebrates but little is known about other PcG targets and the role of the PcG in development, differentiation and disease. We have determined the distribution of the PcG proteins PC, E(Z) and PSC and of histone H3K27 trimethylation in the Drosophila genome. At more than 200 PcG target genes, binding sites for the three PcG proteins colocalize to presumptive Polycomb Response Elements (PREs). In contrast, H3 me3K27 forms broad domains including the entire transcription unit and regulatory regions. PcG targets are highly enriched in genes encoding transcription factors but receptors, signaling proteins, morphogens and regulators representing all major developmental pathways are also included.

  10. Targets for bulk hydrogen analysis using thermal neutrons

    CERN Document Server

    Csikai, J; Buczko, C M

    2002-01-01

    The reflection property of substances can be characterized by the reflection cross-section of thermal neutrons, sigma subbeta. A combination of the targets with thin polyethylene foils allowed an estimation of the flux depression of thermal neutrons caused by a bulk sample containing highly absorbing elements or compounds. Some new and more accurate sigma subbeta values were determined by using the combined target arrangement. For the ratio, R of the reflection and the elastic scattering cross-sections of thermal neutrons, R=sigma subbeta/sigma sub E sub L a value of 0.60+-0.02 was found on the basis of the data obtained for a number of elements from H to Pb. Using this correlation factor, and the sigma sub E sub L values, the unknown sigma subbeta data can be deduced. The equivalent thicknesses, to polyethylene or hydrogen, of the different target materials were determined from the sigma subbeta values.

  11. Orbit Determination Error Analysis Results for the Triana Sun-Earth L2 Libration Point Mission

    Science.gov (United States)

    Marr, G.

    2003-01-01

    Using the NASA Goddard Space Flight Center's Orbit Determination Error Analysis System (ODEAS), orbit determination error analysis results are presented for all phases of the Triana Sun-Earth L1 libration point mission and for the science data collection phase of a future Sun-Earth L2 libration point mission. The Triana spacecraft was nominally to be released by the Space Shuttle in a low Earth orbit, and this analysis focuses on that scenario. From the release orbit a transfer trajectory insertion (TTI) maneuver performed using a solid stage would increase the velocity be approximately 3.1 km/sec sending Triana on a direct trajectory to its mission orbit. The Triana mission orbit is a Sun-Earth L1 Lissajous orbit with a Sun-Earth-vehicle (SEV) angle between 4.0 and 15.0 degrees, which would be achieved after a Lissajous orbit insertion (LOI) maneuver at approximately launch plus 6 months. Because Triana was to be launched by the Space Shuttle, TTI could potentially occur over a 16 orbit range from low Earth orbit. This analysis was performed assuming TTI was performed from a low Earth orbit with an inclination of 28.5 degrees and assuming support from a combination of three Deep Space Network (DSN) stations, Goldstone, Canberra, and Madrid and four commercial Universal Space Network (USN) stations, Alaska, Hawaii, Perth, and Santiago. These ground stations would provide coherent two-way range and range rate tracking data usable for orbit determination. Larger range and range rate errors were assumed for the USN stations. Nominally, DSN support would end at TTI+144 hours assuming there were no USN problems. Post-TTI coverage for a range of TTI longitudes for a given nominal trajectory case were analyzed. The orbit determination error analysis after the first correction maneuver would be generally applicable to any libration point mission utilizing a direct trajectory.

  12. COREnet: The Fusion of Social Network Analysis and Target Audience Analysis

    Science.gov (United States)

    2014-12-01

    be integrated into a HTML5 web-based Target Audience Analysis Worksheet TAAW? D. SCOPE AND METHODOLOGY This is a three-phase capstone project. The...Harrison & S. Huntington (Eds.), Culture matters: How values shape human progress (pp. 98–111). New York: Basic Books Gauchat, J. D., (2012). HTML5 for...networks to inform tactical engagement strategies that will influence the human domain. Small Wars Journal. MacDonald, M. (2012). HTML5 : The

  13. Portable Dew Point Mass Spectrometry System for Real-Time Gas and Moisture Analysis

    Science.gov (United States)

    Arkin, C.; Gillespie, Stacey; Ratzel, Christopher

    2010-01-01

    A portable instrument incorporates both mass spectrometry and dew point measurement to provide real-time, quantitative gas measurements of helium, nitrogen, oxygen, argon, and carbon dioxide, along with real-time, quantitative moisture analysis. The Portable Dew Point Mass Spectrometry (PDP-MS) system comprises a single quadrupole mass spectrometer and a high vacuum system consisting of a turbopump and a diaphragm-backing pump. A capacitive membrane dew point sensor was placed upstream of the MS, but still within the pressure-flow control pneumatic region. Pressure-flow control was achieved with an upstream precision metering valve, a capacitance diaphragm gauge, and a downstream mass flow controller. User configurable LabVIEW software was developed to provide real-time concentration data for the MS, dew point monitor, and sample delivery system pressure control, pressure and flow monitoring, and recording. The system has been designed to include in situ, NIST-traceable calibration. Certain sample tubing retains sufficient water that even if the sample is dry, the sample tube will desorb water to an amount resulting in moisture concentration errors up to 500 ppm for as long as 10 minutes. It was determined that Bev-A-Line IV was the best sample line to use. As a result of this issue, it is prudent to add a high-level humidity sensor to PDP-MS so such events can be prevented in the future.

  14. One-point calibration for calibration-free laser-induced breakdown spectroscopy quantitative analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cavalcanti, G.H.; Teixeira, D.V. [Instituto de Fìsica, Universidade Federal Fluminense, Av. Gal. Milton Tavares de Souza, s/n" o – Campus da Praia Vermelha – CEP 24210-346 – Niterói – Rio de Janeiro (Brazil); Legnaioli, S.; Lorenzetti, G.; Pardini, L. [Institute of Chemistry of Organometallic Compounds, Research Area of National Research Council, Via G. Moruzzi, 1 — 56124 Pisa (Italy); Palleschi, V., E-mail: vincenzo.palleschi@cnr.it [Institute of Chemistry of Organometallic Compounds, Research Area of National Research Council, Via G. Moruzzi, 1 — 56124 Pisa (Italy)

    2013-09-01

    We present a new method for improving the reliability of quantitative analysis by laser-induced breakdown spectroscopy (LIBS). The method can be considered as a variation of the calibration-free LIBS approach; although not completely standard-less, only one standard of known composition and similar matrix to the one to be analyzed is needed. On the other hand, the one-point calibration approach allows the empirical determination of essential experimental and spectroscopic parameters, whose knowledge is often imprecise or lacking; the result is a definite improvement of the trueness of LIBS analysis with respect to the traditional calibration-free approach. The characteristics and advantages of the proposed one-point calibration LIBS approach will be demonstrated on a set of copper-based samples of known composition. - Highlights: • A new method for improving the quantitative analysis by LIBS is presented. • Only one standard of known composition is needed for the analysis. • A set of copper-based samples of known composition is analyzed. • The concentrations calculated result remarkably close to the nominal concentrations.

  15. Inflation targeting and inflation performance : a comparative analysis

    NARCIS (Netherlands)

    Samarina, Anna; De Haan, Jakob; Terpstra, M.

    2014-01-01

    This article examines how the impact of inflation targeting on inflation performance depends on the choice of country samples, adoption dates, time periods and methodological approaches. We apply two different estimation methods - difference-in-differences and propensity score matching - for our

  16. Analysis of Myc-induced histone modifications on target chromatin.

    Directory of Open Access Journals (Sweden)

    Francesca Martinato

    Full Text Available The c-myc proto-oncogene is induced by mitogens and is a central regulator of cell growth and differentiation. The c-myc product, Myc, is a transcription factor that binds a multitude of genomic sites, estimated to be over 10-15% of all promoter regions. Target promoters generally pre-exist in an active or poised chromatin state that is further modified by Myc, contributing to fine transcriptional regulation (activation or repression of the afferent gene. Among other mechanisms, Myc recruits histone acetyl-transferases to target chromatin and locally promotes hyper-acetylation of multiple lysines on histones H3 and H4, although the identity and combination of the modified lysines is unknown. Whether Myc dynamically regulates other histone modifications (or marks at its binding sites also remains to be addressed. Here, we used quantitative chromatin immunoprecipitation (qChIP to profile a total of 24 lysine-acetylation and -methylation marks modulated by Myc at target promoters in a human B-cell line with a regulatable c-myc transgene. Myc binding promoted acetylation of multiple lysines, primarily of H3K9, H3K14, H3K18, H4K5 and H4K12, but significantly also of H4K8, H4K91 and H2AK5. Dimethylation of H3K79 was also selectively induced at target promoters. A majority of target promoters showed co-induction of multiple marks - in various combinations - correlating with recruitment of the two HATs tested (Tip60 and HBO1, incorporation of the histone variant H2A.Z and transcriptional activation. Based on this and previous findings, we surmise that Myc recruits the Tip60/p400 complex to achieve a coordinated histone acetylation/exchange reaction at activated promoters. Our data are also consistent with the additive and redundant role of multiple acetylation events in transcriptional activation.

  17. Second-order analysis of structured inhomogeneous spatio-temporal point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for first general inhomogeneous spatio-temporal point processes and second inhomogeneous spatio-temporal Cox processes. Assuming...... spatio-temporal separability of the intensity function, we clarify different meanings of second-order spatio-temporal separability. One is second-order spatio-temporal independence and relates e.g. to log-Gaussian Cox processes with an additive covariance structure of the underlying spatio......-temporal Gaussian process. Another concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data (the UK 2001 epidemic foot and mouth disease data)....

  18. Using thermal analysis techniques for identifying the flash point temperatures of some lubricant and base oils

    Directory of Open Access Journals (Sweden)

    Aksam Abdelkhalik

    2018-03-01

    Full Text Available The flash point (FP temperatures of some lubricant and base oils were measured according to ASTM D92 and ASTM D93. In addition, the thermal stability of the oils was studied using differential scanning calorimeter (DSC and thermogravimetric analysis (TGA under nitrogen atmosphere. The DSC results showed that the FP temperatures, for each oil, were found during the first decomposition step and the temperature at the peak of the first decomposition step was usually higher than FP temperatures. The TGA results indicated that the temperature at which 17.5% weigh loss take placed (T17.5% was nearly identical with the FP temperature (±10 °C that was measured according to ASTM D92. The deviation percentage between FP and T17.5% was in the range from −0.8% to 3.6%. Keywords: Flash point, TGA, DSC

  19. LIFE CYCLE ASSESSMENT AND HAZARD ANALYSIS AND CRITICAL CONTROL POINTS TO THE PASTA PRODUCT

    Directory of Open Access Journals (Sweden)

    Yulexis Meneses Linares

    2016-10-01

    Full Text Available The objective of this work is to combine the Life Cycle Assessment (LCA and Hazard Analysis and Critical Control Points (HACCP methodologies for the determination of risks that the food production represents to the human health and the ecosystem. The environmental performance of the production of pastas in the “Marta Abreu” Pasta Factory of Cienfuegos is assessed, where the critical control points determined by the biological dangers (mushrooms and plagues and the physical dangers (wood, paper, thread and ferromagnetic particles were the raw materials: flour, semolina and its mixtures, and the disposition and extraction of them. Resources are the most affected damage category due to the consumption of fossil fuels.

  20. Analysis of three-point-bend test for materials with unequal tension and compression properties

    Science.gov (United States)

    Chamis, C. C.

    1974-01-01

    An analysis capability is described for the three-point-bend test applicable to materials of linear but unequal tensile and compressive stress-strain relations. The capability consists of numerous equations of simple form and their graphical representation. Procedures are described to examine the local stress concentrations and failure modes initiation. Examples are given to illustrate the usefulness and ease of application of the capability. Comparisons are made with materials which have equal tensile and compressive properties. The results indicate possible underestimates for flexural modulus or strength ranging from 25 to 50 percent greater than values predicted when accounting for unequal properties. The capability can also be used to reduce test data from three-point-bending tests, extract material properties useful in design from these test data, select test specimen dimensions, and size structural members.

  1. Pathogen Reduction and Hazard Analysis and Critical Control Point (HACCP) systems for meat and poultry. USDA.

    Science.gov (United States)

    Hogue, A T; White, P L; Heminover, J A

    1998-03-01

    The United States Department of Agriculture (USDA) Food Safety Inspection Service (FSIS) adopted Hazard Analysis and Critical Control Point Systems and established finished product standards for Salmonella in slaughter plants to improve food safety for meat and poultry. In order to make significant improvements in food safety, measures must be taken at all points in the farm-to-table chain including production, transportation, slaughter, processing, storage, retail, and food preparation. Since pathogens can be introduced or multiplied anywhere along the continuum, success depends on consideration and comparison of intervention measures throughout the continuum. Food animal and public health veterinarians can create the necessary preventative environment that mitigates risks for food borne pathogen contamination.

  2. Analysis of Point Based Image Registration Errors With Applications in Single Molecule Microscopy.

    Science.gov (United States)

    Cohen, E A K; Ober, R J

    2013-12-15

    We present an asymptotic treatment of errors involved in point-based image registration where control point (CP) localization is subject to heteroscedastic noise; a suitable model for image registration in fluorescence microscopy. Assuming an affine transform, CPs are used to solve a multivariate regression problem. With measurement errors existing for both sets of CPs this is an errors-in-variable problem and linear least squares is inappropriate; the correct method being generalized least squares. To allow for point dependent errors the equivalence of a generalized maximum likelihood and heteroscedastic generalized least squares model is achieved allowing previously published asymptotic results to be extended to image registration. For a particularly useful model of heteroscedastic noise where covariance matrices are scalar multiples of a known matrix (including the case where covariance matrices are multiples of the identity) we provide closed form solutions to estimators and derive their distribution. We consider the target registration error (TRE) and define a new measure called the localization registration error (LRE) believed to be useful, especially in microscopy registration experiments. Assuming Gaussianity of the CP localization errors, it is shown that the asymptotic distribution for the TRE and LRE are themselves Gaussian and the parameterized distributions are derived. Results are successfully applied to registration in single molecule microscopy to derive the key dependence of the TRE and LRE variance on the number of CPs and their associated photon counts. Simulations show asymptotic results are robust for low CP numbers and non-Gaussianity. The method presented here is shown to outperform GLS on real imaging data.

  3. Validation of capillary blood analysis and capillary testing mode on the epoc Point of Care system

    Directory of Open Access Journals (Sweden)

    Jing Cao

    2017-12-01

    Full Text Available Background: Laboratory test in transport is a critical component of patient care, and capillary blood is a preferred sample type particularly in children. This study evaluated the performance of capillary blood testing on the epoc Point of Care Blood Analysis System (Alere Inc. Methods: Ten fresh venous blood samples was tested on the epoc system under the capillary mode. Correlation with GEM 4000 (Instrumentation Laboratory was examined for Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pO2, pCO2, and pH, and correlation with serum tested on Vitros 5600 (Ortho Clinical Diagnostics was examined for creatinine. Eight paired capillary and venous blood was tested on epoc and ABL800 (Radiometer for the correlation of Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Capillary blood from 23 apparently healthy volunteers was tested on the epoc system to assess the concordance to reference ranges used locally. Results: Deming regression correlation coefficients for all the comparisons were above 0.65 except for ionized Ca2+. Accordance of greater than 85% to the local reference ranges were found in all assays with the exception of pO2 and Cl-. Conclusion: Data from this study indicates that capillary blood tests on the epoc system provide comparable results to reference method for these assays, Na+, K+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Further validation in critically ill patients is needed to implement the epoc system in patient transport. Impact of the study: This study demonstrated that capillary blood tests on the epoc Point of Care Blood Analysis System give comparable results to other chemistry analyzers for major blood gas and critical tests. The results are informative to institutions where pre-hospital and inter-hospital laboratory testing on capillary blood is a critical component of patient point of care testing. Keywords: Epoc, Capillary, Transport, Blood gas, Point of care

  4. Thermal shock analysis of liquid-mercury spallation target

    CERN Document Server

    Ishikura, S; Futakawa, M; Hino, R; Date, H

    2002-01-01

    The developments of the neutron scattering facilities are carried out under the high-intensity proton accelerator project promoted by JAERI and KEK. To estimate the structural integrity of the heavy liquid-metal (Hg) target used as a spallation neutron source in a MW-class neutron scattering facility, dynamic stress behavior due to the incident of a 1 MW-pulsed proton beam was analyzed by using FEM code. Two-type target containers with semi-cylindrical type and flat-plate type window were used as models for analyses. As a result, it is confirmed that the stress (pressure wave) generated by dynamic thermal shock becomes the largest at the center of window, and the flat-plate type window is more advantageous from the structural viewpoint than the semi-cylindrical type window. It has been understood that the stress generated in the window by the pressure wave can be treated as the secondary stress. (author)

  5. Analysis of Mo99 production irradiating 20% U targets

    International Nuclear Information System (INIS)

    Calabrese, C. Ruben; Grant, Carlos R.; Marajofsky, Andres; Parkansky, David G.

    1999-01-01

    At present time, the National Atomic Energy Commission is producing about 800 Ci of Mo99 per week irradiating 90% enriched uranium-aluminum alloy plate targets in the RA-3 reactor, a 5 MW. Mtr type one. In order to change to 20% enriched uranium, and to increase the production to about 3000 Ci per week some configurations were studied with rod and plate geometry with uranium (20% enriched) -aluminum targets. The first case was the irradiation of a plate target element in the normal reactor configuration. Results showed a good efficiency, but both reactivity value and power density were too high. An element with rods was also analyzed, but results showed a poor efficiency, too much aluminum involved in the process, although a low reactivity and an acceptable rod power density. Finally, a solution consisting of plate elements with a Zircaloy cladding was adopted, which has shown not only a good efficiency, but it is also acceptable from the viewpoint of safety, heat transference criteria and feasibility

  6. 75 FR 8239 - School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP...

    Science.gov (United States)

    2010-02-24

    ... 0584-AD65 School Food Safety Program Based on Hazard Analysis and Critical Control Point Principles... Safety Program Based on Hazard Analysis and Critical Control Point Principles (HACCP) was published on... school food safety program for the preparation and service of school meals served to children. The Office...

  7. Change-Point and Trend Analysis on Annual Maximum Discharge in Continental United States

    Science.gov (United States)

    Serinaldi, F.; Villarini, G.; Smith, J. A.; Krajewski, W. F.

    2008-12-01

    Annual maximum discharge records from 36 stations representing different hydro-climatic regimes in the continental United States with at least 100 years of records are used to investigate the presence of temporal trends and abrupt changes in mean and variance. Change point analysis is performed by means of two non- parametric (Pettitt and CUSUM), one semi-parametric (Guan), and two parametric (Rodionov and Bayesian Change Point) tests. Two non-parametric (Mann-Kendall and Spearman) and one parametric (Pearson) tests are applied to detect the presence of temporal trends. Generalized Additive Model for Location Scale and Shape (GAMLSS) models are also used to parametrically model the streamflow data exploiting their flexibility to account for changes and temporal trends in the parameters of distribution functions. Additionally, serial correlation is assessed in advance by computing the autocorrelation function (ACF), and the Hurst parameter is estimated using two estimators (aggregated variance and differenced variance methods) to investigate the presence of long range dependence. The results of this study indicate lack of long range dependence in the maximum streamflow series. At some stations the authors found a statistically significant change point in the mean and/or variance, while in general they detected no statistically significant temporal trends.

  8. Performance Analysis of Maximum Power Point Tracking Algorithms Under Varying Irradiation

    Directory of Open Access Journals (Sweden)

    Bhukya Krishna Naick

    2017-03-01

    Full Text Available Photovoltaic (PV system is one of the reliable alternative sources of energy and its contribution in energy sector is growing rapidly. The performance of PV system depends upon the solar insolation, which will be varying throughout the day, season and year. The biggest challenge is to obtain the maximum power from PV array at varying insolation levels. The maximum power point tracking (MPPT controller, in association with tracking algorithm will act as a principal element in driving the PV system at maximum power point (MPP. In this paper, the simulation model has been developed and the results were compared for perturb and observe, incremental conductance, extremum seeking control and fuzzy logic controller based MPPT algorithms at different irradiation levels on a 10 KW PV array. The results obtained were analysed in terms of convergence rate and their efficiency to track the MPP. Keywords: Photovoltaic system, MPPT algorithms, perturb and observe, incremental conductance, scalar gradient extremum seeking control, fuzzy logic controller. Article History: Received 3rd Oct 2016; Received in revised form 6th January 2017; Accepted 10th February 2017; Available online How to Cite This Article: Naick, B. K., Chatterjee, T. K. & Chatterjee, K. (2017 Performance Analysis of Maximum Power Point Tracking Algorithms Under Varying Irradiation. Int Journal of Renewable Energy Development, 6(1, 65-74. http://dx.doi.org/10.14710/ijred.6.1.65-74

  9. Analysis of calibration interval of turbine meters used in natural gas delivery points

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, Fernanda M. [TBG - Transportadora Brasileira Gasoguto Bolivia Brasil, Campinas, SP (Brazil)

    2009-12-19

    In natural gas pipeline operation, an accurate measurement of flow at delivery points provides the basis for the company billing and ensures a relationship of credibility with customers. So, the measurement management system must ensure that the equipment responsible for natural gas measurement is calibrated, although it does not mean that a high frequency of calibration must be adopted, since the calibration costs may increase without relevant gains being perceived. Therefore studying calibration frequency becomes really important in order to find the optimal point between reliability and cost. This paper aimed to evaluate the calibration frequency of the turbine flow meters used in TBG - Transportadora Brasileira Gasoduto Bolivia Brasil to measure processed gas flow at delivery points. Historical calibration data were used in commercial statistical software, which is normally used for lifetime analysis of equipment. Following such procedure, each time a meter was rejected at a calibration, it was considered a fail. The results obtained indicate that the tow year calibration interval used by TBG ensures the probability of failure, translated by the probability of producing incorrect results, of only 3%, considering their process conditions. (author)

  10. Electron-density critical points analysis and catastrophe theory to forecast structure instability in periodic solids.

    Science.gov (United States)

    Merli, Marcello; Pavese, Alessandro

    2018-03-01

    The critical points analysis of electron density, i.e. ρ(x), from ab initio calculations is used in combination with the catastrophe theory to show a correlation between ρ(x) topology and the appearance of instability that may lead to transformations of crystal structures, as a function of pressure/temperature. In particular, this study focuses on the evolution of coalescing non-degenerate critical points, i.e. such that ∇ρ(x c ) = 0 and λ 1 , λ 2 , λ 3 ≠ 0 [λ being the eigenvalues of the Hessian of ρ(x) at x c ], towards degenerate critical points, i.e. ∇ρ(x c ) = 0 and at least one λ equal to zero. The catastrophe theory formalism provides a mathematical tool to model ρ(x) in the neighbourhood of x c and allows one to rationalize the occurrence of instability in terms of electron-density topology and Gibbs energy. The phase/state transitions that TiO 2 (rutile structure), MgO (periclase structure) and Al 2 O 3 (corundum structure) undergo because of pressure and/or temperature are here discussed. An agreement of 3-5% is observed between the theoretical model and experimental pressure/temperature of transformation.

  11. Analysis of tree stand horizontal structure using random point field methods

    Directory of Open Access Journals (Sweden)

    O. P. Sekretenko

    2015-06-01

    Full Text Available This paper uses the model approach to analyze the horizontal structure of forest stands. The main types of models of random point fields and statistical procedures that can be used to analyze spatial patterns of trees of uneven and even-aged stands are described. We show how modern methods of spatial statistics can be used to address one of the objectives of forestry – to clarify the laws of natural thinning of forest stand and the corresponding changes in its spatial structure over time. Studying natural forest thinning, we describe the consecutive stages of modeling: selection of the appropriate parametric model, parameter estimation and generation of point patterns in accordance with the selected model, the selection of statistical functions to describe the horizontal structure of forest stands and testing of statistical hypotheses. We show the possibilities of a specialized software package, spatstat, which is designed to meet the challenges of spatial statistics and provides software support for modern methods of analysis of spatial data. We show that a model of stand thinning that does not consider inter-tree interaction can project the size distribution of the trees properly, but the spatial pattern of the modeled stand is not quite consistent with observed data. Using data of three even-aged pine forest stands of 25, 55, and 90-years old, we demonstrate that the spatial point process models are useful for combining measurements in the forest stands of different ages to study the forest stand natural thinning.

  12. An Automatic Target Detection Algorithm for Swath Sonar Backscatter Imagery, Using Image Texture and Independent Component Analysis

    Directory of Open Access Journals (Sweden)

    Elias Fakiris

    2016-04-01

    Full Text Available In the present paper, a methodological scheme, bringing together common Acoustic Seabed Classification (ASC systems and a powerful data decomposition approach, called Independent Component Analysis (ICA, is demonstrated regarding its suitability for detecting small targets in Side Scan Sonar imagery. Traditional ASC systems extract numerous texture descriptors, leading to a large feature vector, the dimensionality of which is reduced by means of data decomposition techniques, usually Principal Component Analysis (PCA, prior to classification. However, in the target detection issue, data decomposition should point towards finding components that represent sub-ordinary image information (i.e., small targets rather than a dominant one. ICA has long been proved to be suitable for separating targets from a background, and this study represents a novel exhibition of its applicability to Side Scan Sonar (SSS images. The present study attempts to build a fully automated target detection approach that combines image based feature extraction, ICA, and unsupervised classification. The suitability of the proposed approach has been demonstrated using an SSS data-set containing more than 70 manmade targets, most of them metallic, validated through a marine magnetic survey or ground truthing inspection. The method exhibited very good performance as it was able to detect more than 77% of the targets and it produced less than seven false alarms per km2. Moreover, it was compared to cases where, in the exact same methodological scheme, no decomposition technique is used, or PCA is employed instead of ICA, achieving the highest detection rate, but, more importantly, producing more than six times less false alarms, thus proving that ICA successfully manages to maximize target to background separation.

  13. Compact polarimetric SAR product and calibration considerations for target analysis

    Science.gov (United States)

    Sabry, Ramin

    2016-10-01

    Compact polarimetric (CP) data exploitation is currently of growing interest considering the new generation of such Synthetic Aperture Radar (SAR) systems. These systems offer target detection and classification capabilities comparable to those of polarimetric SARs (PolSAR) with less stringent requirements. A good example is the RADARSAT Constellation Mission (RCM). In this paper, some characteristic CP products are described and effects of CP mode deviation from ideal circular polarization transmit on classifications are modeled. The latter is important for operation of typical CP modes (e.g., RCM). The developed model can be used to estimate the ellipticity variation from CP measured data, and hence, calibrate the classification products.

  14. Analysis of the ball-plate laser fusion target experiments

    International Nuclear Information System (INIS)

    Pan, Y.L.

    1975-01-01

    Two dimensional computer simulation results of the two exploding pusher ball-plate targets are in approximate agreement with the experimental space and time integrated x-ray spectra, x-ray microscope data, neutron yields, and laser energy absorptions. Three parameters were used to characterize the laser absorption due to plasma instabilities. Two dumpall parameters were used to model the energy absorption and a single variable was used to define the electron temperature. The values, as well as the selection procedure for these parameters are discussed

  15. Sustainable Process Design under uncertainty analysis: targeting environmental indicators

    DEFF Research Database (Denmark)

    L. Gargalo, Carina; Gani, Rafiqul

    2015-01-01

    This study focuses on uncertainty analysis of environmental indicators used to support sustainable process design efforts. To this end, the Life Cycle Assessment methodology is extended with a comprehensive uncertainty analysis to propagate the uncertainties in input LCA data to the environmental...

  16. Spectral analysis of growing graphs a quantum probability point of view

    CERN Document Server

    Obata, Nobuaki

    2017-01-01

    This book is designed as a concise introduction to the recent achievements on spectral analysis of graphs or networks from the point of view of quantum (or non-commutative) probability theory. The main topics are spectral distributions of the adjacency matrices of finite or infinite graphs and their limit distributions for growing graphs. The main vehicle is quantum probability, an algebraic extension of the traditional probability theory, which provides a new framework for the analysis of adjacency matrices revealing their non-commutative nature. For example, the method of quantum decomposition makes it possible to study spectral distributions by means of interacting Fock spaces or equivalently by orthogonal polynomials. Various concepts of independence in quantum probability and corresponding central limit theorems are used for the asymptotic study of spectral distributions for product graphs. This book is written for researchers, teachers, and students interested in graph spectra, their (asymptotic) spectr...

  17. PENERAPAN SISTEM HAZARD ANALYSIS CRITICAL CONTROL POINT (HACCP PADA PROSES PEMBUATAN KERIPIK TEMPE

    Directory of Open Access Journals (Sweden)

    Rahmi Yuniarti

    2015-06-01

    Full Text Available Malang is one of the industrial centers of tempe chips. To maintain the quality and food safety, analysis is required to identify the hazards during the production process. This study was conducted to identify the hazards during the production process of tempe chips and provide recommendations for developing a HACCP system. The phases of production process of tempe chips are started from slice the tempe, move it to the kitchen, coat it with flour dough, fry it in the pan, drain it, package it, and then storage it. There are 3 types of potential hazards in terms of biological, physical, and chemical during the production process. With the CCP identification, there are three processes that have Critical Control Point. There are the process of slicing tempe, immersion of tempe into the flour mixture and draining. Recommendations for the development of HACCP systems include recommendations related to employee hygiene, supporting equipment, 5-S analysis, and the production layout.

  18. Validation of capillary blood analysis and capillary testing mode on the epoc Point of Care system.

    Science.gov (United States)

    Cao, Jing; Edwards, Rachel; Chairez, Janette; Devaraj, Sridevi

    2017-12-01

    Laboratory test in transport is a critical component of patient care, and capillary blood is a preferred sample type particularly in children. This study evaluated the performance of capillary blood testing on the epoc Point of Care Blood Analysis System (Alere Inc). Ten fresh venous blood samples was tested on the epoc system under the capillary mode. Correlation with GEM 4000 (Instrumentation Laboratory) was examined for Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pO2, pCO2, and pH, and correlation with serum tested on Vitros 5600 (Ortho Clinical Diagnostics) was examined for creatinine. Eight paired capillary and venous blood was tested on epoc and ABL800 (Radiometer) for the correlation of Na+, K+, Cl-, Ca2+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Capillary blood from 23 apparently healthy volunteers was tested on the epoc system to assess the concordance to reference ranges used locally. Deming regression correlation coefficients for all the comparisons were above 0.65 except for ionized Ca2+. Accordance of greater than 85% to the local reference ranges were found in all assays with the exception of pO2 and Cl-. Data from this study indicates that capillary blood tests on the epoc system provide comparable results to reference method for these assays, Na+, K+, glucose, lactate, hematocrit, hemoglobin, pCO2, and pH. Further validation in critically ill patients is needed to implement the epoc system in patient transport. This study demonstrated that capillary blood tests on the epoc Point of Care Blood Analysis System give comparable results to other chemistry analyzers for major blood gas and critical tests. The results are informative to institutions where pre-hospital and inter-hospital laboratory testing on capillary blood is a critical component of patient point of care testing.

  19. Bayesian change-point analysis reveals developmental change in a classic theory of mind task.

    Science.gov (United States)

    Baker, Sara T; Leslie, Alan M; Gallistel, C R; Hood, Bruce M

    2016-12-01

    Although learning and development reflect changes situated in an individual brain, most discussions of behavioral change are based on the evidence of group averages. Our reliance on group-averaged data creates a dilemma. On the one hand, we need to use traditional inferential statistics. On the other hand, group averages are highly ambiguous when we need to understand change in the individual; the average pattern of change may characterize all, some, or none of the individuals in the group. Here we present a new method for statistically characterizing developmental change in each individual child we study. Using false-belief tasks, fifty-two children in two cohorts were repeatedly tested for varying lengths of time between 3 and 5 years of age. Using a novel Bayesian change point analysis, we determined both the presence and-just as importantly-the absence of change in individual longitudinal cumulative records. Whenever the analysis supports a change conclusion, it identifies in that child's record the most likely point at which change occurred. Results show striking variability in patterns of change and stability across individual children. We then group the individuals by their various patterns of change or no change. The resulting patterns provide scarce support for sudden changes in competence and shed new light on the concepts of "passing" and "failing" in developmental studies. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  20. Application of hazard analysis critical control points (HACCP) to organic chemical contaminants in food.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-03-01

    Hazard Analysis Critical Control Points (HACCP) is a systematic approach to the identification, assessment, and control of hazards that was developed as an effective alternative to conventional end-point analysis to control food safety. It has been described as the most effective means of controlling foodborne diseases, and its application to the control of microbiological hazards has been accepted internationally. By contrast, relatively little has been reported relating to the potential use of HACCP, or HACCP-like procedures, to control chemical contaminants of food. This article presents an overview of the implementation of HACCP and discusses its application to the control of organic chemical contaminants in the food chain. Although this is likely to result in many of the advantages previously identified for microbiological HACCP, that is, more effective, efficient, and economical hazard management, a number of areas are identified that require further research and development. These include: (1) a need to refine the methods of chemical contaminant identification and risk assessment employed, (2) develop more cost-effective monitoring and control methods for routine chemical contaminant surveillance of food, and (3) improve the effectiveness of process optimization for the control of chemical contaminants in food.

  1. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    Science.gov (United States)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  2. AUTOMATED VOXEL MODEL FROM POINT CLOUDS FOR STRUCTURAL ANALYSIS OF CULTURAL HERITAGE

    Directory of Open Access Journals (Sweden)

    G. Bitelli

    2016-06-01

    Full Text Available In the context of cultural heritage, an accurate and comprehensive digital survey of a historical building is today essential in order to measure its geometry in detail for documentation or restoration purposes, for supporting special studies regarding materials and constructive characteristics, and finally for structural analysis. Some proven geomatic techniques, such as photogrammetry and terrestrial laser scanning, are increasingly used to survey buildings with different complexity and dimensions; one typical product is in form of point clouds. We developed a semi-automatic procedure to convert point clouds, acquired from laserscan or digital photogrammetry, to a filled volume model of the whole structure. The filled volume model, in a voxel format, can be useful for further analysis and also for the generation of a Finite Element Model (FEM of the surveyed building. In this paper a new approach is presented with the aim to decrease operator intervention in the workflow and obtain a better description of the structure. In order to achieve this result a voxel model with variable resolution is produced. Different parameters are compared and different steps of the procedure are tested and validated in the case study of the North tower of the San Felice sul Panaro Fortress, a monumental historical building located in San Felice sul Panaro (Modena, Italy that was hit by an earthquake in 2012.

  3. Thermal analysis of titanium drive-in target for D-D neutron generation.

    Science.gov (United States)

    Jung, N S; Kim, I J; Kim, S J; Choi, H D

    2010-01-01

    Thermal analysis was performed for a titanium drive-in target of a D-D neutron generator. Computational fluid dynamics code CFX-5 was used in this study. To define the heat flux term for the thermal analysis, beam current profile was measured. Temperature of the target was calculated at some of the operating conditions. The cooling performance of the target was evaluated by means of the comparison of the calculated maximum target temperature and the critical temperature of titanium. Copyright 2009 Elsevier Ltd. All rights reserved.

  4. Micro-Doppler Analysis of Rotating Target in SAR

    National Research Council Canada - National Science Library

    Thayaparan, T; Abrol, S; Qian, S

    2005-01-01

    .... The phase modulation may be seen as a time-dependent micro-Doppler (m-D) frequency. Due to their superior resolution potential, it is useful to analyze such signals with time-frequency analysis methods...

  5. Multivariate analysis for the estimation of target localization errors in fiducial marker-based radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Takamiya, Masanori [Department of Nuclear Engineering, Graduate School of Engineering, Kyoto University, Kyoto 606-8501, Japan and Department of Radiation Oncology and Image-applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto 606-8507 (Japan); Nakamura, Mitsuhiro, E-mail: m-nkmr@kuhp.kyoto-u.ac.jp; Akimoto, Mami; Ueki, Nami; Yamada, Masahiro; Matsuo, Yukinori; Mizowaki, Takashi; Hiraoka, Masahiro [Department of Radiation Oncology and Image-applied Therapy, Graduate School of Medicine, Kyoto University, Kyoto 606-8507 (Japan); Tanabe, Hiroaki [Division of Radiation Oncology, Institute of Biomedical Research and Innovation, Kobe 650-0047 (Japan); Kokubo, Masaki [Division of Radiation Oncology, Institute of Biomedical Research and Innovation, Kobe 650-0047, Japan and Department of Radiation Oncology, Kobe City Medical Center General Hospital, Kobe 650-0047 (Japan); Itoh, Akio [Department of Nuclear Engineering, Graduate School of Engineering, Kyoto University, Kyoto 606-8501 (Japan)

    2016-04-15

    Purpose: To assess the target localization error (TLE) in terms of the distance between the target and the localization point estimated from the surrogates (|TMD|), the average of respiratory motion for the surrogates and the target (|aRM|), and the number of fiducial markers used for estimating the target (n). Methods: This study enrolled 17 lung cancer patients who subsequently underwent four fractions of real-time tumor tracking irradiation. Four or five fiducial markers were implanted around the lung tumor. The three-dimensional (3D) distance between the tumor and markers was at maximum 58.7 mm. One of the markers was used as the target (P{sub t}), and those markers with a 3D |TMD{sub n}| ≤ 58.7 mm at end-exhalation were then selected. The estimated target position (P{sub e}) was calculated from a localization point consisting of one to three markers except P{sub t}. Respiratory motion for P{sub t} and P{sub e} was defined as the root mean square of each displacement, and |aRM| was calculated from the mean value. TLE was defined as the root mean square of each difference between P{sub t} and P{sub e} during the monitoring of each fraction. These procedures were performed repeatedly using the remaining markers. To provide the best guidance on the answer with n and |TMD|, fiducial markers with a 3D |aRM ≥ 10 mm were selected. Finally, a total of 205, 282, and 76 TLEs that fulfilled the 3D |TMD| and 3D |aRM| criteria were obtained for n = 1, 2, and 3, respectively. Multiple regression analysis (MRA) was used to evaluate TLE as a function of |TMD| and |aRM| in each n. Results: |TMD| for n = 1 was larger than that for n = 3. Moreover, |aRM| was almost constant for all n, indicating a similar scale for the marker’s motion near the lung tumor. MRA showed that |aRM| in the left–right direction was the major cause of TLE; however, the contribution made little difference to the 3D TLE because of the small amount of motion in the left–right direction. The TLE

  6. Simultaneous Determination of Aspirin, Salicylamide, and Caffeine in Pain Relievers by Target Factor Analysis

    Science.gov (United States)

    Msimanga, Huggins Z.; Charles, Melissa J.; Martin, Nea W.

    1997-09-01

    A factor analysis-based experiment for the undergraduate instrumental analysis labs is reported. Target factor analysis (TFA) is investigated as an option to the use of high-performance liquid chromatography (HPLC) in the analysis of a pain reliever sample containing aspirin, caffeine, and salicylamide.

  7. Singular point analysis during rail deployment into vacuum vessel for ITER blanket maintenance

    International Nuclear Information System (INIS)

    Kakudate, Satoshi; Shibanuma, Kiyoshi

    2007-05-01

    Remote maintenance of the ITER blanket composed of about 400 modules in the vessel is required by a maintenance robot due to high gamma radiation of ∼500Gy/h in the vessel. A concept of rail-mounted vehicle manipulator system has been developed to apply to the maintenance of the ITER blanket. The most critical issue of the vehicle manipulator system is the feasibility of the deployment of the articulated rail composed of eight rail links into the donut-shaped vessel without any driving mechanism in the rail. To solve this issue, a new driving mechanism and procedure for the rail deployment has been proposed, taking account of a repeated operation of the multi-rail links deployed in the same kinematical manner. The new driving mechanism, which is deferent from those of a usual 'articulated arm' equipped with actuator in the every joint for movement, is composed of three mechanisms. To assess the feasibility of the kinematics of the articulated rail for rail deployment, a kinematical model composed of three rail links related to a cycle of the repeated operation for rail deployment was considered. The determinant det J' of the Jacobian matrix J' was solved so as to estimate the existence of a singular point of the transformation during rail deployment. As a result, it is found that there is a singular point due to det J'=0. To avoid the singular point of the rail links, a new location of the second driving mechanism and the related rail deployment procedure are proposed. As a result of the rail deployment test based on the new proposal using a full-scale vehicle manipulator system, the respective rail links have been successfully deployed within 6 h less than the target of 8 h in the same manner of the repeated operation under a synchronized cooperation among the three driving mechanisms. It is therefore concluded that the feasibility of the rail deployment of the articulated rail composed of simple structures without any driving mechanism has been demonstrated

  8. Targeted drugs for pulmonary arterial hypertension: a network meta-analysis of 32 randomized clinical trials

    Directory of Open Access Journals (Sweden)

    Gao XF

    2017-05-01

    Full Text Available Xiao-Fei Gao,1 Jun-Jie Zhang,1,2 Xiao-Min Jiang,1 Zhen Ge,1,2 Zhi-Mei Wang,1 Bing Li,1 Wen-Xing Mao,1 Shao-Liang Chen1,2 1Department of Cardiology, Nanjing First Hospital, Nanjing Medical University, Nanjing, 2Department of Cardiology, Nanjing Heart Center, Nanjing, People’s Republic of China Background: Pulmonary arterial hypertension (PAH is a devastating disease and ultimately leads to right heart failure and premature death. A total of four classical targeted drugs, prostanoids, endothelin receptor antagonists (ERAs, phosphodiesterase 5 inhibitors (PDE-5Is, and soluble guanylate cyclase stimulator (sGCS, have been proved to improve exercise capacity and hemodynamics compared to placebo; however, direct head-to-head comparisons of these drugs are lacking. This network meta-analysis was conducted to comprehensively compare the efficacy of these targeted drugs for PAH.Methods: Medline, the Cochrane Library, and other Internet sources were searched for randomized clinical trials exploring the efficacy of targeted drugs for patients with PAH. The primary effective end point of this network meta-analysis was a 6-minute walk distance (6MWD.Results: Thirty-two eligible trials including 6,758 patients were identified. There was a statistically significant improvement in 6MWD, mean pulmonary arterial pressure, pulmonary vascular resistance, and clinical worsening events associated with each of the four targeted drugs compared with placebo. Combination therapy improved 6MWD by 20.94 m (95% confidence interval [CI]: 6.94, 34.94; P=0.003 vs prostanoids, and 16.94 m (95% CI: 4.41, 29.47; P=0.008 vs ERAs. PDE-5Is improved 6MWD by 17.28 m (95% CI: 1.91, 32.65; P=0.028 vs prostanoids, with a similar result with combination therapy. In addition, combination therapy reduced mean pulmonary artery pressure by 3.97 mmHg (95% CI: -6.06, -1.88; P<0.001 vs prostanoids, 8.24 mmHg (95% CI: -10.71, -5.76; P<0.001 vs ERAs, 3.38 mmHg (95% CI: -6.30, -0.47; P=0.023 vs

  9. Miniature near-infrared spectrometer for point-of-use chemical analysis

    Science.gov (United States)

    Friedrich, Donald M.; Hulse, Charles A.; von Gunten, Marc; Williamson, Eric P.; Pederson, Christopher G.; O'Brien, Nada A.

    2014-03-01

    Point-of-use chemical analysis holds tremendous promise for a number of industries, including agriculture, recycling, pharmaceuticals and homeland security. Near infrared (NIR) spectroscopy is an excellent candidate for these applications, with minimal sample preparation for real-time decision-making. We will detail the development of a golf ball-sized NIR spectrometer developed specifically for this purpose. The instrument is based upon a thin-film dispersive element that is very stable over time and temperature, with less than 2 nm change expected over the operating temperature range and lifetime of the instrument. This filter is coupled with an uncooled InGaAs detector array in a small, rugged, environmentally stable optical bench ideally suited to unpredictable environments. The resulting instrument weighs less than 60 grams, includes onboard illumination and collection optics for diffuse reflectance applications in the 900-1700 nm wavelength range, and is USB-powered. It can be driven in the field by a laptop, tablet or even a smartphone. The software design includes the potential for both on-board and cloud-based storage, analysis and decision-making. The key attributes of the instrument and the underlying design tradeoffs will be discussed, focusing on miniaturization, ruggedization, power consumption and cost. The optical performance of the instrument, as well as its fit-for purpose will be detailed. Finally, we will show that our manufacturing process has enabled us to build instruments with excellent unit-to-unit reproducibility. We will show that this is a key enabler for instrumentindependent chemical analysis models, a requirement for mass point-of-use deployment.

  10. Analysis of web-based online services for GPS relative and precise point positioning techniques

    Directory of Open Access Journals (Sweden)

    Taylan Ocalan

    Full Text Available Nowadays, Global Positioning System (GPS has been used effectively in several engineering applications for the survey purposes by multiple disciplines. Web-based online services developed by several organizations; which are user friendly, unlimited and most of them are free; have become a significant alternative against the high-cost scientific and commercial software on achievement of post processing and analyzing the GPS data. When centimeter (cm or decimeter (dm level accuracies are desired, that can be obtained easily regarding different quality engineering applications through these services. In this paper, a test study was conducted at ISKI-CORS network; Istanbul-Turkey in order to figure out the accuracy analysis of the most used web based online services around the world (namely OPUS, AUSPOS, SCOUT, CSRS-PPP, GAPS, APPS, magicGNSS. These services use relative and precise point positioning (PPP solution approaches. In this test study, the coordinates of eight stations were estimated by using of both online services and Bernese 5.0 scientific GPS processing software from 24-hour GPS data set and then the coordinate differences between the online services and Bernese processing software were computed. From the evaluations, it was seen that the results for each individual differences were less than 10 mm regarding relative online service, and less than 20 mm regarding precise point positioning service. The accuracy analysis was gathered from these coordinate differences and standard deviations of the obtained coordinates from different techniques and then online services were compared to each other. The results show that the position accuracies obtained by associated online services provide high accurate solutions that may be used in many engineering applications and geodetic analysis.

  11. Evaluating Google, Twitter, and Wikipedia as Tools for Influenza Surveillance Using Bayesian Change Point Analysis: A Comparative Analysis.

    Science.gov (United States)

    Sharpe, J Danielle; Hopkins, Richard S; Cook, Robert L; Striley, Catherine W

    2016-10-20

    Traditional influenza surveillance relies on influenza-like illness (ILI) syndrome that is reported by health care providers. It primarily captures individuals who seek medical care and misses those who do not. Recently, Web-based data sources have been studied for application to public health surveillance, as there is a growing number of people who search, post, and tweet about their illnesses before seeking medical care. Existing research has shown some promise of using data from Google, Twitter, and Wikipedia to complement traditional surveillance for ILI. However, past studies have evaluated these Web-based sources individually or dually without comparing all 3 of them, and it would be beneficial to know which of the Web-based sources performs best in order to be considered to complement traditional methods. The objective of this study is to comparatively analyze Google, Twitter, and Wikipedia by examining which best corresponds with Centers for Disease Control and Prevention (CDC) ILI data. It was hypothesized that Wikipedia will best correspond with CDC ILI data as previous research found it to be least influenced by high media coverage in comparison with Google and Twitter. Publicly available, deidentified data were collected from the CDC, Google Flu Trends, HealthTweets, and Wikipedia for the 2012-2015 influenza seasons. Bayesian change point analysis was used to detect seasonal changes, or change points, in each of the data sources. Change points in Google, Twitter, and Wikipedia that occurred during the exact week, 1 preceding week, or 1 week after the CDC's change points were compared with the CDC data as the gold standard. All analyses were conducted using the R package "bcp" version 4.0.0 in RStudio version 0.99.484 (RStudio Inc). In addition, sensitivity and positive predictive values (PPV) were calculated for Google, Twitter, and Wikipedia. During the 2012-2015 influenza seasons, a high sensitivity of 92% was found for Google, whereas the PPV for

  12. Kinetic analysis of the effects of target structure on siRNA efficiency

    Science.gov (United States)

    Chen, Jiawen; Zhang, Wenbing

    2012-12-01

    RNAi efficiency for target cleavage and protein expression is related to the target structure. Considering the RNA-induced silencing complex (RISC) as a multiple turnover enzyme, we investigated the effect of target mRNA structure on siRNA efficiency with kinetic analysis. The 4-step model was used to study the target cleavage kinetic process: hybridization nucleation at an accessible target site, RISC-mRNA hybrid elongation along with mRNA target structure melting, target cleavage, and enzyme reactivation. At this model, the terms accounting for the target accessibility, stability, and the seed and the nucleation site effects are all included. The results are in good agreement with that of experiments which show different arguments about the structure effects on siRNA efficiency. It shows that the siRNA efficiency is influenced by the integrated factors of target's accessibility, stability, and the seed effects. To study the off-target effects, a simple model of one siRNA binding to two mRNA targets was designed. By using this model, the possibility for diminishing the off-target effects by the concentration of siRNA was discussed.

  13. Bioinformatic analysis to discover putative drug targets against ...

    African Journals Online (AJOL)

    /

    2012-01-26

    Jan 26, 2012 ... JVIRTUAL GEL. GELBANK was available from the NCBI FTP server. This website incorporates only completed genomes and information pertinent to 2-DE. Link is available at www.gelbank.anl.gov. JVirGel is a software for the simulation and analysis of proteomics data (http://www.jvirgel.de/). The Java TM.

  14. Meta-analysis of targeted small-group reading interventions.

    Science.gov (United States)

    Hall, Matthew S; Burns, Matthew K

    2018-02-01

    Small-group reading interventions are commonly used in schools but the components that make them effective are still debated or unknown. The current study meta-analyzed 26 small-group reading intervention studies that resulted in 27 effect sizes. Findings suggested a moderate overall effect for small-group reading interventions (weighted g=0.54). Interventions were more effective if they were targeted to a specific skill (g=0.65), then as part of a comprehensive intervention program that addressed multiple skills (g=0.35). There was a small correlation between intervention effects and group size (r=0.21) and duration (r=0.11). Small-group interventions led to a larger median effect size (g=0.64) for elementary-aged students than for those in middle or high school (g=0.20), but the two confidence intervals overlapped. Implications for research and practice are discussed. Copyright © 2017 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  15. The chaotic points and XRD analysis of Hg-based superconductors

    Energy Technology Data Exchange (ETDEWEB)

    Aslan, Oe [Anatuerkler Educational Consultancy and Trading Company, Orhan Veli Kanik Cad., 6/1, Kavacik 34810 Beykoz, Istanbul (Turkey); Oezdemir, Z Gueven [Physics Department, Yildiz Technical University, Davutpasa Campus, Esenler 34210, Istanbul (Turkey); Keskin, S S [Department of Environmental Eng., University of Marmara, Ziverbey, 34722, Istanbul (Turkey); Onbasli, Ue, E-mail: ozdenaslan@yahoo.co [Physics Department, University of Marmara, Ridvan Pasa Cad. 3. Sok. 85/12 Goztepe, Istanbul (Turkey)

    2009-03-01

    In this article, high T{sub c} mercury based cuprate superconductors with different oxygen doping rates have been examined by means of magnetic susceptibility (magnetization) versus temperature data and X-ray diffraction pattern analysis. The under, optimally and over oxygen doping procedures have been defined from the magnetic susceptibility versus temperature data of the superconducting sample by extracting the Meissner critical transition temperature, T{sub c} and the paramagnetic Meissner temperature, T{sub PME}, so called as the critical quantum chaos points. Moreover, the optimally oxygen doped samples have been investigated under both a.c. and d.c. magnetic fields. The related a.c. data for virgin(uncut) and cut samples with optimal doping have been obtained under a.c. magnetic field of 1 Gauss. For the cut sample with the rectangular shape, the chaotic points have been found to occur at 122 and 140 K, respectively. The Meissner critical temperature of 140 K is the new world record for the high temperature oxide superconductors under normal atmospheric pressure. Moreover, the crystallographic lattice parameters of superconducting samples have a crucial importance in calculating Josephson penetration depth determined by the XRD patterns. From the XRD data obtained for under and optimally doped samples, the crystal symmetries have been found in tetragonal structure.

  16. Testing to fulfill HACCP (Hazard Analysis Critical Control Points) requirements: principles and examples.

    Science.gov (United States)

    Gardner, I A

    1997-12-01

    On-farm HACCP (hazard analysis critical control points) monitoring requires cost-effective, yet accurate and reproducible tests that can determine the status of cows, milk, and the dairy environment. Tests need to be field-validated, and their limitations need to be established so that appropriate screening strategies can be initiated and test results can be rationally interpreted. For infections and residues of low prevalence, tests or testing strategies that are highly specific help to minimize false-positive results and excessive costs to the dairy industry. The determination of the numbers of samples to be tested in HACCP monitoring programs depends on the specific purpose of the test and the likely prevalence of the agent or residue at the critical control point. The absence of positive samples from a herd test should not be interpreted as freedom from a particular agent or residue unless the entire herd has been tested with a test that is 100% sensitive. The current lack of field-validated tests for most of the chemical and infectious agents of concern makes it difficult to ensure that the stated goals of HACCP programs are consistently achieved.

  17. Controlling organic chemical hazards in food manufacturing: a hazard analysis critical control points (HACCP) approach.

    Science.gov (United States)

    Ropkins, K; Beck, A J

    2002-08-01

    Hazard analysis by critical control points (HACCP) is a systematic approach to the identification, assessment and control of hazards. Effective HACCP requires the consideration of all hazards, i.e., chemical, microbiological and physical. However, to-date most 'in-place' HACCP procedures have tended to focus on the control of microbiological and physical food hazards. In general, the chemical component of HACCP procedures is either ignored or limited to applied chemicals, e.g., food additives and pesticides. In this paper we discuss the application of HACCP to a broader range of chemical hazards, using organic chemical contaminants as examples, and the problems that are likely to arise in the food manufacturing sector. Chemical HACCP procedures are likely to result in many of the advantages previously identified for microbiological HACCP procedures: more effective, efficient and economical than conventional end-point-testing methods. However, the high costs of analytical monitoring of chemical contaminants and a limited understanding of formulation and process optimisation as means of controlling chemical contamination of foods are likely to prevent chemical HACCP becoming as effective as microbiological HACCP.

  18. Analysis of tangent hyperbolic nanofluid impinging on a stretching cylinder near the stagnation point

    Directory of Open Access Journals (Sweden)

    T. Salahuddin

    Full Text Available An analysis is executed to study the influence of heat generation/absorption on tangent hyperbolic nanofluid near the stagnation point over a stretching cylinder. In this study the developed model of a tangent hyperbolic nanofluid in boundary layer flow with Brownian motion and thermophoresis effects are discussed. The governing partial differential equations in terms of continuity, momentum, temperature and concentration are rehabilitated into ordinary differential form and then solved numerically using shooting method. The results specify that the addition of nanoparticles into the tangent hyperbolic fluid yields an increment in the skin friction coefficient and the heat transfer rate at the surface. Comparison of the present results with previously published literature is specified and found in good agreement. It is noticed that velocity profile reduces by enhancing Weissenberg number λ and power law index n. The skin friction coefficient, local Nusselt number and local Sherwood number enhances for large values of stretching ratio parameter A. Keywords: Stagnation point flow, Tangent hyperbolic nanofluid, Stretching cylinder, Heat generation/absorption, Boundary layer, Shooting method

  19. An econometric analysis of the effects of the penalty points system driver's license in Spain.

    Science.gov (United States)

    Castillo-Manzano, José I; Castro-Nuño, Mercedes; Pedregal, Diego J

    2010-07-01

    This article seeks to quantify the effects of the penalty points system driver's license during the 18-month period following its coming into force. This is achieved by means of univariate and multivariate unobserved component models set up in a state space framework estimated using maximum likelihood. A detailed intervention analysis is carried out in order to test for the effects and their duration of the introduction of the penalty points system driver's license in Spain. Other variables, mainly indicators of the level of economic activity in Spain, are also considered. Among the main effects, we can mention an average reduction of almost 12.6% in the number of deaths in highway accidents. It would take at least 2 years for that effect to disappear. For the rest of the safety indicator variables (vehicle occupants injured in highway accidents and vehicle occupants injured in accidents built-up areas) the effects disappeared 1 year after the law coming into force. Copyright 2010 Elsevier Ltd. All rights reserved.

  20. Linear stability analysis of laminar flow near a stagnation point in the slip flow regime

    Science.gov (United States)

    Essaghir, E.; Oubarra, A.; Lahjomri, J.

    2017-12-01

    The aim of the present contribution is to analyze the effect of slip parameter on the stability of a laminar incompressible flow near a stagnation point in the slip flow regime. The analysis is based on the traditional normal mode approach and assumes parallel flow approximation. The Orr-Sommerfeld equation that governs the infinitesimal disturbance of stream function imposed to the steady main flow, which is an exact solution of the Navier-Stokes equation satisfying slip boundary conditions, is obtained by using the powerful spectral Chebyshev collocation method. The results of the effect of slip parameter K on the hydrodynamic characteristics of the base flow, namely the velocity profile, the shear stress profile, the boundary layer, displacement and momentum thicknesses are illustrated and discussed. The numerical data for these characteristics, as well as those of the eigenvalues and the corresponding wave numbers recover the results of the special case of no-slip boundary conditions. They are found to be in good agreement with previous numerical calculations. The effects of slip parameter on the neutral curves of stability, for two-dimensional disturbances in the Reynolds-wave number plane, are then obtained for the first time in the slip flow regime for stagnation point flow. Furthermore, the evolution of the critical Reynolds number against the slip parameter is established. The results show that the critical Reynolds number for instability is significantly increased with the slip parameter and the flow turn out to be more stable when the effect of rarefaction becomes important.

  1. Implementation of hazard analysis and critical control point (HACCP) in dried anchovy production process

    Science.gov (United States)

    Citraresmi, A. D. P.; Wahyuni, E. E.

    2018-03-01

    The aim of this study was to inspect the implementation of Hazard Analysis and Critical Control Point (HACCP) for identification and prevention of potential hazards in the production process of dried anchovy at PT. Kelola Mina Laut (KML), Lobuk unit, Sumenep. Cold storage process is needed in each anchovy processing step in order to maintain its physical and chemical condition. In addition, the implementation of quality assurance system should be undertaken to maintain product quality. The research was conducted using a survey method, by following the whole process of making anchovy from the receiving raw materials to the packaging of final product. The method of data analysis used was descriptive analysis method. Implementation of HACCP at PT. KML, Lobuk unit, Sumenep was conducted by applying Pre Requisite Programs (PRP) and preparation stage consisting of 5 initial stages and 7 principles of HACCP. The results showed that CCP was found in boiling process flow with significant hazard of Listeria monocytogenesis bacteria and final sorting process with significant hazard of foreign material contamination in the product. Actions taken were controlling boiling temperature of 100 – 105°C for 3 - 5 minutes and training for sorting process employees.

  2. Multivariate analysis and extraction of parameters in resistive RAMs using the Quantum Point Contact model

    Science.gov (United States)

    Roldán, J. B.; Miranda, E.; González-Cordero, G.; García-Fernández, P.; Romero-Zaliz, R.; González-Rodelas, P.; Aguilera, A. M.; González, M. B.; Jiménez-Molinos, F.

    2018-01-01

    A multivariate analysis of the parameters that characterize the reset process in Resistive Random Access Memory (RRAM) has been performed. The different correlations obtained can help to shed light on the current components that contribute in the Low Resistance State (LRS) of the technology considered. In addition, a screening method for the Quantum Point Contact (QPC) current component is presented. For this purpose, the second derivative of the current has been obtained using a novel numerical method which allows determining the QPC model parameters. Once the procedure is completed, a whole Resistive Switching (RS) series of thousands of curves is studied by means of a genetic algorithm. The extracted QPC parameter distributions are characterized in depth to get information about the filamentary pathways associated with LRS in the low voltage conduction regime.

  3. Indian Point Nuclear Power Station: verification analysis of County Radiological Emergency-Response Plans

    International Nuclear Information System (INIS)

    Nagle, J.; Whitfield, R.

    1983-05-01

    This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations

  4. Comparative rainfall data analysis from two vertically pointing radars, an optical disdrometer, and a rain gauge

    Directory of Open Access Journals (Sweden)

    E. I. Nikolopoulos

    2008-12-01

    Full Text Available The authors present results of a comparative analysis of rainfall data from several ground-based instruments. The instruments include two vertically pointing Doppler radars, S-band and X-band, an optical disdrometer, and a tipping-bucket rain gauge. All instruments were collocated at the Iowa City Municipal Airport in Iowa City, Iowa, for a period of several months. The authors used the rainfall data derived from the four instruments to first study the temporal variability and scaling characteristics of rainfall and subsequently assess the instrumental effects on these derived properties. The results revealed obvious correspondence between the ground and remote sensors, which indicates the significance of the instrumental effect on the derived properties.

  5. Ergodic Capacity Analysis of Free-Space Optical Links with Nonzero Boresight Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique

    2015-04-01

    A unified capacity analysis of a free-space optical (FSO) link that accounts for nonzero boresight pointing errors and both types of detection techniques (i.e. intensity modulation/ direct detection as well as heterodyne detection) is addressed in this work. More specifically, an exact closed-form expression for the moments of the end-to-end signal-to-noise ratio (SNR) of a single link FSO transmission system is presented in terms of well-known elementary functions. Capitalizing on these new moments expressions, we present approximate and simple closedform results for the ergodic capacity at high and low SNR regimes. All the presented results are verified via computer-based Monte-Carlo simulations.

  6. Simplified Probabilistic Analysis of Settlement of Cyclically Loaded Soil Stratum by Point Estimate Method

    Science.gov (United States)

    Przewłócki, Jarosław; Górski, Jarosław; Świdziński, Waldemar

    2016-12-01

    The paper deals with the probabilistic analysis of the settlement of a non-cohesive soil layer subjected to cyclic loading. Originally, the settlement assessment is based on a deterministic compaction model, which requires integration of a set of differential equations. However, with the use of the Bessel functions, the settlement of a soil stratum can be calculated by a simplified algorithm. The compaction model parameters were determined for soil samples taken from subsoil near the Izmit Bay, Turkey. The computations were performed for various sets of random variables. The point estimate method was applied, and the results were verified by the Monte Carlo method. The outcome leads to a conclusion that can be useful in the prediction of soil settlement under seismic loading.

  7. Performance Analysis of a Maximum Power Point Tracking Technique using Silver Mean Method

    Directory of Open Access Journals (Sweden)

    Shobha Rani Depuru

    2018-01-01

    Full Text Available The proposed paper presents a simple and particularly efficacious Maximum Power Point Tracking (MPPT algorithm based on Silver Mean Method (SMM. This method operates by choosing a search interval from the P-V characteristics of the given solar array and converges to MPP of the Solar Photo-Voltaic (SPV system by shrinking its interval. After achieving the maximum power, the algorithm stops shrinking and maintains constant voltage until the next interval is decided. The tracking capability efficiency and performance analysis of the proposed algorithm are validated by the simulation and experimental results with a 100W solar panel for variable temperature and irradiance conditions. The results obtained confirm that even without any perturbation and observation process, the proposed method still outperforms the traditional perturb and observe (P&O method by demonstrating far better steady state output, more accuracy and higher efficiency.

  8. A Deep Learning Prediction Model Based on Extreme-Point Symmetric Mode Decomposition and Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Guohui Li

    2017-01-01

    Full Text Available Aiming at the irregularity of nonlinear signal and its predicting difficulty, a deep learning prediction model based on extreme-point symmetric mode decomposition (ESMD and clustering analysis is proposed. Firstly, the original data is decomposed by ESMD to obtain the finite number of intrinsic mode functions (IMFs and residuals. Secondly, the fuzzy c-means is used to cluster the decomposed components, and then the deep belief network (DBN is used to predict it. Finally, the reconstructed IMFs and residuals are the final prediction results. Six kinds of prediction models are compared, which are DBN prediction model, EMD-DBN prediction model, EEMD-DBN prediction model, CEEMD-DBN prediction model, ESMD-DBN prediction model, and the proposed model in this paper. The same sunspots time series are predicted with six kinds of prediction models. The experimental results show that the proposed model has better prediction accuracy and smaller error.

  9. Readiness to implement Hazard Analysis and Critical Control Point (HACCP) systems in Iowa schools.

    Science.gov (United States)

    Henroid, Daniel; Sneed, Jeannie

    2004-02-01

    To evaluate current food-handling practices, food safety prerequisite programs, and employee knowledge and food safety attitudes and provide baseline data for implementing Hazard Analysis and Critical Control Point (HACCP) systems in school foodservice. One member of the research team visited each school to observe food-handling practices and assess prerequisite programs using a structured observation form. A questionnaire was used to determine employees' attitudes, knowledge, and demographic information. A convenience sample of 40 Iowa schools was recruited with input from the Iowa Department of Education. Descriptive statistics were used to summarize data. One-way analysis of variance was used to assess differences in attitudes and food safety knowledge among managers, cooks, and other foodservice employees. Multiple linear regression assessed the relationship between manager and school district demographics and the food safety practice score. Proper food-handling practices were not being followed in many schools and prerequisite food safety programs for HACCP were found to be inadequate for many school foodservice operations. School foodservice employees were found to have a significant amount of food safety knowledge (15.9+/-2.4 out of 20 possible points). School districts with managers (P=.019) and employees (P=.030) who had a food handler certificate were found to have higher food safety practice scores. Emphasis on implementing prerequisite programs in preparation for HACCP is needed in school foodservice. Training programs, both basic food safety such as ServSafe and HACCP, will support improvement of food-handling practices and implementation of prerequisite programs and HACCP.

  10. Analysis of temperature data over semi-arid Botswana: trends and break points

    Science.gov (United States)

    Mphale, Kgakgamatso; Adedoyin, Akintayo; Nkoni, Godiraone; Ramaphane, Galebonwe; Wiston, Modise; Chimidza, Oyapo

    2017-06-01

    Climate change is a global challenge which impacts negatively on sustainable rural livelihoods, public health and economic development, more especially for communities in Southern Africa. Assessment of indices that signify climate change can inform formulation of relevant adaptation strategies and policies for the communities. Diurnal temperature range (DTR) is acknowledged as an expedient measure of the scourge as it is sensitive to variations in radiative energy balance. In this study, a long-term (1961-2010) daily temperature data obtained from nine (9) synoptic stations in Botswana were analyzed for monotonic trends and epochal changes in annual maximum (T max), minimum (T min) temperatures and DTR time series. Most of the considered stations were along the Kalahari Transect, a region which is at high risk of extensive environmental change due to climate change. Mann-Kendall trend and Lepage tests were applied for trend and change point analysis, respectively. The statistical analysis shows that stations in the southern part of the country experienced significant negative trends in annual DTR at the rate of -0.09 to -0.30 °C per decade due to steeper warming rates in annual T min than annual T max trends. On the contrary, stations in the northern part of the country experienced positive trends in annual DTR brought about by either a decreasing annual T min trend which outstripped annual T max or annual T max which outpaced annual T min. The increasing trends in DTR varied from 0.25 to 0.67 °C per decade. For most of the stations, the most significant annual DTR trends change point was in 1982 which coincided with the reversal of atmospheric circulation patterns.

  11. Emergency medical technician-performed point-of-care blood analysis using the capillary blood obtained from skin puncture.

    Science.gov (United States)

    Kim, Changsun; Kim, Hansol

    2017-12-09

    Comparing a point-of-care (POC) test using the capillary blood obtained from skin puncture with conventional laboratory tests. In this study, which was conducted at the emergency department of a tertiary care hospital in April-July 2017, 232 patients were enrolled, and three types of blood samples (capillary blood from skin puncture, arterial and venous blood from blood vessel puncture) were simultaneously collected. Each blood sample was analyzed using a POC analyzer (epoc® system, USA), an arterial blood gas analyzer (pHOx®Ultra, Nova biomedical, USA) and venous blood analyzers (AU5800, DxH2401, Beckman Coulter, USA). Twelve parameters were compared between the epoc and reference analyzers, with an equivalence test, Bland-Altman plot analysis and linear regression employed to show the agreement or correlation between the two methods. The pH, HCO 3 , Ca 2+ , Na + , K + , Cl - , glucose, Hb and Hct measured by the epoc were equivalent to the reference values (95% confidence interval of mean difference within the range of the agreement target) with clinically inconsequential mean differences and narrow limits of agreement. All of them, except pH, had clinically acceptable agreements between the two methods (results within target value ≥80%). Of the remaining three parameters (pCO 2, pO 2 and lactate), the epoc pCO 2 and lactate values were highly correlated with the reference device values, whereas pO 2 was not. (pCO 2 : R 2 =0.824, y=-1.411+0.877·x; lactate: R 2 =0.902, y=-0.544+0.966·x; pO 2 : R 2 =0.037, y=61.6+0.431·x). Most parameters, except only pO 2 , measured by the epoc were equivalent to or correlated with those from the reference method. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Integrative Analysis of CRISPR/Cas9 Target Sites in the Human HBB Gene

    Directory of Open Access Journals (Sweden)

    Yumei Luo

    2015-01-01

    Full Text Available Recently, the clustered regularly interspaced short palindromic repeats (CRISPR system has emerged as a powerful customizable artificial nuclease to facilitate precise genetic correction for tissue regeneration and isogenic disease modeling. However, previous studies reported substantial off-target activities of CRISPR system in human cells, and the enormous putative off-target sites are labor-intensive to be validated experimentally, thus motivating bioinformatics methods for rational design of CRISPR system and prediction of its potential off-target effects. Here, we describe an integrative analytical process to identify specific CRISPR target sites in the human β-globin gene (HBB and predict their off-target effects. Our method includes off-target analysis in both coding and noncoding regions, which was neglected by previous studies. It was found that the CRISPR target sites in the introns have fewer off-target sites in the coding regions than those in the exons. Remarkably, target sites containing certain transcriptional factor motif have enriched binding sites of relevant transcriptional factor in their off-target sets. We also found that the intron sites have fewer SNPs, which leads to less variation of CRISPR efficiency in different individuals during clinical applications. Our studies provide a standard analytical procedure to select specific CRISPR targets for genetic correction.

  13. Error analysis of dimensionless scaling experiments with multiple points using linear regression

    International Nuclear Information System (INIS)

    Guercan, Oe.D.; Vermare, L.; Hennequin, P.; Bourdelle, C.

    2010-01-01

    A general method of error estimation in the case of multiple point dimensionless scaling experiments, using linear regression and standard error propagation, is proposed. The method reduces to the previous result of Cordey (2009 Nucl. Fusion 49 052001) in the case of a two-point scan. On the other hand, if the points follow a linear trend, it explains how the estimated error decreases as more points are added to the scan. Based on the analytical expression that is derived, it is argued that for a low number of points, adding points to the ends of the scanned range, rather than the middle, results in a smaller error estimate. (letter)

  14. Temperature calibration procedure for thin film substrates for thermo-ellipsometric analysis using melting point standards

    Energy Technology Data Exchange (ETDEWEB)

    Kappert, Emiel J.; Raaijmakers, Michiel J.T.; Ogieglo, Wojciech; Nijmeijer, Arian; Huiskes, Cindy; Benes, Nieck E., E-mail: n.e.benes@utwente.nl

    2015-02-10

    Highlights: • Facile temperature calibration method for thermo-ellipsometric analysis. • The melting point of thin films of indium, lead, zinc, and water can be detected by ellipsometry. • In-situ calibration of ellipsometry hot stage, without using any external equipment. • High-accuracy temperature calibration (±1.3 °C). - Abstract: Precise and accurate temperature control is pertinent to studying thermally activated processes in thin films. Here, we present a calibration method for the substrate–film interface temperature using spectroscopic ellipsometry. The method is adapted from temperature calibration methods that are well developed for thermogravimetric analysis and differential scanning calorimetry instruments, and is based on probing a transition temperature. Indium, lead, and zinc could be spread on a substrate, and the phase transition of these metals could be detected by a change in the Ψ signal of the ellipsometer. For water, the phase transition could be detected by a loss of signal intensity as a result of light scattering by the ice crystals. The combined approach allowed for construction of a linear calibration curve with an accuracy of 1.3 °C or lower over the full temperature range.

  15. Combined impedance and dielectrophoresis portable device for point-of-care analysis

    Science.gov (United States)

    del Moral Zamora, B.; Colomer-Farrarons, J.; Mir-Llorente, M.; Homs-Corbera, A.; Miribel-Català, P.; Samitier-Martí, J.

    2011-05-01

    In the 90s, efforts arise in the scientific world to automate and integrate one or several laboratory applications in tinny devices by using microfluidic principles and fabrication technologies used mainly in the microelectronics field. It showed to be a valid method to obtain better reactions efficiency, shorter analysis times, and lower reagents consumption over existing analytical techniques. Traditionally, these fluidic microsystems able to realize laboratory essays are known as Lab-On-a-Chip (LOC) devices. The capability to transport cells, bacteria or biomolecules in an aqueous medium has significant potential for these microdevices, also known as micro-Total-Analysis Systems (uTAS) when their application is of analytical nature. In particular, the technique of dielectrophoresis (DEP) opened the possibility to manipulate, actuate or transport such biological particles being of great potential in medical diagnostics, environmental control or food processing. This technique consists on applying amplitude and frequency controlled AC signal to a given microsystem in order to manipulate or sort cells. Furthermore, the combination of this technique with electrical impedance measurements, at a single or multiple frequencies, is of great importance to achieve novel reliable diagnostic devices. This is because the sorting and manipulating mechanism can be easily combined with a fully characterizing method able to discriminate cells. The paper is focused in the electronics design of the quadrature DEP generator and the four-electrode impedance measurement modules. These together with the lab-on-a-chip device define a full conception of an envisaged Point-of-Care (POC) device.

  16. Diagnosis of inverter switch open circuit faults based on neutral point voltage signal analysis

    Directory of Open Access Journals (Sweden)

    Liwei GUO

    Full Text Available Using the current signal to diagnose inverter faults information is apt to be affected by the load, noise and other factors; besides, it requires long diagnosis period with special algorithms and the diagnosis result is easily to be incorrect with no-load or light-load. Focusing on this issue, the logical analysis method is proposed for correlation logical analysis of leg neutral-point voltage and pulse signal to realize the diagnosis of the open circuit faults of inverter switches. The logical expressions of output signals of inverter power tube open-circuit faults is put forward and interrelated hardware circuit design is also elaborated. Delaying the rising edge of inverter power tube's pulse signal can effectively avoid the diagnosis error caused by the power tube's switching on/off. The experiment results show that the method can effectively diagnose the open-circuit faults of single-phase single power tube inverter in real-time and the hardware circuit cost is low, which shows it is effective and feasible.

  17. Biospectral analysis of the bladder channel point in chronic low back pain patients

    Science.gov (United States)

    Vidal, Alberto Espinosa; Nava, Juan José Godina; Segura, Miguel Ángel Rodriguez; Bastida, Albino Villegas

    2012-10-01

    Chronic pain is the main cause of disability in the productive age people and is a public health problem that affects both the patient and society. On the other hand, there isn't any instrument to measure it; this is only estimated using subjective variables. The healthy cells generate a known membrane potential which is part of a network of biologically closed electric circuits still unstudied. It is proposed a biospectral analysis of a bladder channel point as a diagnosis method for chronic low back pain patients. Materials and methods: We employed a study group with chronic low back pain patients and a control group without low back pain patients. The visual analog scale (VAS) to determine the level of pain was applied. Bioelectric variables were measured for 10 seconds and the respective biostatistical analyses were made. Results: Biospectral analysis on frequency domain shows a depression in the 60-300 Hz frequency range proportional to the chronicity of low back pain compared against healthy patients.

  18. Point Cloud Generation from sUAS-Mounted iPhone Imagery: Performance Analysis

    Science.gov (United States)

    Ladai, A. D.; Miller, J.

    2014-11-01

    The rapidly growing use of sUAS technology and fast sensor developments continuously inspire mapping professionals to experiment with low-cost airborne systems. Smartphones has all the sensors used in modern airborne surveying systems, including GPS, IMU, camera, etc. Of course, the performance level of the sensors differs by orders, yet it is intriguing to assess the potential of using inexpensive sensors installed on sUAS systems for topographic applications. This paper focuses on the quality analysis of point clouds generated based on overlapping images acquired by an iPhone 5s mounted on a sUAS platform. To support the investigation, test data was acquired over an area with complex topography and varying vegetation. In addition, extensive ground control, including GCPs and transects were collected with GSP and traditional geodetic surveying methods. The statistical and visual analysis is based on a comparison of the UAS data and reference dataset. The results with the evaluation provide a realistic measure of data acquisition system performance. The paper also gives a recommendation for data processing workflow to achieve the best quality of the final products: the digital terrain model and orthophoto mosaic. After a successful data collection the main question is always the reliability and the accuracy of the georeferenced data.

  19. Are your hands clean enough for point-of-care electrolyte analysis?

    Science.gov (United States)

    Lam, Hugh S; Chan, Michael H M; Ng, Pak C; Wong, William; Cheung, Robert C K; So, Alan K W; Fok, Tai F; Lam, Christopher W K

    2005-08-01

    To investigate clinically significant analytical interference in point-of-care electrolyte analysis caused by contamination of blood specimens with hand disinfectant. Six different hand hygiene products were added separately to heparinised blood samples in varying amounts as contaminant. The contaminated samples were analysed by three different blood gas and electrolyte analysers for assessing interference on measured whole blood sodium and potassium concentrations. There were significant analytical interferences caused by hand hygiene product contamination that varied depending on the combination of disinfectant and analyser. Small amounts of Microshield Antibacterial Hand Gel contamination caused large increases in measured sodium concentration. Such effect was much greater compared with the other five products tested, and started to occur at much lower levels of contamination. There was a trend towards lower sodium results in blood samples contaminated with Hexol Antiseptic Lotion (Hexol), the hand hygiene product that we used initially. Apart from AiE Hand Sanitizer, all the other hand disinfectants, especially Hexol, significantly elevated the measured potassium concentration, particularly when a direct ion-selective electrode method was used for measurement. Hand disinfectant products can significantly interfere with blood electrolyte analysis. Proper precautions must be taken against contamination since the resultant errors can adversely affect the clinical management of patients.

  20. Challenges of teacher-based clinical evaluation from nursing students' point of view: Qualitative content analysis.

    Science.gov (United States)

    Sadeghi, Tabandeh; Seyed Bagheri, Seyed Hamid

    2017-01-01

    Clinical evaluation is very important in the educational system of nursing. One of the most common methods of clinical evaluation is evaluation by the teacher, but the challenges that students would face in this evaluation method, have not been mentioned. Thus, this study aimed to explore the experiences and views of nursing students about the challenges of teacher-based clinical evaluation. This study was a descriptive qualitative study with a qualitative content analysis approach. Data were gathered through semi-structured focused group sessions with undergraduate nursing students who were passing their 8 th semester at Rafsanjan University of Medical Sciences. Date were analyzed using Graneheim and Lundman's proposed method. Data collection and analysis were concurrent. According to the findings, "factitious evaluation" was the main theme of study that consisted of three categories: "Personal preferences," "unfairness" and "shirking responsibility." These categories are explained using quotes derived from the data. According to the results of this study, teacher-based clinical evaluation would lead to factitious evaluation. Thus, changing this approach of evaluation toward modern methods of evaluation is suggested. The finding can help nursing instructors to get a better understanding of the nursing students' point of view toward this evaluation approach and as a result could be planning for changing of this approach.

  1. Parameter uncertainty analysis of non-point source pollution from different land use types.

    Science.gov (United States)

    Shen, Zhen-yao; Hong, Qian; Yu, Hong; Niu, Jun-feng

    2010-03-15

    Land use type is one of the most important factors that affect the uncertainty in non-point source (NPS) pollution simulation. In this study, seventeen sensitive parameters were screened from the Soil and Water Assessment Tool (SWAT) model for parameter uncertainty analysis for different land use types in the Daning River Watershed of the Three Gorges Reservoir area, China. First-Order Error Analysis (FOEA) method was adopted to analyze the effect of parameter uncertainty on model outputs under three types of land use, namely, plantation, forest and grassland. The model outputs selected in this study consisted of runoff, sediment yield, organic nitrogen (N), and total phosphorus (TP). The results indicated that the uncertainty conferred by the parameters differed among the three land use types. In forest and grassland, the parameter uncertainty in NPS pollution was primarily associated with runoff processes, but in plantation, the main uncertain parameters were related to runoff process and soil properties. Taken together, the study suggested that adjusting the structure of land use and controlling fertilizer use are helpful methods to control the NPS pollution in the Daning River Watershed.

  2. Cost analysis of premixed multichamber bags versus compounded parenteral nutrition: breakeven point.

    Science.gov (United States)

    Bozat, Erkut; Korubuk, Gamze; Onar, Pelin; Abbasoglu, Osman

    2014-02-01

    Industrially premixed multichamber bags or hospital-manufactured compounded products can be used for parenteral nutrition. The aim of this study was to compare the cost of these 2 approaches. Costs of compounded parenteral nutrition bags in an university hospital were calculated. A total of 600 bags that were administered during 34 days between December 10, 2009 and February 17, 2010 were included in the analysis. For quality control, specific gravity evaluation of the filled bags was performed. It was calculated that the variable cost of a hospital compounded bag was $26.15. If we take the annual fixed costs into consideration, the production cost reaches $36.09 for each unit. It was estimated that the cost for the corresponding multichamber bag was $37.79. Taking the fixed and the variable costs into account, the breakeven point of the hospital compounded and the premixed multichamber bags was seen at 5,404 units per year. In specific gravity evaluation, it was observed that the mean and interval values were inside the upper and lower control margins. In this analysis, usage of hospital-compounded parenteral nutrition bags showed a cost advantage in hospitals that treat more than 15 patients per day. In small volume hospitals, premixed multichamber bags may be more beneficial.

  3. Intraosseous blood samples for point-of-care analysis: agreement between intraosseous and arterial analyses.

    Science.gov (United States)

    Jousi, Milla; Saikko, Simo; Nurmi, Jouni

    2017-09-11

    Point-of-care (POC) testing is highly useful when treating critically ill patients. In case of difficult vascular access, the intraosseous (IO) route is commonly used, and blood is aspirated to confirm the correct position of the IO-needle. Thus, IO blood samples could be easily accessed for POC analyses in emergency situations. The aim of this study was to determine whether IO values agree sufficiently with arterial values to be used for clinical decision making. Two samples of IO blood were drawn from 31 healthy volunteers and compared with arterial samples. The samples were analysed for sodium, potassium, ionized calcium, glucose, haemoglobin, haematocrit, pH, blood gases, base excess, bicarbonate, and lactate using the i-STAT® POC device. Agreement and reliability were estimated by using the Bland-Altman method and intraclass correlation coefficient calculations. Good agreement was evident between the IO and arterial samples for pH, glucose, and lactate. Potassium levels were clearly higher in the IO samples than those from arterial blood. Base excess and bicarbonate were slightly higher, and sodium and ionised calcium values were slightly lower, in the IO samples compared with the arterial values. The blood gases in the IO samples were between arterial and venous values. Haemoglobin and haematocrit showed remarkable variation in agreement. POC diagnostics of IO blood can be a useful tool to guide treatment in critical emergency care. Seeking out the reversible causes of cardiac arrest or assessing the severity of shock are examples of situations in which obtaining vascular access and blood samples can be difficult, though information about the electrolytes, acid-base balance, and lactate could guide clinical decision making. The analysis of IO samples should though be limited to situations in which no other option is available, and the results should be interpreted with caution, because there is not yet enough scientific evidence regarding the agreement of IO

  4. A novel Bayesian change-point algorithm for genome-wide analysis of diverse ChIPseq data types.

    Science.gov (United States)

    Xing, Haipeng; Liao, Willey; Mo, Yifan; Zhang, Michael Q

    2012-12-10

    ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein(1). For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment(2). Reliably identifying these regions was the focus of our work. Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics(3-5) to more rigorous statistical models, e.g. Hidden Markov Models (HMMs)(6-8). We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized. Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types. To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs(9), which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how

  5. Gas analysis within remote porous targets using LIDAR multi-scatter techniques

    Science.gov (United States)

    Guan, Z. G.; Lewander, M.; Grönlund, R.; Lundberg, H.; Svanberg, S.

    2008-11-01

    Light detection and ranging (LIDAR) experiments are normally pursued for range resolved atmospheric gas measurements or for analysis of solid target surfaces using fluorescence of laser-induced breakdown spectroscopy. In contrast, we now demonstrate the monitoring of free gas enclosed in pores of materials, subject to impinging laser radiation, employing the photons emerging back to the surface laterally of the injection point after penetrating the medium in heavy multiple scattering processes. The directly reflected light is blocked by a beam stop. The technique presented is a remote version of the newly introduced gas in scattering media absorption spectroscopy (GASMAS) technique, which so far was pursued with the injection optics and the detector in close contact with the sample. Feasibility measurements of LIDAR-GASMAS on oxygen in polystyrene foam were performed at a distance of 6 m. Multiple-scattering induced delays of the order of 50 ns, which corresponds to 15 m optical path length, were observed. First extensions to a range of 60 m are discussed. Remote observation of gas composition anomalies in snow using differential absorption LIDAR (DIAL) may find application in avalanche victim localization or for leak detection in snow-covered natural gas pipelines. Further, the techniques may be even more useful for short-range, non-intrusive GASMAS measurements, e.g., on packed food products.

  6. Target detection in SAR images via radiometric multi-resolution analysis

    Science.gov (United States)

    Hu, Jingwen; Xia, Gui-Song; Sun, Hong

    2013-10-01

    This paper presents a target detection method in synthetic aperture radar (SAR) images with radiometric multiresolution analysis (RMA). The idea is that target saliency can be efficiently computed by comparing the statistics of targets and those of the local background around them. In order to compute reliable statistics of targets, which usually involve a small number of pixels, RMA is adopted. The RMA preprocessing method performs well in stabilizing the statistical characteristics of SAR images. It can effectively restrain the speckle noise while keep the statistical characteristics of the original image. Based on the computed target saliency, adaptive decision thresholds are got by using the constant false alarm rate (CFAR) target detection framework. Our experiments on real SAR images show that the proposed method can achieve better performance compared with the traditional cell average-constant false alarm rate (CA-CFAR) method.

  7. Beyond typing and grading: target analysis in individualized therapy as a new challenge for tumour pathology.

    Science.gov (United States)

    Kreipe, Hans H; von Wasielewski, Reinhard

    2007-01-01

    In order to bring about its beneficial effects in oncology, targeted therapy depends on accurate target analysis. Whether cells of a tumour will be sensitive to a specific treatment is predicted by the detection of appropriate targets in cancer tissue by immunohistochemistry or molecular methods. In most instances this is performed by histopathologists. Reliability and reproducibility of tissue-based target analysis in histopathology require novel measures of quality assurance by internal and external controls. As a model for external quality assurance in targeted therapy an annual inter-laboratory trial has been set up in Germany applying tissue arrays with up to 60 mammary cancer samples which are tested by participants for expression of HER2/neu and steroid hormone receptors.

  8. Analysis of drug adversiting targeted to health professionals

    Directory of Open Access Journals (Sweden)

    Marcela Campos Esqueff Abdalla

    2017-08-01

    Full Text Available The advertising of medicines is the dissemination of the product by the pharmaceutical industry, with emphasis on brand, aiming to promote their prescription and/or purchase. This practice must comply with the legal provisions in effect determined by Brazilian National Surveillance Agency. The present work aimed to analyze advertisements of medicines offered by the industry to health professionals. The capture of advertisements covered physician offices of various specialties, public and private hospitals and magazines directed at health professionals. The analysis of the collected parts involved the verification of legibility and viewing of information required, as well as the compliance with the health legislation that regulates the promotion and advertising of medicines in Brazil – agency’s resolution n. 96/2008. The results showed that no piece meets the health legislation in full. Most industries employs strategies that hinder access to restricted information of use of the medicine, as contra-indications, for example, constituting an obstacle to rational use. It was also observed the presence of indications other than those approved by the agency and use indication for different age groups in the specified product registration. It is obvious the need for a new model controller and more rigid regulator that prioritize above all particular interests, a major importance, that is the society. This must be protected from false advertising and abusive, promoting the rational use of medicines.

  9. Evenly spaced Detrended Fluctuation Analysis: Selecting the number of points for the diffusion plot

    Science.gov (United States)

    Liddy, Joshua J.; Haddad, Jeffrey M.

    2018-02-01

    Detrended Fluctuation Analysis (DFA) has become a widely-used tool to examine the correlation structure of a time series and provided insights into neuromuscular health and disease states. As the popularity of utilizing DFA in the human behavioral sciences has grown, understanding its limitations and how to properly determine parameters is becoming increasingly important. DFA examines the correlation structure of variability in a time series by computing α, the slope of the log SD- log n diffusion plot. When using the traditional DFA algorithm, the timescales, n, are often selected as a set of integers between a minimum and maximum length based on the number of data points in the time series. This produces non-uniformly distributed values of n in logarithmic scale, which influences the estimation of α due to a disproportionate weighting of the long-timescale regions of the diffusion plot. Recently, the evenly spaced DFA and evenly spaced average DFA algorithms were introduced. Both algorithms compute α by selecting k points for the diffusion plot based on the minimum and maximum timescales of interest and improve the consistency of α estimates for simulated fractional Gaussian noise and fractional Brownian motion time series. Two issues that remain unaddressed are (1) how to select k and (2) whether the evenly-spaced DFA algorithms show similar benefits when assessing human behavioral data. We manipulated k and examined its effects on the accuracy, consistency, and confidence limits of α in simulated and experimental time series. We demonstrate that the accuracy and consistency of α are relatively unaffected by the selection of k. However, the confidence limits of α narrow as k increases, dramatically reducing measurement uncertainty for single trials. We provide guidelines for selecting k and discuss potential uses of the evenly spaced DFA algorithms when assessing human behavioral data.

  10. Analysis of step-up transformer tap change on the quantities at the point of connection to transmission grid

    Directory of Open Access Journals (Sweden)

    Đorđević Dragan

    2017-01-01

    Full Text Available The analysis of a step-up transformer tap change on the quantities at the point of connection to the transmission grid is presented in this paper. The point of connection of generator TENT A6 has been analyzed, and a detailed model of this generator is available in software package DIgSILENT Power Factory. The comparison between the effect of a step-up transformer tap change on the quantities at the point of connection during automatic and manual operation of voltage regulator has been conducted. In order to conduct the analysis of the manual operation of the voltage regulator, the comparison between the different methods of modeling of these modes has been performed. Several generator operating points, selected in order to represent the need for tap change, have been analyzed. Also, previously mentioned analyses have been performed taking into account the voltage-reactive stiffness at the point of connection.

  11. Drug target mining and analysis of the Chinese tree shrew for pharmacological testing.

    Directory of Open Access Journals (Sweden)

    Feng Zhao

    Full Text Available The discovery of new drugs requires the development of improved animal models for drug testing. The Chinese tree shrew is considered to be a realistic candidate model. To assess the potential of the Chinese tree shrew for pharmacological testing, we performed drug target prediction and analysis on genomic and transcriptomic scales. Using our pipeline, 3,482 proteins were predicted to be drug targets. Of these predicted targets, 446 and 1,049 proteins with the highest rank and total scores, respectively, included homologs of targets for cancer chemotherapy, depression, age-related decline and cardiovascular disease. Based on comparative analyses, more than half of drug target proteins identified from the tree shrew genome were shown to be higher similarity to human targets than in the mouse. Target validation also demonstrated that the constitutive expression of the proteinase-activated receptors of tree shrew platelets is similar to that of human platelets but differs from that of mouse platelets. We developed an effective pipeline and search strategy for drug target prediction and the evaluation of model-based target identification for drug testing. This work provides useful information for future studies of the Chinese tree shrew as a source of novel targets for drug discovery research.

  12. Novel Strategy for Non-Targeted Isotope-Assisted Metabolomics by Means of Metabolic Turnover and Multivariate Analysis

    Directory of Open Access Journals (Sweden)

    Yasumune Nakayama

    2014-08-01

    Full Text Available Isotope-labeling is a useful technique for understanding cellular metabolism. Recent advances in metabolomics have extended the capability of isotope-assisted studies to reveal global metabolism. For instance, isotope-assisted metabolomics technology has enabled the mapping of a global metabolic network, estimation of flux at branch points of metabolic pathways, and assignment of elemental formulas to unknown metabolites. Furthermore, some data processing tools have been developed to apply these techniques to a non-targeted approach, which plays an important role in revealing unknown or unexpected metabolism. However, data collection and integration strategies for non-targeted isotope-assisted metabolomics have not been established. Therefore, a systematic approach is proposed to elucidate metabolic dynamics without targeting pathways by means of time-resolved isotope tracking, i.e., “metabolic turnover analysis”, as well as multivariate analysis. We applied this approach to study the metabolic dynamics in amino acid perturbation of Saccharomyces cerevisiae. In metabolic turnover analysis, 69 peaks including 35 unidentified peaks were investigated. Multivariate analysis of metabolic turnover successfully detected a pathway known to be inhibited by amino acid perturbation. In addition, our strategy enabled identification of unknown peaks putatively related to the perturbation.

  13. Quantitative functional analysis of Late Glacial projectile points from northern Europe

    DEFF Research Database (Denmark)

    Dev, Satya; Riede, Felix

    2012-01-01

    that, based on metric considerations, arch-backed points (pen-knife points or Federmesser) most likely were part of a bow-and-arrow weapon system, while large tanged points (Bromme points) most likely tipped spears propelled with the help of a spear-thrower/atlatl. This paper then presents...... surely fully serviceable, diverged considerably from the functional optimum predicated by ballistic theory. These observations relate directly to southern Scandinavian Late Glacial culture-history which is marked by a sequence of co-occurrence of arch-backed and large tanged points in the earlier part...

  14. SCAP-82, Single Scattering, Albedo Scattering, Point-Kernel Analysis in Complex Geometry

    International Nuclear Information System (INIS)

    Disney, R.K.; Vogtman, S.E.

    1987-01-01

    1 - Description of problem or function: SCAP solves for radiation transport in complex geometries using the single or albedo scatter point kernel method. The program is designed to calculate the neutron or gamma ray radiation level at detector points located within or outside a complex radiation scatter source geometry or a user specified discrete scattering volume. Geometry is describable by zones bounded by intersecting quadratic surfaces within an arbitrary maximum number of boundary surfaces per zone. Anisotropic point sources are describable as pointwise energy dependent distributions of polar angles on a meridian; isotropic point sources may also be specified. The attenuation function for gamma rays is an exponential function on the primary source leg and the scatter leg with a build- up factor approximation to account for multiple scatter on the scat- ter leg. The neutron attenuation function is an exponential function using neutron removal cross sections on the primary source leg and scatter leg. Line or volumetric sources can be represented as a distribution of isotropic point sources, with un-collided line-of-sight attenuation and buildup calculated between each source point and the detector point. 2 - Method of solution: A point kernel method using an anisotropic or isotropic point source representation is used, line-of-sight material attenuation and inverse square spatial attenuation between the source point and scatter points and the scatter points and detector point is employed. A direct summation of individual point source results is obtained. 3 - Restrictions on the complexity of the problem: - The SCAP program is written in complete flexible dimensioning so that no restrictions are imposed on the number of energy groups or geometric zones. The geometric zone description is restricted to zones defined by boundary surfaces defined by the general quadratic equation or one of its degenerate forms. The only restriction in the program is that the total

  15. Hazard analysis and critical control point to irradiated food in Brazil

    International Nuclear Information System (INIS)

    Boaratti, Maria de Fatima Guerra

    2004-01-01

    Food borne diseases, in particular gastro-intestinal infections, represent a very large group of pathologies with a strong negative impact on the health of the population because of their widespread nature. Little consideration is given to such conditions due to the fact that their symptoms are often moderate and self-limiting. This has led to a general underestimation of their importance, and consequently to incorrect practices during the preparation and preservation of food, resulting in the frequent occurrence of outbreaks involving groups of varying numbers of consumers. Despite substantial efforts in the avoidance of contamination, an upward trend in the number of outbreaks of food borne illnesses caused by non-spore forming pathogenic bacteria are reported in many countries. Good hygienic practices can reduce the level of contamination but the most important pathogens cannot presently be eliminated from most farms, nor is it possible to eliminate them by primary processing, particularly from those foods which are sold raw. Several decontamination methods exist but the most versatile treatment among them is the ionizing radiation procedure. HACCP (Hazard Analysis and Critical Control Point) is a management system in which food safety is addressed through the analysis and control of biological, chemical, and physical hazards from raw material production, procurement and handling, to manufacturing, distribution and consumption of the finished product. For successful implementation of a HACCP plan, management must be strongly committed to the HACCP concept. A firm commitment to HACCP by top management provides company employees with a sense of the importance of producing safe food. At the same time, it has to be always emphasized that, like other intervention strategies, irradiation must be applied as part of a total sanitation program. The benefits of irradiation should never be considered as an excuse for poor quality or for poor handling and storage conditions

  16. Highly-integrated lab-on-chip system for point-of-care multiparameter analysis.

    Science.gov (United States)

    Schumacher, Soeren; Nestler, Jörg; Otto, Thomas; Wegener, Michael; Ehrentreich-Förster, Eva; Michel, Dirk; Wunderlich, Kai; Palzer, Silke; Sohn, Kai; Weber, Achim; Burgard, Matthias; Grzesiak, Andrzej; Teichert, Andreas; Brandenburg, Albrecht; Koger, Birgit; Albers, Jörg; Nebling, Eric; Bier, Frank F

    2012-02-07

    A novel innovative approach towards a marketable lab-on-chip system for point-of-care in vitro diagnostics is reported. In a consortium of seven Fraunhofer Institutes a lab-on-chip system called "Fraunhofer ivD-platform" has been established which opens up the possibility for an on-site analysis at low costs. The system features a high degree of modularity and integration. Modularity allows the adaption of common and established assay types of various formats. Integration lets the system move from the laboratory to the point-of-need. By making use of the microarray format the lab-on-chip system also addresses new trends in biomedicine. Research topics such as personalized medicine or companion diagnostics show that multiparameter analyses are an added value for diagnostics, therapy as well as therapy control. These goals are addressed with a low-cost and self-contained cartridge, since reagents, microfluidic actuators and various sensors are integrated within the cartridge. In combination with a fully automated instrumentation (read-out and processing unit) a diagnostic assay can be performed in about 15 min. Via a user-friendly interface the read-out unit itself performs the assay protocol, data acquisition and data analysis. So far, example assays for nucleic acids (detection of different pathogens) and protein markers (such as CRP and PSA) have been established using an electrochemical read-out based on redoxcycling or an optical read-out based on total internal reflectance fluorescence (TIRF). It could be shown that the assay performance within the cartridge is similar to that found for the same assay in a microtiter plate. Furthermore, recent developments are the integration of sample preparation and polymerase chain reaction (PCR) on-chip. Hence, the instrument is capable of providing heating-and-cooling cycles necessary for DNA-amplification. In addition to scientific aspects also the production of such a lab-on-chip system was part of the development since

  17. Whole-Genome Thermodynamic Analysis Reduces siRNA Off-Target Effects

    Science.gov (United States)

    Chen, Xi; Liu, Peng; Chou, Hui-Hsien

    2013-01-01

    Small interfering RNAs (siRNAs) are important tools for knocking down targeted genes, and have been widely applied to biological and biomedical research. To design siRNAs, two important aspects must be considered: the potency in knocking down target genes and the off-target effect on any nontarget genes. Although many studies have produced useful tools to design potent siRNAs, off-target prevention has mostly been delegated to sequence-level alignment tools such as BLAST. We hypothesize that whole-genome thermodynamic analysis can identify potential off-targets with higher precision and help us avoid siRNAs that may have strong off-target effects. To validate this hypothesis, two siRNA sets were designed to target three human genes IDH1, ITPR2 and TRIM28. They were selected from the output of two popular siRNA design tools, siDirect and siDesign. Both siRNA design tools have incorporated sequence-level screening to avoid off-targets, thus their output is believed to be optimal. However, one of the sets we tested has off-target genes predicted by Picky, a whole-genome thermodynamic analysis tool. Picky can identify off-target genes that may hybridize to a siRNA within a user-specified melting temperature range. Our experiments validated that some off-target genes predicted by Picky can indeed be inhibited by siRNAs. Similar experiments were performed using commercially available siRNAs and a few off-target genes were also found to be inhibited as predicted by Picky. In summary, we demonstrate that whole-genome thermodynamic analysis can identify off-target genes that are missed in sequence-level screening. Because Picky prediction is deterministic according to thermodynamics, if a siRNA candidate has no Picky predicted off-targets, it is unlikely to cause off-target effects. Therefore, we recommend including Picky as an additional screening step in siRNA design. PMID:23484018

  18. [Effect of ear point embedding on plasma and effect site concentrations of propofol-remifentanil in elderly patients after target-controlled induction].

    Science.gov (United States)

    Zheng, Xiaochun; Wan, Liling; Gao, Fei; Chen, Jianghu; Tu, Wenshao

    2017-08-12

    To observe the clinical effect of ear point embedding on plasma and effect site concentrations of propofol-remifentanil in elderly patients who underwent abdominal external hernia surgery at the time of consciousness and pain disappearing by target-controlled infusion (TCI) and bispectral index (BIS). Fifty patients who underwent elective abdominal hernia surgery were randomly assigned into an observation group and a control group, 25 cases in each one. In the observation group, 30 minutes before anesthesia induction, Fugugou (Extra), Gan (CO 12 ), Pizhixia (AT 4 ), and Shenmen (TF 4 ) were embedded by auricular needles until the end of surgery, 10 times of counter press each point. In the control group, the same amount of auricular tape was applied until the end of surgery at the same points without stimulation 30 minutes before anesthesia induction. Patients in the two groups were given total intravenous anesthesia, and BIS was monitored by BIS anesthesia depth monitor. Propofol was infused by TCI at a beginning concentration of 1.5μg/L and increased by 0.3μg/L every 30s until the patients lost their consciousness. After that, remifentanil was infused by TCI at a beginning concentration of 2.0μg/L and increased by 0.3μg/L every 30s until the patients had no body reaction to pain stimulation (orbital reflex). Indices were recorded, including mean arterial pressure (MAP), heart rate (HR) and the BIS values, at the time of T 0 (entering into the operation room), T 1 (losing consciousness) and T 2 (pain relief), the plasma and effect site concentrations of propofol at T 1 , the plasma and effect site concentrations of remifentanil at T 2 . After surgery we recorded the total amounts of propofol and remifentanil, surgery time and anesthesia time. At T 1 and T 2 , MAP and HR of the observation group were higher than those of the control group ( P effect site concentrations of propofol in the observation group were significantly lower than those in the control group

  19. Whole-genome analysis of herbicide-tolerant mutant rice generated by Agrobacterium-mediated gene targeting.

    Science.gov (United States)

    Endo, Masaki; Kumagai, Masahiko; Motoyama, Ritsuko; Sasaki-Yamagata, Harumi; Mori-Hosokawa, Satomi; Hamada, Masao; Kanamori, Hiroyuki; Nagamura, Yoshiaki; Katayose, Yuichi; Itoh, Takeshi; Toki, Seiichi

    2015-01-01

    Gene targeting (GT) is a technique used to modify endogenous genes in target genomes precisely via homologous recombination (HR). Although GT plants are produced using genetic transformation techniques, if the difference between the endogenous and the modified gene is limited to point mutations, GT crops can be considered equivalent to non-genetically modified mutant crops generated by conventional mutagenesis techniques. However, it is difficult to guarantee the non-incorporation of DNA fragments from Agrobacterium in GT plants created by Agrobacterium-mediated GT despite screening with conventional Southern blot and/or PCR techniques. Here, we report a comprehensive analysis of herbicide-tolerant rice plants generated by inducing point mutations in the rice ALS gene via Agrobacterium-mediated GT. We performed genome comparative genomic hybridization (CGH) array analysis and whole-genome sequencing to evaluate the molecular composition of GT rice plants. Thus far, no integration of Agrobacterium-derived DNA fragments has been detected in GT rice plants. However, >1,000 single nucleotide polymorphisms (SNPs) and insertion/deletion (InDels) were found in GT plants. Among these mutations, 20-100 variants might have some effect on expression levels and/or protein function. Information about additive mutations should be useful in clearing out unwanted mutations by backcrossing. © The Author 2014. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  20. Transcriptome Analysis of Targeted Mouse Mutations Reveals the Topography of Local Changes in Gene Expression.

    Directory of Open Access Journals (Sweden)

    David B West

    2016-02-01

    Full Text Available The unintended consequences of gene targeting in mouse models have not been thoroughly studied and a more systematic analysis is needed to understand the frequency and characteristics of off-target effects. Using RNA-seq, we evaluated targeted and neighboring gene expression in tissues from 44 homozygous mutants compared with C57BL/6N control mice. Two allele types were evaluated: 15 targeted trap mutations (TRAP; and 29 deletion alleles (DEL, usually a deletion between the translational start and the 3' UTR. Both targeting strategies insert a bacterial beta-galactosidase reporter (LacZ and a neomycin resistance selection cassette. Evaluating transcription of genes in +/- 500 kb of flanking DNA around the targeted gene, we found up-regulated genes more frequently around DEL compared with TRAP alleles, however the frequency of alleles with local down-regulated genes flanking DEL and TRAP targets was similar. Down-regulated genes around both DEL and TRAP targets were found at a higher frequency than expected from a genome-wide survey. However, only around DEL targets were up-regulated genes found with a significantly higher frequency compared with genome-wide sampling. Transcriptome analysis confirms targeting in 97% of DEL alleles, but in only 47% of TRAP alleles probably due to non-functional splice variants, and some splicing around the gene trap. Local effects on gene expression are likely due to a number of factors including compensatory regulation, loss or disruption of intragenic regulatory elements, the exogenous promoter in the neo selection cassette, removal of insulating DNA in the DEL mutants, and local silencing due to disruption of normal chromatin organization or presence of exogenous DNA. An understanding of local position effects is important for understanding and interpreting any phenotype attributed to targeted gene mutations, or to spontaneous indels.

  1. Comparative analysis of predicted plastid-targeted proteomes of sequenced higher plant genomes.

    Directory of Open Access Journals (Sweden)

    Scott Schaeffer

    Full Text Available Plastids are actively involved in numerous plant processes critical to growth, development and adaptation. They play a primary role in photosynthesis, pigment and monoterpene synthesis, gravity sensing, starch and fatty acid synthesis, as well as oil, and protein storage. We applied two complementary methods to analyze the recently published apple genome (Malus × domestica to identify putative plastid-targeted proteins, the first using TargetP and the second using a custom workflow utilizing a set of predictive programs. Apple shares roughly 40% of its 10,492 putative plastid-targeted proteins with that of the Arabidopsis (Arabidopsis thaliana plastid-targeted proteome as identified by the Chloroplast 2010 project and ∼57% of its entire proteome with Arabidopsis. This suggests that the plastid-targeted proteomes between apple and Arabidopsis are different, and interestingly alludes to the presence of differential targeting of homologs between the two species. Co-expression analysis of 2,224 genes encoding putative plastid-targeted apple proteins suggests that they play a role in plant developmental and intermediary metabolism. Further, an inter-specific comparison of Arabidopsis, Prunus persica (Peach, Malus × domestica (Apple, Populus trichocarpa (Black cottonwood, Fragaria vesca (Woodland Strawberry, Solanum lycopersicum (Tomato and Vitis vinifera (Grapevine also identified a large number of novel species-specific plastid-targeted proteins. This analysis also revealed the presence of alternatively targeted homologs across species. Two separate analyses revealed that a small subset of proteins, one representing 289 protein clusters and the other 737 unique protein sequences, are conserved between seven plastid-targeted angiosperm proteomes. Majority of the novel proteins were annotated to play roles in stress response, transport, catabolic processes, and cellular component organization. Our results suggest that the current state of

  2. Global analysis of plasticity in turgor loss point, a key drought tolerance trait.

    Science.gov (United States)

    Bartlett, Megan K; Zhang, Ya; Kreidler, Nissa; Sun, Shanwen; Ardy, Rico; Cao, Kunfang; Sack, Lawren

    2014-12-01

    Many species face increasing drought under climate change. Plasticity has been predicted to strongly influence species' drought responses, but broad patterns in plasticity have not been examined for key drought tolerance traits, including turgor loss or 'wilting' point (πtlp ). As soil dries, plants shift πtlp by accumulating solutes (i.e. 'osmotic adjustment'). We conducted the first global analysis of plasticity in Δπtlp and related traits for 283 wild and crop species in ecosystems worldwide. Δπtlp was widely prevalent but moderate (-0.44 MPa), accounting for 16% of post-drought πtlp. Thus, pre-drought πtlp was a considerably stronger predictor of post-drought πtlp across species of wild plants. For cultivars of certain crops Δπtlp accounted for major differences in post-drought πtlp. Climate was correlated with pre- and post-drought πtlp, but not Δπtlp. Thus, despite the wide prevalence of plasticity, πtlp measured in one season can reliably characterise most species' constitutive drought tolerances and distributions relative to water supply. © 2014 John Wiley & Sons Ltd/CNRS.

  3. An end-point method based on graphene oxide for RNase H analysis and inhibitors screening.

    Science.gov (United States)

    Zhao, Chuan; Fan, Jialong; Peng, Lan; Zhao, Lijian; Tong, Chunyi; Wang, Wei; Liu, Bin

    2017-04-15

    As a highly conserved damage repair protein, RNase H can hydrolysis DNA-RNA heteroduplex endonucleolytically and cleave RNA-DNA junctions as well. In this study, we have developed an accurate and sensitive RNase H assay based on fluorophore-labeled chimeric substrate hydrolysis and the differential affinity of graphene oxide on RNA strand with different length. This end-point measurement method can detect RNase H in a range of 0.01 to 1 units /mL with a detection limit of 5.0×10 -3 units/ mL under optimal conditions. We demonstrate the utility of the assay by screening antibiotics, resulting in the identification of gentamycin, streptomycin and kanamycin as inhibitors with IC 50 of 60±5µM, 70±8µM and 300±20µM, respectively. Furthermore, the assay was reliably used to detect RNase H in complicated biosamples and found that RNase H activity in tumor cells was inhibited by gentamycin and streptomycin sulfate in a concentration-dependent manner. The average level of RNase H in serums of HBV infection group was similar to that of control group. In summary, the assay provides an alternative tool for biochemical analysis for this enzyme and indicates the feasibility of high throughput screening inhibitors of RNase H in vitro and in vivo. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Analysis of In-Situ Vibration Monitoring for End-Point Detection of CMP Planarization Processes

    International Nuclear Information System (INIS)

    Hetherington, Dale L.; Lauffer, James P.; Shingledecker, David M.; Stein, David J.; Wyckoff, Edward E.

    1999-01-01

    This paper details the analysis of vibration monitoring for end-point control in oxide CMP processes. Two piezoelectric accelerometers were integrated onto the backside of a stainless steel polishing head of an IPEC 472 polisher. One sensor was placed perpendicular to the carrier plate (vertical) and the other parallel to the plate (horizontal). Wafers patterned with metal and coated with oxide material were polished at different speeds and pressures. Our results show that it is possible to sense a change in the vibration signal over time during planarization of oxide material on patterned wafers. The horizontal accelerometer showed more sensitivity to change in vibration amplitude compared to the vertical accelerometer for a given polish condition. At low carrier and platen rotation rates, the change in vibration signal over time at fixed frequencies decreased approximately ampersand frac12; - 1 order of magnitude (over the 2 to 10 psi polish pressure ranges). At high rotation speeds, the vibration signal remained essentially constant indicating that other factors dominated the vibration signaL These results show that while it is possible to sense changes in acceleration during polishing, more robust hardware and signal processing algorithms are required to ensure its use over a wide range of process conditions

  5. An analysis of seismic risk from a tourism point of view.

    Science.gov (United States)

    Mäntyniemi, Päivi

    2012-07-01

    Global awareness of natural calamities increased after the destructive Indian Ocean tsunami of December 2004, largely because many foreigners lost their lives, especially in Thailand. This paper explores how best to communicate the seismic risk posed by different travel destinations to crisis management personnel in tourists' home countries. The analysis of seismic risk should be straightforward enough for non-specialists, yet powerful enough to identify the travel destinations that are most at risk. The output for each location is a point in 3D space composed of the natural and built-up environment and local tourism. The tourism-specific factors can be tailored according to the tourists' nationality. The necessary information can be collected from various directories and statistics, much of it available over the Internet. The output helps to illustrate the overall seismic risk conditions of different travel destinations, allows for comparison across destinations, and identifies the places that are most at risk. © 2012 The Author(s). Journal compilation © Overseas Development Institute, 2012.

  6. An automated smartphone-based diagnostic assay for point-of-care semen analysis

    Science.gov (United States)

    Kanakasabapathy, Manoj Kumar; Sadasivam, Magesh; Singh, Anupriya; Preston, Collin; Thirumalaraju, Prudhvi; Venkataraman, Maanasa; Bormann, Charles L.; Draz, Mohamed Shehata; Petrozza, John C.; Shafiee, Hadi

    2017-01-01

    Male infertility affects up to 12% of the world’s male population and is linked to various environmental and medical conditions. Manual microscope-based testing and computer-assisted semen analysis (CASA) are the current standard methods to diagnose male infertility; however, these methods are labor-intensive, expensive, and laboratory-based. Cultural and socially dominated stigma against male infertility testing hinders a large number of men from getting tested for infertility, especially in resource-limited African countries. We describe the development and clinical testing of an automated smartphone-based semen analyzer designed for quantitative measurement of sperm concentration and motility for point-of-care male infertility screening. Using a total of 350 clinical semen specimens at a fertility clinic, we have shown that our assay can analyze an unwashed, unprocessed liquefied semen sample with smartphone capabilities, can make remote semen quality testing accessible to people in both developed and developing countries who have access to smartphones. PMID:28330865

  7. HACCP (Hazard Analysis Critical Control Points): is it coming to the dairy?

    Science.gov (United States)

    Cullor, J S

    1997-12-01

    The risks and consequences of foodborne and waterborne pathogens are coming to the forefront of public health concerns, and strong pressure is being applied on agriculture for immediate implementation of on-farm controls. The FDA is considering HACCP (Hazard Analysis Critical Control Points) as the new foundation for revision of the US Food Safety Assurance Program because HACCP is considered to be a science-based, systematic approach to the prevention of food safety problems. In addition, the implementation of HACCP principles permits more government oversight through requirements for standard operating procedures and additional systems for keeping records, places primary responsibility for ensuring food safety on the food manufacturer or distributor, and may assist US food companies in competing more effectively in the world market. With the HACCP-based program in place, a government investigator should be able to determine and evaluate both current and past conditions that are critical to ensuring the safety of the food produced by the facility. When this policy is brought to the production unit, the impact for producers and veterinarians will be substantial.

  8. Analysis of three geopressured geothermal aquifer-natural gas fields; Duson Hollywood and Church Point, Louisiana

    Energy Technology Data Exchange (ETDEWEB)

    Rogers, L.A.; Boardman, C.R.

    1981-05-01

    The available well logs, production records and geological structure maps were analyzed for the Hollywood, Duson, and Church Point, Louisiana oil and gas field to determine the areal extent of the sealed geopressured blocks and to identify which aquifer sands within the blocks are connected to commercial production of hydrocarbons. The analysis showed that over the depth intervals of the geopressured zones shown on the logs essentially all of the sands of any substantial thickness had gas production from them somewhere or other in the fault block. It is therefore expected that the sands which are fully brine saturated in many of the wells are the water drive portion of the producing gas/oil somewhere else within the fault block. In this study only one deep sand was identified, in the Hollywood field, which was not connected to a producing horizon somewhere else in the field. Estimates of the reservoir parameters were made and a hypothetical production calculation showed the probable production to be less than 10,000 b/d. The required gas price to profitably produce this gas is well above the current market price.

  9. Design Parameters Analysis of Point Absorber WEC via an evolutionary-algorithm-based Dimensioning Tool

    Directory of Open Access Journals (Sweden)

    Marcos Blanco

    2015-10-01

    Full Text Available Wave energy conversion has an essential difference from other renewable energies since the dependence between the devices design and the energy resource is stronger. Dimensioning is therefore considered a key stage when a design project of Wave Energy Converters (WEC is undertaken. Location, WEC concept, Power Take-Off (PTO type, control strategy and hydrodynamic resonance considerations are some of the critical aspects to take into account to achieve a good performance. The paper proposes an automatic dimensioning methodology to be accomplished at the initial design project stages and the following elements are described to carry out the study: an optimization design algorithm, its objective functions and restrictions, a PTO model, as well as a procedure to evaluate the WEC energy production. After that, a parametric analysis is included considering different combinations of the key parameters previously introduced. A variety of study cases are analysed from the point of view of energy production for different design-parameters and all of them are compared with a reference case. Finally, a discussion is presented based on the results obtained, and some recommendations to face the WEC design stage are given.

  10. A parallelized Python based Multi-Point Thomson Scattering analysis in NSTX-U

    Science.gov (United States)

    Miller, Jared; Diallo, Ahmed; Leblanc, Benoit

    2014-10-01

    Multi-Point Thomson Scattering (MPTS) is a reliable and accurate method of finding the temperature, density, and pressure of a magnetically confined plasma. Nd:YAG (1064 nm) lasers are fired into the plasma with a frequency of 60 Hz, and the light is Doppler shifted by Thomson scattering. Polychromators on the midplane of the tokamak pick up the light at various radii/scattering angles, and the avalanche photodiode's voltages are added to an MDSplus tree for later analysis. This project ports and optimizes the prior serial IDL MPTS code into a well-documented Python package that runs in parallel. Since there are 30 polychromators in the current NSTX setup (12 more will be added when NSTX-U is completed), using parallelism offers vast savings in performance. NumPy and SciPy further accelerate numerical calculations and matrix operations, Matplotlib and PyQt make an intuitive GUI with plots of the output, and Multiprocessing parallelizes the computationally intensive calculations. The Python package was designed with portability and flexibility in mind so it can be adapted for use in any polychromator-based MPTS system.

  11. Instantaneous normal mode analysis for intermolecular and intramolecular vibrations of water from atomic point of view

    Science.gov (United States)

    Chen, Yu-Chun; Tang, Ping-Han; Wu, Ten-Ming

    2013-11-01

    By exploiting the instantaneous normal mode (INM) analysis for models of flexible molecules, we investigate intermolecular and intramolecular vibrations of water from the atomic point of view. With two flexible SPC/E models, our investigations include three aspects about their INM spectra, which are separated into the unstable, intermolecular, bending, and stretching bands. First, the O- and H-atom contributions in the four INM bands are calculated and their stable INM spectra are compared with the power spectra of the atomic velocity autocorrelation functions. The unstable and intermolecular bands of the flexible models are also compared with those of the SPC/E model of rigid molecules. Second, we formulate the inverse participation ratio (IPR) of the INMs, respectively, for the O- and H-atom and molecule. With the IPRs, the numbers of the three species participated in the INMs are estimated so that the localization characters of the INMs in each band are studied. Further, by the ratio of the IPR of the H atom to that of the O atom, we explore the number of involved OH bond per molecule participated in the INMs. Third, by classifying simulated molecules into subensembles according to the geometry of their local environments or their H-bond configurations, we examine the local-structure effects on the bending and stretching INM bands. All of our results are verified to be insensible to the definition of H-bond. Our conclusions about the intermolecular and intramolecular vibrations in water are given.

  12. Instantaneous normal mode analysis for intermolecular and intramolecular vibrations of water from atomic point of view.

    Science.gov (United States)

    Chen, Yu-Chun; Tang, Ping-Han; Wu, Ten-Ming

    2013-11-28

    By exploiting the instantaneous normal mode (INM) analysis for models of flexible molecules, we investigate intermolecular and intramolecular vibrations of water from the atomic point of view. With two flexible SPC/E models, our investigations include three aspects about their INM spectra, which are separated into the unstable, intermolecular, bending, and stretching bands. First, the O- and H-atom contributions in the four INM bands are calculated and their stable INM spectra are compared with the power spectra of the atomic velocity autocorrelation functions. The unstable and intermolecular bands of the flexible models are also compared with those of the SPC/E model of rigid molecules. Second, we formulate the inverse participation ratio (IPR) of the INMs, respectively, for the O- and H-atom and molecule. With the IPRs, the numbers of the three species participated in the INMs are estimated so that the localization characters of the INMs in each band are studied. Further, by the ratio of the IPR of the H atom to that of the O atom, we explore the number of involved OH bond per molecule participated in the INMs. Third, by classifying simulated molecules into subensembles according to the geometry of their local environments or their H-bond configurations, we examine the local-structure effects on the bending and stretching INM bands. All of our results are verified to be insensible to the definition of H-bond. Our conclusions about the intermolecular and intramolecular vibrations in water are given.

  13. A node-based smoothed point interpolation method for dynamic analysis of rotating flexible beams

    Science.gov (United States)

    Du, C. F.; Zhang, D. G.; Li, L.; Liu, G. R.

    2017-10-01

    We proposed a mesh-free method, the called node-based smoothed point interpolation method (NS-PIM), for dynamic analysis of rotating beams. A gradient smoothing technique is used, and the requirements on the consistence of the displacement functions are further weakened. In static problems, the beams with three types of boundary conditions are analyzed, and the results are compared with the exact solution, which shows the effectiveness of this method and can provide an upper bound solution for the deflection. This means that the NS-PIM makes the system soften. The NS-PIM is then further extended for solving a rigid-flexible coupled system dynamics problem, considering a rotating flexible cantilever beam. In this case, the rotating flexible cantilever beam considers not only the transverse deformations, but also the longitudinal deformations. The rigid-flexible coupled dynamic equations of the system are derived via employing Lagrange's equations of the second type. Simulation results of the NS-PIM are compared with those obtained using finite element method (FEM) and assumed mode method. It is found that compared with FEM, the NS-PIM has anti-ill solving ability under the same calculation conditions.

  14. Optimized Granularity Analysis of Maximum Power Point Trackers in Low Power Applications

    Science.gov (United States)

    2017-06-01

    inductance , and I is the electric current. Likewise, the capacitor builds charge and stores electrostatic energy in the form of voltage, Estored = ½CV2...Maximum Power Point Current I-V Current-Voltage K Kelvin L Inductance MPP Maximum Power Point MPPT Maximum Power Point Tracking NPS Naval...Consumer products are typically designed to function for a matter of hours between charges ; however, military operations may last days or weeks without

  15. Spatial analysis of ecosystem service relationships to improve targeting of payments for hydrological services.

    Science.gov (United States)

    Mokondoko, Pierre; Manson, Robert H; Ricketts, Taylor H; Geissert, Daniel

    2018-01-01

    Payment for hydrological services (PHS) are popular tools for conserving ecosystems and their water-related services. However, improving the spatial targeting and impacts of PHS, as well as their ability to foster synergies with other ecosystem services (ES), remain challenging. We aimed at using spatial analyses to evaluate the targeting performance of México's National PHS program in central Veracruz. We quantified the effectiveness of areas targeted for PHS in actually covering areas of high HS provision and social priority during 2003-2013. First, we quantified provisioning and spatial distributions of two target (water yield and soil retention), and one non-target ES (carbon storage) using InVEST. Subsequently, pairwise relationships among ES were quantified by using spatial correlation and overlap analyses. Finally, we evaluated targeting by: (i) prioritizing areas of individual and overlapping ES; (ii) quantifying spatial co-occurrences of these priority areas with those targeted by PHS; (iii) evaluating the extent to which PHS directly contribute to HS delivery; and (iv), testing if PHS targeted areas disproportionately covered areas with high ecological and social priority. We found that modelled priority areas exhibited non-random distributions and distinct spatial patterns. Our results show significant pairwise correlations between all ES suggesting synergistic relationships. However, our analysis showed a significantly lower overlap than expected and thus significant mismatches between PHS targeted areas and all types of priority areas. These findings suggest that the targeting of areas with high HS provisioning and social priority by Mexico's PHS program could be improved significantly. This study underscores: (1) the importance of using maps of HS provisioning as main targeting criteria in PHS design to channel payments towards areas that require future conservation, and (2) the need for future research that helps balance ecological and socioeconomic

  16. An assessment of independent component analysis for detection of military targets from hyperspectral images

    Science.gov (United States)

    Tiwari, K. C.; Arora, M. K.; Singh, D.

    2011-10-01

    Hyperspectral data acquired over hundreds of narrow contiguous wavelength bands are extremely suitable for target detection due to their high spectral resolution. Though spectral response of every material is expected to be unique, but in practice, it exhibits variations, which is known as spectral variability. Most target detection algorithms depend on spectral modelling using a priori available target spectra In practice, target spectra is, however, seldom available a priori. Independent component analysis (ICA) is a new evolving technique that aims at finding out components which are statistically independent or as independent as possible. The technique therefore has the potential of being used for target detection applications. A assessment of target detection from hyperspectral images using ICA and other algorithms based on spectral modelling may be of immense interest, since ICA does not require a priori target information. The aim of this paper is, thus, to assess the potential of ICA based algorithm vis a vis other prevailing algorithms for military target detection. Four spectral matching algorithms namely Orthogonal Subspace Projection (OSP), Constrained Energy Minimisation (CEM), Spectral Angle Mapper (SAM) and Spectral Correlation Mapper (SCM), and four anomaly detection algorithms namely OSP anomaly detector (OSPAD), Reed-Xiaoli anomaly detector (RXD), Uniform Target Detector (UTD) and a combination of Reed-Xiaoli anomaly detector and Uniform Target Detector (RXD-UTD) were considered. The experiments were conducted using a set of synthetic and AVIRIS hyperspectral images containing aircrafts as military targets. A comparison of true positive and false positive rates of target detections obtained from ICA and other algorithms plotted on a receiver operating curves (ROC) space indicates the superior performance of the ICA over other algorithms.

  17. Focal point analysis of the singlet-triplet energy gap of octacene and larger acenes.

    Science.gov (United States)

    Hajgató, Balázs; Huzak, Matija; Deleuze, Michael S

    2011-08-25

    A benchmark theoretical study of the electronic ground state and of the vertical and adiabatic singlet-triplet (ST) excitation energies of n-acenes (C(4n+2)H(2n+4)) ranging from octacene (n = 8) to undecacene (n = 11) is presented. The T1 diagnostics of coupled cluster theory and further energy-based criteria demonstrate that all investigated systems exhibit predominantly a (1)A(g) singlet closed-shell electronic ground state. Singlet-triplet (S(0)-T(1)) energy gaps can therefore be very accurately determined by applying the principle of a focal point analysis (FPA) onto the results of a series of single-point and symmetry-restricted calculations employing correlation consistent cc-pVXZ basis sets (X = D, T, Q, 5) and single-reference methods [HF, MP2, MP3, MP4SDQ, CCSD, and CCSD(T)] of improving quality. According to our best estimates, which amount to a dual extrapolation of energy differences to the level of coupled cluster theory including single, double, and perturbative estimates of connected triple excitations [CCSD(T)] in the limit of an asymptotically complete basis set (cc-pV∞Z), the S(0)-T(1) vertical (adiabatic) excitation energies of these compounds amount to 13.40 (8.21), 10.72 (6.05), 8.05 (3.67), and 7.10 (2.58) kcal/mol, respectively. In line with the absence of Peierls distortions (bond length alternations), extrapolations of results obtained at this level for benzene (n = 1) and all studied n-acenes so far (n = 2-11) indicate a vanishing S(0)-T(1) energy gap, in the limit of an infinitely large polyacene, within an uncertainty of 1.5 kcal/mol (0.06 eV). Lacking experimental values for the S(0)-T(1) energy gaps of n-acenes larger than hexacene, comparison is made with recent optical and electrochemical determinations of the HOMO-LUMO band gap. Further issues such as scalar relativistic, core correlation, and diagonal Born-Oppenheimer corrections (DBOCs) are tentatively examined. © 2011 American Chemical Society

  18. Sentiment analysis enhancement with target variable in Kumar’s Algorithm

    Science.gov (United States)

    Arman, A. A.; Kawi, A. B.; Hurriyati, R.

    2016-04-01

    Sentiment analysis (also known as opinion mining) refers to the use of text analysis and computational linguistics to identify and extract subjective information in source materials. Sentiment analysis is widely applied to reviews discussion that is being talked in social media for many purposes, ranging from marketing, customer service, or public opinion of public policy. One of the popular algorithm for Sentiment Analysis implementation is Kumar algorithm that developed by Kumar and Sebastian. Kumar algorithm can identify the sentiment score of the statement, sentence or tweet, but cannot determine the relationship of the object or target related to the sentiment being analysed. This research proposed solution for that challenge by adding additional component that represent object or target to the existing algorithm (Kumar algorithm). The result of this research is a modified algorithm that can give sentiment score based on a given object or target.

  19. Targeted DNA sequencing and in situ mutation analysis using mobile phone microscopy

    Science.gov (United States)

    Kühnemund, Malte; Wei, Qingshan; Darai, Evangelia; Wang, Yingjie; Hernández-Neuta, Iván; Yang, Zhao; Tseng, Derek; Ahlford, Annika; Mathot, Lucy; Sjöblom, Tobias; Ozcan, Aydogan; Nilsson, Mats

    2017-01-01

    Molecular diagnostics is typically outsourced to well-equipped centralized laboratories, often far from the patient. We developed molecular assays and portable optical imaging designs that permit on-site diagnostics with a cost-effective mobile-phone-based multimodal microscope. We demonstrate that targeted next-generation DNA sequencing reactions and in situ point mutation detection assays in preserved tumour samples can be imaged and analysed using mobile phone microscopy, achieving a new milestone for tele-medicine technologies.

  20. Type of evidence behind point-of-care clinical information products: a bibliometric analysis.

    Science.gov (United States)

    Ketchum, Andrea M; Saleh, Ahlam A; Jeong, Kwonho

    2011-02-18

    Point-of-care (POC) products are widely used as information reference tools in the clinical setting. Although usability, scope of coverage, ability to answer clinical questions, and impact on health outcomes have been studied, no comparative analysis of the characteristics of the references, the evidence for the content, in POC products is available. The objective of this study was to compare the type of evidence behind five POC clinical information products. This study is a comparative bibliometric analysis of references cited in monographs in POC products. Five commonly used products served as subjects for the study: ACP PIER, Clinical Evidence, DynaMed, FirstCONSULT, and UpToDate. The four clinical topics examined to identify content in the products were asthma, hypertension, hyperlipidemia, and carbon monoxide poisoning. Four indicators were measured: distribution of citations, type of evidence, product currency, and citation overlap. The type of evidence was determined based primarily on the publication type found in the MEDLINE bibliographic record, as well as the Medical Subject Headings (MeSH), both assigned by the US National Library of Medicine. MeSH is the controlled vocabulary used for indexing articles in MEDLINE/PubMed. FirstCONSULT had the greatest proportion of references with higher levels of evidence publication types such as systematic review and randomized controlled trial (137/153, 89.5%), although it contained the lowest total number of references (153/2330, 6.6%). DynaMed had the largest total number of references (1131/2330, 48.5%) and the largest proportion of current (2007-2009) references (170/1131, 15%). The distribution of references cited for each topic varied between products. For example, asthma had the most references listed in DynaMed, Clinical Evidence, and FirstCONSULT, while hypertension had the most references in UpToDate and ACP PIER. An unexpected finding was that the rate of citation overlap was less than 1% for each topic

  1. Analysis of multi-species point patterns using multivariate log Gaussian Cox processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Guan, Yongtao; Jalilian, Abdollah

    Multivariate log Gaussian Cox processes are flexible models for multivariate point patterns. However, they have so far only been applied in bivariate cases. In this paper we move beyond the bivariate case in order to model multi-species point patterns of tree locations. In particular we address...

  2. Point-of-care lactate and creatinine analysis for sick obstetric ...

    African Journals Online (AJOL)

    2016-03-15

    Mar 15, 2016 ... and while point-of-care devices are often used for research studies, they are scarcely available for routine ... Point-of-care clinical chemistry testing was feasible, practical, and well received by staff, and was considered to have a useful role to play in the clinical care of sick obstetric patients at this referral ...

  3. Fixed Points of Belief Propagation - An Analysis via Polynomial Homotopy Continuation.

    Science.gov (United States)

    Knoll, Christian; Mehta, Dhagash; Chen, Tianran; Pernkopf, Franz

    2017-09-07

    Belief propagation (BP) is an iterative method to perform approximate inference on arbitrary graphical models. Whether BP converges and if the solution is a unique fixed point depends on both the structure and the parametrization of the model. To understand this dependence it is interesting to find all fixed points.

  4. Functional Equivalence of Autistic Leading and Communicative Pointing: Analysis and Treatment.

    Science.gov (United States)

    Carr, Edward G.; Kemp, Duane C.

    1989-01-01

    Autistic leading in four autistic children, aged three-five, was treated by strengthening pointing as an alternative form of request. Following intervention, pointing gradually replaced leading, and stimulus generalization was observed. Results indicate that functional equivalence and response efficiency can be procedurally combined to…

  5. Analysis of the Neutron Generator and Target for the LSDTS System

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je; Lee, Yong Deok; Song, Jae Hoon; Song, Kee Chan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2008-11-15

    A preliminary analysis was performed based on the literatures and the patents for the neutron generators and targets for the lead slowing down time spectrometer (LSDTS) system. It was found that local neutron generator did not exhibit enough neutron intensity such as 1E+12 n/s, which is a minimum requirement for the LSDTS system to overcome curium backgrounds. However, a neutron generator implemented with an electron accelerator may provide a higher intensity around 1E+13 n/s and it is required to investigate further including a detail analysis. In addition to the neutron generator, a study on target was performed with the Monte Carlo simulation. In the study, an optimal design of target was suggested to provide a high neutron yield and a better thermal resistance. The suggested target consists several cylindrical plates with a certain cooling gap, which have increasing thickness and increasing radius.

  6. Analysis of the Neutron Generator and Target for the LSDTS System

    International Nuclear Information System (INIS)

    Park, Chang Je; Lee, Yong Deok; Song, Jae Hoon; Song, Kee Chan

    2008-11-01

    A preliminary analysis was performed based on the literatures and the patents for the neutron generators and targets for the lead slowing down time spectrometer (LSDTS) system. It was found that local neutron generator did not exhibit enough neutron intensity such as 1E+12 n/s, which is a minimum requirement for the LSDTS system to overcome curium backgrounds. However, a neutron generator implemented with an electron accelerator may provide a higher intensity around 1E+13 n/s and it is required to investigate further including a detail analysis. In addition to the neutron generator, a study on target was performed with the Monte Carlo simulation. In the study, an optimal design of target was suggested to provide a high neutron yield and a better thermal resistance. The suggested target consists several cylindrical plates with a certain cooling gap, which have increasing thickness and increasing radius

  7. Target preparation and neutron activation analysis: a successful story at IRMM

    International Nuclear Information System (INIS)

    Robouch, P.; Arana, G.; Eguskiza, M.; Maguregui, M.I.; Pomme, S.; Ingelbrecht, C.

    2002-01-01

    The main task of a target producer is to make well characterized and homogeneous deposits on specific supports. Alpha and/or gamma spectrometry are traditionally used to monitor the quality of actinide deposits. With the increasing demand for enriched stable isotope targets, other analytical techniques, such as ICP-MS and NAA, are needed. This paper presents the application of neutron activation analysis to quality control of 'thin' targets, 'thicker' neutron dosimeters and 'thick' bronze disks prepared by the Reference Materials Unit at the Institute of Reference Materials and Measurements

  8. Target preparation and neutron activation analysis a successful story at IRMM

    CERN Document Server

    Robouch, P; Eguskiza, M; Maguregui, M I; Pommé, S; Ingelbrecht, C

    2002-01-01

    The main task of a target producer is to make well characterized and homogeneous deposits on specific supports. Alpha and/or gamma spectrometry are traditionally used to monitor the quality of actinide deposits. With the increasing demand for enriched stable isotope targets, other analytical techniques, such as ICP-MS and NAA, are needed. This paper presents the application of neutron activation analysis to quality control of 'thin' targets, 'thicker' neutron dosimeters and 'thick' bronze disks prepared by the Reference Materials Unit at the Institute of Reference Materials and Measurements.

  9. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    OpenAIRE

    Yu-Ting Hung; Chi-Te Liu; I-Chen Peng; Chin Hsu; Roch-Chui Yu; Kuan-Chen Cheng

    2015-01-01

    To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP) plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and doc...

  10. Point-of-sale tobacco promotion and youth smoking: a meta-analysis.

    Science.gov (United States)

    Robertson, Lindsay; Cameron, Claire; McGee, Rob; Marsh, Louise; Hoek, Janet

    2016-12-01

    Previous systematic reviews have found consistent evidence of a positive association between exposure to point-of-sale (POS) tobacco promotion and increased smoking and smoking susceptibility among children and adolescents. No meta-analysis has been conducted on these studies to date. Systematic literature searches were carried out to identify all quantitative observational studies that examined the relationship between POS tobacco promotion and individual-level smoking and smoking-related cognitions among children and adolescents, published between January 1990 and June 2014. Random-effects meta-analyses were used. Subgroup analyses were conducted according to extent of tobacco POS advertising environment in the study environment. Sensitivity analyses were performed according to study size and quality. 13 studies met the inclusion criteria; 11 reported data for behavioural outcomes, 6 for cognitive outcomes (each of these assessed smoking susceptibility). The studies were cross-sectional, with the exception of 2 cohort studies. For the behavioural outcomes, the pooled OR was 1.61 (95% CI 1.33 to 1.96) and for smoking susceptibility the pooled OR was 1.32 (95% CI 1.09 to 1.61). Children and adolescents more frequently exposed to POS tobacco promotion have around 1.6 times higher odds of having tried smoking and around 1.3 times higher odds of being susceptible to future smoking, compared with those less frequently exposed. Together with the available evaluations of POS display bans, the results strongly indicate that legislation banning tobacco POS promotion will effectively reduce smoking among young people. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  11. Performance Analysis of Free-Space Optical Links Over Malaga (M) Turbulence Channels with Pointing Errors

    KAUST Repository

    Ansari, Imran Shafique

    2015-08-12

    In this work, we present a unified performance analysis of a free-space optical (FSO) link that accounts for pointing errors and both types of detection techniques (i.e. intensity modulation/direct detection (IM/DD) as well as heterodyne detection). More specifically, we present unified exact closedform expressions for the cumulative distribution function, the probability density function, the moment generating function, and the moments of the end-to-end signal-to-noise ratio (SNR) of a single link FSO transmission system, all in terms of the Meijer’s G function except for the moments that is in terms of simple elementary functions. We then capitalize on these unified results to offer unified exact closed-form expressions for various performance metrics of FSO link transmission systems, such as, the outage probability, the scintillation index (SI), the average error rate for binary and M-ary modulation schemes, and the ergodic capacity (except for IM/DD technique, where we present closed-form lower bound results), all in terms of Meijer’s G functions except for the SI that is in terms of simple elementary functions. Additionally, we derive the asymptotic results for all the expressions derived earlier in terms of Meijer’s G function in the high SNR regime in terms of simple elementary functions via an asymptotic expansion of the Meijer’s G function. We also derive new asymptotic expressions for the ergodic capacity in the low as well as high SNR regimes in terms of simple elementary functions via utilizing moments. All the presented results are verified via computer-based Monte-Carlo simulations.

  12. Screening of point mutations by multiple SSCP analysis in the dystrophin gene

    Energy Technology Data Exchange (ETDEWEB)

    Lasa, A.; Baiget, M.; Gallano, P. [Hospital Sant Pau, Barcelona (Spain)

    1994-09-01

    Duchenne muscular dystrophy (DMD) is a lethal, X-linked neuromuscular disorder. The population frequency of DMD is one in approximately 3500 boys, of which one third is thought to be a new mutant. The DMD gene is the largest known to date, spanning over 2,3 Mb in band Xp21.2; 79 exons are transcribed into a 14 Kb mRNA coding for a protein of 427 kD which has been named dystrophin. It has been shown that about 65% of affected boys have a gene deletion with a wide variation in localization and size. The remaining affected individuals who have no detectable deletions or duplications would probably carry more subtle mutations that are difficult to detect. These mutations occur in several different exons and seem to be unique to single patients. Their identification represents a formidable goal because of the large size and complexity of the dystrophin gene. SSCP is a very efficient method for the detection of point mutations if the parameters that affect the separation of the strands are optimized for a particular DNA fragment. The multiple SSCP allows the simultaneous study of several exons, and implies the use of different conditions because no single set of conditions will be optimal for all fragments. Seventy-eight DMD patients with no deletion or duplication in the dystrophin gene were selected for the multiple SSCP analysis. Genomic DNA from these patients was amplified using the primers described for the diagnosis procedure (muscle promoter and exons 3, 8, 12, 16, 17, 19, 32, 45, 48 and 51). We have observed different mobility shifts in bands corresponding to exons 8, 12, 43 and 51. In exons 17 and 45, altered electrophoretic patterns were found in different samples identifying polymorphisms already described.

  13. Flux balance analysis of ammonia assimilation network in E. coli predicts preferred regulation point.

    Directory of Open Access Journals (Sweden)

    Lu Wang

    Full Text Available Nitrogen assimilation is a critical biological process for the synthesis of biomolecules in Escherichia coli. The central ammonium assimilation network in E. coli converts carbon skeleton α-ketoglutarate and ammonium into glutamate and glutamine, which further serve as nitrogen donors for nitrogen metabolism in the cell. This reaction network involves three enzymes: glutamate dehydrogenase (GDH, glutamine synthetase (GS and glutamate synthase (GOGAT. In minimal media, E. coli tries to maintain an optimal growth rate by regulating the activity of the enzymes to match the availability of the external ammonia. The molecular mechanism and the strategy of the regulation in this network have been the research topics for many investigators. In this paper, we develop a flux balance model for the nitrogen metabolism, taking into account of the cellular composition and biosynthetic requirements for nitrogen. The model agrees well with known experimental results. Specifically, it reproduces all the (15N isotope labeling experiments in the wild type and the two mutant (ΔGDH and ΔGOGAT strains of E. coli. Furthermore, the predicted catalytic activities of GDH, GS and GOGAT in different ammonium concentrations and growth rates for the wild type, ΔGDH and ΔGOGAT strains agree well with the enzyme concentrations obtained from western blots. Based on this flux balance model, we show that GS is the preferred regulation point among the three enzymes in the nitrogen assimilation network. Our analysis reveals the pattern of regulation in this central and highly regulated network, thus providing insights into the regulation strategy adopted by the bacteria. Our model and methods may also be useful in future investigations in this and other networks.

  14. Building Change Detection from LIDAR Point Cloud Data Based on Connected Component Analysis

    Science.gov (United States)

    Awrangjeb, M.; Fraser, C. S.; Lu, G.

    2015-08-01

    Building data are one of the important data types in a topographic database. Building change detection after a period of time is necessary for many applications, such as identification of informal settlements. Based on the detected changes, the database has to be updated to ensure its usefulness. This paper proposes an improved building detection technique, which is a prerequisite for many building change detection techniques. The improved technique examines the gap between neighbouring buildings in the building mask in order to avoid under segmentation errors. Then, a new building change detection technique from LIDAR point cloud data is proposed. Buildings which are totally new or demolished are directly added to the change detection output. However, for demolished or extended building parts, a connected component analysis algorithm is applied and for each connected component its area, width and height are estimated in order to ascertain if it can be considered as a demolished or new building part. Finally, a graphical user interface (GUI) has been developed to update detected changes to the existing building map. Experimental results show that the improved building detection technique can offer not only higher performance in terms of completeness and correctness, but also a lower number of undersegmentation errors as compared to its original counterpart. The proposed change detection technique produces no omission errors and thus it can be exploited for enhanced automated building information updating within a topographic database. Using the developed GUI, the user can quickly examine each suggested change and indicate his/her decision with a minimum number of mouse clicks.

  15. Optimization of an Optical Inspection System Based on the Taguchi Method for Quantitative Analysis of Point-of-Care Testing

    Directory of Open Access Journals (Sweden)

    Chia-Hsien Yeh

    2014-09-01

    Full Text Available This study presents an optical inspection system for detecting a commercial point-of-care testing product and a new detection model covering from qualitative to quantitative analysis. Human chorionic gonadotropin (hCG strips (cut-off value of the hCG commercial product is 25 mIU/mL were the detection target in our study. We used a complementary metal-oxide semiconductor (CMOS sensor to detect the colors of the test line and control line in the specific strips and to reduce the observation errors by the naked eye. To achieve better linearity between the grayscale and the concentration, and to decrease the standard deviation (increase the signal to noise ratio, S/N, the Taguchi method was used to find the optimal parameters for the optical inspection system. The pregnancy test used the principles of the lateral flow immunoassay, and the colors of the test and control line were caused by the gold nanoparticles. Because of the sandwich immunoassay model, the color of the gold nanoparticles in the test line was darkened by increasing the hCG concentration. As the results reveal, the S/N increased from 43.48 dB to 53.38 dB, and the hCG concentration detection increased from 6.25 to 50 mIU/mL with a standard deviation of less than 10%. With the optimal parameters to decrease the detection limit and to increase the linearity determined by the Taguchi method, the optical inspection system can be applied to various commercial rapid tests for the detection of ketamine, troponin I, and fatty acid binding protein (FABP.

  16. Application of hazard analysis and critical control point methodology and risk-based grading to consumer food safety surveys.

    Science.gov (United States)

    Røssvoll, Elin Halbach; Ueland, Øydis; Hagtvedt, Therese; Jacobsen, Eivind; Lavik, Randi; Langsrud, Solveig

    2012-09-01

    Traditionally, consumer food safety survey responses have been classified as either "right" or "wrong" and food handling practices that are associated with high risk of infection have been treated in the same way as practices with lower risks. In this study, a risk-based method for consumer food safety surveys has been developed, and HACCP (hazard analysis and critical control point) methodology was used for selecting relevant questions. We conducted a nationally representative Web-based survey (n = 2,008), and to fit the self-reported answers we adjusted a risk-based grading system originally developed for observational studies. The results of the survey were analyzed both with the traditional "right" and "wrong" classification and with the risk-based grading system. The results using the two methods were very different. Only 5 of the 10 most frequent food handling violations were among the 10 practices associated with the highest risk. These 10 practices dealt with different aspects of heat treatment (lacking or insufficient), whereas the majority of the most frequent violations involved storing food at room temperature for too long. Use of the risk-based grading system for survey responses gave a more realistic picture of risks associated with domestic food handling practices. The method highlighted important violations and minor errors, which are performed by most people and are not associated with significant risk. Surveys built on a HACCP-based approach with risk-based grading will contribute to a better understanding of domestic food handling practices and will be of great value for targeted information and educational activities.

  17. Acupuncture-Point Stimulation for Postoperative Pain Control: A Systematic Review and Meta-Analysis of Randomized Controlled Trials

    Directory of Open Access Journals (Sweden)

    Xian-Liang Liu

    2015-01-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of Acupuncture-point stimulation (APS in postoperative pain control compared with sham/placebo acupuncture or standard treatments (usual care or no treatment. Only randomized controlled trials (RCTs were included. Meta-analysis results indicated that APS interventions improved VAS scores significantly and also reduced total morphine consumption. No serious APS-related adverse effects (AEs were reported. There is Level I evidence for the effectiveness of body points plaster therapy and Level II evidence for body points electroacupuncture (EA, body points acupressure, body points APS for abdominal surgery patients, auricular points seed embedding, manual auricular acupuncture, and auricular EA. We obtained Level III evidence for body points APS in patients who underwent cardiac surgery and cesarean section and for auricular-point stimulation in patients who underwent abdominal surgery. There is insufficient evidence to conclude that APS is an effective postoperative pain therapy in surgical patients, although the evidence does support the conclusion that APS can reduce analgesic requirements without AEs. The best level of evidence was not adequate in most subgroups. Some limitations of this study may have affected the results, possibly leading to an overestimation of APS effects.

  18. Building Point Detection from Vehicle-Borne LiDAR Data Based on Voxel Group and Horizontal Hollow Analysis

    Directory of Open Access Journals (Sweden)

    Yu Wang

    2016-05-01

    Full Text Available Information extraction and three-dimensional (3D reconstruction of buildings using the vehicle-borne laser scanning (VLS system is significant for many applications. Extracting LiDAR points, from VLS, belonging to various types of building in large-scale complex urban environments still retains some problems. In this paper, a new technical framework for automatic and efficient building point extraction is proposed, including three main steps: (1 voxel group-based shape recognition; (2 category-oriented merging; and (3 building point identification by horizontal hollow ratio analysis. This article proposes a concept of “voxel group” based on the voxelization of VLS points: each voxel group is composed of several voxels that belong to one single real-world object. Then the shapes of point clouds in each voxel group are recognized and this shape information is utilized to merge voxel group. This article puts forward a characteristic nature of vehicle-borne LiDAR building points, called “horizontal hollow ratio”, for efficient extraction. Experiments are analyzed from two aspects: (1 building-based evaluation for overall experimental area; and (2 point-based evaluation for individual building using the completeness and correctness. The experimental results indicate that the proposed framework is effective for the extraction of LiDAR points belonging to various types of buildings in large-scale complex urban environments.

  19. Hydro-physical processes at the plunge point: an analysis using satellite and in situ data

    Directory of Open Access Journals (Sweden)

    A. T. Assireu

    2011-12-01

    Full Text Available The plunge point is the main mixing point between river and epilimnetic reservoir water. Plunge point monitoring is essential for understanding the behavior of density currents and their implications for reservoir. The use of satellite imagery products from different sensors (Landsat TM band 6 thermal signatures and visible channels for the characterization of the river-reservoir transition zone is presented in this study. It is demonstrated the feasibility of using Landsat TM band imagery to discern the subsurface river plumes and the plunge point. The spatial variability of the plunge point evident in the hydrologic data illustrates the advantages of synoptic satellite measurements over in situ point measurements alone to detect the river-reservoir transition zone. During the dry season, when the river-reservoir water temperature differences vanish and the river circulation is characterized by interflow-overflow, the river water inserts into the upper layers of the reservoir, affecting water quality. The results indicate a good agreement between hydrologic and satellite data and that the joint use of thermal and visible channel data for the operational monitoring of a plunge point is feasible. The deduced information about the density current from this study could potentially be assimilated into numerical models and hence be of significant interest for environmental and climatological research.

  20. A SWOT Analysis of the Nabucco Pipeline from Romania’s Point of View

    Directory of Open Access Journals (Sweden)

    Mariana Papatulica

    2009-07-01

    Full Text Available European Union energy sources are supposed to be sufficient to cover expected growth of natural gas demand for the coming decades, but there are not enough opportunities/infrastructure to transport these volumes of gas to European markets. Arbitrary interruptions of Russia gas deliveries towards Europe, the delays in the rehabilitation of its obsolete pipeline network, the interdiction of direct Asian gas exports transit through Russian transport infrastructure, made stringently necessary for European countries to diversify gas suppliers’ portfolio, by avoiding Russian territory. Nabucco pipeline was conceived as an alternative to European Union countries’ high dependence on Russian gas (about 40% of their consumption is provided by Russia, by connecting European Union countries directly to the huge natural gas resources of Central Asia, on the route Turkey – Bulgaria – Romania – Hungary – Austria. The purpose of this paper is to make a SWOT analysis of this project, highlighting its strengths and weakness from Romania’s point of view, as well as the opportunities and threats as external factors. The main idea resulting from the analysis is that strengths are prevailing for Romania. The turning to account of this project will ensure the diversification of gas sources and the development of competitive markets which can entail price reduction. It is supposed to be a fair and advantageous option, economically reliable, that will reduce dependence on deliveries of gas from a single source – Russia, ensuring two undeniable prerequisites: accessibility (to new supply sources and availability (which refers to guarantees of long term sustainability of gas deliveries. The project implementation will allow energy to help to establish new structural links between the EU, Turkey and the Caspian Sea states and will ensure transfrontier cooperation possibilities inside some euro-regions already constituted, by accessing regional development

  1. Cloud point extraction for analysis of antiretrovirals in human plasma by UFLC-ESI-MS/MS

    Directory of Open Access Journals (Sweden)

    Gabriel A. Hunzicker

    2015-12-01

    Full Text Available An analytical methodology based on cloud point extraction (CPE coupled to Ultra-Fast Liquid Chromatography and electrospray tandem mass spectrometry (UFLC-MS/MS was developed for analysis of Abacavir (ABC, Efavirenz (EFV, Lamivudine (3 TC and Nelfinavir (NFV in human plasma. It is the first time that CPE was used for extraction of antiretrovirals (ARV from plasma. The effects of relevant physic-chemical variables on analytical response of each ARV, including pH, surfactant concentration, equilibration time and temperature, were study and optimized; as well as its coupling to UFLC-ESI-MS/MS. Under optimized conditions, the resulting methodology was as follows: a 500 μL aliquot of human plasma was diluted with 2 mL deionized water in a 10 mL centrifuge tube. A 500 μL aliquot Triton X-114 5% w/v was added and homogenized using a vortex stirrer. The resulting cloudy solution was kept at 65 °C for 20 min for promoting the condensation of surfactant micelles. Then it was centrifuged at 3000 × g for 5 min for separation of the surfactant-rich phase. After discarding the aqueous supernatant, 400 μL ACN were added to the remaining surfactant rich phase and centrifuged in order to precipitate proteins and separate them. A 150 μL aliquot of the supernatant was transferred to 2 mL vial and further diluted with 400 μL deionized water. A 30 μL aliquot of the so-prepared solution was injected and analyzed into the UFLC-MS/MS. The method detection limits for ABC, EFV, 3 TC and NFV under optimized conditions were 31, 77, 57 and 21 ng mL−1, respectively. The RSD% for the studied analytes were <15%, except at the LOQ, which were <19%. Recovery values ranged from 81 to 107%. The proposed methodology was successfully applied for the analysis of ABC, EFV, 3 TC and NFV in human plasma within the concentration range of 43–6816, 125–4992, 81–3248 and 49–7904 ng mL−1, respectively. Under optimized working conditions the proposed

  2. Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition (Dagstuhl Seminar 17281)

    OpenAIRE

    Zennou, Sarah; Debray, Saumya K.; Dullien, Thomas; Lakhothia, Arun

    2018-01-01

    This report summarizes the program and the outcomes of the Dagstuhl Seminar 17281, entitled "Malware Analysis: From Large-Scale Data Triage to Targeted Attack Recognition". The seminar brought together practitioners and researchers from industry and academia to discuss the state-of-the art in the analysis of malware from both a big data perspective and a fine grained analysis. Obfuscation was also considered. The meeting created new links within this very diverse community.

  3. Fire hazard analysis of alcohol aqueous solution and Chinese liquor based on flash point

    Science.gov (United States)

    Chen, Qinpei; Kang, Guoting; Zhou, Tiannian; Wang, Jian

    2017-10-01

    In this paper, a series of experiments were conducted to study the flash point of alcohol aqueous solution and Chinese liquor. The fire hazard of the experimental results was analysed based on the standard GB50160-2008 of China. The result shows open-cup method doesn’t suit to alcohol aqueous solution. On the other hand, the closed-cup method shows good applicability. There is a non-linear relationship between closed-cup flash point and alcohol volume concentration. And the prediction equation established in this paper shows good fitting to the flash point and fire hazard classification of Chinese liquor.

  4. Nuclear microbeam analysis of ICF target material made by GDP technique

    Energy Technology Data Exchange (ETDEWEB)

    Rong, C.; He, X. [Applied Ion Beam Physics Laboratory, Institute of Modern Physics, Department of Nuclear Science and Technology, Fudan University, Shanghai 200433 (China); Meng, J., E-mail: eleanor920@163.com [Research Center of Laser Fusion, CAEP, Mianyang 621000 (China); Gao, D. [Research Center of Laser Fusion, CAEP, Mianyang 621000 (China); Zhang, Y.; Li, X.; Lyu, H.; Zhu, Y. [Applied Ion Beam Physics Laboratory, Institute of Modern Physics, Department of Nuclear Science and Technology, Fudan University, Shanghai 200433 (China); Zheng, Y. [Shanghai Synchrotron Radiation Facility, Shanghai Institute of Applied Physics, Chinese Academy of Sciences, Shanghai 201204 (China); Wang, X. [Applied Ion Beam Physics Laboratory, Institute of Modern Physics, Department of Nuclear Science and Technology, Fudan University, Shanghai 200433 (China); Shen, H., E-mail: haoshen@fudan.edu.cn [Applied Ion Beam Physics Laboratory, Institute of Modern Physics, Department of Nuclear Science and Technology, Fudan University, Shanghai 200433 (China)

    2015-04-01

    Germanium doped carbon–hydrogen polymer (CH) by Glow Discharge Polymer (GDP) technique has become the preferred Inertial Confinement Fusion (ICF) target material. The nondestructive measurement of elements content in the ICF target has become a significant work in recent years. This paper presents the compositional and distributional results of the Germanium doped CH analysis. The Ge doped CH materials as thin film and as hollow sphere were investigated by the Rutherford Backscattering Spectroscopy (RBS) combined with the particle induced X-ray emission (PIXE) and the Elastic Recoil Detection Analysis (ERDA). The samples are thin film with 36 μm thickness and ICF target with 500–2000 μm diameter. The calibration and geometrical arrangement in the analysis of spherical target should be carefully considered in order to acquire accurate results. In the work, the uniformity of the sphere is shown and the ratio of carbon, hydrogen and germanium has been measured. The ratio values are in good agreement with the results obtained by the combustion method. In addition, the difference of the composition from thin film to hollow sphere is also discussed. This work demonstrates that nuclear microbeam analysis is an ideal method to evaluate the ICF target quality.

  5. Genome-Wide Analysis of miRNA targets in Brachypodium and Biomass Energy Crops

    Energy Technology Data Exchange (ETDEWEB)

    Green, Pamela J. [Univ. of Delaware, Newark, DE (United States)

    2015-08-11

    MicroRNAs (miRNAs) contribute to the control of numerous biological processes through the regulation of specific target mRNAs. Although the identities of these targets are essential to elucidate miRNA function, the targets are much more difficult to identify than the small RNAs themselves. Before this work, we pioneered the genome-wide identification of the targets of Arabidopsis miRNAs using an approach called PARE (German et al., Nature Biotech. 2008; Nature Protocols, 2009). Under this project, we applied PARE to Brachypodium distachyon (Brachypodium), a model plant in the Poaceae family, which includes the major food grain and bioenergy crops. Through in-depth global analysis and examination of specific examples, this research greatly expanded our knowledge of miRNAs and target RNAs of Brachypodium. New regulation in response to environmental stress or tissue type was found, and many new miRNAs were discovered. More than 260 targets of new and known miRNAs with PARE sequences at the precise sites of miRNA-guided cleavage were identified and characterized. Combining PARE data with the small RNA data also identified the miRNAs responsible for initiating approximately 500 phased loci, including one of the novel miRNAs. PARE analysis also revealed that differentially expressed miRNAs in the same family guide specific target RNA cleavage in a correspondingly tissue-preferential manner. The project included generation of small RNA and PARE resources for bioenergy crops, to facilitate ongoing discovery of conserved miRNA-target RNA regulation. By associating specific miRNA-target RNA pairs with known physiological functions, the research provides insights about gene regulation in different tissues and in response to environmental stress. This, and release of new PARE and small RNA data sets should contribute basic knowledge to enhance breeding and may suggest new strategies for improvement of biomass energy crops.

  6. Numerical analysis of free surface instabilities in the IFMIF lithium target

    Energy Technology Data Exchange (ETDEWEB)

    Gordeev, S.; Heinzel, V. [Research Centre of Karlsruhe (Germany). Inst. for Reactor Safety; Moeslang, A. [Research Centre of Karlsruhe (Germany). Inst. for Material Research I

    2007-07-01

    The International Fusion Materials Facility (IFMIF) facility uses a high speed (10-20 m/s) Lithium (Li) jet flow as a target for two 40 MeV/125 mA deuteron beams. The major function of the Li target is to provide a stable Li jet for the production of an intense neutron flux. For the understanding the lithium jet behaviour and elimination of the free-surface flow instabilities a detailed analysis of the Li jet flow is necessary. Different kinds of instability mechanisms in the liquid jet flow have been evaluated and classified based on analytical and experimental data. Numerical investigations of the target free surface flow have been performed. Previous numerical investigations have shown in principle the suitability of CFD code Star- CD for the simulation of the Li-target flow. The main objective of this study is detailed numerical analysis of instabilities in the Li-jet flow caused by boundary layer relaxation near the nozzle exit, transition to the turbulence flow and back wall curvature. A number of CFD models are developed to investigate the formation of instabilities on the target surface. Turbulence models are validated on the experimental data. Experimental observations have shown that the change of the nozzle geometry at the outlet such as a slight divergence of the nozzle surfaces or nozzle edge defects causes the flow separation and occurrence of longitudinal periodic structures on the free surface with an amplitude up to 5 mm. Target surface fluctuations of this magnitude can lead to the penetration of the deuteron beam in the target structure and cause the local overheating of the back plat. Analysis of large instabilities in the Li-target flow combined with the heat distribution in lithium depending on the free surface shape is performed in this study. (orig.)

  7. Numerical analysis of free surface instabilities in the IFMIF lithium target

    International Nuclear Information System (INIS)

    Gordeev, S.; Heinzel, V.; Moeslang, A.

    2007-01-01

    The International Fusion Materials Facility (IFMIF) facility uses a high speed (10-20 m/s) Lithium (Li) jet flow as a target for two 40 MeV/125 mA deuteron beams. The major function of the Li target is to provide a stable Li jet for the production of an intense neutron flux. For the understanding the lithium jet behaviour and elimination of the free-surface flow instabilities a detailed analysis of the Li jet flow is necessary. Different kinds of instability mechanisms in the liquid jet flow have been evaluated and classified based on analytical and experimental data. Numerical investigations of the target free surface flow have been performed. Previous numerical investigations have shown in principle the suitability of CFD code Star- CD for the simulation of the Li-target flow. The main objective of this study is detailed numerical analysis of instabilities in the Li-jet flow caused by boundary layer relaxation near the nozzle exit, transition to the turbulence flow and back wall curvature. A number of CFD models are developed to investigate the formation of instabilities on the target surface. Turbulence models are validated on the experimental data. Experimental observations have shown that the change of the nozzle geometry at the outlet such as a slight divergence of the nozzle surfaces or nozzle edge defects causes the flow separation and occurrence of longitudinal periodic structures on the free surface with an amplitude up to 5 mm. Target surface fluctuations of this magnitude can lead to the penetration of the deuteron beam in the target structure and cause the local overheating of the back plat. Analysis of large instabilities in the Li-target flow combined with the heat distribution in lithium depending on the free surface shape is performed in this study. (orig.)

  8. Analysis of the stochastic channel model by Saleh & Valenzuela via the theory of point processes

    DEFF Research Database (Denmark)

    Jakobsen, Morten Lomholt; Pedersen, Troels; Fleury, Bernard Henri

    2012-01-01

    In this paper we revisit the classical channel model by Saleh & Valenzuela via the theory of spatial point processes. By reformulating this model as a particular point process and by repeated application of Campbell’s Theorem we provide concise and elegant access to its overall structure and unde......In this paper we revisit the classical channel model by Saleh & Valenzuela via the theory of spatial point processes. By reformulating this model as a particular point process and by repeated application of Campbell’s Theorem we provide concise and elegant access to its overall structure...... to define, analyze, and compare most channel models already suggested in literature and that the powerful tools of this framework have not been fully exploited in this context yet....

  9. Do points, levels and leaderboards harm intrinsic motivation? An empirical analysis of common gamification elements

    DEFF Research Database (Denmark)

    Mekler, Elisa D.; Brühlmann, Florian; Opwis, Klaus

    2013-01-01

    It is heavily debated within the gamification community whether specific game elements may actually undermine users' intrinsic motivation. This online experiment examined the effects of three commonly employed game design elements - points, leaderboard, levels - on users' performance, intrinsic...

  10. Transient analysis mode participation for modal survey target mode selection using MSC/NASTRAN DMAP

    Science.gov (United States)

    Barnett, Alan R.; Ibrahim, Omar M.; Sullivan, Timothy L.; Goodnight, Thomas W.

    1994-01-01

    Many methods have been developed to aid analysts in identifying component modes which contribute significantly to component responses. These modes, typically targeted for dynamic model correlation via a modal survey, are known as target modes. Most methods used to identify target modes are based on component global dynamic behavior. It is sometimes unclear if these methods identify all modes contributing to responses important to the analyst. These responses are usually those in areas of hardware design concerns. One method used to check the completeness of target mode sets and identify modes contributing significantly to important component responses is mode participation. With this method, the participation of component modes in dynamic responses is quantified. Those modes which have high participation are likely modal survey target modes. Mode participation is most beneficial when it is used with responses from analyses simulating actual flight events. For spacecraft, these responses are generated via a structural dynamic coupled loads analysis. Using MSC/NASTRAN DMAP, a method has been developed for calculating mode participation based on transient coupled loads analysis results. The algorithm has been implemented to be compatible with an existing coupled loads methodology and has been used successfully to develop a set of modal survey target modes.

  11. Transient analysis mode participation for modal survey target mode selection using MSC/NASTRAN DMAP

    Science.gov (United States)

    Barnett, Alan R.; Ibrahim, Omar M.; Sullivan, Timothy L.; Goodnight, Thomas W.

    1994-03-01

    Many methods have been developed to aid analysts in identifying component modes which contribute significantly to component responses. These modes, typically targeted for dynamic model correlation via a modal survey, are known as target modes. Most methods used to identify target modes are based on component global dynamic behavior. It is sometimes unclear if these methods identify all modes contributing to responses important to the analyst. These responses are usually those in areas of hardware design concerns. One method used to check the completeness of target mode sets and identify modes contributing significantly to important component responses is mode participation. With this method, the participation of component modes in dynamic responses is quantified. Those modes which have high participation are likely modal survey target modes. Mode participation is most beneficial when it is used with responses from analyses simulating actual flight events. For spacecraft, these responses are generated via a structural dynamic coupled loads analysis. Using MSC/NASTRAN DMAP, a method has been developed for calculating mode participation based on transient coupled loads analysis results. The algorithm has been implemented to be compatible with an existing coupled loads methodology and has been used successfully to develop a set of modal survey target modes.

  12. Hazard analysis and critical control point systems in the United States Department of Agriculture regulatory policy.

    Science.gov (United States)

    Billy, T J; Wachsmuth, I K

    1997-08-01

    Recent outbreaks of foodborne illness and studies by expert groups have established the need for fundamental change in the United States meat and poultry inspection programme to reduce the risk of foodborne illness. The Food Safety and Inspection Service (FSIS) of the United States Department of Agriculture (USDA) has embarked on a broad effort to bring about such change, with particular emphasis on the reduction of pathogenic micro-organisms in raw meat and poultry products. The publication on 25 July 1996 of the Final Rule on pathogen reduction and hazard analysis and critical control point (HACCP) systems was a major milestone in the FSIS strategy for change. The Final Rule provides a framework for change and clarifies the respective roles of industry and government in ensuring the safety of meat and poultry products. With the implementation of this Final Rule underway, the FSIS has been exploring ways in which slaughter inspection carried out under an HACCP-based system can be changed so that food safety risks are addressed more adequately and the allocation of inspection resources is improved further. In addition, the FSIS is broadening the focus of food safety activities to extend beyond slaughter and processing plants by working with industry, academia and other government agencies. Such co-operation should lead to the development of measures to improve food safety before animals reach the slaughter plant and after products leave the inspected establishment for distribution to the retail level. For the future, the FSIS believes that quantitative risk assessments will be at the core of food safety activities. Risk assessments provide the most effective means of identifying how specific pathogens and other hazards may be encountered throughout the farm-to-table chain and of measuring the potential impact of various interventions. In addition, these assessments will be used in the development and evaluation of HACCP systems. The FSIS is currently conducting a

  13. Demographic transition in India: an evolutionary interpretation of population and health trends using 'change-point analysis'.

    Science.gov (United States)

    Goli, Srinivas; Arokiasamy, Perianayagam

    2013-01-01

    Lack of a robust analytical tool for trend analysis of population and health indicators is the basic rationale of this study. In an effort to fill this gap, this study advances 'Change-Point analyzer' as a new analytical tool for assessment of the progress and its pattern in population and health indicators. The defining feature of 'change-point analyzer' is that, it detects subtle changes that are often missed in simple trend line plots and also quantified the volume of change that is not possible in simple trend line plots. A long-term assessment of 'change-point analyses' of trends in population and health indicators such as IMR, Population size, TFR, and LEB in India show multiple points of critical changes. Measured change points of demographic and health trends helps in understanding the demographic transitional shifts connecting it to contextual policy shifts. Critical change-points in population and health indicators in India are associated with the evolution of structural changes in population and health policy framework. This study, therefore, adds significantly to the evolutionary interpretation of critical change-points in long-term trajectories of population and health indicators vis-a-vis population and health policy shifts in India. The results have not only helped in reassessing the historical past and the current demographic transition trajectory but also advanced a new method of assessing the population and health trends which are necessary for robust monitoring of the progress in population and health policies.

  14. Retrospective analysis of the financial break-even point for intrathecal morphine pump use in Korea.

    Science.gov (United States)

    Kim, Eun Kyoung; Shin, Ji Yeon; Castañeda, Anyela Marcela; Lee, Seung Jae; Yoon, Hyun Kyu; Kim, Yong Chul; Moon, Jee Youn

    2017-10-01

    The high cost of intrathecal morphine pump (ITMP) implantation may be the main obstacle to its use. Since July 2014, the Korean national health insurance (NHI) program began paying 50% of the ITMP implantation cost in select refractory chronic pain patients. The aims of this study were to investigate the financial break-even point and patients' satisfaction in patients with ITMP treatment after the initiation of the NHI reimbursement. We collected data retrospectively or via direct phone calls to patients who underwent ITMP implantation at a single university-based tertiary hospital between July 2014 and May 2016. Pain severity, changes in the morphine equivalent daily dosage (MEDD), any adverse events, and patients' satisfaction were determined. We calculated the financial break-even point of ITMP implantation via investigating the patient's actual medical costs and insurance information. During the studied period, 23 patients received ITMP implantation, and 20 patients were included in our study. Scores on an 11-point numeric rating scale (NRS) for pain were significantly reduced compared to the baseline value ( P break-even point was 28 months for ITMP treatment after the NHI reimbursement policy. ITMP provided effective chronic pain management with improved satisfaction and reasonable financial break-even point of 28 months with 50% financial coverage by NHI program.

  15. Multi-temporal UAV-borne LiDAR point clouds for vegetation analysis - a case study

    Science.gov (United States)

    Mandlburger, Gottfried; Wieser, Martin; Hollaus, Markus; Pfennigbauer, Martin; Riegl, Ursula

    2016-04-01

    In the recent past the introduction of compact and lightweight LiDAR (Light Detection And Ranging) sensors together with progress in UAV (Unmanned Aerial Vehicle) technology allowed the integration of laser scanners on remotely piloted multicopter, helicopter-type and even fixed-wing platforms. The multi-target capabilities of state-of-the-art time-of-flight full-waveform laser sensors operated from low flying UAV-platforms has enabled capturing of the entire 3D structure of semi-transparent objects like deciduous forests under leaf-off conditions in unprecedented density and completeness. For such environments it has already been demonstrated that UAV-borne laser scanning combines the advantages of terrestrial laser scanning (high point density, short range) and airborne laser scanning (bird's eye perspective, homogeneous point distribution). Especially the oblique looking capabilities of scanners with a large field of view (>180°) enable capturing of vegetation from different sides resulting in a constantly high point density also in the sub canopy domain. Whereas the findings stated above were drawn based on a case study carried out in February 2015 with the Riegl VUX-1UAV laser scanner system mounted on a Riegl RiCopter octocopter UAV-platform over an alluvial forest at the Pielach River (Lower Austria), the site was captured a second time with the same sensor system and mission parameters at the end of the vegetation period on October 28th, 2015. The main goal of this experiment was to assess the impact of the late autumn foliage on the achievable 3D point density. Especially the entire understory vegetation and certain tree species (e.g. willow) were still in full leaf whereas the bigger trees (poplar) where already partly defoliated. The comparison revealed that, although both campaigns featured virtually the same laser shot count, the ground point density dropped from 517 points/m2 in February (leaf-off) to 267 points/m2 end of October (leaf-on). The

  16. Data analysis strategies for the characterization of normal: superconductor point contacts by barrier strength parameter

    Science.gov (United States)

    Smith, Charles W.; Reinertson, Randal C.; Dolan, P. J., Jr.

    1993-05-01

    The theoretical description by Blonder, Tinkham, and Klapwijk [Phys. Rev. B 25, 4515 (1982)] of the I-V curves of normal: superconductor point contacts encompasses a broad range of experimental behavior, from the tunnel junction case, on the one hand, to the clean metallic microconstriction limit on the other. The theory characterizes point contacts in terms of a single parameter, the barrier strength. The differential conductance of a point contact, at zero bias, as a function of temperature, offers a direct experimental method by which the barrier strength parameter can be evaluated. In view of the full range of phenomena incorporated by this theory, we suggest several different strategies for the evaluation of the barrier strength parameter from data in the low and intermediate barrier strength regimes and for measurements in the low temperature (near T=0 K) and high temperature (near T=Tc) limits.

  17. Plasma triglycerides and cardiovascular events in the Treating to New Targets and Incremental Decrease in End-Points through Aggressive Lipid Lowering trials of statins in patients with coronary artery disease

    DEFF Research Database (Denmark)

    Faergeman, Ole; Holme, Ingar; Fayyad, Rana

    2009-01-01

    We determined the ability of in-trial measurements of triglycerides (TGs) to predict new cardiovascular events (CVEs) using data from the Incremental Decrease in End Points through Aggressive Lipid Lowering (IDEAL) and Treating to New Targets (TNT) trials. The trials compared atorvastatin 80 mg...... adjusting for age, gender, and study, risk of CVEs increased with increasing TGs (p relation of TGs...

  18. TRAC analysis of design basis events for the accelerator production of tritium target/blanket

    International Nuclear Information System (INIS)

    Lin, J.C.; Elson, J.

    1997-01-01

    A two-loop primary cooling system with a residual heat removal system was designed to mitigate the heat generated in the tungsten neutron source rods inside the rungs of the ladders and the shell of the rungs. The Transient Reactor Analysis Code (TRAC) was used to analyze the thermal-hydraulic behavior of the primary cooling system during a pump coastdown transient; a cold-leg, large-break loss-of-coolant accident (LBLOCA); a hot-leg LBLOCA; and a target downcomer LBLOCA. The TRAC analysis results showed that the heat generated in the tungsten neutron source rods can be mitigated by the primary cooling system for the pump coastdown transient and all the LBLOCAs except the target downcomer LBLOCA. For the target downcomer LBLOCA, a cavity flood system is required to fill the cavity with water at a level above the large fixed headers

  19. HACCP (Hazard Analysis and Critical Control Points) to guarantee safe water reuse and drinking water production--a case study.

    Science.gov (United States)

    Dewettinck, T; Van Houtte, E; Geenens, D; Van Hege, K; Verstraete, W

    2001-01-01

    To obtain a sustainable water catchment in the dune area of the Flemish west coast, the integration of treated domestic wastewater in the existing potable water production process is planned. The hygienic hazards associated with the introduction of treated domestic wastewater into the water cycle are well recognised. Therefore, the concept of HACCP (Hazard Analysis and Critical Control Points) was used to guarantee hygienically safe drinking water production. Taking into account the literature data on the removal efficiencies of the proposed advanced treatment steps with regard to enteric viruses and protozoa and after setting high quality limits based on the recent progress in quantitative risk assessment, the critical control points (CCPs) and points of attention (POAs) were identified. Based on the HACCP analysis a specific monitoring strategy was developed which focused on the control of these CCPs and POAs.

  20. Analysis of Arbitrary Reflector Antennas Applying the Geometrical Theory of Diffraction Together with the Master Points Technique

    Directory of Open Access Journals (Sweden)

    María Jesús Algar

    2013-01-01

    Full Text Available An efficient approach for the analysis of surface conformed reflector antennas fed arbitrarily is presented. The near field in a large number of sampling points in the aperture of the reflector is obtained applying the Geometrical Theory of Diffraction (GTD. A new technique named Master Points has been developed to reduce the complexity of the ray-tracing computations. The combination of both GTD and Master Points reduces the time requirements of this kind of analysis. To validate the new approach, several reflectors and the effects on the radiation pattern caused by shifting the feed and introducing different obstacles have been considered concerning both simple and complex geometries. The results of these analyses have been compared with the Method of Moments (MoM results.

  1. Sensitivity analysis of point and parametric pedotransfer functions for estimating water retention of soils in Algeria

    Directory of Open Access Journals (Sweden)

    S. Touil

    2016-12-01

    0.0636 cm3 cm−3, respectively. The results of global sensitivity analyses (GSAs showed that the mathematical formalism of PTFs and their input variables reacted differently in terms of point pressure and texture. The point and parametric PTFs were sensitive mainly to the sand fraction in the fine- and medium-textural classes. The use of clay percentage (C % and bulk density (BD as inputs in the medium-textural class improved the estimation of PTFs at −33 kPa.

  2. RETRAN operational transient analysis of the Big Rock Point plant boiling water reactor

    International Nuclear Information System (INIS)

    Sawtelle, G.R.; Atchison, J.D.; Farman, R.F.; VandeWalle, D.J.; Bazydlo, H.G.

    1983-01-01

    Energy Incorporated used the RETRAN computer code to model and calculate nine Consumers Power Company Big Rock Point Nuclear Power Plant transients. RETRAN, a best-estimate, one-dimensional, homogeneous-flow thermal-equilibrium code, is applicable to FSAR Chapter 15 transients for Conditions 1 through IV. The BWR analyses were performed in accordance with USNRC Standard Review Plan criteria and in response to the USNRC Systematic Evaluation Program. The RETRAN Big Rock Point model was verified by comparison to plant startup test data. This paper discusses the unique modeling techniques used in RETRAN to model this steam-drum-type BWR. Transient analyses results are also presented

  3. Development, validation and application of multi-point kinetics model in RELAP5 for analysis of asymmetric nuclear transients

    Energy Technology Data Exchange (ETDEWEB)

    Pradhan, Santosh K., E-mail: santosh@aerb.gov.in [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Obaidurrahman, K. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India); Iyer, Kannan N. [Department of Mechanical Engineering, IIT Bombay, Mumbai 400076 (India); Gaikwad, Avinash J. [Nuclear Safety Analysis Division, Atomic Energy Regulatory Board, Mumbai 400094 (India)

    2016-04-15

    Highlights: • A multi-point kinetics model is developed for RELAP5 system thermal hydraulics code. • Model is validated against extensive 3D kinetics code. • RELAP5 multi-point kinetics formulation is used to investigate critical break for LOCA in PHWR. - Abstract: Point kinetics approach in system code RELAP5 limits its use for many of the reactivity induced transients, which involve asymmetric core behaviour. Development of fully coupled 3D core kinetics code with system thermal-hydraulics is the ultimate requirement in this regard; however coupling and validation of 3D kinetics module with system code is cumbersome and it also requires access to source code. An intermediate approach with multi-point kinetics is appropriate and relatively easy to implement for analysis of several asymmetric transients for large cores. Multi-point kinetics formulation is based on dividing the entire core into several regions and solving ODEs describing kinetics in each region. These regions are interconnected by spatial coupling coefficients which are estimated from diffusion theory approximation. This model offers an advantage that associated ordinary differential equations (ODEs) governing multi-point kinetics formulation can be solved using numerical methods to the desired level of accuracy and thus allows formulation based on user defined control variables, i.e., without disturbing the source code and hence also avoiding associated coupling issues. Euler's method has been used in the present formulation to solve several coupled ODEs internally at each time step. The results have been verified against inbuilt point-kinetics models of RELAP5 and validated against 3D kinetics code TRIKIN. The model was used to identify the critical break in RIH of a typical large PHWR core. The neutronic asymmetry produced in the core due to the system induced transient was effectively handled by the multi-point kinetics model overcoming the limitation of in-built point kinetics model

  4. In Vivo Phosphoproteomics Analysis Reveals the Cardiac Targets of β-Adrenergic Receptor Signaling

    DEFF Research Database (Denmark)

    Lundby, Alicia; Andersen, Martin N; Steffensen, Annette B

    2013-01-01

    -X-X-pS/T), and integrative analysis of sequence motifs and interaction networks suggested that the kinases AMPK (adenosine 5'-monophosphate-activated protein kinase), Akt, and mTOR (mammalian target of rapamycin) mediate βAR signaling, in addition to the well-established pathways mediated by PKA (cyclic adenosine...

  5. Doing Televised Rhetorical Analysis as a Means of Promoting College Awareness in a Target Market.

    Science.gov (United States)

    Schnell, Jim

    This paper describes aspects of doing televised rhetorical analysis as they relate to the promotion of college awareness in a particular target market. Considerations in the paper include variables that most professors encounter in their efforts to address the "service" expectations of their employment and how these variables can be…

  6. Analysis of shots on target and goals scored in soccer matches ...

    African Journals Online (AJOL)

    The aim of this study was to analyse the characteristics and patterns of shots on target and goals scored during the 2012-European Championship. The broadcasted matches were recorded and converted into electronic video files for a computerbased analysis. This quantitative study examined 31 matches of the ...

  7. PowerPoint® presentation flaws and failures: a psychological analysis

    NARCIS (Netherlands)

    Kosslyn, S.M.; Kievit, R.A.; Russell, A.G.; Shepard, J.M.

    2012-01-01

    Electronic slideshow presentations are often faulted anecdotally, but little empirical work has documented their faults. In Study 1 we found that eight psychological principles are often violated in PowerPoint® slideshows, and are violated to similar extents across different fields - for example,

  8. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    International Nuclear Information System (INIS)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard

    2015-01-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  9. Stability analysis of the Gyroscopic Power Take-Off wave energy point absorber

    Science.gov (United States)

    Nielsen, Søren R. K.; Zhang, Zili; Kramer, Morten M.; Olsen, Jan

    2015-10-01

    The Gyroscopic Power Take-Off (GyroPTO) wave energy point absorber consists of a float rigidly connected to a lever. The operational principle is somewhat similar to that of the so-called gyroscopic hand wrist exercisers, where the rotation of the float is brought forward by the rotational particle motion of the waves. At first, the equations of motion of the system are derived based on analytical rigid body dynamics. Next, assuming monochromatic waves simplified equations are derived, valid under synchronisation of the ring of the gyro to the angular frequency of the excitation. Especially, it is demonstrated that the dynamics of the ring can be described as an autonomous nonlinear single-degree-of-freedom system, affected by three different types of point attractors. One where the ring vibrations are attracted to a static equilibrium point indicating unstable synchronisation and two types of attractors where the ring is synchronised to the wave angular frequency, either rotating in one or the opposite direction. Finally, the stability conditions and the basins of attraction to the point attractors defining the synchronised motion are determined.

  10. One-point fluctuation analysis of the high-energy neutrino sky

    DEFF Research Database (Denmark)

    Feyereisen, Michael R.; Tamborra, Irene; Ando, Shin'ichiro

    2017-01-01

    of shower events that can reasonably be associated to blazars. We also find that upper limits on the contribution of blazars to the measured flux are unfavourably affected by the skewness of the blazar flux distribution. One-point event clustering and likelihood analyses of the IceCube HESE data suggest...

  11. The solution of the neutron point kinetics equation with stochastic extension: an analysis of two moments

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Milena Wollmann da; Vilhena, Marco Tullio M.B.; Bodmann, Bardo Ernst J.; Vasques, Richard, E-mail: milena.wollmann@ufrgs.br, E-mail: vilhena@mat.ufrgs.br, E-mail: bardobodmann@ufrgs.br, E-mail: richard.vasques@fulbrightmail.org [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica

    2015-07-01

    The neutron point kinetics equation, which models the time-dependent behavior of nuclear reactors, is often used to understand the dynamics of nuclear reactor operations. It consists of a system of coupled differential equations that models the interaction between (i) the neutron population; and (II) the concentration of the delayed neutron precursors, which are radioactive isotopes formed in the fission process that decay through neutron emission. These equations are deterministic in nature, and therefore can provide only average values of the modeled populations. However, the actual dynamical process is stochastic: the neutron density and the delayed neutron precursor concentrations vary randomly with time. To address this stochastic behavior, Hayes and Allen have generalized the standard deterministic point kinetics equation. They derived a system of stochastic differential equations that can accurately model the random behavior of the neutron density and the precursor concentrations in a point reactor. Due to the stiffness of these equations, this system was numerically implemented using a stochastic piecewise constant approximation method (Stochastic PCA). Here, we present a study of the influence of stochastic fluctuations on the results of the neutron point kinetics equation. We reproduce the stochastic formulation introduced by Hayes and Allen and compute Monte Carlo numerical results for examples with constant and time-dependent reactivity, comparing these results with stochastic and deterministic methods found in the literature. Moreover, we introduce a modified version of the stochastic method to obtain a non-stiff solution, analogue to a previously derived deterministic approach. (author)

  12. A note on the statistical analysis of point judgment matrices | Kabera ...

    African Journals Online (AJOL)

    The Analytic Hierarchy Process is a multicriteria decision making technique developed by Saaty in the 1970s. The core of the approach is the pairwise comparison of objects according to a single criterion using a 9-point ratio scale and the estimation of weights associated with these objects based on the resultant judgment ...

  13. Sexually selected signaling in birds: A case for bayesian change-point analysis of behavioral routines

    NARCIS (Netherlands)

    Roth, T.; Sprau, P.; Naguib, M.; Amrhein, V.

    2012-01-01

    Responses of organisms to environments or to conspecifics may abruptly change once the organism has changed its state. For example, the expression of sexually selected signals often depends on the pairing status of the sender. A likely change in signaling routines at the point of pair formation

  14. Sexually selected signaling in birds: a case for Bayesian change-point analysis of behavioral routinges

    NARCIS (Netherlands)

    Roth, T.; Sprau, P.; Naguib, M.; Amrhein, V.

    2012-01-01

    Responses of organisms to environments or to conspecifics may abruptly change once the organism has changed its state. For example, the expression of sexually selected signals often depends on the pairing status of the sender. A likely change in signaling routines at the point of pair formation

  15. An Error Analysis of the Phased Array Antenna Pointing Algorithm for STARS Flight Demonstration No. 2

    Science.gov (United States)

    Carney, Michael P.; Simpson, James C.

    2005-01-01

    STARS is a multicenter NASA project to determine the feasibility of using space-based assets, such as the Tracking and Data Relay Satellite System (TDRSS) and Global Positioning System (GPS), to increase flexibility (e.g. increase the number of possible launch locations and manage simultaneous operations) and to reduce operational costs by decreasing the need for ground-based range assets and infrastructure. The STARS project includes two major systems: the Range Safety and Range User systems. The latter system uses broadband communications (125 kbps to 500 kbps) for voice, video, and vehicle/payload data. Flight Demonstration #1 revealed the need to increase the data rate of the Range User system. During Flight Demo #2, a Ku-band antenna will generate a higher data rate and will be designed with an embedded pointing algorithm to guarantee that the antenna is pointed directly at TDRS. This algorithm will utilize the onboard position and attitude data to point the antenna to TDRS within a 2-degree full-angle beamwidth. This report investigates how errors in aircraft position and attitude, along with errors in satellite position, propagate into the overall pointing vector.

  16. Trends in miniaturized total analysis systems for point-of-care testing in clinical chemistry

    NARCIS (Netherlands)

    Tudos, Anna J.; Besselink, G.A.J.; Schasfoort, Richardus B.M.

    2001-01-01

    A currently emerging approach enables more widespread monitoring of health parameters in disease prevention and biomarker monitoring. Miniaturisation provides the means for the production of small, fast and easy-to-operate devices for reduced-cost healthcare testing at the point-of-care (POC) or

  17. Stress state analysis of sub-sized pre-cracked three-point-bend specimen

    Czech Academy of Sciences Publication Activity Database

    Stratil, Luděk; Kozák, Vladislav; Hadraba, Hynek; Dlouhý, Ivo

    2012-01-01

    Roč. 19, 2/3 (2012), s. 121-129 ISSN 1802-1484 R&D Projects: GA ČR GD106/09/H035; GA ČR(CZ) GAP107/10/0361 Institutional support: RVO:68081723 Keywords : KLST * three-point bending * side grooving * Eurofer97 * J-integral Subject RIV: JL - Materials Fatigue, Friction Mechanics

  18. Computational Analysis of Distance Operators for the Iterative Closest Point Algorithm.

    Directory of Open Access Journals (Sweden)

    Higinio Mora

    Full Text Available The Iterative Closest Point (ICP algorithm is currently one of the most popular methods for rigid registration so that it has become the standard in the Robotics and Computer Vision communities. Many applications take advantage of it to align 2D/3D surfaces due to its popularity and simplicity. Nevertheless, some of its phases present a high computational cost thus rendering impossible some of its applications. In this work, it is proposed an efficient approach for the matching phase of the Iterative Closest Point algorithm. This stage is the main bottleneck of that method so that any efficiency improvement has a great positive impact on the performance of the algorithm. The proposal consists in using low computational cost point-to-point distance metrics instead of classic Euclidean one. The candidates analysed are the Chebyshev and Manhattan distance metrics due to their simpler formulation. The experiments carried out have validated the performance, robustness and quality of the proposal. Different experimental cases and configurations have been set up including a heterogeneous set of 3D figures, several scenarios with partial data and random noise. The results prove that an average speed up of 14% can be obtained while preserving the convergence properties of the algorithm and the quality of the final results.

  19. Uncertainty analysis of point-by-point sampling complex surfaces using touch probe CMMs DOE for complex surfaces verification with CMM

    DEFF Research Database (Denmark)

    Barini, Emanuele Modesto; Tosello, Guido; De Chiffre, Leonardo

    2010-01-01

    The paper describes a study concerning point-by-point sampling of complex surfaces using tactile CMMs. A four factor, two level completely randomized factorial experiment was carried out, involving measurements on a complex surface configuration item comprising a sphere, a cylinder and a cone, co...

  20. Analysis of Deregulated microRNAs and Their Target Genes in Gastric Cancer.

    Directory of Open Access Journals (Sweden)

    Simonas Juzėnas

    Full Text Available MicroRNAs (miRNAs are widely studied non-coding RNAs that modulate gene expression. MiRNAs are deregulated in different tumors including gastric cancer (GC and have potential diagnostic and prognostic implications. The aim of our study was to determine miRNA profile in GC tissues, followed by evaluation of deregulated miRNAs in plasma of GC patients. Using available databases and bioinformatics methods we also aimed to evaluate potential target genes of confirmed differentially expressed miRNA and validate these findings in GC tissues.The study included 51 GC patients and 51 controls. Initially, we screened miRNA expression profile in 13 tissue samples of GC and 12 normal gastric tissues with TaqMan low density array (TLDA. In the second stage, differentially expressed miRNAs were validated in a replication cohort using qRT-PCR in tissue and plasma samples. Subsequently, we analyzed potential target genes of deregulated miRNAs using bioinformatics approach, determined their expression in GC tissues and performed correlation analysis with targeting miRNAs.Profiling with TLDA revealed 15 deregulated miRNAs in GC tissues compared to normal gastric mucosa. Replication analysis confirmed that miR-148a-3p, miR-204-5p, miR-223-3p and miR-375 were consistently deregulated in GC tissues. Analysis of GC patients' plasma samples showed significant down-regulation of miR-148a-3p, miR-375 and up-regulation of miR-223-3p compared to healthy subjects. Further, using bioinformatic tools we identified targets of replicated miRNAs and performed disease-associated gene enrichment analysis. Ultimately, we evaluated potential target gene BCL2 and DNMT3B expression by qRT-PCR in GC tissue, which correlated with targeting miRNA expression.Our study revealed miRNA profile in GC tissues and showed that miR-148a-3p, miR-223-3p and miR-375 are deregulated in GC plasma samples, but these circulating miRNAs showed relatively weak diagnostic performance as sole biomarkers

  1. Mechanical analysis of the joint between Wendelstein 7-X target elements and the divertor frame structure

    Energy Technology Data Exchange (ETDEWEB)

    Smirnow, M., E-mail: michael.smirnow@ipp.mpg.de; Kuchelmeister, M.; Boscary, J.; Tittes, H.; Peacock, A.

    2014-10-15

    The target elements of the actively cooled high heat flux (HHF) divertor of Wendelstein 7-X are made of CFC (carbon fiber-reinforced carbon composite) tiles bonded to a CuCrZr heat sink and are mounted onto a support frame. During operation, the power loading will result in the thermal expansion of the target elements. Their attachment to the support frame needs to provide, on the one hand, enough flexibility to allow some movement to release the induced thermal stresses and, on the other hand, to provide enough stiffness to avoid a misalignment of one target element relative to the others. This flexibility is realized by a spring element made of a stack of disc springs together with a sliding support at one of the two or three mounting points. Detailed finite element calculations have shown that the deformation of the heat sink leads to some non-axial deformation of the spring elements. A mechanical test was performed to validate the attachment design under cyclic loading and to measure the deformations typical of the expected deformation of the elements. The outcome of this study is the validation of the design selected for the attachment of the target elements, which survived experimentally the applied mechanical cycling which simulates the thermal cycling under operation.

  2. Comparison Between Interactive Closest Point and Procrustes Analysis for Determining the Median Sagittal Plane of Three-Dimensional Facial Data.

    Science.gov (United States)

    Xiong, Yuxue; Zhao, Yijiao; Yang, Huifang; Sun, Yucun; Wang, Yong

    2016-03-01

    To compare 2 digital methods to determine median sagittal plane of three-dimensional facial data-the interactive closest point algorithm and Procrustes analysis. The three-dimensional facial data of the 30 volunteers were got by the Face Scan 3D optical sensor (3D-Shape GmbH, Erlangen, Germany), and then were input to the reverse engineering software Imageware 13.0 (Siemens, Plano, TX) and Geomagic 2012 (Cary, NC). Their mirrored data were acquired and superimposed with the original data by the methods of interactive closest points and Procrustes analysis. The median sagittal planes of the 2 methods were extracted from the original and mirrored facial data respectively, 3 asymmetry indices were measured for comparison. Differences between the facial asymmetry indices of the 2 methods were evaluated using the paired sample t-test. In terms of the 3 asymmetry indices, there were no significant differences between interactive closest points and Procrustes analysis for extracting median sagittal plane from three-dimensional facial data.(t = 0.0.060, P = 0.953 for asymmetry index (AI) 1, t = -0.926, P = 0.362 for AI 2, t = 1.1172, P = 0.0.251 for AI 3). In this evaluation of 30 subjects, the Procrustes analysis and the interactive closest point median-sagittal planes were similar in terms of the 3 asymmetry indices. Thus, Procrustes analysis and interactive closest point can both be used to abstract median sagittal plane from three-dimensional facial data.

  3. Analysis of company success potential from the point of view of success-ability

    Directory of Open Access Journals (Sweden)

    Robert Zich

    2009-01-01

    Full Text Available The success-ability conception represents a specific approach to company competitive strategy crea­tion. Because of a different philosophy especially as regards the concept of competitive advantages, it requires a specific approach to evaluation of a company’s market position from the point of view of its competitiveness. Basic evaluation includes four perspectives evaluating the profile of company competitiveness, suitability of the adopted approach, ability to develop the adopted approach and eva­lua­tion from the point of view of the benefit for the customer together with the ability of the competition to imitate the company’s approach. This method has application not only in the area of company strategy creation but it can also be used when investigating the competitiveness of companies.

  4. Fault Detection and Diagnosis of Railway Point Machines by Sound Analysis

    Science.gov (United States)

    Lee, Jonguk; Choi, Heesu; Park, Daihee; Chung, Yongwha; Kim, Hee-Young; Yoon, Sukhan

    2016-01-01

    Railway point devices act as actuators that provide different routes to trains by driving switchblades from the current position to the opposite one. Point failure can significantly affect railway operations, with potentially disastrous consequences. Therefore, early detection of anomalies is critical for monitoring and managing the condition of rail infrastructure. We present a data mining solution that utilizes audio data to efficiently detect and diagnose faults in railway condition monitoring systems. The system enables extracting mel-frequency cepstrum coefficients (MFCCs) from audio data with reduced feature dimensions using attribute subset selection, and employs support vector machines (SVMs) for early detection and classification of anomalies. Experimental results show that the system enables cost-effective detection and diagnosis of faults using a cheap microphone, with accuracy exceeding 94.1% whether used alone or in combination with other known methods. PMID:27092509

  5. Fault Detection and Diagnosis of Railway Point Machines by Sound Analysis.

    Science.gov (United States)

    Lee, Jonguk; Choi, Heesu; Park, Daihee; Chung, Yongwha; Kim, Hee-Young; Yoon, Sukhan

    2016-04-16

    Railway point devices act as actuators that provide different routes to trains by driving switchblades from the current position to the opposite one. Point failure can significantly affect railway operations, with potentially disastrous consequences. Therefore, early detection of anomalies is critical for monitoring and managing the condition of rail infrastructure. We present a data mining solution that utilizes audio data to efficiently detect and diagnose faults in railway condition monitoring systems. The system enables extracting mel-frequency cepstrum coefficients (MFCCs) from audio data with reduced feature dimensions using attribute subset selection, and employs support vector machines (SVMs) for early detection and classification of anomalies. Experimental results show that the system enables cost-effective detection and diagnosis of faults using a cheap microphone, with accuracy exceeding 94.1% whether used alone or in combination with other known methods.

  6. Fault Detection and Diagnosis of Railway Point Machines by Sound Analysis

    Directory of Open Access Journals (Sweden)

    Jonguk Lee

    2016-04-01

    Full Text Available Railway point devices act as actuators that provide different routes to trains by driving switchblades from the current position to the opposite one. Point failure can significantly affect railway operations, with potentially disastrous consequences. Therefore, early detection of anomalies is critical for monitoring and managing the condition of rail infrastructure. We present a data mining solution that utilizes audio data to efficiently detect and diagnose faults in railway condition monitoring systems. The system enables extracting mel-frequency cepstrum coefficients (MFCCs from audio data with reduced feature dimensions using attribute subset selection, and employs support vector machines (SVMs for early detection and classification of anomalies. Experimental results show that the system enables cost-effective detection and diagnosis of faults using a cheap microphone, with accuracy exceeding 94.1% whether used alone or in combination with other known methods.

  7. [Powdered infant formulae preparation guide for hospitals based on Hazard Analysis and Critical Control Points (HACCP) principles].

    Science.gov (United States)

    Vargas-Leguás, H; Rodríguez Garrido, V; Lorite Cuenca, R; Pérez-Portabella, C; Redecillas Ferreiro, S; Campins Martí, M

    2009-06-01

    This guide for the preparation of powdered infant formulae in hospital environments is a collaborative work between several hospital services and is based on national and European regulations, international experts meetings and the recommendations of scientific societies. This guide also uses the Hazard Analysis and Critical Control Point principles proposed by Codex Alimentarius and emphasises effective verifying measures, microbiological controls of the process and the corrective actions when monitoring indicates that a critical control point is not under control. It is a dynamic guide and specifies the evaluation procedures that allow it to be constantly adapted.

  8. The implementation of a Hazard Analysis and Critical Control Point management system in a peanut butter ice cream plant

    Directory of Open Access Journals (Sweden)

    Yu-Ting Hung

    2015-09-01

    Full Text Available To ensure the safety of the peanut butter ice cream manufacture, a Hazard Analysis and Critical Control Point (HACCP plan has been designed and applied to the production process. Potential biological, chemical, and physical hazards in each manufacturing procedure were identified. Critical control points for the peanut butter ice cream were then determined as the pasteurization and freezing process. The establishment of a monitoring system, corrective actions, verification procedures, and documentation and record keeping were followed to complete the HACCP program. The results of this study indicate that implementing the HACCP system in food industries can effectively enhance food safety and quality while improving the production management.

  9. Virtual diplomacy: an analysis of the structure of the target audiences

    Directory of Open Access Journals (Sweden)

    V. V. Verbytska

    2016-03-01

    Full Text Available In the context of the global information society the communication processes, especially at the international level, become more important.  The effectiveness of communication depends primarily on its focus, i.e. on defining clearly the target audience which it should focus on. Virtual diplomacy, as a kind of political communication at the international level, is no exception.  The novelty, rapid development and dissemination of this phenomenon require profound analysis and elaboration of effective utilization strategies, including studying its recipients and target audiences. Purpose: identification, structuring and analysis of the recipients of virtual diplomacy as the audiences of international political communication. The study uses such research methods, as system analysis, structural functionalism, dialectics and synergy, comparison, critical analysis. Main results of the research: 1. The study examined the specifics of political communication in the context of the development of the global information society at the international level. 2. It also analyzed the recipients of virtual diplomacy as a kind of political communication at the international level. 3. The study highlighted the key target groups in the global Internet network based on the tasks performed by virtual diplomacy. 4. It proved the effectiveness of cooperation with each target group in the framework of virtual diplomacy. 5. It described the specifics of the work with each target group in the context of virtual diplomacy. Practical implications: The article may be useful for writing scientific theoretical studies, tests, essays and term papers, for designing special courses in universities in the sphere of international relations and international information. It can also be a guide for the authorities carrying out diplomatic activities and international information cooperation. Findings: In the context of the establishment of the global information society political

  10. System for automatic x-ray-image analysis, measurement, and sorting of laser fusion targets

    International Nuclear Information System (INIS)

    Singleton, R.M.; Perkins, D.E.; Willenborg, D.L.

    1980-01-01

    This paper describes the Automatic X-Ray Image Analysis and Sorting (AXIAS) system which is designed to analyze and measure x-ray images of opaque hollow microspheres used as laser fusion targets. The x-ray images are first recorded on a high resolution film plate. The AXIAS system then digitizes and processes the images to accurately measure the target parameters and defects. The primary goals of the AXIAS system are: to provide extremely accurate and rapid measurements, to engineer a practical system for a routine production environment and to furnish the capability of automatically measuring an array of images for sorting and selection

  11. Hot-spot analysis for drug discovery targeting protein-protein interactions.

    Science.gov (United States)

    Rosell, Mireia; Fernández-Recio, Juan

    2018-04-01

    Protein-protein interactions are important for biological processes and pathological situations, and are attractive targets for drug discovery. However, rational drug design targeting protein-protein interactions is still highly challenging. Hot-spot residues are seen as the best option to target such interactions, but their identification requires detailed structural and energetic characterization, which is only available for a tiny fraction of protein interactions. Areas covered: In this review, the authors cover a variety of computational methods that have been reported for the energetic analysis of protein-protein interfaces in search of hot-spots, and the structural modeling of protein-protein complexes by docking. This can help to rationalize the discovery of small-molecule inhibitors of protein-protein interfaces of therapeutic interest. Computational analysis and docking can help to locate the interface, molecular dynamics can be used to find suitable cavities, and hot-spot predictions can focus the search for inhibitors of protein-protein interactions. Expert opinion: A major difficulty for applying rational drug design methods to protein-protein interactions is that in the majority of cases the complex structure is not available. Fortunately, computational docking can complement experimental data. An interesting aspect to explore in the future is the integration of these strategies for targeting PPIs with large-scale mutational analysis.

  12. An analysis of health promotion materials for Dutch truck drivers: Off target and too complex?

    Science.gov (United States)

    Boeijinga, Anniek; Hoeken, Hans; Sanders, José

    2017-01-01

    Despite various health promotion initiatives, unfavorable figures regarding Dutch truck drivers' eating behaviors, exercise behaviors, and absenteeism have not improved. The aim was to obtain a better understanding of the low level of effectiveness of current health interventions for Dutch truck drivers by examining to what extent these are tailored to the target group's particular mindset (focus of content) and health literacy skills (presentation of content). The article analyzes 21 health promotion materials for Dutch truck drivers using a two-step approach: (a) an analysis of the materials' focus, guided by the Health Action Process Approach; and (b) an argumentation analysis, guided by pragma-dialectics. The corpus analysis revealed: (a) a predominant focus on the motivation phase; and (b) in line with the aim of motivating the target group, a consistent use of pragmatic arguments, which were typically presented in an implicit way. The results indicate that existing health promotion materials for Dutch truck drivers are not sufficiently tailored to the target group's mindset and health literacy skills. Recommendations are offered to develop more tailored/effective health interventions targeting this high-risk, underserved occupational group.

  13. TargetVue: Visual Analysis of Anomalous User Behaviors in Online Communication Systems.

    Science.gov (United States)

    Cao, Nan; Shi, Conglei; Lin, Sabrina; Lu, Jie; Lin, Yu-Ru; Lin, Ching-Yung

    2016-01-01

    Users with anomalous behaviors in online communication systems (e.g. email and social medial platforms) are potential threats to society. Automated anomaly detection based on advanced machine learning techniques has been developed to combat this issue; challenges remain, though, due to the difficulty of obtaining proper ground truth for model training and evaluation. Therefore, substantial human judgment on the automated analysis results is often required to better adjust the performance of anomaly detection. Unfortunately, techniques that allow users to understand the analysis results more efficiently, to make a confident judgment about anomalies, and to explore data in their context, are still lacking. In this paper, we propose a novel visual analysis system, TargetVue, which detects anomalous users via an unsupervised learning model and visualizes the behaviors of suspicious users in behavior-rich context through novel visualization designs and multiple coordinated contextual views. Particularly, TargetVue incorporates three new ego-centric glyphs to visually summarize a user's behaviors which effectively present the user's communication activities, features, and social interactions. An efficient layout method is proposed to place these glyphs on a triangle grid, which captures similarities among users and facilitates comparisons of behaviors of different users. We demonstrate the power of TargetVue through its application in a social bot detection challenge using Twitter data, a case study based on email records, and an interview with expert users. Our evaluation shows that TargetVue is beneficial to the detection of users with anomalous communication behaviors.

  14. Analysis, Thematic Maps and Data Mining from Point Cloud to Ontology for Software Development

    Science.gov (United States)

    Nespeca, R.; De Luca, L.

    2016-06-01

    The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based) are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  15. ANALYSIS, THEMATIC MAPS AND DATA MINING FROM POINT CLOUD TO ONTOLOGY FOR SOFTWARE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    R. Nespeca

    2016-06-01

    Full Text Available The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.

  16. Second-order analysis of inhomogeneous spatial point processes with proportional intensity functions

    DEFF Research Database (Denmark)

    Guan, Yongtao; Waagepetersen, Rasmus; Beale, Colin M.

    2008-01-01

    of the intensity functions. The first approach is based on nonparametric kernel-smoothing, whereas the second approach uses a conditional likelihood estimation approach to fit a parametric model for the pair correlation function. A great advantage of the proposed methods is that they do not require the often...... to two spatial point patterns regarding the spatial distributions of birds in the U.K.'s Peak District in 1990 and 2004....

  17. Design Analysis of SNS Target StationBiological Shielding Monoligh with Proton Power Uprate

    Energy Technology Data Exchange (ETDEWEB)

    Bekar, Kursat B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ibrahim, Ahmad M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-05-01

    This report documents the analysis of the dose rate in the experiment area outside the Spallation Neutron Source (SNS) target station shielding monolith with proton beam energy of 1.3 GeV. The analysis implemented a coupled three dimensional (3D)/two dimensional (2D) approach that used both the Monte Carlo N-Particle Extended (MCNPX) 3D Monte Carlo code and the Discrete Ordinates Transport (DORT) two dimensional deterministic code. The analysis with proton beam energy of 1.3 GeV showed that the dose rate in continuously occupied areas on the lateral surface outside the SNS target station shielding monolith is less than 0.25 mrem/h, which complies with the SNS facility design objective. However, the methods and codes used in this analysis are out of date and unsupported, and the 2D approximation of the target shielding monolith does not accurately represent the geometry. We recommend that this analysis is updated with modern codes and libraries such as ADVANTG or SHIFT. These codes have demonstrated very high efficiency in performing full 3D radiation shielding analyses of similar and even more difficult problems.

  18. Feasibility Study of Using Mobile Laser Scanning Point Cloud Data for GNSS Line of Sight Analysis

    Directory of Open Access Journals (Sweden)

    Yuwei Chen

    2017-01-01

    Full Text Available The positioning accuracy with good GNSS observation can easily reach centimetre level, supported by advanced GNSS technologies. However, it is still a challenge to offer a robust GNSS based positioning solution in a GNSS degraded area. The concept of GNSS shadow matching has been proposed to enhance the GNSS based position accuracy in city canyons, where the nearby high buildings block parts of the GNSS radio frequency (RF signals. However, the results rely on the accuracy of the utilized ready-made 3D city model. In this paper, we investigate a solution to generate a GNSS shadow mask with mobile laser scanning (MLS cloud data. The solution includes removal of noise points, determining the object which only attenuated the RF signal and extraction of the highest obstruction point, and eventually angle calculation for the GNSS shadow mask generation. By analysing the data with the proposed methodology, it is concluded that the MLS point cloud data can be used to extract the GNSS shadow mask after several steps of processing to filter out the hanging objects and the plantings without generating the accurate 3D model, which depicts the boundary of GNSS signal coverage more precisely in city canyon environments compared to traditional 3D models.

  19. An analysis of the dependence of saccadic latency on target position and target characteristics in human subjects

    Directory of Open Access Journals (Sweden)

    Rosenberg Jay R

    2001-09-01

    Full Text Available Abstract Background Predictions from conduction velocity data for primate retinal ganglion cell axons indicate that the conduction time to the lateral geniculate nucleus for stimulation of peripheral retina should be no longer than for stimulation of central retina. On this basis, the latency of saccadic eye movements should not increase for more peripherally located targets. However, previous studies have reported relatively very large increases, which has the implication of a very considerable increase in central processing time for the saccade-generating system. Results In order to resolve this paradox, we have undertaken an extended series of experiments in which saccadic eye movements were recorded by electro-oculography in response to targets presented in the horizontal meridian in normal young subjects. For stationary or moving targets of either normal beam intensity or reduced red intensity, with the direction of gaze either straight ahead with respect to the head or directed eccentrically, the saccadic latency was shown to remain invariant with respect to a wide range of target angular displacements. Conclusions These results indicate that, irrespective of the angular displacement of the target, the direction of gaze or the target intensity, the saccade-generating system operates with a constant generation time.

  20. CFD Analysis of the Active Part of the HYPER Spallation Target

    International Nuclear Information System (INIS)

    Nam-il Tak; Chungho Cho; Tae-Yung Song

    2006-01-01

    KAERI (Korea Atomic Energy Research Institute) is developing an accelerator driven system (ADS) named HYPER (HYbrid Power Extraction Reactor) for a transmutation of long-lived nuclear wastes. One of the challenging tasks for the HYPER system is to design a large spallation target having a beam power of 15∼25 MW. The present paper focuses on the thermal-hydraulic performance of the active part of the HYPER target. Computational fluid dynamics (CFD) analysis was performed using a commercial code CFX 5.7.1. Several advanced turbulence models with different grid structures were applied. The CFX results show the significant impact of the turbulence model on the window temperature. It is concluded that experimental verifications are very important for the design of the HYPER target. (authors)

  1. Mapping Long Noncoding RNA Chromatin Occupancy Using Capture Hybridization Analysis of RNA Targets (CHART).

    Science.gov (United States)

    Vance, Keith W

    2017-01-01

    Capture Hybridization Analysis of RNA Targets (CHART) has recently been developed to map the genome-wide binding profile of chromatin-associated RNAs. This protocol uses a small number of 22-28 nucleotide biotinylated antisense oligonucleotides, complementary to regions of the target RNA that are accessible for hybridization, to purify RNAs from a cross-linked chromatin extract. RNA-chromatin complexes are next immobilized on beads, washed, and specifically eluted using RNase H. Associated genomic DNA is then sequenced using high-throughput sequencing technologies and mapped to the genome to identify RNA-chromatin associations on a large scale. CHART-based strategies can be applied to determine the nature and extent of long noncoding RNA (long ncRNA) association with chromatin genome-wide and identify direct long ncRNA transcriptional targets.

  2. TARGETED AND OFF-TARGET (BYSTANDER AND ABSCOPAL) EFFECTS OF RADIATION THERAPY: REDOX MECHANISMS AND RISK-BENEFIT ANALYSIS.

    Science.gov (United States)

    Pouget, Jean-Pierre; Georgakilas, Alexandros G; Ravanat, Jean-Luc

    2018-01-19

    Radiation therapy (from external beams to unsealed and sealed radionuclide sources) takes advantage of the detrimental effects of the clustered production of radicals and reactive oxygen species (ROS). Research has mainly focused on the interaction of radiation with water, which is the major constituent of living beings, and with nuclear DNA, which contains the genetic information. This led to the so-called "target" theory according to which cells have to be hit by ionizing particles to elicit an important biological response, including cell death. In cancer therapy, the Poisson law and linear quadratic mathematical models have been used to describe the probability of hits per cell as a function of the radiation dose. However, in the last twenty years, many studies have shown that radiation generates "danger" signals that propagate from irradiated to non-irradiated cells, leading to bystander and other off-target effects. Like for targeted effects, redox mechanisms play a key role also in off-target effects through transmission of ROS and reactive nitrogen species (RNS), but also of cytokines, ATP and extracellular DNA. Particularly, nuclear factor kappa B is essential for triggering self-sustained production of ROS and RNS, thus making the bystander response similar to inflammation. In some therapeutic situations, this phenomenon is associated with recruitment of immune cells that are involved in distant irradiation effects (called "away-from-target" i.e. abscopal effects). Determining the contribution of targeted and off-target effects in the clinic is still challenging. This has important consequences in radiotherapy, but also possibly in diagnostic procedures and in radiation protection.

  3. Transcriptome profiling to identify ATRA-responsive genes in human iPSC-derived endoderm for high-throughput point of departure analysis (SOT Annual Meeting)

    Science.gov (United States)

    Toxicological tipping points occur at chemical concentrations that overwhelm a cell’s adaptive response leading to permanent effects. We focused on retinoid signaling in differentiating endoderm to identify developmental pathways for tipping point analysis. Human induced pluripot...

  4. A psychological and Islamic analysis of corporal punishment from children’s point of view

    Directory of Open Access Journals (Sweden)

    Mahbobeh Alborzi

    2012-04-01

    Full Text Available There are various methods and ways for the sociability of children. To this end, this study examines children’s point of view about corporal punishment from an Islamic and psychological perspective analytically. To do so, 40 male and female preschoolers selected on the basis of availability were tested using drawing and interview tests. The results from statistical analyses showed that most of participating children had experienced a mild corporal punishment. They didn’t have a positive view toward corporal punishment but they approved of other disciplining methods such as deprivation, penalization and not speaking with them. Children who experienced severe corporal punishment used dark colors, light lines and the least space in their paintings. Also, results from regression analysis showed that among the demographic variables of parents, the age and education of mothers were negative and significant predictors of the use of corporal punishment. The results were analyzed from both Islamic and psychological perspectives. روش‌ها و شیوه‌های متعددی برای جامعه‌پذیری کودکان وجود دارد. در این راستا پژوهش حاضر به بررسی دیدگاه کودکان از تنبیه بدنی با تحلیلی از رویکرد اسلامی و روانشناسی پرداخت. در این راستا تعداد چهل دختر و پسر مقطع پیش دبستانی بر اساس نمونه در دسترس با استفاده از آزمون نقاشی و مصاحبه مورد سنجش قرارگرفتند. نتایج تحلیل‌های آماری نشان داد بیشتر کودکان مورد تنبیه بدنی خفیف قرار گرفته بودند. کودکان دیدگاه مثبتی نسبت به تنبیه بدنی نداشتند، لیکن در قبال خطاها و اشتباهات خود روش‌های تربیتی دیگر همچون قهر، محرومیت و جریمه شدن

  5. HYGIENE PRACTICES IN URBAN RESTAURANTS AND CHALLENGES TO IMPLEMENTING FOOD SAFETY AND HAZARD ANALYSIS CRITICAL CONTROL POINTS (HACCP) PROGRAMMES IN THIKA TOWN, KENYA.

    Science.gov (United States)

    Muinde, R K; Kiinyukia, C; Rombo, G O; Muoki, M A

    2012-12-01

    To determine the microbial load in food, examination of safety measures and possibility of implementing an Hazard Analysis Critical Control Points (HACCP) system. The target population for this study consisted of restaurants owners in Thika. Municipality (n = 30). Simple randomsamples of restaurantswere selected on a systematic sampling method of microbial analysis in cooked, non-cooked, raw food and water sanitation in the selected restaurants. Two hundred and ninety eight restaurants within Thika Municipality were selected. Of these, 30 were sampled for microbiological testing. From the study, 221 (74%) of the restaurants were ready to eat establishments where food was prepared early enough to hold and only 77(26%) of the total restaurants, customers made an order of food they wanted. 118(63%) of the restaurant operators/staff had knowledge on quality control on food safety measures, 24 (8%) of the restaurants applied these knowledge while 256 (86%) of the restaurants staff showed that food contains ingredients that were hazard if poorly handled. 238 (80%) of the resultants used weighing and sorting of food materials, 45 (15%) used preservation methods and the rest used dry foods as critical control points on food safety measures. The study showed that there was need for implementation of Hazard Analysis Critical Control Points (HACCP) system to enhance food safety. Knowledge of HACCP was very low with 89 (30%) of the restaurants applying some of quality measures to the food production process systems. There was contamination with Coliforms, Escherichia coli and Staphylococcus aureus microbial though at very low level. The means of Coliforms, Escherichia coli and Staphylococcus aureas microbial in sampled food were 9.7 x 103CFU/gm, 8.2 x 103 CFU/gm and 5.4 x 103 CFU/gm respectively with Coliforms taking the highest mean.

  6. Stability analysis of the Gyroscopic Power Take-Off wave energy point absorber

    DEFF Research Database (Denmark)

    Nielsen, Søren R. K.; Zhang, Zili; Kramer, Morten Mejlhede

    2015-01-01

    The Gyroscopic Power Take-Off (GyroPTO) wave energy point absorber consists of a float rigidly connected to a lever. The operational principle is somewhat similar to that of the so-called gyroscopic hand wrist exercisers, where the rotation of the float is brought forward by the rotational particle...... motion of the waves. At first, the equations of motion of the system are derived based on analytical rigid body dynamics. Next, assuming monochromatic waves simplified equations are derived, valid under synchronisation of the ring of the gyro to the angular frequency of the excitation. Especially...

  7. Seismic analysis of fuel and target assemblies at a production reactor

    International Nuclear Information System (INIS)

    Braverman, J.I.; Wang, Y.K.

    1991-01-01

    This paper describes the unique modeling and analysis considerations used to assess the seismic adequacy of the fuel and target assemblies in a production reactor at Savannah River Site. This confirmatory analysis was necessary to provide assurance that the reactor can operate safely during a seismic event and be brought to a safe shutdown condition. The plant which was originally designed in the 1950's required to be assessed to more current seismic criteria. The design of the reactor internals and the magnitude of the structural responses enabled the use of a linear elastic dynamic analysis. A seismic analysis was performed using a finite element model consisting of the fuel and target assemblies, reactor tank, and a portion of the concrete structure supporting the reactor tank. The effects of submergence of the fuel and target assemblies in the water contained within the reactor tank can have a significant effect on their seismic response. Thus, the model included hydrodynamic fluid coupling effects between the assemblies and the reactor tank. Fluid coupling mass terms were based on formulations for solid bodies immersed in incompressible and frictionless fluids. The potential effects of gap conditions were also assessed in this evaluation. 5 refs., 6 figs., 1 tab

  8. Object-Based Point Cloud Analysis of Full-Waveform Airborne Laser Scanning Data for Urban Vegetation Classification

    Directory of Open Access Journals (Sweden)

    Norbert Pfeifer

    2008-08-01

    Full Text Available Airborne laser scanning (ALS is a remote sensing technique well-suited for 3D vegetation mapping and structure characterization because the emitted laser pulses are able to penetrate small gaps in the vegetation canopy. The backscattered echoes from the foliage, woody vegetation, the terrain, and other objects are detected, leading to a cloud of points. Higher echo densities (> 20 echoes/m2 and additional classification variables from full-waveform (FWF ALS data, namely echo amplitude, echo width and information on multiple echoes from one shot, offer new possibilities in classifying the ALS point cloud. Currently FWF sensor information is hardly used for classification purposes. This contribution presents an object-based point cloud analysis (OBPA approach, combining segmentation and classification of the 3D FWF ALS points designed to detect tall vegetation in urban environments. The definition tall vegetation includes trees and shrubs, but excludes grassland and herbage. In the applied procedure FWF ALS echoes are segmented by a seeded region growing procedure. All echoes sorted descending by their surface roughness are used as seed points. Segments are grown based on echo width homogeneity. Next, segment statistics (mean, standard deviation, and coefficient of variation are calculated by aggregating echo features such as amplitude and surface roughness. For classification a rule base is derived automatically from a training area using a statistical classification tree. To demonstrate our method we present data of three sites with around 500,000 echoes each. The accuracy of the classified vegetation segments is evaluated for two independent validation sites. In a point-wise error assessment, where the classification is compared with manually classified 3D points, completeness and correctness better than 90% are reached for the validation sites. In comparison to many other algorithms the proposed 3D point classification works on the original

  9. Towards understanding the lifespan extension by reduced insulin signaling: bioinformatics analysis of DAF-16/FOXO direct targets in Caenorhabditis elegans.

    Science.gov (United States)

    Li, Yan-Hui; Zhang, Gai-Gai

    2016-04-12

    DAF-16, the C. elegans FOXO transcription factor, is an important determinant in aging and longevity. In this work, we manually curated FOXODB http://lyh.pkmu.cn/foxodb/, a database of FOXO direct targets. It now covers 208 genes. Bioinformatics analysis on 109 DAF-16 direct targets in C. elegans found interesting results. (i) DAF-16 and transcription factor PQM-1 co-regulate some targets. (ii) Seventeen targets directly regulate lifespan. (iii) Four targets are involved in lifespan extension induced by dietary restriction. And (iv) DAF-16 direct targets might play global roles in lifespan regulation.

  10. Four-dimensional targeting error analysis in image-guided radiotherapy

    International Nuclear Information System (INIS)

    Riboldi, M; Baroni, G; Sharp, G C; Chen, G T Y

    2009-01-01

    Image-guided therapy (IGT) involves acquisition and processing of biomedical images to actively guide medical interventions. The proliferation of IGT technologies has been particularly significant in image-guided radiotherapy (IGRT), as a way to increase the tumor targeting accuracy. When IGRT is applied to moving tumors, image guidance becomes challenging, as motion leads to increased uncertainty. Different strategies may be applied to mitigate the effects of motion: each technique is related to a different technological effort and complexity in treatment planning and delivery. The objective comparison of different motion mitigation strategies can be achieved by quantifying the residual uncertainties in tumor targeting, to be detected by means of IGRT technologies. Such quantification requires an extension of targeting error theory to a 4D space, where the 3D tumor trajectory as a function of time measured (4D Targeting Error, 4DTE). Accurate 4DTE analysis can be represented by a motion probability density function, describing the statistical fluctuations of tumor trajectory. We illustrate the application of 4DTE analysis through examples, including weekly variations in tumor trajectory as detected by 4DCT, respiratory gating via external surrogates and real-time tumor tracking.

  11. Analysis of the thermomechanical behavior of the IFMIF bayonet target assembly under design loading scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Bernardi, D., E-mail: davide.bernardi@enea.it [ENEA Brasimone, Camugnano, BO (Italy); Arena, P.; Bongiovì, G.; Di Maio, P.A. [Dipartimento di Energia, Ingegneria dell’Informazione e Modelli Matematici, Università di Palermo, Viale delle Scienze, Palermo (Italy); Frisoni, M. [ENEA Bologna, Via Martiri di Monte Sole 4, Bologna (Italy); Miccichè, G.; Serra, M. [ENEA Brasimone, Camugnano, BO (Italy)

    2015-10-15

    In the framework of the IFMIF Engineering Validation and Engineering Design Activities (IFMIF/EVEDA) phase, ENEA is responsible for the design of the European concept of the IFMIF lithium target system which foresees the possibility to periodically replace only the most irradiated and thus critical component (i.e., the backplate) while continuing to operate the rest of the target for a longer period (the so-called bayonet backplate concept). In this work, the results of the steady state thermomechanical analysis of the IFMIF bayonet target assembly under two different design loading scenarios (a “hot” scenario and a “cold” scenario) are briefly reported highlighting the relevant indications obtained with respect to the fulfillment of the design requirements. In particular, the analyses have shown that in the hot scenario the temperatures reached in the target assembly are within the material acceptable limits while in the cold scenario transition below the ductile to brittle transition temperature (DBTT) cannot be excluded. Moreover, results indicate that the contact between backplate and high flux test module is avoided and that the overall structural integrity of the system is assured in both scenarios. However, stress linearization analysis reveals that ITER Structural Design Criteria for In-vessel Components (SDC-IC) design rules are not always met along the selected paths at backplate middle plane section in the hot scenario, thus suggesting the need of a revision of the backplate design or a change of the operating conditions.

  12. Global analysis of p53-regulated transcription identifies its direct targets and unexpected regulatory mechanisms.

    Science.gov (United States)

    Allen, Mary Ann; Andrysik, Zdenek; Dengler, Veronica L; Mellert, Hestia S; Guarnieri, Anna; Freeman, Justin A; Sullivan, Kelly D; Galbraith, Matthew D; Luo, Xin; Kraus, W Lee; Dowell, Robin D; Espinosa, Joaquin M

    2014-05-27

    The p53 transcription factor is a potent suppressor of tumor growth. We report here an analysis of its direct transcriptional program using Global Run-On sequencing (GRO-seq). Shortly after MDM2 inhibition by Nutlin-3, low levels of p53 rapidly activate ∼200 genes, most of them not previously established as direct targets. This immediate response involves all canonical p53 effector pathways, including apoptosis. Comparative global analysis of RNA synthesis vs steady state levels revealed that microarray profiling fails to identify low abundance transcripts directly activated by p53. Interestingly, p53 represses a subset of its activation targets before MDM2 inhibition. GRO-seq uncovered a plethora of gene-specific regulatory features affecting key survival and apoptotic genes within the p53 network. p53 regulates hundreds of enhancer-derived RNAs. Strikingly, direct p53 targets harbor pre-activated enhancers highly transcribed in p53 null cells. Altogether, these results enable the study of many uncharacterized p53 target genes and unexpected regulatory mechanisms.DOI: http://dx.doi.org/10.7554/eLife.02200.001. Copyright © 2014, Allen et al.

  13. Unsteady-state analysis of a counter-flow dew point evaporative cooling system

    KAUST Repository

    Lin, J.

    2016-07-19

    Understanding the dynamic behavior of the dew point evaporative cooler is crucial in achieving efficient cooling for real applications. This paper details the development of a transient model for a counter-flow dew point evaporative cooling system. The transient model approaching steady conditions agreed well with the steady state model. Additionally, it is able to accurately predict the experimental data within 4.3% discrepancy. The transient responses of the cooling system were investigated under different inlet air conditions. Temporal temperature and humidity profiles were analyzed for different transient and step responses. The key findings from this study include: (1) the response trend and settling time is markedly dependent on the inlet air temperature, humidity and velocity; (2) the settling time of the transient response ranges from 50 s to 300 s when the system operates under different inlet conditions; and (3) the average transient wet bulb effectiveness (1.00–1.06) of the system is observed to be higher than the steady state wet bulb effectiveness (1.01) for our range of study. © 2016 Elsevier Ltd

  14. Delay analysis of a point-to-multipoint spectrum sharing network with CSI based power allocation

    KAUST Repository

    Khan, Fahd Ahmed

    2012-10-01

    In this paper, we analyse the delay performance of a point-to-multipoint cognitive radio network which is sharing the spectrum with a point-to-multipoint primary network. The channel is assumed to be independent but not identically distributed and has Nakagami-m fading. A constraint on the peak transmit power of the secondary user transmitter (SU-Tx) is also considered in addition to the peak interference power constraint. Based on the constraints, a power allocation scheme which requires knowledge of the instantaneous channel state information (CSI) of the interference links is derived. The SU-Tx is assumed to be equipped with a buffer and is modelled using the M/G/1 queueing model. Closed form expressions for the probability distribution function (PDF) and cumulative distribution function (CDF) of the packet transmission time is derived. Using the PDF, the expressions for the moments of transmission time are obtained. In addition, using the moments, the expressions for the performance measures such as the total average waiting time of packets and the average number of packets waiting in the buffer of the SU-Tx are also obtained. Numerical simulations corroborate the theoretical results. © 2012 IEEE.

  15. Data for Suspect Screening and Non-Targeted Analysis of Drinking Water Using Point-Of-Use Filters

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset contains information about all the features extracted from the raw data files, the formulas that were assigned to some of these features, and the...

  16. A single point in protein trafficking by Plasmodium falciparum determines the expression of major antigens on the surface of infected erythrocytes targeted by human antibodies.

    Science.gov (United States)

    Chan, Jo-Anne; Howell, Katherine B; Langer, Christine; Maier, Alexander G; Hasang, Wina; Rogerson, Stephen J; Petter, Michaela; Chesson, Joanne; Stanisic, Danielle I; Duffy, Michael F; Cooke, Brian M; Siba, Peter M; Mueller, Ivo; Bull, Peter C; Marsh, Kevin; Fowkes, Freya J I; Beeson, James G

    2016-11-01

    Antibodies to blood-stage antigens of Plasmodium falciparum play a pivotal role in human immunity to malaria. During parasite development, multiple proteins are trafficked from the intracellular parasite to the surface of P. falciparum-infected erythrocytes (IEs). However, the relative importance of different proteins as targets of acquired antibodies, and key pathways involved in trafficking major antigens remain to be clearly defined. We quantified antibodies to surface antigens among children, adults, and pregnant women from different malaria-exposed regions. We quantified the importance of antigens as antibody targets using genetically engineered P. falciparum with modified surface antigen expression. Genetic deletion of the trafficking protein skeleton-binding protein-1 (SBP1), which is involved in trafficking the surface antigen PfEMP1, led to a dramatic reduction in antibody recognition of IEs and the ability of human antibodies to promote opsonic phagocytosis of IEs, a key mechanism of parasite clearance. The great majority of antibody epitopes on the IE surface were SBP1-dependent. This was demonstrated using parasite isolates with different genetic or phenotypic backgrounds, and among antibodies from children, adults, and pregnant women in different populations. Comparisons of antibody reactivity to parasite isolates with SBP1 deletion or inhibited PfEMP1 expression suggest that PfEMP1 is the dominant target of acquired human antibodies, and that other P. falciparum IE surface proteins are minor targets. These results establish SBP1 as part of a critical pathway for the trafficking of major surface antigens targeted by human immunity, and have key implications for vaccine development, and quantifying immunity in populations.

  17. Life-Cycle Cost-Benefit (LCCB) Analysis of Bridges from a User and Social Point of View

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle

    2009-01-01

    is to present and discuss some of these problems from a user and social point of view. A brief presentation of a preliminary study of the importance of including benefits in life-cycle cost-benefit analysis in management systems for bridges is shown. Benefits may be positive as well as negative from the user...... point of view. In the paper, negative benefits (user costs) are discussed in relation to the maintenance of concrete bridges. A limited number of excerpts from published reports that are related to the importance of estimating user costs when repairs of bridges are planned, and when optimized strategies......During the last two decades, important progress has been made in the life-cycle cost-benefit (LCCB) analysis of structures, especially offshore platforms, bridges and nuclear installations. Due to the large uncertainties related to the deterioration, maintenance, and benefits of such structures...

  18. [Introduction of hazard analysis and critical control points (HACCP) principles at the flight catering food production plant].

    Science.gov (United States)

    Popova, A Yu; Trukhina, G M; Mikailova, O M

    In the article there is considered the quality control and safety system implemented in the one of the largest flight catering food production plant for airline passengers and flying squad. The system for the control was based on the Hazard Analysis And Critical Control Points (HACCP) principles and developed hygienic and antiepidemic measures. There is considered the identification of hazard factors at stages of the technical process. There are presented results of the analysis data of monitoring for 6 critical control points over the five-year period. The quality control and safety system permit to decline food contamination risk during acceptance, preparation and supplying of in-flight meal. There was proved the efficiency of the implemented system. There are determined further ways of harmonization and implementation for HACCP principles in the plant.

  19. Elemental chemical analysis of submerged targets by double-pulse laser-induced breakdown spectroscopy.

    Science.gov (United States)

    De Giacomo, A; Dell'Aglio, M; Casavola, A; Colonna, G; De Pascale, O; Capitelli, M

    2006-05-01

    Double-pulse laser-induced plasma spectroscopy (DP-LIPS) is applied to submerged targets to investigate its feasibility for elemental analysis. The role of experimental parameters, such as inter-pulse delay and detection time, has been discussed in terms of the dynamics of the laser-induced bubble produced by the first pulse and its confinement effect on the plasma produced by the second laser pulse. The analytical performance of this technique applied to targets in a water environment are discussed. The elemental analysis of submerged copper alloys by DP-LIPS has been compared with conventional (single-pulse) LIBS in air. Theoretical investigation of the plasma dynamics in water bubbles and open air has been performed.

  20. Allocating the Fixed Resources and Setting Targets in Integer Data Envelopment Analysis

    Directory of Open Access Journals (Sweden)

    Kobra Gholami

    2013-11-01

    Full Text Available Data envelopment analysis (DEA is a non-parametric approach to evaluate a set of decision making units (DMUs consuming multiple inputs to produce multiple outputs. Formally, DEA use to estimate the efficiency score into the empirical efficient frontier. Also, DEA can be used to allocate resources and set targets for future forecast. The data are continuous in the standard DEA model whereas there are many problems in the real life that data must be integer such as number of employee, machinery, expert and so on. Thus in this paper we propose an approach to allocate fixed resources and set fixed targets with selective integer assumption that is based on an integer data envelopment analysis (IDEA approach for the first time. The major aim in this approach is preserving the efficiency score of DMUs. We use the concept of benchmarking to reach this aim. The numerical example gets to illustrate the applicability of the proposed method.

  1. Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data

    Science.gov (United States)

    Deng, Xinyi

    2016-08-01

    A common interest of scientists in many fields is to understand the relationship between the dynamics of a physical system and the occurrences of discrete events within such physical system. Seismologists study the connection between mechanical vibrations of the Earth and the occurrences of earthquakes so that future earthquakes can be better predicted. Astrophysicists study the association between the oscillating energy of celestial regions and the emission of photons to learn the Universe's various objects and their interactions. Neuroscientists study the link between behavior and the millisecond-timescale spike patterns of neurons to understand higher brain functions. Such relationships can often be formulated within the framework of state-space models with point process observations. The basic idea is that the dynamics of the physical systems are driven by the dynamics of some stochastic state variables and the discrete events we observe in an interval are noisy observations with distributions determined by the state variables. This thesis proposes several new methodological developments that advance the framework of state-space models with point process observations at the intersection of statistics and neuroscience. In particular, we develop new methods 1) to characterize the rhythmic spiking activity using history-dependent structure, 2) to model population spike activity using marked point process models, 3) to allow for real-time decision making, and 4) to take into account the need for dimensionality reduction for high-dimensional state and observation processes. We applied these methods to a novel problem of tracking rhythmic dynamics in the spiking of neurons in the subthalamic nucleus of Parkinson's patients with the goal of optimizing placement of deep brain stimulation electrodes. We developed a decoding algorithm that can make decision in real-time (for example, to stimulate the neurons or not) based on various sources of information present in

  2. Thoracic endovascular aortic repair migration and aortic elongation differentiated using dual reference point analysis.

    Science.gov (United States)

    Alberta, Hillary B; Takayama, Toshio; Panthofer, Annalise; Cambria, Richard P; Farber, Mark A; Jordan, William D; Matsumura, Jon S

    2018-02-01

    We evaluated images of patients undergoing a thoracic endovascular aortic repair procedure using two reference points as a means for differentiating stent graft migration from aortic elongation. Conventional standards define migration of a stent graft as an absolute change in the distance from the distal graft ring to a distal landmark ≥10 mm compared with a baseline measurement. Aortic elongation occurs over time in both healthy individuals and patients with aortic disease. Aortic elongation in patients with stent grafts may result in increased distal thoracic aortic lengths over time. False-positive stent graft migration would be defined when these patients meet the standard definition for migration, even if the stent has not moved in relation to the elongating aorta. This retrospective study evaluated the aortic length of 23 patients treated with the conformable GORE TAG thoracic endoprosthesis (W. L. Gore & Associates, Flagstaff, Ariz) in three clinical trials (dissection, traumatic injury, and aneurysm). Patients who met the standard definition for migration were selected. A standardized protocol was used to measure aortic centerline lengths, including the innominate artery (IA) to the most distal device ring, the IA to the celiac artery (CA), and the distal ring to the CA. Baseline lengths obtained from the first postoperative image were compared with length measurements obtained from the first interval at which they met the standard definition for migration. The conventional standards for migration using a single reference point were compared with the use of dual reference points. Of the 23 patients with endograft changes, 20 were deemed to have aortic elongation rather than true migration. The remaining three patients were deemed to have migration on the basis of the IA to distal ring position compared with the IA to CA length change. The IA to CA interval length change was markedly greater in those with elongation compared with migration (23.8 ± 8.4

  3. Analysis of the dynamics of a nutating body. [numerical analysis of displacement, velocity, and acceleration of point on mechanical drives

    Science.gov (United States)

    Anderson, W. J.

    1974-01-01

    The equations for the displacement, velocity, and acceleration of a point in a nutating body are developed. These are used to derive equations for the inertial moment developed by a nutating body of arbitrary shape. Calculations made for a previously designed nutating plate transmission indicate that that device is severely speed limited because of the very high magnitude inertial moment.

  4. Simulating Serial-Target Antibacterial Drug Synergies Using Flux Balance Analysis

    DEFF Research Database (Denmark)

    Krueger, Andrew S.; Munck, Christian; Dantas, Gautam

    2016-01-01

    Flux balance analysis (FBA) is an increasingly useful approach for modeling the behavior of metabolic systems. However, standard FBA modeling of genetic knockouts cannot predict drug combination synergies observed between serial metabolic targets, even though such synergies give rise to some...... the possibility for more accurate genome-scale predictions of drug synergies, which can be used to suggest treatments for infections and other diseases....

  5. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  6. Targeting khat or targeting Somalis? A discourse analysis of project evaluations on khat abuse among Somali immigrants in Scandinavia

    Directory of Open Access Journals (Sweden)

    Nordgren Johan

    2015-09-01

    Full Text Available BACKGROUND – In Denmark, Norway and Sweden, the use of the psychoactive plant khat is widely seen as a social and health problem exclusively affecting the Somali immigrant population. Several projects by governmental and municipal bodies and agencies have been initiated to reduce khat use and abuse within this target population.

  7. Determination of fluidized bed granulation end point using near-infrared spectroscopy and phenomenological analysis.

    Science.gov (United States)

    Findlay, W Paul; Peck, Garnet R; Morris, Kenneth R

    2005-03-01

    Simultaneous real-time monitoring of particle size and moisture content by near-infrared spectroscopy through a window into the bed of a fluidized bed granulator is used to determine the granulation end point. The moisture content and particle size determined by the near-infrared monitor correlates well with off-line moisture content and particle size measurements. The measured particle size is modeled using a population balance approach, and the moisture content is shown to follow accepted models during drying. Given a known formulation, with predefined parameters for peak moisture content, final moisture content, and final granule size, the near-infrared monitoring system can be used to control a fluidized bed granulation by determining when binder addition should be stopped and when drying of the granules is complete. Copyright 2005 Wiley-Liss, Inc. and the American Pharmacists Association.

  8. Analysis on signal properties due to concurrent leaks at two points in water supply pipelines

    International Nuclear Information System (INIS)

    Lee, Young Sup

    2015-01-01

    Intelligent leak detection is an essential component of a underground water supply pipeline network such as a smart water grid system. In this network, numerous leak detection sensors are needed to cover all of the pipelines in a specific area installed at specific regular distances. It is also necessary to determine the existence of any leaks and estimate its location within a short time after it occurs. In this study, the leak signal properties and feasibility of leak location detection were investigated when concurrent leaks occurred at two points in a pipeline. The straight distance between the two leak sensors in the 100A sized cast-iron pipeline was 315.6 m, and their signals were measured with one leak and two concurrent leaks. Each leak location was described after analyzing the frequency properties and cross-correlation of the measured signals.

  9. PowerPoint® Presentation Flaws and Failures: A Psychological Analysis

    Science.gov (United States)

    Kosslyn, Stephen M.; Kievit, Rogier A.; Russell, Alexandra G.; Shephard, Jennifer M.

    2012-01-01

    Electronic slideshow presentations are often faulted anecdotally, but little empirical work has documented their faults. In Study 1 we found that eight psychological principles are often violated in PowerPoint® slideshows, and are violated to similar extents across different fields – for example, academic research slideshows generally were no better or worse than business slideshows. In Study 2 we found that respondents reported having noticed, and having been annoyed by, specific problems in presentations arising from violations of particular psychological principles. Finally, in Study 3 we showed that observers are not highly accurate in recognizing when particular slides violated a specific psychological rule. Furthermore, even when they correctly identified the violation, they often could not explain the nature of the problem. In sum, the psychological foundations for effective slideshow presentation design are neither obvious nor necessarily intuitive, and presentation designers in all fields, from education to business to government, could benefit from explicit instruction in relevant aspects of psychology. PMID:22822402

  10. PowerPoint(®) Presentation Flaws and Failures: A Psychological Analysis.

    Science.gov (United States)

    Kosslyn, Stephen M; Kievit, Rogier A; Russell, Alexandra G; Shephard, Jennifer M

    2012-01-01

    Electronic slideshow presentations are often faulted anecdotally, but little empirical work has documented their faults. In Study 1 we found that eight psychological principles are often violated in PowerPoint(®) slideshows, and are violated to similar extents across different fields - for example, academic research slideshows generally were no better or worse than business slideshows. In Study 2 we found that respondents reported having noticed, and having been annoyed by, specific problems in presentations arising from violations of particular psychological principles. Finally, in Study 3 we showed that observers are not highly accurate in recognizing when particular slides violated a specific psychological rule. Furthermore, even when they correctly identified the violation, they often could not explain the nature of the problem. In sum, the psychological foundations for effective slideshow presentation design are neither obvious nor necessarily intuitive, and presentation designers in all fields, from education to business to government, could benefit from explicit instruction in relevant aspects of psychology.

  11. Nitrate transport and supply limitations quantified using high-frequency stream monitoring and turning point analysis

    Science.gov (United States)

    Jones, Christopher S.; Wang, Bo; Schilling, Keith E.; Chan, Kung-sik

    2017-06-01

    Agricultural landscapes often leak inorganic nitrogen to the stream network, usually in the form of nitrate-nitrite (NOx-N), degrading downstream water quality on both the local and regional scales. While the spatial distribution of nitrate sources has been delineated in many watersheds, less is known about the complicated temporal dynamics that drive stream NOx-N because traditional methods of stream grab sampling are often conducted at a low frequency. Deployment of accurate real-time, continuous measurement devices that have been developed in recent years enables high-frequency sampling that provides detailed information on the concentration-discharge relation and the timing of NOx-N delivery to streams. We aggregated 15-min interval NOx-N and discharge data over a nine-year period into daily averages and then used robust statistical methods to identify how the discharge regime within an artificially-drained agricultural watershed reflected catchment hydrology and NOx-N delivery pathways. We then quantified how transport and supply limitations varied from year-to-year and how dependence of these limitations varied with climate, especially drought. Our results show NOx-N concentrations increased linearly with discharge up to an average "turning point" of 1.42 mm of area-normalized discharge, after which concentrations decline with increasing discharge. We estimate transport and supply limitations to govern 57 and 43 percent, respectively, of the NOx-N flux over the nine-year period. Drought effects on the NOx-N flux linger for multiple years and this is reflected in a greater tendency toward supply limitations in the three years following drought. How the turning point varies with climate may aid in prediction of NOx-N loading in future climate regimes.

  12. Three-dimensional distribution of cortical synapses: a replicated point pattern-based analysis

    Directory of Open Access Journals (Sweden)

    Laura eAnton-Sanchez

    2014-08-01

    Full Text Available The biggest problem when analyzing the brain is that its synaptic connections are extremely complex. Generally, the billions of neurons making up the brain exchange information through two types of highly specialized structures: chemical synapses (the vast majority and so-called gap junctions (a substrate of one class of electrical synapse. Here we are interested in exploring the three-dimensional spatial distribution of chemical synapses in the cerebral cortex. Recent research has showed that the three-dimensional spatial distribution of synapses in layer III of the neocortex can be modeled by a random sequential adsorption (RSA point process, i.e., synapses are distributed in space almost randomly, with the only constraint that they cannot overlap. In this study we hypothesize that RSA processes can also explain the distribution of synapses in all cortical layers. We also investigate whether there are differences in both the synaptic density and spatial distribution of synapses between layers. Using combined focused ion beam milling and scanning electron microscopy (FIB/SEM, we obtained three-dimensional samples from the six layers of the rat somatosensory cortex and identified and reconstructed the synaptic junctions. A total volume of tissue of approximately 4500 μm3 and around 4000 synapses from three different animals were analyzed. Different samples, layers and/or animals were aggregated and compared using RSA replicated spatial point processes. The results showed no significant differences in the synaptic distribution across the different rats used in the study. We found that RSA processes described the spatial distribution of synapses in all samples of each layer. We also found that the synaptic distribution in layers II to VI conforms to a common underlying RSA process with different densities per layer. Interestingly, the results showed that synapses in layer I had a slightly different spatial distribution from the other layers.

  13. The application of specific point energy analysis to laser cutting with 1 μm laser radiation

    OpenAIRE

    Hashemzadeh, M.; Suder, W; Williams, S.; Powell, J.; Kaplan, A.F.H.; Voisey, K.T.

    2014-01-01

    Specific point energy (SPE) is a concept that has been successfully used in laser welding where SPE and power density determine penetration depth. This type of analysis allows the welding characteristics of different laser systems to be directly compared. This paper investigates if the SPE concept can usefully be applied to laser cutting. In order to provide data for the analysis laser cutting of various thicknesses of mild steel with a 2kW fibre laser was carried out over a wide range of par...

  14. Globally Optimized Targeted Mass Spectrometry: Reliable Metabolomics Analysis with Broad Coverage.

    Science.gov (United States)

    Gu, Haiwei; Zhang, Ping; Zhu, Jiangjiang; Raftery, Daniel

    2015-12-15

    Targeted detection is one of the most important methods in mass spectrometry (MS)-based metabolomics; however, its major limitation is the reduced metabolome coverage that results from the limited set of targeted metabolites typically used in the analysis. In this study we describe a new approach, globally optimized targeted (GOT)-MS, that combines many of the advantages of targeted detection and global profiling in metabolomics analysis, including the capability to detect unknowns, broad metabolite coverage, and excellent quantitation. The key step in GOT-MS is a global search of precursor and product ions using a single liquid chromatography-triple quadrupole (LC-QQQ) mass spectrometer. Here, focused on measuring serum metabolites, we obtained 595 precursor ions and 1 890 multiple reaction monitoring (MRM) transitions, under positive and negative ionization modes in the mass range of 60-600 Da. For many of the MRMs/metabolites under investigation, the analytical performance of GOT-MS is better than or at least comparable to that obtained by global profiling using a quadrupole-time-of-flight (Q-TOF) instrument of similar vintage. Using a study of serum metabolites in colorectal cancer (CRC) as a representative example, GOT-MS significantly outperformed a large targeted MS assay containing ∼160 biologically important metabolites and provided a complementary approach to traditional global profiling using Q-TOF-MS. GOT-MS thus expands and optimizes the detection capabilities for QQQ-MS through a novel approach and should have the potential to significantly advance both basic and clinical metabolic research.

  15. Mechanical behavior analysis of a submerged fixed point anchoring system for a hydroacoustic signature measuring sensor for divers and ships

    Science.gov (United States)

    Slamnoiu, G.; Radu, O.; Surdu, G.; Roşca, V.; Damian, R.; Pascu, C.; Curcă, E.; Rădulescu, A.

    2016-08-01

    The paper has as its main objectives the presentation and the analysis of the numerical analysis results for the study of a fixed point anchoring system for a hydroacoustic sensor when measuring the hydroacoustic signature of divers and ships in real sea conditions. The study of the mechanical behavior of this system has as main objectives the optimization of the shape and weight of the anchorage ballast for the metallic structure while considering the necessity to maintain the sensor in a fixed point and the analysis of the sensor movements and the influences on the measurements caused by the sea current streams. The study was focused on the 3D model of metallic structure design; numerical modeling of the water flow around the sensor anchoring structure using volume of fluid analysis and the analysis of the forces and displacements using FEM when needed for the study. In this paper we have used data for the sea motion dynamics and in particular the velocity of the sea current streams as determined by experimental measurements that have been conducted for the western area of the Black Sea.

  16. A new integrated dual time-point amyloid PET/MRI data analysis method

    Energy Technology Data Exchange (ETDEWEB)

    Cecchin, Diego; Zucchetta, Pietro; Turco, Paolo; Bui, Franco [University Hospital of Padua, Nuclear Medicine Unit, Department of Medicine - DIMED, Padua (Italy); Barthel, Henryk; Tiepolt, Solveig; Sabri, Osama [Leipzig University, Department of Nuclear Medicine, Leipzig (Germany); Poggiali, Davide; Cagnin, Annachiara; Gallo, Paolo [University Hospital of Padua, Neurology, Department of Neurosciences (DNS), Padua (Italy); Frigo, Anna Chiara [University Hospital of Padua, Biostatistics, Epidemiology and Public Health Unit, Department of Cardiac, Thoracic and Vascular Sciences, Padua (Italy)

    2017-11-15

    In the initial evaluation of patients with suspected dementia and Alzheimer's disease, there is no consensus on how to perform semiquantification of amyloid in such a way that it: (1) facilitates visual qualitative interpretation, (2) takes the kinetic behaviour of the tracer into consideration particularly with regard to at least partially correcting for blood flow dependence, (3) analyses the amyloid load based on accurate parcellation of cortical and subcortical areas, (4) includes partial volume effect correction (PVEC), (5) includes MRI-derived topographical indexes, (6) enables application to PET/MRI images and PET/CT images with separately acquired MR images, and (7) allows automation. A method with all of these characteristics was retrospectively tested in 86 subjects who underwent amyloid ({sup 18}F-florbetaben) PET/MRI in a clinical setting (using images acquired 90-110 min after injection, 53 were classified visually as amyloid-negative and 33 as amyloid-positive). Early images after tracer administration were acquired between 0 and 10 min after injection, and later images were acquired between 90 and 110 min after injection. PVEC of the PET data was carried out using the geometric transfer matrix method. Parametric images and some regional output parameters, including two innovative ''dual time-point'' indexes, were obtained. Subjects classified visually as amyloid-positive showed a sparse tracer uptake in the primary sensory, motor and visual areas in accordance with the isocortical stage of the topographic distribution of the amyloid plaque (Braak stages V/VI). In patients classified visually as amyloid-negative, the method revealed detectable levels of tracer uptake in the basal portions of the frontal and temporal lobes, areas that are known to be sites of early deposition of amyloid plaques that probably represented early accumulation (Braak stage A) that is typical of normal ageing. There was a strong correlation between

  17. Fed-state gastric media and drug analysis techniques: Current status and points to consider.

    Science.gov (United States)

    Baxevanis, Fotios; Kuiper, Jesse; Fotaki, Nikoletta

    2016-10-01

    Gastric fed state conditions can have a significant effect on drug dissolution and absorption. In vitro dissolution tests with simple aqueous media cannot usually predict drugs' in vivo response, as several factors such as the meal content, the gastric emptying and possible interactions between food and drug formulations can affect drug's pharmacokinetics. Good understanding of the effect of the in vivo fed gastric conditions on the drug is essential for the development of biorelevant dissolution media simulating the gastric environment after the administration of the standard high fat meal proposed by the FDA and the EMA in bioavailability/bioequivalence (BA/BE) studies. The analysis of drugs in fed state media can be quite challenging as most analytical protocols currently employed are time consuming and labour intensive. In this review, an overview of the in vivo gastric conditions and the biorelevant media used for their in vitro simulation are described. Furthermore an analysis of the physicochemical properties of the drugs and the formulations related to food effect is given. In terms of drug analysis, the protocols currently used for the fed state media sample treatment and analysis and the analytical challenges and needs emerging for more efficient and time saving techniques for a broad spectrum of compounds are being discussed. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. On the Lie point symmetry analysis and solutions of the inviscid ...

    Indian Academy of Sciences (India)

    (Springer, New York, 2002). [7] P J Olver, Application of Lie groups to differential equations (Springer, New York, 1993). [8] N H Ibragimov, CRC handbook of Lie group analysis of differential equations, Volume 1,. Symmetries, exact solutions and conservation laws (CRC Press, Boca Raton, 1994). 414. Pramana – J. Phys.

  19. Points of convergence between functional and formal approaches to syntactic analysis

    DEFF Research Database (Denmark)

    Bjerre, Tavs; Engels, Eva; Jørgensen, Henrik

    2008-01-01

    respectively: The functional approach is represented by Paul Diderichsen's (1936, 1941, 1946, 1964) sætningsskema, ‘sentence model', and the formal approach is represented by analysis whose main features are common to the principles and parameters framework (Chomsky 1986) and the minimalist programme (Chomsky...

  20. An analysis of possible off target effects following CAS9/CRISPR targeted deletions of neuropeptide gene enhancers from the mouse genome.

    Science.gov (United States)

    Hay, Elizabeth Anne; Khalaf, Abdulla Razak; Marini, Pietro; Brown, Andrew; Heath, Karyn; Sheppard, Darrin; MacKenzie, Alasdair

    2017-08-01

    We have successfully used comparative genomics to identify putative regulatory elements within the human genome that contribute to the tissue specific expression of neuropeptides such as galanin and receptors such as CB1. However, a previous inability to rapidly delete these elements from the mouse genome has prevented optimal assessment of their function in-vivo. This has been solved using CAS9/CRISPR genome editing technology which uses a bacterial endonuclease called CAS9 that, in combination with specifically designed guide RNA (gRNA) molecules, cuts specific regions of the mouse genome. However, reports of "off target" effects, whereby the CAS9 endonuclease is able to cut sites other than those targeted, limits the appeal of this technology. We used cytoplasmic microinjection of gRNA and CAS9 mRNA into 1-cell mouse embryos to rapidly generate enhancer knockout mouse lines. The current study describes our analysis of the genomes of these enhancer knockout lines to detect possible off-target effects. Bioinformatic analysis was used to identify the most likely putative off-target sites and to design PCR primers that would amplify these sequences from genomic DNA of founder enhancer deletion mouse lines. Amplified DNA was then sequenced and blasted against the mouse genome sequence to detect off-target effects. Using this approach we were unable to detect any evidence of off-target effects in the genomes of three founder lines using any of the four gRNAs used in the analysis. This study suggests that the problem of off-target effects in transgenic mice have been exaggerated and that CAS9/CRISPR represents a highly effective and accurate method of deleting putative neuropeptide gene enhancer sequences from the mouse genome. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Critical analysis of the potential for therapeutic targeting of mammalian target of rapamycin (mTOR in gastric cancer

    Directory of Open Access Journals (Sweden)

    Inokuchi M

    2014-04-01

    Full Text Available Mikito Inokuchi,1 Keiji Kato,1 Kazuyuki Kojima,2 Kenichi Sugihara1 1Department of Surgical Oncology, 2Department of Minimally Invasive Surgery, Tokyo Medical and Dental University, Tokyo, Japan Abstract: Multidisciplinary treatment including chemotherapy has become the global standard of care for patients with metastatic gastric cancer (mGC; nonetheless, survival remains poor. Although many molecular-targeted therapies have been developed for various cancers, only anti-HER2 treatment has produced promising results in patients with mGC. Mammalian target of rapamycin (mTOR plays a key role in cell proliferation, antiapoptosis, and metastasis in signaling pathways from the tyrosine kinase receptor, and its activation has been demonstrated in gastric cancer (GC cells. This review discusses the clinical relevance of mTOR in GC and examines its potential as a therapeutic target in patients with mGC. Preclinical studies in animal models suggest that suppression of the mTOR pathway inhibits the proliferation of GC cells and delays tumor progression. The mTOR inhibitor everolimus has been evaluated as second- or third-line treatment in clinical trials. Adverse events were well tolerated although the effectiveness of everolimus alone was limited. Everolimus is now being evaluated in combination with chemotherapy in Phase III clinical studies in this subgroup of patients. Two Phase III studies include exploratory biomarker research designed to evaluate the predictive value of the expression or mutation of molecules related to the Akt/mTOR signaling pathway. These biomarker studies may lead to the realization of targeted therapy for selected patients with mGC in the future. Keywords: gastric cancer, mTOR, everolimus

  2. Genetic analysis of the gravitropic set-point angle in lateral roots of arabidopsis

    Science.gov (United States)

    Mullen, J. L.; Hangarter, R. P.

    2003-05-01

    Research on gravity responses in plants has mostly focused on primary roots and shoots, which typically orient to a vertical orientation. However, the distribution of lateral organs and their characteristically non-vertical growth orientation are critical for the determination of plant form. For example, in Arabidopsis, when lateral roots emerge from the primary root, they grow at a nearly horizontal orientation. As they elongate, the roots slowly curve until they eventually reach a vertical orientation. The regulation of this lateral root orientation is an important component affecting overall root system architecture. We found that this change in orientation is not simply due to the onset of gravitropic competence, as non-vertical lateral roots are capable of both positive and negative gravitropism. Thus, the horizontal growth of new lateral roots appears to be determined by what is called the gravitropic set-point angle (GSA). This developmental control of the GSA of lateral roots in Arabidopsis provides a useful system for investigating the components involved in regulating gravitropic responses. Using this system, we have identified several Arabidopsis mutants that have altered lateral root orientations but maintain normal primary root orientation.

  3. Smart point-of-care systems for molecular diagnostics based on nanotechnology: whole blood glucose analysis

    Science.gov (United States)

    Devadhasan, Jasmine P.; Kim, Sanghyo

    2015-07-01

    Complementary metal oxide semiconductor (CMOS) image sensors are received great attention for their high efficiency in biological applications. The present work describes a CMOS image sensor-based whole blood glucose monitoring system through a point-of-care (POC) approach. A simple poly-ethylene terephthalate (PET) film chip was developed to carry out the enzyme kinetic reaction at various concentrations of blood glucose. In this technique, assay reagent was adsorbed onto amine functionalized silica (AFSiO2) nanoparticles in order to achieve glucose oxidation on the PET film chip. The AFSiO2 nanoparticles can immobilize the assay reagent with an electrostatic attraction and eased to develop the opaque platform which was technically suitable chip to analyze by the camera module. The oxidized glucose then produces a green color according to the glucose concentration and is analyzed by the camera module as a photon detection technique. The photon number decreases with increasing glucose concentration. The simple sensing approach, utilizing enzyme immobilized AFSiO2 nanoparticle chip and assay detection method was developed for quantitative glucose measurement.

  4. Analysis of lawsuits related to point-of-care ultrasonography in neonatology and pediatric subspecialties.

    Science.gov (United States)

    Nguyen, J; Cascione, M; Noori, S

    2016-09-01

    Point-of-care ultrasonography (POCUS) is becoming increasingly available for neonatologists and pediatric subspecialists (PSS); however, concerns over potential litigation from possible missed diagnoses or incorrect management have been documented. This study aims to define the extent and quality of lawsuits filed against neonatologists and PSS related to POCUS. We conducted a retrospective study of all United States reported state and federal cases in the Westlaw database from January 1990 through October 2015. Cases were reviewed and included if either a neonatologist or PSS were accused of misconduct or the interpretation or failure to perform an ultrasound/echocardiogram was discussed. Descriptive statistics were used to evaluate the data. Our search criteria returned 468 results; 2 cases were determined to be relevant to the study objective. The two cases alleged a failure to perform a diagnostic test and implicated POCUS as an option. There were no cases of neonatologists and PSS being sued for POCUS performance or interpretation. This study of a major legal database suggests that POCUS use and interpretation is not a significant cause of lawsuits against neonatologists and PSS.

  5. Performance analysis of commercial multiple-input-multiple-output access point in distributed antenna system.

    Science.gov (United States)

    Fan, Yuting; Aighobahi, Anthony E; Gomes, Nathan J; Xu, Kun; Li, Jianqiang

    2015-03-23

    In this paper, we experimentally investigate the throughput of IEEE 802.11n 2x2 multiple-input-multiple-output (MIMO) signals in a radio-over-fiber-based distributed antenna system (DAS) with different fiber lengths and power imbalance. Both a MIMO-supported access point (AP) and a spatial-diversity-supported AP were separately employed in the experiments. Throughput measurements were carried out with wireless users at different locations in a typical office environment. For the different fiber length effect, the results indicate that MIMO signals can maintain high throughput when the fiber length difference between the two remote antenna units (RAUs) is under 100 m and falls quickly when the length difference is greater. For the spatial diversity signals, high throughput can be maintained even when the difference is 150 m. On the other hand, the separation of the MIMO antennas allows additional freedom in placing the antennas in strategic locations for overall improved system performance, although it may also lead to received power imbalance problems. The results show that the throughput performance drops in specific positions when the received power imbalance is above around 13 dB. Hence, there is a trade-off between the extent of the wireless coverage for moderate bit-rates and the area over which peak bit-rates can be achieved.

  6. Parametric analysis of a combined dew point evaporative-vapour compression based air conditioning system

    Directory of Open Access Journals (Sweden)

    Shailendra Singh Chauhan

    2016-09-01

    Full Text Available A dew point evaporative-vapour compression based combined air conditioning system for providing good human comfort conditions at a low cost has been proposed in this paper. The proposed system has been parametrically analysed for a wide range of ambient temperatures and specific humidity under some reasonable assumptions. The proposed system has also been compared from the conventional vapour compression air conditioner on the basis of cooling load on the cooling coil working on 100% fresh air assumption. The saving of cooling load on the coil was found to be maximum with a value of 60.93% at 46 °C and 6 g/kg specific humidity, while it was negative for very high humidity of ambient air, which indicates that proposed system is applicable for dry and moderate humid conditions but not for very humid conditions. The system is working well with an average net monthly power saving of 192.31 kW h for hot and dry conditions and 124.38 kW h for hot and moderate humid conditions. Therefore it could be a better alternative for dry and moderate humid climate with a payback period of 7.2 years.

  7. A meshfree radial point interpolation method for analysis of functionally graded material (FGM) plates

    Science.gov (United States)

    Dai, K. Y.; Liu, G. R.; Lim, K. M.; Han, X.; Du, S. Y.

    A meshfree model is presented for the static and dynamic analyses of functionally graded material (FGM) plates based on the radial point interpolation method (PIM). In the present method, the mid-plane of an FGM plate is represented by a set of distributed nodes while the material properties in its thickness direction are computed analytically to take into account their continuous variations from one surface to another. Several examples are successfully analyzed for static deflections, natural frequencies and dynamic responses of FGM plates with different volume fraction exponents and boundary conditions. The convergence rate and accuracy are studied and compared with the finite element method (FEM). The effects of the constituent fraction exponent on static deflection as well as natural frequency are also investigated in detail using different FGM models. Based on the current material gradient, it is found that as the volume fraction exponent increases, the mechanical characteristics of the FGM plate approach those of the pure metal plate blended in the FGM.

  8. An analysis of silver candlesticks from a casting point of view: originals,copies, forgeries

    Directory of Open Access Journals (Sweden)

    Jaromir Audy

    2010-02-01

    Full Text Available Collecting silver artefacts has traditionally been a very popular hobby. Silver is addictive, therefore the number of potential collectors and investors appears to grow each year. Unfortunately, increases in the interest and buying potentials resulted in a number of forgeries manufactured and introduced to the open antique market. The items such as early silver candlesticks dictate a very high price, for many high quality fakes show very good appearances and matching similarities with originals. Such copies are traditionally manufactured by casting using the original items as patterns. Small details and variances in design features, position and shape of hallmarks, including the final surface quality are usual features to distinguish the fakes from the originals. This paper presents results of a study conducted on several silver candlesticks, including two artefacts bearing features of those produced in the mid 18th century, one original Italian candelabrum from Fascist era, and small candlesticks made in the early 20th century. Also, the paper presents some interesting contemporary coins ?replicas of many those produced in different countries. The coins were offered for sale by unscrupulous dealers via auctions and e-bays. Finally the main results and findings from this study are discussed from a manufacturing point of view, such as fabrication technology, surface quality and hallmarks, which will help the collectors, dealers and investors to detect and avoid forgeries.

  9. Accuracy Analysis of Precise Point Positioning of Compass Navigation System Applied to Crustal Motion Monitoring

    Science.gov (United States)

    Wang, Yuebing

    2017-04-01

    Based on the observation data of Compass/GPSobserved at five stations, time span from July 1, 2014 to June 30, 2016. UsingPPP positioning model of the PANDA software developed by Wuhan University,Analyzedthe positioning accuracy of single system and Compass/GPS integrated resolving, and discussed the capability of Compass navigation system in crustal motion monitoring. The results showed that the positioning accuracy in the east-west directionof the Compass navigation system is lower than the north-south direction (the positioning accuracy de 3 times RMS), in general, the positioning accuracyin the horizontal direction is about 1 2cm and the vertical direction is about 5 6cm. The GPS positioning accuracy in the horizontal direction is better than 1cm and the vertical direction is about 1 2cm. The accuracy of Compass/GPS integrated resolving is quite to GPS. It is worth mentioning that although Compass navigation system precision point positioning accuracy is lower than GPS, two sets of velocity fields obtained by using the Nikolaidis (2002) model to analyze the Compass and GPS time series results respectively, the results showed that the maximum difference of the two sets of velocity field in horizontal directions is 1.8mm/a. The Compass navigation system can now be used to monitor the crustal movement of the large deformation area, based on the velocity field in horizontal direction.

  10. Measurement standards and the general problem of reference points in chemical analysis

    International Nuclear Information System (INIS)

    Richter, W.; Dube, G.

    2002-01-01

    Besides the measurement standards available in general metrology in the form of the realisations of the units of measurement, measurement standards of chemical composition are needed for the vast field of chemical measurement (measurements of the chemical composition), because it is the main aim of such measurements to quantify non-isolated substances, often in complicated matrices, to which the 'classical' measurement standards and their lower- level derivatives are not directly applicable. At present, material artefacts as well as standard measurement devices serve as chemical measurement standards. These are measurement standards in the full metrological sense only, however, if they are firmly linked to the SI unit in which the composition represented by the standard is expressed. This requirement has the consequence that only a very restricted number of really reliable chemical measurement standards exist at present. Since it is very difficult and time consuming to increase this number substantially and, on the other hand, reliable reference points are increasingly needed for all kinds of chemical measurements, primary methods of measurement and high-level reference measurements will play an increasingly important role for the establishment of worldwide comparability and hence mutual acceptance of chemical measurement results. (author)

  11. [Analysis of the key points in the micro-endodontic treatment].

    Science.gov (United States)

    Hou, B X

    2016-08-01

    Micro-endodontic treatment refers to the microscope-assisted endodontic treatment techniques. The microscope offers a stereoscopic, enlarged image under great magnification and illumination at a comfortable working position. It will greatly promote the precision and improve the outcomes of endodontic treatment through enhancing the ability to detect the complexity of the root canal system of teeth that probably cannot be seen by the naked eyes, remove the infectious substances in root canal more efficiently, provide a tight root canal obturation and carry out effective retreatment procedures. The requirements of micro-endodontic treatment are different from the conventional root canal therapy carried out without microscope due to the complicated structure of the microscope. In order to make the use of microscope easier, it is of great importance to learn how to adjust the position of the operator and the patient, preset the angle of objective lens and the eyepiece, select the proper magnification and instruments, practice eye-hand cooperation under the microscope, etc. The purpose of this article was to analyze the key points in the applications of the microscope in endodontic treatment by reviewing the literature together with the author's clinical experience.

  12. PowerPoint® Presentation Flaws and Failures: A Psychological Analysis

    Directory of Open Access Journals (Sweden)

    Stephen Michael Kosslyn

    2012-07-01

    Full Text Available Electronic slideshow presentations are often faulted anecdotally, but little empirical work has documented their faults. Three studies reported here document psychological causes of their flaws. In Study 1 we found that eight psychological principles are often violated in PowerPoint® presentations, across different fields—for example, academic research presentations generally were no better or worse than business presentations. In Study 2 we found that respondents reported having noticed, and having been annoyed by, specific problems in presentations arising from violations of particular psychological principles. Finally, in Study 3 we showed that observers are not highly accurate in recognizing when slides violated a specific psychological rule. Furthermore, even when they correctly identified the violation, they often could not explain the nature of the problem. In sum, the psychological foundations for effective slideshow presentation design are neither obvious nor necessarily intuitive, and presentation designers in all fields, from education to business to government, could benefit from explicit instruction in relevant aspects of psychology.

  13. Common data model access; a unified layer to access data from data analysis point of view

    International Nuclear Information System (INIS)

    Poirier, S.; Buteau, A.; Ounsy, M.; Rodriguez, C.; Hauser, N.; Lam, T.; Xiong, N.

    2012-01-01

    For almost 20 years, the scientific community of neutron and synchrotron institutes have been dreaming of a common data format for exchanging experimental results and applications for reducing and analyzing the data. Using HDF5 as a data container has become the standard in many facilities. The big issue is the standardization of the data organization (schema) within the HDF5 container. By introducing a new level of indirection for data access, the Common-Data-Model-Access (CDMA) framework proposes a solution and allows separation of responsibilities between data reduction developers and the institute. Data reduction developers are responsible for data reduction code; the institute provides a plug-in to access the data. The CDMA is a core API that accesses data through a data format plug-in mechanism and scientific application definitions (sets of keywords) coming from a consensus between scientists and institutes. Using a innovative 'mapping' system between application definitions and physical data organizations, the CDMA allows data reduction application development independent of the data file container AND schema. Each institute develops a data access plug-in for its own data file formats along with the mapping between application definitions and its data files. Thus data reduction applications can be developed from a strictly scientific point of view and are immediately able to process data acquired from several institutes. (authors)

  14. Maximum power point tracking analysis of a coreless ironless electric generator for renewable energy application

    Science.gov (United States)

    Razali, Akhtar; Rahman, Fadhlur; Leong, Yap Wee; Razali Hanipah, Mohd; Azri Hizami, Mohd

    2018-04-01

    The magnetism attraction between permanent magnets and soft ironcore lamination in a conventional electric ironcore generator is often known as cogging. Cogging requires an additional input power to overcome, hence became one of the power loss sources. With the increasing of power output, the cogging is also proportionally increased. This leads to the increasing of the supplied power of the driver motor to overcome the cog. Therefore, this research is embarked to study fundamentally about the possibility of removing ironcore lamination in an electric generator to see its performance characteristic. In the maximum power point tracking test, the fabricated ironless coreless electricity generator was tested by applying the load on the ironless coreless electricity generator optimization to maximize the power generated, voltage and the current produced by the ironless coreless electricity generator when the rotational speed of the rotor increased throughout the test. The rotational torque and power output are measured, and efficiency is then analyzed. Results indicated that the generator produced RMS voltage of 200VAC at rotational speed of 318 RPM. Torque required to rotate the generator was at 10.8Nm. The generator had working efficiency of 77.73% and the power generated was at 280W.

  15. UHR-Q-TOF Analysis Can Address Common Challenges in Targeted and Untargeted Metabolomics

    Science.gov (United States)

    Zurek, G.; Krug, D.; Muller, R.; Barsch, A.

    2011-01-01

    Here, we present an ESI-UHR-Q-TOF based analysis of myxobacterial secondary metabolites, which permits to solve several challenges frequently encountered in metabolite profiling studies. Myxobacteria are promising producers of natural products exhibiting potent biological activities, and several myxobacterial metabolites are currently under investigation as potential leads for novel drugs. However, the myxobacteria are also a striking example for the divergence between the genetic capacity for the production of secondary metabolites and the number of compounds that could be characterised to date. Wild type and mutant strains were analyzed concerning the production patterns of known metabolites and with regard to the discovery of new metabolites. Sample throughput: Since mass accuracy and resolution of TOF instruments are independent of the acquisition rate, they are perfectly suited for a coupling to UHPLC separations. These hyphenations enable a reduction of analysis time in combination with a high chromatographic resolution and therefore permit an increased sample throughput. The UHR-TOF analysis revealed that an acquisition rate of up to 20Hz did not compromise the achieved mass accuracy or resolution. Targeted and untargeted metabolite profiling: Acquisition of full scan accurate mass spectra enable the targeted screening for known compounds e.g. from the class of DKxanthenes based on very selective high resolution EIC (hrEIC) traces with small mass windows of 1.0–0.5mDa. A comparison of several datasets following a “comprehensive feature extraction” combined with a statistical analysis permits an untargeted discovery of novel biomarkers using the same data files as for the targeted analysis. Identification: Even a mass accuracy of 0.1ppm is not sufficient for an unambiguous formula identification for m/z values above 500. A combination of accurate mass data and isotopic pattern information in MS and MS/MS spectra can extend this m/z range for reliable

  16. Change-point analysis data of neonatal diffusion tensor MRI in preterm and term-born infants

    Directory of Open Access Journals (Sweden)

    Dan Wu

    2017-06-01

    Full Text Available The data presented in this article are related to the research article entitled “Mapping the Critical Gestational Age at Birth that Alters Brain Development in Preterm-born Infants using Multi-Modal MRI” (Wu et al., 2017 [1]. Brain immaturity at birth poses critical neurological risks in the preterm-born infants. We used a novel change-point model to analyze the critical gestational age at birth (GAB that could affect postnatal development, based on diffusion tensor MRI (DTI acquired from 43 preterm and 43 term-born infants in 126 brain regions. In the corresponding research article, we presented change-point analysis of fractional anisotropy (FA and mean diffusivities (MD measurements in these infants. In this article, we offered the relative changes of axonal and radial diffusivities (AD and RD in relation to the change of FA and FA-based change-points, and we also provided the AD- and RD-based change-point results.

  17. Transitivity analysis: a framework for the study of social values in the context of points of view.

    Science.gov (United States)

    Tsirogianni, Stavroula; Sammut, Gordon

    2014-09-01

    Since its inception, psychology has struggled with issues of conceptualization and operationalization of social-psychological phenomena. The study of social values and points of view has been prone to such difficulties, despite a predominant concern of qualitative distinctions in the variability of both of these phenomena across different individuals and social groups. And while interest in both traces a common origin in Rokeach's studies of narrow mindedness, the study of both phenomena has since proceeded apace. In this study, we posit a renewed reconciliation between the two that is best served through a social-psychological model of points of view in terms of the values that inspire them. We draw on critical linguistics to propose a theoretical and methodological framework that can aid a systematic study of value structures as they take different forms and meanings through particular types of points of view. In five stages of qualitative analysis, the model deconstructs utterances into distinct terms that reveal a predominant perspective-taking style that can be utilized towards the categorization of different points of view, in terms of values that imbue them and that serve to provide them with a coherent angle of constructing a particular narrative. © 2013 The British Psychological Society.

  18. Point-process analysis of neural spiking activity of muscle spindles recorded from thin-film longitudinal intrafascicular electrodes.

    Science.gov (United States)

    Citi, Luca; Djilas, Milan; Azevedo-Coste, Christine; Yoshida, Ken; Brown, Emery N; Barbieri, Riccardo

    2011-01-01

    Recordings from thin-film Longitudinal Intra-Fascicular Electrodes (tfLIFE) together with a wavelet-based de-noising and a correlation-based spike sorting algorithm, give access to firing patterns of muscle spindle afferents. In this study we use a point process probability structure to assess mechanical stimulus-response characteristics of muscle spindle spike trains. We assume that the stimulus intensity is primarily a linear combination of the spontaneous firing rate, the muscle extension, and the stretch velocity. By using the ability of the point process framework to provide an objective goodness of fit analysis, we were able to distinguish two classes of spike clusters with different statistical structure. We found that spike clusters with higher SNR have a temporal structure that can be fitted by an inverse Gaussian distribution while lower SNR clusters follow a Poisson-like distribution. The point process algorithm is further able to provide the instantaneous intensity function associated with the stimulus-response model with the best goodness of fit. This important result is a first step towards a point process decoding algorithm to estimate the muscle length and possibly provide closed loop Functional Electrical Stimulation (FES) systems with natural sensory feedback information.

  19. In-silico Metabolome Target Analysis Towards PanC-based Antimycobacterial Agent Discovery.

    Science.gov (United States)

    Khoshkholgh-Sima, Baharak; Sardari, Soroush; Izadi Mobarakeh, Jalal; Khavari-Nejad, Ramezan Ali

    2015-01-01

    Mycobacterium tuberculosis, the main cause of tuberculosis (TB), has still remained a global health crisis especially in developing countries. Tuberculosis treatment is a laborious and lengthy process with high risk of noncompliance, cytotoxicity adverse events and drug resistance in patient. Recently, there has been an alarming rise of drug resistant in TB. In this regard, it is an unmet need to develop novel antitubercular medicines that target new or more effective biochemical pathways to prevent drug resistant Mycobacterium. Integrated study of metabolic pathways through in-silico approach played a key role in antimycobacterial design process in this study. Our results suggest that pantothenate synthetase (PanC), anthranilate phosphoribosyl transferase (TrpD) and 3-isopropylmalate dehydratase (LeuD) might be appropriate drug targets. In the next step, in-silico ligand analysis was used for more detailed study of chemical tractability of targets. This was helpful to identify pantothenate synthetase (PanC, Rv3602c) as the best target for antimycobacterial design procedure. Virtual library screening on the best ligand of PanC was then performed for inhibitory ligand design. At the end, five chemical intermediates showed significant inhibition of Mycobacterium bovis with good selectivity indices (SI) ≥10 according to Tuberculosis Antimicrobial Acquisition & Coordinating Facility of US criteria for antimycobacterial screening programs.

  20. Identification of Cell Surface Targets through Meta-analysis of Microarray Data

    Directory of Open Access Journals (Sweden)

    Henry Haeberle

    2012-07-01

    Full Text Available High-resolution image guidance for resection of residual tumor cells would enable more precise and complete excision for more effective treatment of cancers, such as medulloblastoma, the most common pediatric brain cancer. Numerous studies have shown that brain tumor patient outcomes correlate with the precision of resection. To enable guided resection with molecular specificity and cellular resolution, molecular probes that effectively delineate brain tumor boundaries are essential. Therefore, we developed a bioinformatics approach to analyze micro-array datasets for the identification of transcripts that encode candidate cell surface biomarkers that are highly enriched in medulloblastoma. The results identified 380 genes with greater than a two-fold increase in the expression in the medulloblastoma compared with that in the normal cerebellum. To enrich for targets with accessibility for extracellular molecular probes, we further refined this list by filtering it with gene ontology to identify genes with protein localization on, or within, the plasma membrane. To validate this meta-analysis, the top 10 candidates were evaluated with immunohistochemistry. We identified two targets, fibrillin 2 and EphA3, which specifically stain medulloblastoma. These results demonstrate a novel bioinformatics approach that successfully identified cell surface and extracellular candidate markers enriched in medulloblastoma versus adjacent cerebellum. These two proteins are high-value targets for the development of tumor-specific probes in medulloblastoma. This bioinformatics method has broad utility for the identification of accessible molecular targets in a variety of cancers and will enable probe development for guided resection.